CStats:MaxClipMean
The MaxClipMean method computes the maximum-clipped mean value of a data sample. This method calculates the mean after rexcluding the maximum value from the sample. Use this method to avoid a biased result when the region is known to contain one deviant, high pixel value. The sample may be a Lua table or a class object of type CImage, CArray, or CMatrix. For CImage and CMatrix objects, an optional CRect object can be used to define the points used in the calculation.
nMean, nStdDev = CStats:MaxClipMean( table ) nMean, nStdDev = CStats:MaxClipMean( CImage ) nMean, nStdDev = CStats:MaxClipMean( CImage, CRect ) nMean, nStdDev = CStats:MaxClipMean( CArray ) nMean, nStdDev = CStats:MaxClipMean( CArray, CRect ) nMean, nStdDev = CStats:MaxClipMean( CMatrix ) where |
table is a lua table containing the data.
CImage, CArray, and CMatrix are class objects containing the data to measure,
CRect is a CRect rectangle object that defines the region to measure.
nMean and nStdDev are the max-clipped mean and standard deviation of the data. On failure, 0,0 is returned.
Suppose that CImage object I and CRect object R exist. The following script returns the mean value inside a rectangle on the image, discarding the 1 highest pixel value:
|
-- create a CStats object |
|
-- returns the mean value |
|
-- list the result |
The following script returns the max-clipped mean value for a table of data:
|
-- create a CStats object |
|
-- create some data in a table |
|
-- list the result |