CStats:MinMaxClipMean
The MinMaxClipMean method computes the mean value of a data sample, excluding the minimum and maximum pixel value from the sample. Use this method to avoid a biased result when the region is suspected to contain one deviant high value and one deviant low value. The sample may be a Lua table or a class object of type CImage, CArray, or CMatrix. For CImage and CMatrix objects, an optional CRect object can be used to define the points used in the calculation.
nMean, nStdDev = CStats:MinMaxClipMean( table ) nMean, nStdDev = CStats:MinMaxClipMean( CImage, ) nMean, nStdDev = CStats:MinMaxClipMean( CImage, CRect ) nMean, nStdDev = CStats:MinMaxClipMean( CArray ) nMean, nStdDev = CStats:MinMaxClipMean( CArray, CRect ) nMean, nStdDev = CStats:MinMaxClipMean( CMatrix ) where |
table is a lua table containing the data.
CImage, CArray, and CMatrix are class objects containing the data to measure,
CRect is a CRect rectangle object that defines the region to measure.
nMean and nStdDev are the min/max clipped mean and the standard deviation of the data. On failure, 0,0 is returned.
Suppose a CImageI and a CRectR exist. The following script returns the mean value inside a rectangle on the image, discarding both the single highest and single lowest pixel values:
|
-- returns the mean value |
|
-- list the results |
|
-- when done with S, remove it from memory |
The following script returns the min/max-clipped mean value for a table of data:
|
-- create a CStats object |
|
-- create some data in a table |
|
-- list the result |