I'm using OpenCV to calculate the PCA for a set of pixels represented as 2D points. But I'm unsure how PCA is calculating the mean of these points, as it is not what I would expect.
If I have the following 3 points (x,y):
(2,1), (2,2), (2,3)
And I then calculate the PCA (OpenCvSharp is used, but it is just a wrapper):
Mat input = new Mat(3,2, MatType.CV_32FC1);
input.Set(0,0, 2f);
input.Set(0,1, 1f);
input.Set(1,0, 2f);
input.Set(1,1, 2f);
input.Set(2,0, 2f);
input.Set(2,1, 3f);
Mat mean = new Mat();
PCA pcaResult = new PCA(input, mean, PCA.Flags.DataAsRow);
float meanX = pcaResult.Mean.Get(0,0);
float meanY = pcaResult.Mean.Get(0,1);
`meanX` and `meanY` are `(0.6666667, 0)`, respectively. Where as I would have expected them to be `(2,2)` (I'm guessing some form of normalisation is occurring). How is the PCA mean being calculated, and what do I need to do to calculate the mean that I expect? Or do I need to pass the mean into PCA?
↧