Quantcast
Channel: OpenCV Q&A Forum - RSS feed
Viewing all articles
Browse latest Browse all 41027

Pose estimation: solvePnP and epipolar geometry do not agree

$
0
0
Hi, I have a relative camera pose estimation problem where I am looking at a scene with differently oriented cameras spaced a certain distance apart. Initially, I am computing the essential matrix using the 5 point algorithm and decomposing it to get the R and t of camera 2 w.r.t camera 1. I thought it would be a good idea to do a check by triangulating the two sets of image points into 3D, and then running solvePnP on the 3D-2D correspondences, but the result I get from solvePnP is way off. I am trying to do this because bundle adjustment would not work as my scale keeps changing; so I thought this would be one way to "refine" my pose: correct me if I am wrong. Anyway, In one case, I had a 45 degree rotation between camera 1 and camera 2 along the Z axis, and the epipolar geometry part gave me this answer: Relative camera rotation is [1.46774, 4.28483, 40.4676] Translation vector is [-0.778165583410928; -0.6242059242696293; -0.06946429947410336] solvePnP, on the other hand.. Camera1: rvecs [0.3830144497209735; -0.5153903947692436; -0.001401186630803216] tvecs [-1777.451836911453; -1097.111339375749; 3807.545406775675] Euler1 [24.0615, -28.7139, -6.32776] Camera2: rvecs [1407374883553280; 1337006420426752; 774194163884064.1] (!!) tvecs[1.249151852575814; -4.060149502748567; -0.06899980661249146] Euler2 [-122.805, -69.3934, 45.7056] Something is troublingly off with the rvecs of camera2 and tvec of camera 1. My code involving the point triangulation and solvePnP looks like this: points1.convertTo(points1, CV_32F); points2.convertTo(points2, CV_32F); // Homogenize image points points1.col(0) = (points1.col(0) - pp.x) / focal; points2.col(0) = (points2.col(0) - pp.x) / focal; points1.col(1) = (points1.col(1) - pp.y) / focal; points2.col(1) = (points2.col(1) - pp.y) / focal; points1 = points1.t(); points2 = points2.t(); cv::triangulatePoints(P1, P2, points1, points2, points3DH); cv::Mat points3D; convertPointsFromHomogeneous(Mat(points3DH.t()).reshape(4, 1), points3D); cv::solvePnP(points3D, points1.t(), K, noArray(), rvec1, tvec1, 1, CV_ITERATIVE ); cv::solvePnP(points3D, points2.t(), K, noArray(), rvec2, tvec2, 1, CV_ITERATIVE ); And then I am converting the rvecs through Rodrigues to get the Euler angles: but since rvecs and tvecs themselves seem to be wrong, I feel something's wrong with my process. Any pointers would be helpful. Thanks!

Viewing all articles
Browse latest Browse all 41027

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>