Open sourcing the calibration process

Hi,
Is it possible to know how you do the calibration so we can directly do it, without having to send you the calibration videos + wait for the answer ? This is because I want to test multiple settings with multiple cameras over time.

I have been able to get an intrinsic matrix for my cameras and distortion coefficients, but it does not yield a perfect calibrated image in ActionStitch.

For now, I filmed using a GoPro 9 and 11 Mini, both in 2.7K Superview.

Thanks!

Stitching by doing my own calibration gave me this result :

All it matters is this line:

ret, intrinsic_matrix, distCoeff, rvecs, tvecs = cv2.calibrateCamera(opts, ipts, grey_image.shape[::-1],None,None,None, None, cv2.CALIB_RATIONAL_MODEL)

cv2.CALIB_RATIONAL_MODEL means it is using the 8 parameter distortion model.

If the total reprojection error is smaller than 0.1 then the calibration should be good enough.

Thank you, I was missing the cv2.CALIB_RATIONAL_MODEL. How would you compute the reprojection error?