# Extended Kalman Filtering with OpenCv

The OpenCV has a Linear Kalman filter (LKF) implementation, but usually the motion model of the tracked object is highly nonlinear.

Therefor, I would like to give a concrete example of the implementation and use of the Extended Kalman filter (EKF) by using the LKF in OpenCV.

In this example we have information about the orientation of the camera and the pixel coordinates of an object (e.g. blob detection). We suppose that the object is moving on a flat surface only on the Y axes and the camera can only rotate around the Z axes. a simple model describing the motion of the object is given like this:

Important: In the world frame the lateral movement of the object relative to the camera is on the Y axes, but on the image coordinate system the upper left corner is the origin and the width is the X and the height is the Y axes! Moreover, the transfer function f and h are: where δ is the sampling time.

For the EKF we need the Jacobian matrix of the transfer function f, and measurement function h: Initialization of the EKF:

double deltaT = 0.01, omega_w =8, omega_u = 3.1623;

KalmanFilter KF(3, 2, 0);

Mat_<float> measurement(2,1);

measurement.setTo(Scalar(0));

KF.statePost.at<float>(0) = 0; // X

KF.statePost.at<float>(1) = 0; // dX

KF.statePost.at<float>(2) = 0; // theta

KF.transitionMatrix = (Mat_<float>(3, 3) << 1,1,0, 0,1,0, 0,0,1 ); //f

KF.measurementMatrix = (Mat_<float>(2, 3) << 1,0,0, 0,0,1 ); //H

KF.processNoiseCov = (Mat_<float>(3, 3) << 1,0,0, 0,0.1,0, 0,0,0.1);

KF.processNoiseCov *=pow(omega_w,2);

setIdentity(KF.measurementNoiseCov, Scalar::all(pow(omega_u,2)));

setIdentity(KF.errorCovPost, Scalar::all(50));

The main loop of the EKF estimation:

while (stop != true)

{

// First the object position and camera orientation should be received from the sensors

//....

// Predict, to update the internal statePre variable

rhoKF = KF.statePost.at<float>(0); // rho

DrhoKF = KF.statePost.at<float>(1); // d rho

thetaKF = KF.statePost.at<float>(2); // theta double Dcos = tan(thetaKF)*(1/cos(thetaKF));

// Jacobina of transfer function => F

KF.transitionMatrix = (Mat_<float>(3, 3) << 1,deltaT/cos(thetaKF), DrhoKF*deltaT*Dcos, //2*sin(2*thetaKF)*cos(thetaKF)/pow(cos(2*thetaKF)+1,2), 0,1,0, 0,0,1);

Mat prediction = KF.predict();

KF.statePre.at<float>(0) = rhoKF + DrhoKF * deltaT / cos(thetaKF);

KF.statePre.at<float>(1) = DrhoKF;

KF.statePre.at<float>(2) = thetaKF;

// Update

measurement.at<float>(0) = xCoordinateOfObject;

measurement.at<float>(1) = thetaOfCamera; lastRho = rightLane;

Mat estimated = KF.correct(measurement);

KF.temp5.at<float>(0) = measurement.at<float>(0) - KF.statePre.at<float>(0);

KF.temp5.at<float>(1) = measurement.at<float>(1) - KF.statePre.at<float>(2);

KF.statePost = KF.statePre + KF.gain * KF.temp5;

}