Commit f40725bb authored by Georgios Evangelidis's avatar Georgios Evangelidis Committed by Andrey Kamaev

Add ECC algorithm

Evangelidis, G.D. and Psarakis E.Z. "Parametric Image Alignment using Enhanced
Correlation Coefficient Maximization", IEEE Transactions on PAMI, vol. 32, no.
10, 2008
parent b9b42005
......@@ -153,6 +153,50 @@ In case of point sets, the problem is formulated as follows: you need to find a
:ocv:func:`getPerspectiveTransform`,
:ocv:func:`findHomography`
findTransformECC
------------------------
Finds the geometric transform (warp) between two images in terms of the ECC criterion [EP08]_.
.. ocv:function:: double findTransformECC( InputArray templateImage, InputArray inputImage, InputOutputArray warpMatrix, int motionType=MOTION_AFFINE, TermCriteria criteria=TermCriteria(TermCriteria::COUNT+TermCriteria::EPS, 50, 0.001))
.. ocv:cfunction:: double cvFindTransformECC( const CvArr* templateImage, const CvArr* inputImage, CvMat* warpMatrix, const int motionType, const CvTermCriteria criteria)
:param templateImage: single-channel template image; ``CV_8U`` or ``CV_32F`` array.
:param inputImage: single-channel input image which should be warped with the final ``warpMatrix`` in order to provide an image similar to ``templateImage``, same type as ``temlateImage``.
:param warpMatrix: floating-point :math:`2\times 3` or :math:`3\times 3` mapping matrix (warp).
:param motionType: parameter, specifying the type of motion:
* **MOTION_TRANSLATION** sets a translational motion model; ``warpMatrix`` is :math:`2\times 3` with the first :math:`2\times 2` part being the unity matrix and the rest two parameters being estimated.
* **MOTION_EUCLIDEAN** sets a Euclidean (rigid) transformation as motion model; three parameters are estimated; ``warpMatrix`` is :math:`2\times 3`.
* **MOTION_AFFINE** sets an affine motion model (DEFAULT); six parameters are estimated; ``warpMatrix`` is :math:`2\times 3`.
* **MOTION_HOMOGRAPHY** sets a homography as a motion model; eight parameters are estimated;``warpMatrix`` is :math:`3\times 3`.
:param criteria: parameter, specifying the termination criteria of the ECC algorithm; ``criteria.epsilon`` defines the threshold of the increment in the correlation coefficient between two iterations (a negative ``criteria.epsilon`` makes ``criteria.maxcount`` the only termination criterion). Default values are shown in the declaration above.
The function estimates the optimum transformation (``warpMatrix``) with respect to ECC criterion ([EP08]_), that is
..math::
\texttt{warpMatrix} =
\texttt{warpMatrix} = \arg\max_{W} \texttt{ECC}(\texttt{templateImage}(x,y),\texttt{inputImage}(x',y'))
where
..math::
\begin{bmatrix} x' \\ y' \end{bmatrix} = W \cdot \begin{bmatrix} x \\ y \\ 1 \end{bmatrix}
(the equation holds with homogeneous coordinates for homography). It returns the final enhanced correlation coefficient, that is the correlation coefficient between the template image and the final warped input image. When a :math:`3\times 3` matrix is given with ``motionType`` =0, 1 or 2, the third row is ignored.
Unlike :ocv:func:`findHomography` and :ocv:func:`estimateRigidTransform`, the function :ocv:func:`findTransformECC` implements an area-based alignment that builds on intensity similarities. In essence, the function updates the initial transformation that roughly aligns the images. If this information is missing, the identity warp (unity matrix) should be given as input. Note that if images undergo strong displacements/rotations, an initial transformation that roughly aligns the images is necessary (e.g., a simple euclidean/similarity transform that allows for the images showing the same image content approximately). Use inverse warping in the second image to take an image close to the first one, i.e. use the flag ``WARP_INVERSE_MAP`` with :ocv:func:`warpAffine` or :ocv:func:`warpPerspective`. See also the OpenCV sample ``image_alignment.cpp`` that demonstrates the use of the function. Note that the function throws an exception if algorithm does not converges.
.. seealso::
:ocv:func:`estimateRigidTransform`,
:ocv:func:`findHomography`
updateMotionHistory
......@@ -726,3 +770,5 @@ Releases all inner buffers.
.. [Zach2007] C. Zach, T. Pock and H. Bischof. "A Duality Based Approach for Realtime TV-L1 Optical Flow", In Proceedings of Pattern Recognition (DAGM), Heidelberg, Germany, pp. 214-223, 2007
.. [Javier2012] Javier Sanchez, Enric Meinhardt-Llopis and Gabriele Facciolo. "TV-L1 Optical Flow Estimation".
.. [EP08] Evangelidis, G.D. and Psarakis E.Z. "Parametric Image Alignment using Enhanced Correlation Coefficient Maximization", IEEE Transactions on PAMI, vol. 32, no. 10, 2008
......@@ -218,6 +218,24 @@ CVAPI(const CvMat*) cvKalmanCorrect( CvKalman* kalman, const CvMat* measurement
#define cvKalmanUpdateByTime cvKalmanPredict
#define cvKalmanUpdateByMeasurement cvKalmanCorrect
/****************************************************************************************\
* Image Alignment (ECC algorithm) *
\****************************************************************************************/
enum
{
MOTION_TRANSLATION,
MOTION_EUCLIDEAN,
MOTION_AFFINE,
MOTION_HOMOGRAPHY
};
/* Estimate the geometric transformation between 2 images (area-based alignment) */
CVAPI(double) cvFindTransformECC (const CvArr* templateImage, const CvArr* inputImage,
CvMat* warpMatrix,
const int motionType,
const CvTermCriteria criteria);
#ifdef __cplusplus
}
......@@ -323,6 +341,21 @@ CV_EXPORTS_W void calcOpticalFlowFarneback( InputArray prev, InputArray next,
CV_EXPORTS_W Mat estimateRigidTransform( InputArray src, InputArray dst,
bool fullAffine);
//! estimates the best-fit Translation, Euclidean, Affine or Perspective Transformation
// with respect to Enhanced Correlation Coefficient criterion that maps one image to
// another (area-based alignment)
//
// see reference:
// Evangelidis, G. E., Psarakis, E.Z., Parametric Image Alignment using
// Enhanced Correlation Coefficient Maximization, PAMI, 30(8), 2008
CV_EXPORTS_W double findTransformECC(InputArray templateImage,
InputArray inputImage,
InputOutputArray warpMatrix,
int motionType=MOTION_AFFINE,
TermCriteria criteria=TermCriteria(TermCriteria::COUNT+TermCriteria::EPS, 50, 0.001));
//! computes dense optical flow using Simple Flow algorithm
CV_EXPORTS_W void calcOpticalFlowSF(InputArray from,
InputArray to,
......
#include "perf_precomp.hpp"
using namespace std;
using namespace cv;
using namespace perf;
using std::tr1::make_tuple;
using std::tr1::get;
CV_ENUM(MotionType, MOTION_TRANSLATION, MOTION_EUCLIDEAN, MOTION_AFFINE, MOTION_HOMOGRAPHY)
typedef std::tr1::tuple<MotionType> MotionType_t;
typedef perf::TestBaseWithParam<MotionType_t> TransformationType;
PERF_TEST_P(TransformationType, findTransformECC, /*testing::ValuesIn(MotionType::all())*/
testing::Values((int) MOTION_TRANSLATION, (int) MOTION_EUCLIDEAN,
(int) MOTION_AFFINE, (int) MOTION_HOMOGRAPHY)
)
{
Mat inputImage = imread(getDataPath("cv/shared/fruits.png"),0);
Mat img;
resize(inputImage, img, Size(216,216));
Mat templateImage;
int transform_type = get<0>(GetParam());
Mat warpMat;
Mat warpGround;
double angle;
switch (transform_type) {
case MOTION_TRANSLATION:
warpGround = (Mat_<float>(2,3) << 1.f, 0.f, 7.234f,
0.f, 1.f, 11.839f);
warpAffine(img, templateImage, warpGround,
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
break;
case MOTION_EUCLIDEAN:
angle = CV_PI/30;
warpGround = (Mat_<float>(2,3) << (float)cos(angle), (float)-sin(angle), 12.123f,
(float)sin(angle), (float)cos(angle), 14.789f);
warpAffine(img, templateImage, warpGround,
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
break;
case MOTION_AFFINE:
warpGround = (Mat_<float>(2,3) << 0.98f, 0.03f, 15.523f,
-0.02f, 0.95f, 10.456f);
warpAffine(img, templateImage, warpGround,
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
break;
case MOTION_HOMOGRAPHY:
warpGround = (Mat_<float>(3,3) << 0.98f, 0.03f, 15.523f,
-0.02f, 0.95f, 10.456f,
0.0002f, 0.0003f, 1.f);
warpPerspective(img, templateImage, warpGround,
Size(200,200), CV_INTER_LINEAR+CV_WARP_FILL_OUTLIERS+CV_WARP_INVERSE_MAP);
break;
}
TEST_CYCLE()
{
if (transform_type<3)
warpMat = Mat::eye(2,3, CV_32F);
else
warpMat = Mat::eye(3,3, CV_32F);
findTransformECC(templateImage, img, warpMat, transform_type,
TermCriteria(TermCriteria::COUNT+TermCriteria::EPS, 5, -1));
}
SANITY_CHECK(warpMat, 1e-3);
}
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment