Commit b2cd954f authored by Vadim Pisarevsky's avatar Vadim Pisarevsky

Merge pull request #3042 from edgarriba:master

parents 57a1343f deec2335
......@@ -325,6 +325,7 @@ extlinks = {
'imgproc_geometric' : ('http://docs.opencv.org/modules/imgproc/doc/geometric_transformations.html#%s', None ),
'miscellaneous_transformations' : ('http://docs.opencv.org/modules/imgproc/doc/miscellaneous_transformations.html#%s', None),
'user_interface' : ('http://docs.opencv.org/modules/highgui/doc/user_interface.html#%s', None),
'video' : ('http://docs.opencv.org/modules/video/doc/motion_analysis_and_object_tracking.html#%s', None),
# 'opencv_group' : ('http://answers.opencv.org/%s', None),
'opencv_qa' : ('http://answers.opencv.org/%s', None),
......@@ -402,6 +403,7 @@ extlinks = {
'contour_area' : ('http://docs.opencv.org/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=contourarea#contourarea%s', None),
'arc_length' : ('http://docs.opencv.org/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=arclength#arclength%s', None),
'point_polygon_test' : ('http://docs.opencv.org/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=pointpolygontest#pointpolygontest%s', None),
'feature_detection_and_description' : ('http://docs.opencv.org/modules/features2d/doc/feature_detection_and_description.html#%s', None),
'feature_detector' : ( 'http://docs.opencv.org/modules/features2d/doc/common_interfaces_of_feature_detectors.html?highlight=featuredetector#FeatureDetector%s', None),
'feature_detector_detect' : ('http://docs.opencv.org/modules/features2d/doc/common_interfaces_of_feature_detectors.html?highlight=detect#featuredetector-detect%s', None ),
'surf_feature_detector' : ('http://docs.opencv.org/modules/features2d/doc/common_interfaces_of_feature_detectors.html?highlight=surffeaturedetector#surffeaturedetector%s', None ),
......
This diff is collapsed.
......@@ -45,6 +45,25 @@ Although we got most of our images in a 2D format they do come from a 3D world.
:height: 90pt
:width: 90pt
+
.. tabularcolumns:: m{100pt} m{300pt}
.. cssclass:: toctableopencv
===================== ==============================================
|PoseEstimation| **Title:** :ref:`realTimePoseEstimation`
*Compatibility:* > OpenCV 2.0
*Author:* Edgar Riba
Real time pose estimation of a textured object using ORB features, FlannBased matcher, PnP approach plus Ransac and Linear Kalman Filter to reject possible bad poses.
===================== ==============================================
.. |PoseEstimation| image:: images/real_time_pose_estimation.jpg
:height: 90pt
:width: 90pt
.. raw:: latex
\pagebreak
......@@ -54,3 +73,4 @@ Although we got most of our images in a 2D format they do come from a 3D world.
../camera_calibration_square_chess/camera_calibration_square_chess
../camera_calibration/camera_calibration
../real_time_pose/real_time_pose
......@@ -562,7 +562,7 @@ solvePnP
------------
Finds an object pose from 3D-2D point correspondences.
.. ocv:function:: bool solvePnP( InputArray objectPoints, InputArray imagePoints, InputArray cameraMatrix, InputArray distCoeffs, OutputArray rvec, OutputArray tvec, bool useExtrinsicGuess=false, int flags=ITERATIVE )
.. ocv:function:: bool solvePnP( InputArray objectPoints, InputArray imagePoints, InputArray cameraMatrix, InputArray distCoeffs, OutputArray rvec, OutputArray tvec, bool useExtrinsicGuess=false, int flags=SOLVEPNP_ITERATIVE )
.. ocv:pyfunction:: cv2.solvePnP(objectPoints, imagePoints, cameraMatrix, distCoeffs[, rvec[, tvec[, useExtrinsicGuess[, flags]]]]) -> retval, rvec, tvec
......@@ -584,9 +584,10 @@ Finds an object pose from 3D-2D point correspondences.
:param flags: Method for solving a PnP problem:
* **ITERATIVE** Iterative method is based on Levenberg-Marquardt optimization. In this case the function finds such a pose that minimizes reprojection error, that is the sum of squared distances between the observed projections ``imagePoints`` and the projected (using :ocv:func:`projectPoints` ) ``objectPoints`` .
* **P3P** Method is based on the paper of X.S. Gao, X.-R. Hou, J. Tang, H.-F. Chang "Complete Solution Classification for the Perspective-Three-Point Problem". In this case the function requires exactly four object and image points.
* **EPNP** Method has been introduced by F.Moreno-Noguer, V.Lepetit and P.Fua in the paper "EPnP: Efficient Perspective-n-Point Camera Pose Estimation".
* **SOLVEPNP_ITERATIVE** Iterative method is based on Levenberg-Marquardt optimization. In this case the function finds such a pose that minimizes reprojection error, that is the sum of squared distances between the observed projections ``imagePoints`` and the projected (using :ocv:func:`projectPoints` ) ``objectPoints`` .
* **SOLVEPNP_P3P** Method is based on the paper of X.S. Gao, X.-R. Hou, J. Tang, H.-F. Chang "Complete Solution Classification for the Perspective-Three-Point Problem". In this case the function requires exactly four object and image points.
* **SOLVEPNP_EPNP** Method has been introduced by F.Moreno-Noguer, V.Lepetit and P.Fua in the paper "EPnP: Efficient Perspective-n-Point Camera Pose Estimation".
* **SOLVEPNP_DLS** Method is based on the paper of Joel A. Hesch and Stergios I. Roumeliotis. "A Direct Least-Squares (DLS) Method for PnP".
The function estimates the object pose given a set of object points, their corresponding image projections, as well as the camera matrix and the distortion coefficients.
......@@ -598,7 +599,7 @@ solvePnPRansac
------------------
Finds an object pose from 3D-2D point correspondences using the RANSAC scheme.
.. ocv:function:: void solvePnPRansac( InputArray objectPoints, InputArray imagePoints, InputArray cameraMatrix, InputArray distCoeffs, OutputArray rvec, OutputArray tvec, bool useExtrinsicGuess=false, int iterationsCount = 100, float reprojectionError = 8.0, int minInliersCount = 100, OutputArray inliers = noArray(), int flags = ITERATIVE )
.. ocv:function:: bool solvePnPRansac( InputArray objectPoints, InputArray imagePoints, InputArray cameraMatrix, InputArray distCoeffs, OutputArray rvec, OutputArray tvec, bool useExtrinsicGuess=false, int iterationsCount = 100, float reprojectionError = 8.0, double confidence = 0.99, OutputArray inliers = noArray(), int flags = SOLVEPNP_ITERATIVE )
.. ocv:pyfunction:: cv2.solvePnPRansac(objectPoints, imagePoints, cameraMatrix, distCoeffs[, rvec[, tvec[, useExtrinsicGuess[, iterationsCount[, reprojectionError[, minInliersCount[, inliers[, flags]]]]]]]]) -> rvec, tvec, inliers
......@@ -620,15 +621,18 @@ Finds an object pose from 3D-2D point correspondences using the RANSAC scheme.
:param reprojectionError: Inlier threshold value used by the RANSAC procedure. The parameter value is the maximum allowed distance between the observed and computed point projections to consider it an inlier.
:param minInliersCount: Number of inliers. If the algorithm at some stage finds more inliers than ``minInliersCount`` , it finishes.
:param confidence: The probability that the algorithm produces a useful result.
:param inliers: Output vector that contains indices of inliers in ``objectPoints`` and ``imagePoints`` .
:param flags: Method for solving a PnP problem (see :ocv:func:`solvePnP` ).
The function estimates an object pose given a set of object points, their corresponding image projections, as well as the camera matrix and the distortion coefficients. This function finds such a pose that minimizes reprojection error, that is, the sum of squared distances between the observed projections ``imagePoints`` and the projected (using
:ocv:func:`projectPoints` ) ``objectPoints``. The use of RANSAC makes the function resistant to outliers. The function is parallelized with the TBB library.
:ocv:func:`projectPoints` ) ``objectPoints``. The use of RANSAC makes the function resistant to outliers.
.. note::
* An example of how to use solvePNPRansac for object detection can be found at opencv_source_code/samples/cpp/tutorial_code/calib3d/real_time_pose_estimation/
findFundamentalMat
......
......@@ -56,10 +56,11 @@ enum { LMEDS = 4, //!< least-median algorithm
RANSAC = 8 //!< RANSAC algorithm
};
enum { ITERATIVE = 0,
EPNP = 1, // F.Moreno-Noguer, V.Lepetit and P.Fua "EPnP: Efficient Perspective-n-Point Camera Pose Estimation"
P3P = 2 // X.S. Gao, X.-R. Hou, J. Tang, H.-F. Chang; "Complete Solution Classification for the Perspective-Three-Point Problem"
};
enum { SOLVEPNP_ITERATIVE = 0,
SOLVEPNP_EPNP = 1, // F.Moreno-Noguer, V.Lepetit and P.Fua "EPnP: Efficient Perspective-n-Point Camera Pose Estimation"
SOLVEPNP_P3P = 2, // X.S. Gao, X.-R. Hou, J. Tang, H.-F. Chang; "Complete Solution Classification for the Perspective-Three-Point Problem"
SOLVEPNP_DLS = 3 // Joel A. Hesch and Stergios I. Roumeliotis. "A Direct Least-Squares (DLS) Method for PnP"
};
enum { CALIB_CB_ADAPTIVE_THRESH = 1,
CALIB_CB_NORMALIZE_IMAGE = 2,
......@@ -152,15 +153,15 @@ CV_EXPORTS_W void projectPoints( InputArray objectPoints,
CV_EXPORTS_W bool solvePnP( InputArray objectPoints, InputArray imagePoints,
InputArray cameraMatrix, InputArray distCoeffs,
OutputArray rvec, OutputArray tvec,
bool useExtrinsicGuess = false, int flags = ITERATIVE );
bool useExtrinsicGuess = false, int flags = SOLVEPNP_ITERATIVE );
//! computes the camera pose from a few 3D points and the corresponding projections. The outliers are possible.
CV_EXPORTS_W void solvePnPRansac( InputArray objectPoints, InputArray imagePoints,
CV_EXPORTS_W bool solvePnPRansac( InputArray objectPoints, InputArray imagePoints,
InputArray cameraMatrix, InputArray distCoeffs,
OutputArray rvec, OutputArray tvec,
bool useExtrinsicGuess = false, int iterationsCount = 100,
float reprojectionError = 8.0, int minInliersCount = 100,
OutputArray inliers = noArray(), int flags = ITERATIVE );
float reprojectionError = 8.0, double confidence = 0.99,
OutputArray inliers = noArray(), int flags = SOLVEPNP_ITERATIVE );
//! initializes camera matrix from a few 3D points and the corresponding projections.
CV_EXPORTS_W Mat initCameraMatrix2D( InputArrayOfArrays objectPoints,
......
......@@ -91,7 +91,8 @@ enum
{
CV_ITERATIVE = 0,
CV_EPNP = 1, // F.Moreno-Noguer, V.Lepetit and P.Fua "EPnP: Efficient Perspective-n-Point Camera Pose Estimation"
CV_P3P = 2 // X.S. Gao, X.-R. Hou, J. Tang, H.-F. Chang; "Complete Solution Classification for the Perspective-Three-Point Problem"
CV_P3P = 2, // X.S. Gao, X.-R. Hou, J. Tang, H.-F. Chang; "Complete Solution Classification for the Perspective-Three-Point Problem"
CV_DLS = 3 // Joel A. Hesch and Stergios I. Roumeliotis. "A Direct Least-Squares (DLS) Method for PnP"
};
CVAPI(int) cvFindFundamentalMat( const CvMat* points1, const CvMat* points2,
......
......@@ -10,7 +10,7 @@ using namespace perf;
using std::tr1::make_tuple;
using std::tr1::get;
CV_ENUM(pnpAlgo, ITERATIVE, EPNP /*, P3P*/)
CV_ENUM(pnpAlgo, SOLVEPNP_ITERATIVE, SOLVEPNP_EPNP, SOLVEPNP_P3P, SOLVEPNP_DLS)
typedef std::tr1::tuple<int, pnpAlgo> PointsNum_Algo_t;
typedef perf::TestBaseWithParam<PointsNum_Algo_t> PointsNum_Algo;
......@@ -19,8 +19,8 @@ typedef perf::TestBaseWithParam<int> PointsNum;
PERF_TEST_P(PointsNum_Algo, solvePnP,
testing::Combine(
testing::Values(/*4,*/ 3*9, 7*13), //TODO: find why results on 4 points are too unstable
testing::Values((int)ITERATIVE, (int)EPNP)
testing::Values(4, 3*9, 7*13), //TODO: find why results on 4 points are too unstable
testing::Values((int)SOLVEPNP_ITERATIVE, (int)SOLVEPNP_EPNP)
)
)
{
......@@ -51,6 +51,7 @@ PERF_TEST_P(PointsNum_Algo, solvePnP,
add(points2d, noise, points2d);
declare.in(points3d, points2d);
declare.time(100);
TEST_CYCLE_N(1000)
{
......@@ -58,12 +59,18 @@ PERF_TEST_P(PointsNum_Algo, solvePnP,
}
SANITY_CHECK(rvec, 1e-6);
SANITY_CHECK(tvec, 1e-3);
SANITY_CHECK(tvec, 1e-6);
}
PERF_TEST(PointsNum_Algo, solveP3P)
PERF_TEST_P(PointsNum_Algo, solvePnPSmallPoints,
testing::Combine(
testing::Values(4), //TODO: find why results on 4 points are too unstable
testing::Values((int)SOLVEPNP_P3P, (int)SOLVEPNP_DLS)
)
)
{
int pointsNum = 4;
int pointsNum = get<0>(GetParam());
pnpAlgo algo = get<1>(GetParam());
vector<Point2f> points2d(pointsNum);
vector<Point3f> points3d(pointsNum);
......@@ -93,11 +100,11 @@ PERF_TEST(PointsNum_Algo, solveP3P)
TEST_CYCLE_N(1000)
{
solvePnP(points3d, points2d, intrinsics, distortion, rvec, tvec, false, P3P);
solvePnP(points3d, points2d, intrinsics, distortion, rvec, tvec, false, algo);
}
SANITY_CHECK(rvec, 1e-6);
SANITY_CHECK(tvec, 1e-6);
SANITY_CHECK(rvec, 1e-4);
SANITY_CHECK(tvec, 1e-2);
}
PERF_TEST_P(PointsNum, DISABLED_SolvePnPRansac, testing::Values(4, 3*9, 7*13))
......@@ -117,8 +124,10 @@ PERF_TEST_P(PointsNum, DISABLED_SolvePnPRansac, testing::Values(4, 3*9, 7*13))
Mat dist_coef(1, 8, CV_32F, cv::Scalar::all(0));
vector<cv::Point2f> image_vec;
Mat rvec_gold(1, 3, CV_32FC1);
randu(rvec_gold, 0, 1);
Mat tvec_gold(1, 3, CV_32FC1);
randu(tvec_gold, 0, 1);
projectPoints(object, rvec_gold, tvec_gold, camera_mat, dist_coef, image_vec);
......
This diff is collapsed.
This diff is collapsed.
......@@ -391,6 +391,7 @@ CV_INIT_ALGORITHM(LMeDSPointSetRegistrator, "PointSetRegistrator.LMeDS",
obj.info()->addParam(obj, "confidence", obj.confidence);
obj.info()->addParam(obj, "maxIters", obj.maxIters))
Ptr<PointSetRegistrator> createRANSACPointSetRegistrator(const Ptr<PointSetRegistrator::Callback>& _cb,
int _modelPoints, double _threshold,
double _confidence, int _maxIters)
......@@ -409,6 +410,7 @@ Ptr<PointSetRegistrator> createLMeDSPointSetRegistrator(const Ptr<PointSetRegist
new LMeDSPointSetRegistrator(_cb, _modelPoints, _confidence, _maxIters));
}
class Affine3DEstimatorCallback : public PointSetRegistrator::Callback
{
public:
......
This diff is collapsed.
......@@ -54,9 +54,10 @@ class CV_solvePnPRansac_Test : public cvtest::BaseTest
public:
CV_solvePnPRansac_Test()
{
eps[ITERATIVE] = 1.0e-2;
eps[EPNP] = 1.0e-2;
eps[P3P] = 1.0e-2;
eps[SOLVEPNP_ITERATIVE] = 1.0e-2;
eps[SOLVEPNP_EPNP] = 1.0e-2;
eps[SOLVEPNP_P3P] = 1.0e-2;
eps[SOLVEPNP_DLS] = 1.0e-2;
totalTestsCount = 10;
}
~CV_solvePnPRansac_Test() {}
......@@ -135,7 +136,7 @@ protected:
}
solvePnPRansac(points, projectedPoints, intrinsics, distCoeffs, rvec, tvec,
false, 500, 0.5, -1, inliers, method);
false, 500, 0.5, 0.99, inliers, method);
bool isTestSuccess = inliers.size() >= points.size()*0.95;
......@@ -153,13 +154,12 @@ protected:
{
ts->set_failed_test_info(cvtest::TS::OK);
vector<Point3f> points;
vector<Point3f> points, points_dls;
const int pointsCount = 500;
points.resize(pointsCount);
generate3DPointCloud(points);
const int methodsCount = 3;
const int methodsCount = 4;
RNG rng = ts->get_rng();
......@@ -184,7 +184,7 @@ protected:
}
}
}
double eps[3];
double eps[4];
int totalTestsCount;
};
......@@ -193,9 +193,10 @@ class CV_solvePnP_Test : public CV_solvePnPRansac_Test
public:
CV_solvePnP_Test()
{
eps[ITERATIVE] = 1.0e-6;
eps[EPNP] = 1.0e-6;
eps[P3P] = 1.0e-4;
eps[SOLVEPNP_ITERATIVE] = 1.0e-6;
eps[SOLVEPNP_EPNP] = 1.0e-6;
eps[SOLVEPNP_P3P] = 1.0e-4;
eps[SOLVEPNP_DLS] = 1.0e-4;
totalTestsCount = 1000;
}
......@@ -218,6 +219,10 @@ protected:
{
opoints = std::vector<Point3f>(points.begin(), points.begin()+4);
}
else if(method == 3)
{
opoints = std::vector<Point3f>(points.begin(), points.begin()+50);
}
else
opoints = points;
......@@ -239,7 +244,7 @@ protected:
}
};
TEST(DISABLED_Calib3d_SolvePnPRansac, accuracy) { CV_solvePnPRansac_Test test; test.safe_run(); }
TEST(Calib3d_SolvePnPRansac, accuracy) { CV_solvePnPRansac_Test test; test.safe_run(); }
TEST(Calib3d_SolvePnP, accuracy) { CV_solvePnP_Test test; test.safe_run(); }
......
......@@ -99,9 +99,13 @@ if(BUILD_EXAMPLES AND OCV_DEPENDENCIES_FOUND)
endif()
foreach(sample_filename ${cpp_samples})
get_filename_component(sample ${sample_filename} NAME_WE)
OPENCV_DEFINE_CPP_EXAMPLE(${sample} ${sample_filename})
if(NOT "${sample_filename}" MATCHES "real_time_pose_estimation/")
get_filename_component(sample ${sample_filename} NAME_WE)
OPENCV_DEFINE_CPP_EXAMPLE(${sample} ${sample_filename})
endif()
endforeach()
include("tutorial_code/calib3d/real_time_pose_estimation/CMakeLists.txt")
endif()
if(INSTALL_C_EXAMPLES AND NOT WIN32)
......
set(sample_dir ${CMAKE_CURRENT_SOURCE_DIR}/tutorial_code/calib3d/real_time_pose_estimation/src/)
set(target cpp-tutorial-)
set(sample_pnplib
${sample_dir}CsvReader.cpp
${sample_dir}CsvWriter.cpp
${sample_dir}ModelRegistration.cpp
${sample_dir}Mesh.cpp
${sample_dir}Model.cpp
${sample_dir}PnPProblem.cpp
${sample_dir}Utils.cpp
${sample_dir}RobustMatcher.cpp
)
add_executable( ${target}pnp_registration ${sample_dir}main_registration.cpp ${sample_pnplib} )
add_executable( ${target}pnp_detection ${sample_dir}main_detection.cpp ${sample_pnplib} )
ocv_target_link_libraries( ${target}pnp_registration ${OPENCV_LINKER_LIBS} ${OPENCV_CPP_SAMPLES_REQUIRED_DEPS} )
ocv_target_link_libraries( ${target}pnp_detection ${OPENCV_LINKER_LIBS} ${OPENCV_CPP_SAMPLES_REQUIRED_DEPS} )
ply
format ascii 1.0
comment made by anonymous
comment this file is a cube
element vertex 8
property float32 x
property float32 y
property float32 z
element face 12
property list uint8 int32 vertex_index
end_header
0 0 0
0 25.8 0
18.9 0 0
18.9 25.8 0
0 0 7.5
0 25.8 7.5
18.9 0 7.5
18.9 25.8 7.5
3 5 1 0
3 5 4 0
3 4 0 2
3 4 6 2
3 7 5 4
3 7 6 4
3 3 2 1
3 1 2 0
3 5 7 1
3 7 1 3
3 7 6 3
3 6 3 2
#include "CsvReader.h"
/** The default constructor of the CSV reader Class */
CsvReader::CsvReader(const string &path, const char &separator){
_file.open(path.c_str(), ifstream::in);
_separator = separator;
}
/* Read a plane text file with .ply format */
void CsvReader::readPLY(vector<Point3f> &list_vertex, vector<vector<int> > &list_triangles)
{
std::string line, tmp_str, n;
int num_vertex = 0, num_triangles = 0;
int count = 0;
bool end_header = false;
bool end_vertex = false;
// Read the whole *.ply file
while (getline(_file, line)) {
stringstream liness(line);
// read header
if(!end_header)
{
getline(liness, tmp_str, _separator);
if( tmp_str == "element" )
{
getline(liness, tmp_str, _separator);
getline(liness, n);
if(tmp_str == "vertex") num_vertex = StringToInt(n);
if(tmp_str == "face") num_triangles = StringToInt(n);
}
if(tmp_str == "end_header") end_header = true;
}
// read file content
else if(end_header)
{
// read vertex and add into 'list_vertex'
if(!end_vertex && count < num_vertex)
{
string x, y, z;
getline(liness, x, _separator);
getline(liness, y, _separator);
getline(liness, z);
cv::Point3f tmp_p;
tmp_p.x = (float)StringToInt(x);
tmp_p.y = (float)StringToInt(y);
tmp_p.z = (float)StringToInt(z);
list_vertex.push_back(tmp_p);
count++;
if(count == num_vertex)
{
count = 0;
end_vertex = !end_vertex;
}
}
// read faces and add into 'list_triangles'
else if(end_vertex && count < num_triangles)
{
string num_pts_per_face, id0, id1, id2;
getline(liness, num_pts_per_face, _separator);
getline(liness, id0, _separator);
getline(liness, id1, _separator);
getline(liness, id2);
std::vector<int> tmp_triangle(3);
tmp_triangle[0] = StringToInt(id0);
tmp_triangle[1] = StringToInt(id1);
tmp_triangle[2] = StringToInt(id2);
list_triangles.push_back(tmp_triangle);
count++;
}
}
}
}
#ifndef CSVREADER_H
#define CSVREADER_H
#include <iostream>
#include <fstream>
#include <opencv2/core/core.hpp>
#include "Utils.h"
using namespace std;
using namespace cv;
class CsvReader {
public:
/**
* The default constructor of the CSV reader Class.
* The default separator is ' ' (empty space)
*
* @param path - The path of the file to read
* @param separator - The separator character between words per line
* @return
*/
CsvReader(const string &path, const char &separator = ' ');
/**
* Read a plane text file with .ply format
*
* @param list_vertex - The container of the vertices list of the mesh
* @param list_triangle - The container of the triangles list of the mesh
* @return
*/
void readPLY(vector<Point3f> &list_vertex, vector<vector<int> > &list_triangles);
private:
/** The current stream file for the reader */
ifstream _file;
/** The separator character between words for each line */
char _separator;
};
#endif
#include "CsvWriter.h"
CsvWriter::CsvWriter(const string &path, const string &separator){
_file.open(path.c_str(), ofstream::out);
_isFirstTerm = true;
_separator = separator;
}
CsvWriter::~CsvWriter() {
_file.flush();
_file.close();
}
void CsvWriter::writeXYZ(const vector<Point3f> &list_points3d)
{
string x, y, z;
for(unsigned int i = 0; i < list_points3d.size(); ++i)
{
x = FloatToString(list_points3d[i].x);
y = FloatToString(list_points3d[i].y);
z = FloatToString(list_points3d[i].z);
_file << x << _separator << y << _separator << z << std::endl;
}
}
void CsvWriter::writeUVXYZ(const vector<Point3f> &list_points3d, const vector<Point2f> &list_points2d, const Mat &descriptors)
{
string u, v, x, y, z, descriptor_str;
for(unsigned int i = 0; i < list_points3d.size(); ++i)
{
u = FloatToString(list_points2d[i].x);
v = FloatToString(list_points2d[i].y);
x = FloatToString(list_points3d[i].x);
y = FloatToString(list_points3d[i].y);
z = FloatToString(list_points3d[i].z);
_file << u << _separator << v << _separator << x << _separator << y << _separator << z;
for(int j = 0; j < 32; ++j)
{
descriptor_str = FloatToString(descriptors.at<float>(i,j));
_file << _separator << descriptor_str;
}
_file << std::endl;
}
}
#ifndef CSVWRITER_H
#define CSVWRITER_H
#include <iostream>
#include <fstream>
#include <opencv2/core/core.hpp>
#include "Utils.h"
using namespace std;
using namespace cv;
class CsvWriter {
public:
CsvWriter(const string &path, const string &separator = " ");
~CsvWriter();
void writeXYZ(const vector<Point3f> &list_points3d);
void writeUVXYZ(const vector<Point3f> &list_points3d, const vector<Point2f> &list_points2d, const Mat &descriptors);
private:
ofstream _file;
string _separator;
bool _isFirstTerm;
};
#endif
/*
* Mesh.cpp
*
* Created on: Apr 9, 2014
* Author: edgar
*/
#include "Mesh.h"
#include "CsvReader.h"
// --------------------------------------------------- //
// TRIANGLE CLASS //
// --------------------------------------------------- //
/** The custom constructor of the Triangle Class */
Triangle::Triangle(int id, cv::Point3f V0, cv::Point3f V1, cv::Point3f V2)
{
id_ = id; v0_ = V0; v1_ = V1; v2_ = V2;
}
/** The default destructor of the Class */
Triangle::~Triangle()
{
// TODO Auto-generated destructor stub
}
// --------------------------------------------------- //
// RAY CLASS //
// --------------------------------------------------- //
/** The custom constructor of the Ray Class */
Ray::Ray(cv::Point3f P0, cv::Point3f P1) {
p0_ = P0; p1_ = P1;
}
/** The default destructor of the Class */
Ray::~Ray()
{
// TODO Auto-generated destructor stub
}
// --------------------------------------------------- //
// OBJECT MESH CLASS //
// --------------------------------------------------- //
/** The default constructor of the ObjectMesh Class */
Mesh::Mesh() : list_vertex_(0) , list_triangles_(0)
{
id_ = 0;
num_vertexs_ = 0;
num_triangles_ = 0;
}
/** The default destructor of the ObjectMesh Class */
Mesh::~Mesh()
{
// TODO Auto-generated destructor stub
}
/** Load a CSV with *.ply format **/
void Mesh::load(const std::string path)
{
// Create the reader
CsvReader csvReader(path);
// Clear previous data
list_vertex_.clear();
list_triangles_.clear();
// Read from .ply file
csvReader.readPLY(list_vertex_, list_triangles_);
// Update mesh attributes
num_vertexs_ = (int)list_vertex_.size();
num_triangles_ = (int)list_triangles_.size();
}
/*
* Mesh.h
*
* Created on: Apr 9, 2014
* Author: edgar
*/
#ifndef MESH_H_
#define MESH_H_
#include <iostream>
#include <opencv2/core/core.hpp>
// --------------------------------------------------- //
// TRIANGLE CLASS //
// --------------------------------------------------- //
class Triangle {
public:
explicit Triangle(int id, cv::Point3f V0, cv::Point3f V1, cv::Point3f V2);
virtual ~Triangle();
cv::Point3f getV0() const { return v0_; }
cv::Point3f getV1() const { return v1_; }
cv::Point3f getV2() const { return v2_; }
private:
/** The identifier number of the triangle */
int id_;
/** The three vertices that defines the triangle */
cv::Point3f v0_, v1_, v2_;
};
// --------------------------------------------------- //
// RAY CLASS //
// --------------------------------------------------- //
class Ray {
public:
explicit Ray(cv::Point3f P0, cv::Point3f P1);
virtual ~Ray();
cv::Point3f getP0() { return p0_; }
cv::Point3f getP1() { return p1_; }
private:
/** The two points that defines the ray */
cv::Point3f p0_, p1_;
};
// --------------------------------------------------- //
// OBJECT MESH CLASS //
// --------------------------------------------------- //
class Mesh
{
public:
Mesh();
virtual ~Mesh();
std::vector<std::vector<int> > getTrianglesList() const { return list_triangles_; }
cv::Point3f getVertex(int pos) const { return list_vertex_[pos]; }
int getNumVertices() const { return num_vertexs_; }
void load(const std::string path_file);
private:
/** The identification number of the mesh */
int id_;
/** The current number of vertices in the mesh */
int num_vertexs_;
/** The current number of triangles in the mesh */
int num_triangles_;
/* The list of triangles of the mesh */
std::vector<cv::Point3f> list_vertex_;
/* The list of triangles of the mesh */
std::vector<std::vector<int> > list_triangles_;
};
#endif /* OBJECTMESH_H_ */
/*
* Model.cpp
*
* Created on: Apr 9, 2014
* Author: edgar
*/
#include "Model.h"
#include "CsvWriter.h"
Model::Model() : list_points2d_in_(0), list_points2d_out_(0), list_points3d_in_(0)
{
n_correspondences_ = 0;
}
Model::~Model()
{
// TODO Auto-generated destructor stub
}
void Model::add_correspondence(const cv::Point2f &point2d, const cv::Point3f &point3d)
{
list_points2d_in_.push_back(point2d);
list_points3d_in_.push_back(point3d);
n_correspondences_++;
}
void Model::add_outlier(const cv::Point2f &point2d)
{
list_points2d_out_.push_back(point2d);
}
void Model::add_descriptor(const cv::Mat &descriptor)
{
descriptors_.push_back(descriptor);
}
void Model::add_keypoint(const cv::KeyPoint &kp)
{
list_keypoints_.push_back(kp);
}
/** Save a CSV file and fill the object mesh */
void Model::save(const std::string path)
{
cv::Mat points3dmatrix = cv::Mat(list_points3d_in_);
cv::Mat points2dmatrix = cv::Mat(list_points2d_in_);
//cv::Mat keyPointmatrix = cv::Mat(list_keypoints_);
cv::FileStorage storage(path, cv::FileStorage::WRITE);
storage << "points_3d" << points3dmatrix;
storage << "points_2d" << points2dmatrix;
storage << "keypoints" << list_keypoints_;
storage << "descriptors" << descriptors_;
storage.release();
}
/** Load a YAML file using OpenCv functions **/
void Model::load(const std::string path)
{
cv::Mat points3d_mat;
cv::FileStorage storage(path, cv::FileStorage::READ);
storage["points_3d"] >> points3d_mat;
storage["descriptors"] >> descriptors_;
points3d_mat.copyTo(list_points3d_in_);
storage.release();
}
/*
* Model.h
*
* Created on: Apr 9, 2014
* Author: edgar
*/
#ifndef MODEL_H_
#define MODEL_H_
#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/features2d/features2d.hpp>
class Model
{
public:
Model();
virtual ~Model();
std::vector<cv::Point2f> get_points2d_in() const { return list_points2d_in_; }
std::vector<cv::Point2f> get_points2d_out() const { return list_points2d_out_; }
std::vector<cv::Point3f> get_points3d() const { return list_points3d_in_; }
std::vector<cv::KeyPoint> get_keypoints() const { return list_keypoints_; }
cv::Mat get_descriptors() const { return descriptors_; }
int get_numDescriptors() const { return descriptors_.rows; }
void add_correspondence(const cv::Point2f &point2d, const cv::Point3f &point3d);
void add_outlier(const cv::Point2f &point2d);
void add_descriptor(const cv::Mat &descriptor);
void add_keypoint(const cv::KeyPoint &kp);
void save(const std::string path);
void load(const std::string path);
private:
/** The current number of correspondecnes */
int n_correspondences_;
/** The list of 2D points on the model surface */
std::vector<cv::KeyPoint> list_keypoints_;
/** The list of 2D points on the model surface */
std::vector<cv::Point2f> list_points2d_in_;
/** The list of 2D points outside the model surface */
std::vector<cv::Point2f> list_points2d_out_;
/** The list of 3D points on the model surface */
std::vector<cv::Point3f> list_points3d_in_;
/** The list of 2D points descriptors */
cv::Mat descriptors_;
};
#endif /* OBJECTMODEL_H_ */
/*
* ModelRegistration.cpp
*
* Created on: Apr 18, 2014
* Author: edgar
*/
#include "ModelRegistration.h"
ModelRegistration::ModelRegistration()
{
n_registrations_ = 0;
max_registrations_ = 0;
}
ModelRegistration::~ModelRegistration()
{
// TODO Auto-generated destructor stub
}
void ModelRegistration::registerPoint(const cv::Point2f &point2d, const cv::Point3f &point3d)
{
// add correspondence at the end of the vector
list_points2d_.push_back(point2d);
list_points3d_.push_back(point3d);
n_registrations_++;
}
void ModelRegistration::reset()
{
n_registrations_ = 0;
max_registrations_ = 0;
list_points2d_.clear();
list_points3d_.clear();
}
/*
* ModelRegistration.h
*
* Created on: Apr 18, 2014
* Author: edgar
*/
#ifndef MODELREGISTRATION_H_
#define MODELREGISTRATION_H_
#include <iostream>
#include <opencv2/core/core.hpp>
class ModelRegistration
{
public:
ModelRegistration();
virtual ~ModelRegistration();
void setNumMax(int n) { max_registrations_ = n; }
std::vector<cv::Point2f> get_points2d() const { return list_points2d_; }
std::vector<cv::Point3f> get_points3d() const { return list_points3d_; }
int getNumMax() const { return max_registrations_; }
int getNumRegist() const { return n_registrations_; }
bool is_registrable() const { return (n_registrations_ < max_registrations_); }
void registerPoint(const cv::Point2f &point2d, const cv::Point3f &point3d);
void reset();
private:
/** The current number of registered points */
int n_registrations_;
/** The total number of points to register */
int max_registrations_;
/** The list of 2D points to register the model */
std::vector<cv::Point2f> list_points2d_;
/** The list of 3D points to register the model */
std::vector<cv::Point3f> list_points3d_;
};
#endif /* MODELREGISTRATION_H_ */
/*
* PnPProblem.cpp
*
* Created on: Mar 28, 2014
* Author: Edgar Riba
*/
#include <iostream>
#include <sstream>
#include "PnPProblem.h"
#include "Mesh.h"
#include <opencv2/calib3d/calib3d.hpp>
/* Functions headers */
cv::Point3f CROSS(cv::Point3f v1, cv::Point3f v2);
double DOT(cv::Point3f v1, cv::Point3f v2);
cv::Point3f SUB(cv::Point3f v1, cv::Point3f v2);
cv::Point3f get_nearest_3D_point(std::vector<cv::Point3f> &points_list, cv::Point3f origin);
/* Functions for Möller–Trumbore intersection algorithm */
cv::Point3f CROSS(cv::Point3f v1, cv::Point3f v2)
{
cv::Point3f tmp_p;
tmp_p.x = v1.y*v2.z - v1.z*v2.y;
tmp_p.y = v1.z*v2.x - v1.x*v2.z;
tmp_p.z = v1.x*v2.y - v1.y*v2.x;
return tmp_p;
}
double DOT(cv::Point3f v1, cv::Point3f v2)
{
return v1.x*v2.x + v1.y*v2.y + v1.z*v2.z;
}
cv::Point3f SUB(cv::Point3f v1, cv::Point3f v2)
{
cv::Point3f tmp_p;
tmp_p.x = v1.x - v2.x;
tmp_p.y = v1.y - v2.y;
tmp_p.z = v1.z - v2.z;
return tmp_p;
}
/* End functions for Möller–Trumbore intersection algorithm
* */
// Function to get the nearest 3D point to the Ray origin
cv::Point3f get_nearest_3D_point(std::vector<cv::Point3f> &points_list, cv::Point3f origin)
{
cv::Point3f p1 = points_list[0];
cv::Point3f p2 = points_list[1];
double d1 = std::sqrt( std::pow(p1.x-origin.x, 2) + std::pow(p1.y-origin.y, 2) + std::pow(p1.z-origin.z, 2) );
double d2 = std::sqrt( std::pow(p2.x-origin.x, 2) + std::pow(p2.y-origin.y, 2) + std::pow(p2.z-origin.z, 2) );
if(d1 < d2)
{
return p1;
}
else
{
return p2;
}
}
// Custom constructor given the intrinsic camera parameters
PnPProblem::PnPProblem(const double params[])
{
_A_matrix = cv::Mat::zeros(3, 3, CV_64FC1); // intrinsic camera parameters
_A_matrix.at<double>(0, 0) = params[0]; // [ fx 0 cx ]
_A_matrix.at<double>(1, 1) = params[1]; // [ 0 fy cy ]
_A_matrix.at<double>(0, 2) = params[2]; // [ 0 0 1 ]
_A_matrix.at<double>(1, 2) = params[3];
_A_matrix.at<double>(2, 2) = 1;
_R_matrix = cv::Mat::zeros(3, 3, CV_64FC1); // rotation matrix
_t_matrix = cv::Mat::zeros(3, 1, CV_64FC1); // translation matrix
_P_matrix = cv::Mat::zeros(3, 4, CV_64FC1); // rotation-translation matrix
}
PnPProblem::~PnPProblem()
{
// TODO Auto-generated destructor stub
}
void PnPProblem::set_P_matrix( const cv::Mat &R_matrix, const cv::Mat &t_matrix)
{
// Rotation-Translation Matrix Definition
_P_matrix.at<double>(0,0) = R_matrix.at<double>(0,0);
_P_matrix.at<double>(0,1) = R_matrix.at<double>(0,1);
_P_matrix.at<double>(0,2) = R_matrix.at<double>(0,2);
_P_matrix.at<double>(1,0) = R_matrix.at<double>(1,0);
_P_matrix.at<double>(1,1) = R_matrix.at<double>(1,1);
_P_matrix.at<double>(1,2) = R_matrix.at<double>(1,2);
_P_matrix.at<double>(2,0) = R_matrix.at<double>(2,0);
_P_matrix.at<double>(2,1) = R_matrix.at<double>(2,1);
_P_matrix.at<double>(0,3) = t_matrix.at<double>(0);
_P_matrix.at<double>(1,3) = t_matrix.at<double>(1);
_P_matrix.at<double>(2,3) = t_matrix.at<double>(2);
}
// Estimate the pose given a list of 2D/3D correspondences and the method to use
bool PnPProblem::estimatePose( const std::vector<cv::Point3f> &list_points3d,
const std::vector<cv::Point2f> &list_points2d,
int flags)
{
cv::Mat distCoeffs = cv::Mat::zeros(4, 1, CV_64FC1);
cv::Mat rvec = cv::Mat::zeros(3, 1, CV_64FC1);
cv::Mat tvec = cv::Mat::zeros(3, 1, CV_64FC1);
bool useExtrinsicGuess = false;
// Pose estimation
bool correspondence = cv::solvePnP( list_points3d, list_points2d, _A_matrix, distCoeffs, rvec, tvec,
useExtrinsicGuess, flags);
// Transforms Rotation Vector to Matrix
Rodrigues(rvec,_R_matrix);
_t_matrix = tvec;
// Set projection matrix
this->set_P_matrix(_R_matrix, _t_matrix);
return correspondence;
}
// Estimate the pose given a list of 2D/3D correspondences with RANSAC and the method to use
void PnPProblem::estimatePoseRANSAC( const std::vector<cv::Point3f> &list_points3d, // list with model 3D coordinates
const std::vector<cv::Point2f> &list_points2d, // list with scene 2D coordinates
int flags, cv::Mat &inliers, int iterationsCount, // PnP method; inliers container
float reprojectionError, double confidence ) // Ransac parameters
{
cv::Mat distCoeffs = cv::Mat::zeros(4, 1, CV_64FC1); // vector of distortion coefficients
cv::Mat rvec = cv::Mat::zeros(3, 1, CV_64FC1); // output rotation vector
cv::Mat tvec = cv::Mat::zeros(3, 1, CV_64FC1); // output translation vector
bool useExtrinsicGuess = false; // if true the function uses the provided rvec and tvec values as
// initial approximations of the rotation and translation vectors
cv::solvePnPRansac( list_points3d, list_points2d, _A_matrix, distCoeffs, rvec, tvec,
useExtrinsicGuess, iterationsCount, reprojectionError, confidence,
inliers, flags );
Rodrigues(rvec,_R_matrix); // converts Rotation Vector to Matrix
_t_matrix = tvec; // set translation matrix
this->set_P_matrix(_R_matrix, _t_matrix); // set rotation-translation matrix
}
// Given the mesh, backproject the 3D points to 2D to verify the pose estimation
std::vector<cv::Point2f> PnPProblem::verify_points(Mesh *mesh)
{
std::vector<cv::Point2f> verified_points_2d;
for( int i = 0; i < mesh->getNumVertices(); i++)
{
cv::Point3f point3d = mesh->getVertex(i);
cv::Point2f point2d = this->backproject3DPoint(point3d);
verified_points_2d.push_back(point2d);
}
return verified_points_2d;
}
// Backproject a 3D point to 2D using the estimated pose parameters
cv::Point2f PnPProblem::backproject3DPoint(const cv::Point3f &point3d)
{
// 3D point vector [x y z 1]'
cv::Mat point3d_vec = cv::Mat(4, 1, CV_64FC1);
point3d_vec.at<double>(0) = point3d.x;
point3d_vec.at<double>(1) = point3d.y;
point3d_vec.at<double>(2) = point3d.z;
point3d_vec.at<double>(3) = 1;
// 2D point vector [u v 1]'
cv::Mat point2d_vec = cv::Mat(4, 1, CV_64FC1);
point2d_vec = _A_matrix * _P_matrix * point3d_vec;
// Normalization of [u v]'
cv::Point2f point2d;
point2d.x = (float)(point2d_vec.at<double>(0) / point2d_vec.at<double>(2));
point2d.y = (float)(point2d_vec.at<double>(1) / point2d_vec.at<double>(2));
return point2d;
}
// Back project a 2D point to 3D and returns if it's on the object surface
bool PnPProblem::backproject2DPoint(const Mesh *mesh, const cv::Point2f &point2d, cv::Point3f &point3d)
{
// Triangles list of the object mesh
std::vector<std::vector<int> > triangles_list = mesh->getTrianglesList();
double lambda = 8;
double u = point2d.x;
double v = point2d.y;
// Point in vector form
cv::Mat point2d_vec = cv::Mat::ones(3, 1, CV_64F); // 3x1
point2d_vec.at<double>(0) = u * lambda;
point2d_vec.at<double>(1) = v * lambda;
point2d_vec.at<double>(2) = lambda;
// Point in camera coordinates
cv::Mat X_c = _A_matrix.inv() * point2d_vec ; // 3x1
// Point in world coordinates
cv::Mat X_w = _R_matrix.inv() * ( X_c - _t_matrix ); // 3x1
// Center of projection
cv::Mat C_op = cv::Mat(_R_matrix.inv()).mul(-1) * _t_matrix; // 3x1
// Ray direction vector
cv::Mat ray = X_w - C_op; // 3x1
ray = ray / cv::norm(ray); // 3x1
// Set up Ray
Ray R((cv::Point3f)C_op, (cv::Point3f)ray);
// A vector to store the intersections found
std::vector<cv::Point3f> intersections_list;
// Loop for all the triangles and check the intersection
for (unsigned int i = 0; i < triangles_list.size(); i++)
{
cv::Point3f V0 = mesh->getVertex(triangles_list[i][0]);
cv::Point3f V1 = mesh->getVertex(triangles_list[i][1]);
cv::Point3f V2 = mesh->getVertex(triangles_list[i][2]);
Triangle T(i, V0, V1, V2);
double out;
if(this->intersect_MollerTrumbore(R, T, &out))
{
cv::Point3f tmp_pt = R.getP0() + out*R.getP1(); // P = O + t*D
intersections_list.push_back(tmp_pt);
}
}
// If there are intersection, find the nearest one
if (!intersections_list.empty())
{
point3d = get_nearest_3D_point(intersections_list, R.getP0());
return true;
}
else
{
return false;
}
}
// Möller–Trumbore intersection algorithm
bool PnPProblem::intersect_MollerTrumbore(Ray &Ray, Triangle &Triangle, double *out)
{
const double EPSILON = 0.000001;
cv::Point3f e1, e2;
cv::Point3f P, Q, T;
double det, inv_det, u, v;
double t;
cv::Point3f V1 = Triangle.getV0(); // Triangle vertices
cv::Point3f V2 = Triangle.getV1();
cv::Point3f V3 = Triangle.getV2();
cv::Point3f O = Ray.getP0(); // Ray origin
cv::Point3f D = Ray.getP1(); // Ray direction
//Find vectors for two edges sharing V1
e1 = SUB(V2, V1);
e2 = SUB(V3, V1);
// Begin calculation determinant - also used to calculate U parameter
P = CROSS(D, e2);
// If determinant is near zero, ray lie in plane of triangle
det = DOT(e1, P);
//NOT CULLING
if(det > -EPSILON && det < EPSILON) return false;
inv_det = 1.f / det;
//calculate distance from V1 to ray origin
T = SUB(O, V1);
//Calculate u parameter and test bound
u = DOT(T, P) * inv_det;
//The intersection lies outside of the triangle
if(u < 0.f || u > 1.f) return false;
//Prepare to test v parameter
Q = CROSS(T, e1);
//Calculate V parameter and test bound
v = DOT(D, Q) * inv_det;
//The intersection lies outside of the triangle
if(v < 0.f || u + v > 1.f) return false;
t = DOT(e2, Q) * inv_det;
if(t > EPSILON) { //ray intersection
*out = t;
return true;
}
// No hit, no win
return false;
}
/*
* PnPProblem.h
*
* Created on: Mar 28, 2014
* Author: Edgar Riba
*/
#ifndef PNPPROBLEM_H_
#define PNPPROBLEM_H_
#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include "Mesh.h"
#include "ModelRegistration.h"
class PnPProblem
{
public:
explicit PnPProblem(const double param[]); // custom constructor
virtual ~PnPProblem();
bool backproject2DPoint(const Mesh *mesh, const cv::Point2f &point2d, cv::Point3f &point3d);
bool intersect_MollerTrumbore(Ray &R, Triangle &T, double *out);
std::vector<cv::Point2f> verify_points(Mesh *mesh);
cv::Point2f backproject3DPoint(const cv::Point3f &point3d);
bool estimatePose(const std::vector<cv::Point3f> &list_points3d, const std::vector<cv::Point2f> &list_points2d, int flags);
void estimatePoseRANSAC( const std::vector<cv::Point3f> &list_points3d, const std::vector<cv::Point2f> &list_points2d,
int flags, cv::Mat &inliers,
int iterationsCount, float reprojectionError, double confidence );
cv::Mat get_A_matrix() const { return _A_matrix; }
cv::Mat get_R_matrix() const { return _R_matrix; }
cv::Mat get_t_matrix() const { return _t_matrix; }
cv::Mat get_P_matrix() const { return _P_matrix; }
void set_P_matrix( const cv::Mat &R_matrix, const cv::Mat &t_matrix);
private:
/** The calibration matrix */
cv::Mat _A_matrix;
/** The computed rotation matrix */
cv::Mat _R_matrix;
/** The computed translation matrix */
cv::Mat _t_matrix;
/** The computed projection matrix */
cv::Mat _P_matrix;
};
// Functions for Möller–Trumbore intersection algorithm
cv::Point3f CROSS(cv::Point3f v1, cv::Point3f v2);
double DOT(cv::Point3f v1, cv::Point3f v2);
cv::Point3f SUB(cv::Point3f v1, cv::Point3f v2);
#endif /* PNPPROBLEM_H_ */
/*
* RobustMatcher.cpp
*
* Created on: Jun 4, 2014
* Author: eriba
*/
#include "RobustMatcher.h"
#include <time.h>
#include <opencv2/features2d/features2d.hpp>
RobustMatcher::~RobustMatcher()
{
// TODO Auto-generated destructor stub
}
void RobustMatcher::computeKeyPoints( const cv::Mat& image, std::vector<cv::KeyPoint>& keypoints)
{
detector_->detect(image, keypoints);
}
void RobustMatcher::computeDescriptors( const cv::Mat& image, std::vector<cv::KeyPoint>& keypoints, cv::Mat& descriptors)
{
extractor_->compute(image, keypoints, descriptors);
}
int RobustMatcher::ratioTest(std::vector<std::vector<cv::DMatch> > &matches)
{
int removed = 0;
// for all matches
for ( std::vector<std::vector<cv::DMatch> >::iterator
matchIterator= matches.begin(); matchIterator!= matches.end(); ++matchIterator)
{
// if 2 NN has been identified
if (matchIterator->size() > 1)
{
// check distance ratio
if ((*matchIterator)[0].distance / (*matchIterator)[1].distance > ratio_)
{
matchIterator->clear(); // remove match
removed++;
}
}
else
{ // does not have 2 neighbours
matchIterator->clear(); // remove match
removed++;
}
}
return removed;
}
void RobustMatcher::symmetryTest( const std::vector<std::vector<cv::DMatch> >& matches1,
const std::vector<std::vector<cv::DMatch> >& matches2,
std::vector<cv::DMatch>& symMatches )
{
// for all matches image 1 -> image 2
for (std::vector<std::vector<cv::DMatch> >::const_iterator
matchIterator1 = matches1.begin(); matchIterator1 != matches1.end(); ++matchIterator1)
{
// ignore deleted matches
if (matchIterator1->empty() || matchIterator1->size() < 2)
continue;
// for all matches image 2 -> image 1
for (std::vector<std::vector<cv::DMatch> >::const_iterator
matchIterator2 = matches2.begin(); matchIterator2 != matches2.end(); ++matchIterator2)
{
// ignore deleted matches
if (matchIterator2->empty() || matchIterator2->size() < 2)
continue;
// Match symmetry test
if ((*matchIterator1)[0].queryIdx ==
(*matchIterator2)[0].trainIdx &&
(*matchIterator2)[0].queryIdx ==
(*matchIterator1)[0].trainIdx)
{
// add symmetrical match
symMatches.push_back(
cv::DMatch((*matchIterator1)[0].queryIdx,
(*matchIterator1)[0].trainIdx,
(*matchIterator1)[0].distance));
break; // next match in image 1 -> image 2
}
}
}
}
void RobustMatcher::robustMatch( const cv::Mat& frame, std::vector<cv::DMatch>& good_matches,
std::vector<cv::KeyPoint>& keypoints_frame, const cv::Mat& descriptors_model )
{
// 1a. Detection of the ORB features
this->computeKeyPoints(frame, keypoints_frame);
// 1b. Extraction of the ORB descriptors
cv::Mat descriptors_frame;
this->computeDescriptors(frame, keypoints_frame, descriptors_frame);
// 2. Match the two image descriptors
std::vector<std::vector<cv::DMatch> > matches12, matches21;
// 2a. From image 1 to image 2
matcher_->knnMatch(descriptors_frame, descriptors_model, matches12, 2); // return 2 nearest neighbours
// 2b. From image 2 to image 1
matcher_->knnMatch(descriptors_model, descriptors_frame, matches21, 2); // return 2 nearest neighbours
// 3. Remove matches for which NN ratio is > than threshold
// clean image 1 -> image 2 matches
ratioTest(matches12);
// clean image 2 -> image 1 matches
ratioTest(matches21);
// 4. Remove non-symmetrical matches
symmetryTest(matches12, matches21, good_matches);
}
void RobustMatcher::fastRobustMatch( const cv::Mat& frame, std::vector<cv::DMatch>& good_matches,
std::vector<cv::KeyPoint>& keypoints_frame,
const cv::Mat& descriptors_model )
{
good_matches.clear();
// 1a. Detection of the ORB features
this->computeKeyPoints(frame, keypoints_frame);
// 1b. Extraction of the ORB descriptors
cv::Mat descriptors_frame;
this->computeDescriptors(frame, keypoints_frame, descriptors_frame);
// 2. Match the two image descriptors
std::vector<std::vector<cv::DMatch> > matches;
matcher_->knnMatch(descriptors_frame, descriptors_model, matches, 2);
// 3. Remove matches for which NN ratio is > than threshold
ratioTest(matches);
// 4. Fill good matches container
for ( std::vector<std::vector<cv::DMatch> >::iterator
matchIterator= matches.begin(); matchIterator!= matches.end(); ++matchIterator)
{
if (!matchIterator->empty()) good_matches.push_back((*matchIterator)[0]);
}
}
/*
* RobustMatcher.h
*
* Created on: Jun 4, 2014
* Author: eriba
*/
#ifndef ROBUSTMATCHER_H_
#define ROBUSTMATCHER_H_
#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/features2d/features2d.hpp>
class RobustMatcher {
public:
RobustMatcher() : ratio_(0.8f)
{
// ORB is the default feature
detector_ = new cv::OrbFeatureDetector();
extractor_ = new cv::OrbDescriptorExtractor();
// BruteFroce matcher with Norm Hamming is the default matcher
matcher_ = new cv::BFMatcher(cv::NORM_HAMMING, false);
}
virtual ~RobustMatcher();
// Set the feature detector
void setFeatureDetector(cv::FeatureDetector * detect) { detector_ = detect; }
// Set the descriptor extractor
void setDescriptorExtractor(cv::DescriptorExtractor * desc) { extractor_ = desc; }
// Set the matcher
void setDescriptorMatcher(cv::DescriptorMatcher * match) { matcher_ = match; }
// Compute the keypoints of an image
void computeKeyPoints( const cv::Mat& image, std::vector<cv::KeyPoint>& keypoints);
// Compute the descriptors of an image given its keypoints
void computeDescriptors( const cv::Mat& image, std::vector<cv::KeyPoint>& keypoints, cv::Mat& descriptors);
// Set ratio parameter for the ratio test
void setRatio( float rat) { ratio_ = rat; }
// Clear matches for which NN ratio is > than threshold
// return the number of removed points
// (corresponding entries being cleared,
// i.e. size will be 0)
int ratioTest(std::vector<std::vector<cv::DMatch> > &matches);
// Insert symmetrical matches in symMatches vector
void symmetryTest( const std::vector<std::vector<cv::DMatch> >& matches1,
const std::vector<std::vector<cv::DMatch> >& matches2,
std::vector<cv::DMatch>& symMatches );
// Match feature points using ratio and symmetry test
void robustMatch( const cv::Mat& frame, std::vector<cv::DMatch>& good_matches,
std::vector<cv::KeyPoint>& keypoints_frame,
const cv::Mat& descriptors_model );
// Match feature points using ratio test
void fastRobustMatch( const cv::Mat& frame, std::vector<cv::DMatch>& good_matches,
std::vector<cv::KeyPoint>& keypoints_frame,
const cv::Mat& descriptors_model );
private:
// pointer to the feature point detector object
cv::FeatureDetector * detector_;
// pointer to the feature descriptor extractor object
cv::DescriptorExtractor * extractor_;
// pointer to the matcher object
cv::DescriptorMatcher * matcher_;
// max ratio between 1st and 2nd NN
float ratio_;
};
#endif /* ROBUSTMATCHER_H_ */
/*
* Utils.cpp
*
* Created on: Mar 28, 2014
* Author: Edgar Riba
*/
#include <iostream>
#include "PnPProblem.h"
#include "ModelRegistration.h"
#include "Utils.h"
#include <opencv2/imgproc/imgproc.hpp>
#include <opencv2/calib3d/calib3d.hpp>
// For text
int fontFace = cv::FONT_ITALIC;
double fontScale = 0.75;
int thickness_font = 2;
// For circles
int lineType = 8;
int radius = 4;
double thickness_circ = -1;
// Draw a text with the question point
void drawQuestion(cv::Mat image, cv::Point3f point, cv::Scalar color)
{
std::string x = IntToString((int)point.x);
std::string y = IntToString((int)point.y);
std::string z = IntToString((int)point.z);
std::string text = " Where is point (" + x + "," + y + "," + z + ") ?";
cv::putText(image, text, cv::Point(25,50), fontFace, fontScale, color, thickness_font, 8);
}
// Draw a text with the number of entered points
void drawText(cv::Mat image, std::string text, cv::Scalar color)
{
cv::putText(image, text, cv::Point(25,50), fontFace, fontScale, color, thickness_font, 8);
}
// Draw a text with the number of entered points
void drawText2(cv::Mat image, std::string text, cv::Scalar color)
{
cv::putText(image, text, cv::Point(25,75), fontFace, fontScale, color, thickness_font, 8);
}
// Draw a text with the frame ratio
void drawFPS(cv::Mat image, double fps, cv::Scalar color)
{
std::string fps_str = IntToString((int)fps);
std::string text = fps_str + " FPS";
cv::putText(image, text, cv::Point(500,50), fontFace, fontScale, color, thickness_font, 8);
}
// Draw a text with the frame ratio
void drawConfidence(cv::Mat image, double confidence, cv::Scalar color)
{
std::string conf_str = IntToString((int)confidence);
std::string text = conf_str + " %";
cv::putText(image, text, cv::Point(500,75), fontFace, fontScale, color, thickness_font, 8);
}
// Draw a text with the number of entered points
void drawCounter(cv::Mat image, int n, int n_max, cv::Scalar color)
{
std::string n_str = IntToString(n);
std::string n_max_str = IntToString(n_max);
std::string text = n_str + " of " + n_max_str + " points";
cv::putText(image, text, cv::Point(500,50), fontFace, fontScale, color, thickness_font, 8);
}
// Draw the points and the coordinates
void drawPoints(cv::Mat image, std::vector<cv::Point2f> &list_points_2d, std::vector<cv::Point3f> &list_points_3d, cv::Scalar color)
{
for (unsigned int i = 0; i < list_points_2d.size(); ++i)
{
cv::Point2f point_2d = list_points_2d[i];
cv::Point3f point_3d = list_points_3d[i];
// Draw Selected points
cv::circle(image, point_2d, radius, color, -1, lineType );
std::string idx = IntToString(i+1);
std::string x = IntToString((int)point_3d.x);
std::string y = IntToString((int)point_3d.y);
std::string z = IntToString((int)point_3d.z);
std::string text = "P" + idx + " (" + x + "," + y + "," + z +")";
point_2d.x = point_2d.x + 10;
point_2d.y = point_2d.y - 10;
cv::putText(image, text, point_2d, fontFace, fontScale*0.5, color, thickness_font, 8);
}
}
// Draw only the 2D points
void draw2DPoints(cv::Mat image, std::vector<cv::Point2f> &list_points, cv::Scalar color)
{
for( size_t i = 0; i < list_points.size(); i++)
{
cv::Point2f point_2d = list_points[i];
// Draw Selected points
cv::circle(image, point_2d, radius, color, -1, lineType );
}
}
// Draw an arrow into the image
void drawArrow(cv::Mat image, cv::Point2i p, cv::Point2i q, cv::Scalar color, int arrowMagnitude, int thickness, int line_type, int shift)
{
//Draw the principle line
cv::line(image, p, q, color, thickness, line_type, shift);
const double PI = CV_PI;
//compute the angle alpha
double angle = atan2((double)p.y-q.y, (double)p.x-q.x);
//compute the coordinates of the first segment
p.x = (int) ( q.x + arrowMagnitude * cos(angle + PI/4));
p.y = (int) ( q.y + arrowMagnitude * sin(angle + PI/4));
//Draw the first segment
cv::line(image, p, q, color, thickness, line_type, shift);
//compute the coordinates of the second segment
p.x = (int) ( q.x + arrowMagnitude * cos(angle - PI/4));
p.y = (int) ( q.y + arrowMagnitude * sin(angle - PI/4));
//Draw the second segment
cv::line(image, p, q, color, thickness, line_type, shift);
}
// Draw the 3D coordinate axes
void draw3DCoordinateAxes(cv::Mat image, const std::vector<cv::Point2f> &list_points2d)
{
cv::Scalar red(0, 0, 255);
cv::Scalar green(0,255,0);
cv::Scalar blue(255,0,0);
cv::Scalar black(0,0,0);
cv::Point2i origin = list_points2d[0];
cv::Point2i pointX = list_points2d[1];
cv::Point2i pointY = list_points2d[2];
cv::Point2i pointZ = list_points2d[3];
drawArrow(image, origin, pointX, red, 9, 2);
drawArrow(image, origin, pointY, blue, 9, 2);
drawArrow(image, origin, pointZ, green, 9, 2);
cv::circle(image, origin, radius/2, black, -1, lineType );
}
// Draw the object mesh
void drawObjectMesh(cv::Mat image, const Mesh *mesh, PnPProblem *pnpProblem, cv::Scalar color)
{
std::vector<std::vector<int> > list_triangles = mesh->getTrianglesList();
for( size_t i = 0; i < list_triangles.size(); i++)
{
std::vector<int> tmp_triangle = list_triangles.at(i);
cv::Point3f point_3d_0 = mesh->getVertex(tmp_triangle[0]);
cv::Point3f point_3d_1 = mesh->getVertex(tmp_triangle[1]);
cv::Point3f point_3d_2 = mesh->getVertex(tmp_triangle[2]);
cv::Point2f point_2d_0 = pnpProblem->backproject3DPoint(point_3d_0);
cv::Point2f point_2d_1 = pnpProblem->backproject3DPoint(point_3d_1);
cv::Point2f point_2d_2 = pnpProblem->backproject3DPoint(point_3d_2);
cv::line(image, point_2d_0, point_2d_1, color, 1);
cv::line(image, point_2d_1, point_2d_2, color, 1);
cv::line(image, point_2d_2, point_2d_0, color, 1);
}
}
// Computes the norm of the translation error
double get_translation_error(const cv::Mat &t_true, const cv::Mat &t)
{
return cv::norm( t_true - t );
}
// Computes the norm of the rotation error
double get_rotation_error(const cv::Mat &R_true, const cv::Mat &R)
{
cv::Mat error_vec, error_mat;
error_mat = R_true * cv::Mat(R.inv()).mul(-1);
cv::Rodrigues(error_mat, error_vec);
return cv::norm(error_vec);
}
// Converts a given Rotation Matrix to Euler angles
cv::Mat rot2euler(const cv::Mat & rotationMatrix)
{
cv::Mat euler(3,1,CV_64F);
double m00 = rotationMatrix.at<double>(0,0);
double m02 = rotationMatrix.at<double>(0,2);
double m10 = rotationMatrix.at<double>(1,0);
double m11 = rotationMatrix.at<double>(1,1);
double m12 = rotationMatrix.at<double>(1,2);
double m20 = rotationMatrix.at<double>(2,0);
double m22 = rotationMatrix.at<double>(2,2);
double x, y, z;
// Assuming the angles are in radians.
if (m10 > 0.998) { // singularity at north pole
x = 0;
y = CV_PI/2;
z = atan2(m02,m22);
}
else if (m10 < -0.998) { // singularity at south pole
x = 0;
y = -CV_PI/2;
z = atan2(m02,m22);
}
else
{
x = atan2(-m12,m11);
y = asin(m10);
z = atan2(-m20,m00);
}
euler.at<double>(0) = x;
euler.at<double>(1) = y;
euler.at<double>(2) = z;
return euler;
}
// Converts a given Euler angles to Rotation Matrix
cv::Mat euler2rot(const cv::Mat & euler)
{
cv::Mat rotationMatrix(3,3,CV_64F);
double x = euler.at<double>(0);
double y = euler.at<double>(1);
double z = euler.at<double>(2);
// Assuming the angles are in radians.
double ch = cos(z);
double sh = sin(z);
double ca = cos(y);
double sa = sin(y);
double cb = cos(x);
double sb = sin(x);
double m00, m01, m02, m10, m11, m12, m20, m21, m22;
m00 = ch * ca;
m01 = sh*sb - ch*sa*cb;
m02 = ch*sa*sb + sh*cb;
m10 = sa;
m11 = ca*cb;
m12 = -ca*sb;
m20 = -sh*ca;
m21 = sh*sa*cb + ch*sb;
m22 = -sh*sa*sb + ch*cb;
rotationMatrix.at<double>(0,0) = m00;
rotationMatrix.at<double>(0,1) = m01;
rotationMatrix.at<double>(0,2) = m02;
rotationMatrix.at<double>(1,0) = m10;
rotationMatrix.at<double>(1,1) = m11;
rotationMatrix.at<double>(1,2) = m12;
rotationMatrix.at<double>(2,0) = m20;
rotationMatrix.at<double>(2,1) = m21;
rotationMatrix.at<double>(2,2) = m22;
return rotationMatrix;
}
// Converts a given string to an integer
int StringToInt ( const std::string &Text )
{
std::istringstream ss(Text);
int result;
return ss >> result ? result : 0;
}
// Converts a given float to a string
std::string FloatToString ( float Number )
{
std::ostringstream ss;
ss << Number;
return ss.str();
}
// Converts a given integer to a string
std::string IntToString ( int Number )
{
std::ostringstream ss;
ss << Number;
return ss.str();
}
/*
* Utils.h
*
* Created on: Mar 28, 2014
* Author: Edgar Riba
*/
#ifndef UTILS_H_
#define UTILS_H_
#include <iostream>
#include "PnPProblem.h"
// Draw a text with the question point
void drawQuestion(cv::Mat image, cv::Point3f point, cv::Scalar color);
// Draw a text with the number of entered points
void drawText(cv::Mat image, std::string text, cv::Scalar color);
// Draw a text with the number of entered points
void drawText2(cv::Mat image, std::string text, cv::Scalar color);
// Draw a text with the frame ratio
void drawFPS(cv::Mat image, double fps, cv::Scalar color);
// Draw a text with the frame ratio
void drawConfidence(cv::Mat image, double confidence, cv::Scalar color);
// Draw a text with the number of entered points
void drawCounter(cv::Mat image, int n, int n_max, cv::Scalar color);
// Draw the points and the coordinates
void drawPoints(cv::Mat image, std::vector<cv::Point2f> &list_points_2d, std::vector<cv::Point3f> &list_points_3d, cv::Scalar color);
// Draw only the 2D points
void draw2DPoints(cv::Mat image, std::vector<cv::Point2f> &list_points, cv::Scalar color);
// Draw an arrow into the image
void drawArrow(cv::Mat image, cv::Point2i p, cv::Point2i q, cv::Scalar color, int arrowMagnitude = 9, int thickness=1, int line_type=8, int shift=0);
// Draw the 3D coordinate axes
void draw3DCoordinateAxes(cv::Mat image, const std::vector<cv::Point2f> &list_points2d);
// Draw the object mesh
void drawObjectMesh(cv::Mat image, const Mesh *mesh, PnPProblem *pnpProblem, cv::Scalar color);
// Computes the norm of the translation error
double get_translation_error(const cv::Mat &t_true, const cv::Mat &t);
// Computes the norm of the rotation error
double get_rotation_error(const cv::Mat &R_true, const cv::Mat &R);
// Converts a given Rotation Matrix to Euler angles
cv::Mat rot2euler(const cv::Mat & rotationMatrix);
// Converts a given Euler angles to Rotation Matrix
cv::Mat euler2rot(const cv::Mat & euler);
// Converts a given string to an integer
int StringToInt ( const std::string &Text );
// Converts a given float to a string
std::string FloatToString ( float Number );
// Converts a given integer to a string
std::string IntToString ( int Number );
#endif /* UTILS_H_ */
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment