Commit 43f3eb9f authored by biagio montesano's avatar biagio montesano

New warnings corrected

parent b433ac58
.. _LSDDetector:
Line Segments Detector
======================
Lines extraction methodology
----------------------------
The lines extraction methodology described in the following is mainly based on [EDL]_.
The extraction starts with a Gaussian pyramid generated from an original image, downsampled N-1 times, blurred N times, to obtain N layers (one for each octave), with layer 0 corresponding to input image. Then, from each layer (octave) in the pyramid, lines are extracted using LSD algorithm.
Differently from EDLine lines extractor used in original article, LSD furnishes information only about lines extremes; thus, additional information regarding slope and equation of line are computed via analytic methods. The number of pixels is obtained using `LineIterator <http://docs.opencv.org/modules/core/doc/drawing_functions.html#lineiterator>`_. Extracted lines are returned in the form of KeyLine objects, but since extraction is based on a method different from the one used in `BinaryDescriptor <binary_descriptor.html>`_ class, data associated to a line's extremes in original image and in octave it was extracted from, coincide. KeyLine's field *class_id* is used as an index to indicate the order of extraction of a line inside a single octave.
LSDDetector::createLSDDetector
------------------------------
Creates ad LSDDetector object, using smart pointers.
.. ocv:function:: Ptr<LSDDetector> LSDDetector::createLSDDetector()
LSDDetector::detect
-------------------
Detect lines inside an image.
.. ocv:function:: void LSDDetector::detect( const Mat& image, std::vector<KeyLine>& keylines, int scale, int numOctaves, const Mat& mask=Mat())
.. ocv:function:: void LSDDetector::detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, int scale, int numOctaves, const std::vector<Mat>& masks=std::vector<Mat>() ) const
:param image: input image
:param images: input images
:param keylines: vector or set of vectors that will store extracted lines for one or more images
:param mask: mask matrix to detect only KeyLines of interest
:param masks: vector of mask matrices to detect only KeyLines of interest from each input image
:param scale: scale factor used in pyramids generation
:param numOctaves: number of octaves inside pyramid
References
----------
.. [EDL] Von Gioi, R. Grompone, et al. *LSD: A fast line segment detector with a false detection control*, IEEE Transactions on Pattern Analysis and Machine Intelligence 32.4 (2010): 722-732.
......@@ -125,6 +125,8 @@ References
.. [LBD] Zhang, Lilian, and Reinhard Koch. *An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency*, Journal of Visual Communication and Image Representation 24.7 (2013): 794-805.
.. [EDL] Von Gioi, R. Grompone, et al. *LSD: A fast line segment detector with a false detection control*, IEEE Transactions on Pattern Analysis and Machine Intelligence 32.4 (2010): 722-732.
Summary
-------
......
......@@ -43,7 +43,6 @@
#define __OPENCV_LINE_DESCRIPTOR_HPP__
#include "opencv2/line_descriptor/descriptor.hpp"
//#include "opencv2/core.hpp"
namespace cv
{
......
......@@ -134,15 +134,6 @@ class CV_EXPORTS_W BinaryDescriptor : public Algorithm
};
struct CV_EXPORTS LineDetectionMode
{
enum
{
LSD_DETECTOR = 0, // detect lines using LSD
EDL_DETECTOR = 1 // detect lines using EDLines
};
};
/* constructor */
CV_WRAP
BinaryDescriptor( const BinaryDescriptor::Params &parameters = BinaryDescriptor::Params() );
......@@ -162,53 +153,48 @@ class CV_EXPORTS_W BinaryDescriptor : public Algorithm
int getReductionRatio();
void setReductionRatio( int rRatio );
/* read parameters from a FileNode object and store them (class function ) */
/* reads parameters from a FileNode object and store them (class function ) */
virtual void read( const cv::FileNode& fn );
/* store parameters to a FileStorage object (class function) */
/* stores parameters to a FileStorage object (class function) */
virtual void write( cv::FileStorage& fs ) const;
/* requires line detection (only one image) */
CV_WRAP
void detect( const Mat& image, CV_OUT std::vector<KeyLine>& keypoints, const Mat& mask = Mat(), int flags = LineDetectionMode::LSD_DETECTOR );
void detect( const Mat& image, CV_OUT std::vector<KeyLine>& keypoints, const Mat& mask = Mat() );
/* requires line detection (more than one image) */
void detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, const std::vector<Mat>& masks = std::vector<Mat>(),
int flags = LineDetectionMode::LSD_DETECTOR ) const;
void detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, const std::vector<Mat>& masks =
std::vector<Mat>() ) const;
/* requires descriptors computation (only one image) */
CV_WRAP
void compute( const Mat& image, CV_OUT CV_IN_OUT std::vector<KeyLine>& keylines, CV_OUT Mat& descriptors, bool returnFloatDescr = false, int flags =
LineDetectionMode::LSD_DETECTOR ) const;
void compute( const Mat& image, CV_OUT CV_IN_OUT std::vector<KeyLine>& keylines, CV_OUT Mat& descriptors, bool returnFloatDescr = false ) const;
/* requires descriptors computation (more than one image) */
void compute( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, std::vector<Mat>& descriptors, bool returnFloatDescr =
false,
int flags = LineDetectionMode::LSD_DETECTOR ) const;
false ) const;
/*return descriptor size */
/* returns descriptor size */
int descriptorSize() const;
/* return data type */
/* returns data type */
int descriptorType() const;
/* return norm mode */
/* returns norm mode */
int defaultNorm() const;
/* check whether Gaussian pyramids were created */
bool empty() const;
/* definition of operator () */
CV_WRAP_AS(detectAndCompute)
virtual void operator()( InputArray image, InputArray mask, CV_OUT std::vector<KeyLine>& keylines, OutputArray descriptors,
bool useProvidedKeyLines = false, bool returnFloatDescr = false, int flags = LineDetectionMode::LSD_DETECTOR ) const;
bool useProvidedKeyLines = false, bool returnFloatDescr = false ) const;
protected:
/* implementation of line detection */
virtual void detectImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, int flags, const Mat& mask = Mat() ) const;
virtual void detectImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, const Mat& mask = Mat() ) const;
/* implementation of descriptors' computation */
virtual void computeImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, Mat& descriptors, bool returnFloatDescr, int flags ) const;
virtual void computeImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, Mat& descriptors, bool returnFloatDescr ) const;
/* function inherited from Algorithm */
AlgorithmInfo* info() const;
......@@ -217,49 +203,24 @@ class CV_EXPORTS_W BinaryDescriptor : public Algorithm
/* conversion of an LBD descriptor to its binary representation */
unsigned char binaryConversion( float* f1, float* f2 );
/* compute LBD descriptors */
int computeLBD( ScaleLines &keyLines, int flags );
/* compute LBD descriptors using EDLine extractor */
int computeLBD_EDL( ScaleLines &keyLines );
/* compute Gaussian pyramid of input image */
void computeGaussianPyramid( const Mat& image );
/* gather lines in groups.
Each group contains the same line, detected in different octaves */
int OctaveKeyLines( ScaleLines &keyLines );
int computeLBD( ScaleLines &keyLines );
/* gather lines in groups using EDLine extractor.
/* gathers lines in groups using EDLine extractor.
Each group contains the same line, detected in different octaves */
int OctaveKeyLines_EDL( cv::Mat& image, ScaleLines &keyLines );
/* get coefficients of line passing by two points (in line_extremes) */
void getLineParameters( cv::Vec4i &line_extremes, cv::Vec3i &lineParams );
/* compute the angle between line and X axis */
float getLineDirection( cv::Vec3i &lineParams );
int OctaveKeyLines( cv::Mat& image, ScaleLines &keyLines );
/* the local gaussian coefficient applied to the orthogonal line direction within each band */
std::vector<float> gaussCoefL_;
/* the global gaussian coefficient applied to each Row within line support region */
/* the global gaussian coefficient applied to each row within line support region */
std::vector<float> gaussCoefG_;
/* vector to store horizontal and vertical derivatives of octave images */
std::vector<cv::Mat> dxImg_vector, dyImg_vector;
/* vectot to store sizes of octave images */
std::vector<cv::Size> images_sizes;
/* structure to store lines extracted from each octave image */
std::vector<std::vector<cv::Vec4i> > extractedLines;
/* descriptor parameters */
Params params;
/* vector to store the Gaussian pyramid od an input image */
std::vector<cv::Mat> octaveImages;
/* vector of sizes of downsampled and blurred images */
std::vector<cv::Size> images_sizes;
/*For each octave of image, we define an EDLineDetector, because we can get gradient images (dxImg, dyImg, gImg)
*from the EDLineDetector class without extra computation cost. Another reason is that, if we use
......@@ -269,6 +230,42 @@ class CV_EXPORTS_W BinaryDescriptor : public Algorithm
};
class CV_EXPORTS_W LSDDetector : public Algorithm
{
public:
/* constructor */
LSDDetector()
{
}
;
/* constructor with smart pointer */
static Ptr<LSDDetector> createLSDDetector();
/* requires line detection (only one image) */
CV_WRAP
void detect( const Mat& image, CV_OUT std::vector<KeyLine>& keypoints, int scale, int numOctaves, const Mat& mask = Mat() );
/* requires line detection (more than one image) */
void detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, int scale, int numOctaves,
const std::vector<Mat>& masks = std::vector<Mat>() ) const;
private:
/* compute Gaussian pyramid of input image */
void computeGaussianPyramid( const Mat& image, int numOctaves, int scale );
/* implementation of line detection */
void detectImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, int numOctaves, int scale, const Mat& mask ) const;
/* matrices for Gaussian pyramids */
std::vector<cv::Mat> gaussianPyrs;
protected:
/* function inherited from Algorithm */
AlgorithmInfo* info() const;
};
class CV_EXPORTS_W BinaryDescriptorMatcher : public Algorithm
{
......
......@@ -61,20 +61,19 @@ static void help()
<< std::endl;
}
inline void writeMat(cv::Mat m, std::string name, int n)
inline void writeMat( cv::Mat m, std::string name, int n )
{
std::stringstream ss;
std::string s;
ss << n;
ss >> s;
std::string fileNameConf = name + s;
cv::FileStorage fsConf(fileNameConf, cv::FileStorage::WRITE);
cv::FileStorage fsConf( fileNameConf, cv::FileStorage::WRITE );
fsConf << "m" << m;
fsConf.release();
}
int main( int argc, char** argv )
{
/* get parameters from command line */
......@@ -102,7 +101,7 @@ int main( int argc, char** argv )
/* compute lines */
std::vector<KeyLine> keylines;
bd->detect( imageMat, keylines, mask, 1 );
bd->detect( imageMat, keylines, mask );
/* select only lines from first octave */
std::vector<KeyLine> octave0;
......@@ -115,7 +114,7 @@ int main( int argc, char** argv )
/* compute descriptors */
cv::Mat descriptors;
bd->compute( imageMat, octave0, descriptors, false, 1 );
writeMat(descriptors, "bd_descriptors", 0);
bd->compute( imageMat, octave0, descriptors, 1);
writeMat( descriptors, "bd_descriptors", 0 );
}
......@@ -80,7 +80,7 @@ int main( int argc, char** argv )
std::cout << "Error, image could not be loaded. Please, check its path" << std::endl;
}
/* create a ramdom binary mask */
/* create a random binary mask */
cv::Mat mask = Mat::ones( imageMat.size(), CV_8UC1 );
/* create a pointer to a BinaryDescriptor object with deafult parameters */
......@@ -91,16 +91,15 @@ int main( int argc, char** argv )
/* extract lines */
cv::Mat output = imageMat.clone();
bd->detect( imageMat, lines, mask, 1 );
bd->detect( imageMat, lines, mask );
/* draw lines extracted from octave 0 */
if( output.channels() == 1 )
cvtColor( output, output, COLOR_GRAY2BGR );
for ( size_t i = 0; i < lines.size(); i++ )
{
KeyLine kl = lines[i];
if( kl.octave == 0 /*&& kl.response >0.08*/)
if( kl.octave == 0)
{
/* get a random color */
int R = ( rand() % (int) ( 255 + 1 ) );
......@@ -112,7 +111,7 @@ int main( int argc, char** argv )
Point pt2 = Point( kl.endPointX, kl.endPointY );
/* draw line */
line( output, pt1, pt2, Scalar( B, G, R ), 5 );
line( output, pt1, pt2, Scalar( B, G, R ), 3 );
}
}
......
......@@ -128,13 +128,4 @@ int main( int argc, char** argv )
std::vector<std::vector<DMatch> > matches;
bdm->radiusMatch( queries, matches, 30 );
/* print matches */
for ( size_t q = 0; q < matches.size(); q++ )
{
for ( size_t m = 0; m < matches[q].size(); m++ )
{
DMatch dm = matches[q][m];
std::cout << "Descriptor: " << q << " Image: " << dm.imgIdx << " Distance: " << dm.distance << std::endl;
}
}
}
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2014, Biagio Montesano, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#include "precomp.hpp"
using namespace cv;
Ptr<LSDDetector> LSDDetector::createLSDDetector()
{
return Ptr<LSDDetector>( new LSDDetector() );
}
/* compute Gaussian pyramid of input image */
void LSDDetector::computeGaussianPyramid( const Mat& image, int numOctaves, int scale )
{
/* clear class fields */
gaussianPyrs.clear();
/* insert input image into pyramid */
cv::Mat currentMat = image.clone();
cv::GaussianBlur( currentMat, currentMat, cv::Size( 5, 5 ), 1 );
gaussianPyrs.push_back( currentMat );
/* fill Gaussian pyramid */
for ( int pyrCounter = 1; pyrCounter < numOctaves; pyrCounter++ )
{
/* compute and store next image in pyramid and its size */
pyrDown( currentMat, currentMat, Size( currentMat.cols / scale, currentMat.rows / scale ) );
gaussianPyrs.push_back( currentMat );
}
}
/* check lines' extremes */
inline void checkLineExtremes( cv::Vec4i& extremes, cv::Size imageSize )
{
if( extremes[0] < 0 )
extremes[0] = 0;
if( extremes[0] >= imageSize.width )
extremes[0] = imageSize.width - 1;
if( extremes[2] < 0 )
extremes[2] = 0;
if( extremes[2] >= imageSize.width )
extremes[2] = imageSize.width - 1;
if( extremes[1] < 0 )
extremes[1] = 0;
if( extremes[1] >= imageSize.height )
extremes[1] = imageSize.height - 1;
if( extremes[3] < 0 )
extremes[3] = 0;
if( extremes[3] >= imageSize.height )
extremes[3] = imageSize.height - 1;
}
/* requires line detection (only one image) */
void LSDDetector::detect( const Mat& image, CV_OUT std::vector<KeyLine>& keylines, int scale, int numOctaves, const Mat& mask )
{
if( mask.data != NULL && ( mask.size() != image.size() || mask.type() != CV_8UC1 ) )
{
std::cout << "Mask error while detecting lines: " << "please check its dimensions and that data type is CV_8UC1" << std::endl;
CV_Assert( false );
}
else
detectImpl( image, keylines, numOctaves, scale, mask );
}
/* requires line detection (more than one image) */
void LSDDetector::detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, int scale, int numOctaves,
const std::vector<Mat>& masks ) const
{
/* detect lines from each image */
for ( size_t counter = 0; counter < images.size(); counter++ )
{
if( masks[counter].data != NULL && ( masks[counter].size() != images[counter].size() || masks[counter].type() != CV_8UC1 ) )
{
std::cout << "Masks error while detecting lines: " << "please check their dimensions and that data types are CV_8UC1" << std::endl;
CV_Assert( false );
}
detectImpl( images[counter], keylines[counter], numOctaves, scale, masks[counter] );
}
}
/* implementation of line detection */
void LSDDetector::detectImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, int numOctaves, int scale, const Mat& mask ) const
{
cv::Mat image;
if( imageSrc.channels() != 1 )
cvtColor( imageSrc, image, COLOR_BGR2GRAY );
else
image = imageSrc.clone();
/*check whether image depth is different from 0 */
if( image.depth() != 0 )
{
std::cout << "Warning, depth image!= 0" << std::endl;
CV_Assert( false );
}
/* create a pointer to self */
LSDDetector *lsd = const_cast<LSDDetector*>( this );
/* compute Gaussian pyramids */
lsd->computeGaussianPyramid( image, numOctaves, scale );
/* create an LSD extractor */
cv::Ptr<cv::LineSegmentDetector> ls = cv::createLineSegmentDetector( cv::LSD_REFINE_ADV );
/* prepare a vector to host extracted segments */
std::vector<std::vector<cv::Vec4i> > lines_lsd;
/* extract lines */
for ( int i = 0; i < numOctaves; i++ )
{
std::vector<Vec4i> octave_lines;
ls->detect( gaussianPyrs[i], octave_lines );
lines_lsd.push_back( octave_lines );
}
/* create keylines */
for ( int j = 0; j < (int) lines_lsd.size(); j++ )
{
for ( int k = 0; k < (int) lines_lsd[j].size(); k++ )
{
KeyLine kl;
cv::Vec4i extremes = lines_lsd[j][k];
/* check data validity */
checkLineExtremes( extremes, gaussianPyrs[j].size() );
/* fill KeyLine's fields */
kl.startPointX = extremes[0];
kl.startPointY = extremes[1];
kl.endPointX = extremes[2];
kl.endPointY = extremes[3];
kl.sPointInOctaveX = extremes[0];
kl.sPointInOctaveY = extremes[1];
kl.ePointInOctaveX = extremes[2];
kl.ePointInOctaveY = extremes[3];
kl.lineLength = sqrt( pow( extremes[0] - extremes[2], 2 ) + pow( extremes[1] - extremes[3], 2 ) );
/* compute number of pixels covered by line */
LineIterator li( gaussianPyrs[j], Point( extremes[0], extremes[1] ), Point( extremes[2], extremes[3] ) );
kl.numOfPixels = li.count;
kl.angle = atan2( ( kl.endPointY - kl.startPointY ), ( kl.endPointX - kl.startPointX ) );
kl.class_id = k;
kl.octave = j;
kl.size = ( kl.endPointX - kl.startPointX ) * ( kl.endPointY - kl.startPointY );
kl.response = kl.lineLength / max( gaussianPyrs[j].cols, gaussianPyrs[j].rows );
kl.pt = Point( ( kl.endPointX + kl.startPointX ) / 2, ( kl.endPointY + kl.startPointY ) / 2 );
keylines.push_back( kl );
}
}
/* delete undesired KeyLines, according to input mask */
if( !mask.empty() )
{
for ( size_t keyCounter = 0; keyCounter < keylines.size(); keyCounter++ )
{
KeyLine kl = keylines[keyCounter];
if( mask.at<uchar>( kl.startPointY, kl.startPointX ) == 0 && mask.at<uchar>( kl.endPointY, kl.endPointX ) == 0 )
keylines.erase( keylines.begin() + keyCounter );
}
}
}
......@@ -154,9 +154,10 @@ BinaryDescriptor::BinaryDescriptor( const BinaryDescriptor::Params &parameters )
params( parameters )
{
/* reserve enough space for EDLine objects */
/* reserve enough space for EDLine objects and images in Gaussian pyramid */
edLineVec_.resize( params.numOfOctave_ );
images_sizes.resize( params.numOfOctave_ );
for ( unsigned int i = 0; i < params.numOfOctave_; i++ )
edLineVec_[i] = new EDLineDetector;
......@@ -196,7 +197,7 @@ BinaryDescriptor::BinaryDescriptor( const BinaryDescriptor::Params &parameters )
/* definition of operator () */
void BinaryDescriptor::operator()( InputArray image, InputArray mask, CV_OUT std::vector<KeyLine>& keylines, OutputArray descriptors,
bool useProvidedKeyLines, bool returnFloatDescr, int flags ) const
bool useProvidedKeyLines, bool returnFloatDescr ) const
{
/* create some matrix objects */
......@@ -214,10 +215,10 @@ void BinaryDescriptor::operator()( InputArray image, InputArray mask, CV_OUT std
/* require drawing KeyLines detection if demanded */
if( !useProvidedKeyLines )
detectImpl( imageMat, keylines, flags, maskMat );
detectImpl( imageMat, keylines, maskMat );
/* compute descriptors */
computeImpl( imageMat, keylines, descrMat, returnFloatDescr, flags );
computeImpl( imageMat, keylines, descrMat, returnFloatDescr );
}
BinaryDescriptor::~BinaryDescriptor()
......@@ -254,12 +255,6 @@ int BinaryDescriptor::descriptorSize() const
return 32 * 8;
}
/* check whether Gaussian pyramids were created */
bool BinaryDescriptor::empty() const
{
return octaveImages.empty();
}
/* power function with error management */
static inline int get2Pow( int i )
{
......@@ -287,109 +282,8 @@ unsigned char BinaryDescriptor::binaryConversion( float* f1, float* f2 )
}
/* get coefficients of line passing by two points in (line_extremes) */
void BinaryDescriptor::getLineParameters( cv::Vec4i& line_extremes, cv::Vec3i& lineParams )
{
int x1 = line_extremes[0];
int x2 = line_extremes[2];
int y1 = line_extremes[1];
int y2 = line_extremes[3];
/* line is parallel to Y axis */
if( x1 == x2 )
{
lineParams[0] = 1;
lineParams[1] = 0;
lineParams[2] = x1 /* or x2 */;
}
/* line is parallel to X axis */
else if( y1 == y2 )
{
lineParams[0] = 0;
lineParams[1] = 1;
lineParams[2] = y1 /* or y2 */;
}
/* line is not parallel to any axis */
else
{
lineParams[0] = y1 - y2;
lineParams[1] = x2 - x1;
lineParams[2] = -y1 * ( x2 - x1 ) + x1 * ( y2 - y1 );
}
}
/* compute the angle between line and X axis */
float BinaryDescriptor::getLineDirection( cv::Vec3i &lineParams )
{
/* line is parallel to X axis */
if( lineParams[0] == 0 )
return 0;
/* line is parallel to Y axis */
else if( lineParams[1] == 0 )
return M_PI / 2;
/* line is not parallel to any axis */
else
return atan2( -lineParams[0], lineParams[1] );
}
/* compute Gaussian pyramid of input image */
void BinaryDescriptor::computeGaussianPyramid( const Mat& image )
{
/* clear class fields */
images_sizes.clear();
octaveImages.clear();
extractedLines.clear();
/* insert input image into pyramid */
cv::Mat currentMat = image.clone();
cv::GaussianBlur( currentMat, currentMat, cv::Size( 5, 5 ), 1 );
octaveImages.push_back( currentMat );
images_sizes.push_back( currentMat.size() );
/* fill Gaussian pyramid */
for ( int pyrCounter = 1; pyrCounter < params.numOfOctave_; pyrCounter++ )
{
/* compute and store next image in pyramid and its size */
pyrDown( currentMat, currentMat, Size( currentMat.cols / params.reductionRatio, currentMat.rows / params.reductionRatio ) );
octaveImages.push_back( currentMat );
images_sizes.push_back( currentMat.size() );
}
}
/* check lines' extremes */
inline void checkLineExtremes( cv::Vec4i& extremes, cv::Size imageSize )
{
if( extremes[0] < 0 )
extremes[0] = 0;
if( extremes[0] >= imageSize.width )
extremes[0] = imageSize.width - 1;
if( extremes[2] < 0 )
extremes[2] = 0;
if( extremes[2] >= imageSize.width )
extremes[2] = imageSize.width - 1;
if( extremes[1] < 0 )
extremes[1] = 0;
if( extremes[1] >= imageSize.height )
extremes[1] = imageSize.height - 1;
if( extremes[3] < 0 )
extremes[3] = 0;
if( extremes[3] >= imageSize.height )
extremes[3] = imageSize.height - 1;
}
/* requires line detection (only one image) */
void BinaryDescriptor::detect( const Mat& image, CV_OUT std::vector<KeyLine>& keylines, const Mat& mask, int flags )
void BinaryDescriptor::detect( const Mat& image, CV_OUT std::vector<KeyLine>& keylines, const Mat& mask )
{
if( mask.data != NULL && ( mask.size() != image.size() || mask.type() != CV_8UC1 ) )
{
......@@ -400,12 +294,11 @@ void BinaryDescriptor::detect( const Mat& image, CV_OUT std::vector<KeyLine>& ke
}
else
detectImpl( image, keylines, flags, mask );
detectImpl( image, keylines, mask );
}
/* requires line detection (more than one image) */
void BinaryDescriptor::detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, const std::vector<Mat>& masks,
int flags ) const
void BinaryDescriptor::detect( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, const std::vector<Mat>& masks ) const
{
/* detect lines from each image */
for ( size_t counter = 0; counter < images.size(); counter++ )
......@@ -418,11 +311,11 @@ void BinaryDescriptor::detect( const std::vector<Mat>& images, std::vector<std::
CV_Assert( false );
}
detectImpl( images[counter], keylines[counter], flags, masks[counter] );
detectImpl( images[counter], keylines[counter], masks[counter] );
}
}
void BinaryDescriptor::detectImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, int flags, const Mat& mask ) const
void BinaryDescriptor::detectImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, const Mat& mask ) const
{
cv::Mat image;
......@@ -441,33 +334,9 @@ void BinaryDescriptor::detectImpl( const Mat& imageSrc, std::vector<KeyLine>& ke
/* create a pointer to self */
BinaryDescriptor *bn = const_cast<BinaryDescriptor*>( this );
/* compute Gaussian pyramid */
if( flags == 0 )
bn->computeGaussianPyramid( image );
/* detect and arrange lines across octaves */
ScaleLines sl;
Mat m = image.clone();
cvtColor( m, m, COLOR_GRAY2BGR );
if( flags == 0 )
bn->OctaveKeyLines( sl );
else
bn->OctaveKeyLines_EDL( image, sl );
Mat temp = image.clone();
cvtColor( temp, temp, COLOR_GRAY2BGR );
for ( size_t i = 0; i < sl.size(); i++ )
{
for ( size_t j = 0; j < sl[i].size(); j++ )
{
OctaveSingleLine tempOSL = sl[i][j];
line( m, Point( tempOSL.startPointX, tempOSL.startPointY ), Point( tempOSL.endPointX, tempOSL.endPointY ), Scalar( 255, 0, 0 ), 5 );
}
}
imshow( "Immagine", m );
waitKey();
bn->OctaveKeyLines( image, sl );
/* fill KeyLines vector */
for ( int i = 0; i < (int) sl.size(); i++ )
......@@ -480,9 +349,6 @@ void BinaryDescriptor::detectImpl( const Mat& imageSrc, std::vector<KeyLine>& ke
/* create a KeyLine object */
KeyLine kl;
/* check data validity */
// cv::Vec4i extremes( osl.startPointX, osl.startPointY, osl.endPointX, osl.endPointY );
// checkLineExtremes( extremes, imageSrc.size() );
/* fill KeyLine's fields */
kl.startPointX = osl.startPointX; //extremes[0];
kl.startPointY = osl.startPointY; //extremes[1];
......@@ -522,22 +388,22 @@ void BinaryDescriptor::detectImpl( const Mat& imageSrc, std::vector<KeyLine>& ke
}
/* requires descriptors computation (only one image) */
void BinaryDescriptor::compute( const Mat& image, CV_OUT CV_IN_OUT std::vector<KeyLine>& keylines, CV_OUT Mat& descriptors, bool returnFloatDescr,
int flags ) const
void BinaryDescriptor::compute( const Mat& image, CV_OUT CV_IN_OUT std::vector<KeyLine>& keylines, CV_OUT Mat& descriptors,
bool returnFloatDescr ) const
{
computeImpl( image, keylines, descriptors, returnFloatDescr, flags );
computeImpl( image, keylines, descriptors, returnFloatDescr );
}
/* requires descriptors computation (more than one image) */
void BinaryDescriptor::compute( const std::vector<Mat>& images, std::vector<std::vector<KeyLine> >& keylines, std::vector<Mat>& descriptors,
bool returnFloatDescr, int flags ) const
bool returnFloatDescr ) const
{
for ( size_t i = 0; i < images.size(); i++ )
computeImpl( images[i], keylines[i], descriptors[i], returnFloatDescr, flags );
computeImpl( images[i], keylines[i], descriptors[i], returnFloatDescr );
}
/* implementation of descriptors computation */
void BinaryDescriptor::computeImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, Mat& descriptors, bool returnFloatDescr, int flags ) const
void BinaryDescriptor::computeImpl( const Mat& imageSrc, std::vector<KeyLine>& keylines, Mat& descriptors, bool returnFloatDescr ) const
{
/* convert input image to gray scale */
cv::Mat image;
......@@ -620,421 +486,64 @@ void BinaryDescriptor::computeImpl( const Mat& imageSrc, std::vector<KeyLine>& k
}
}
/* compute Gaussian pyramid, if image is new or pyramid was not
computed before */
BinaryDescriptor *bn = const_cast<BinaryDescriptor*>( this );
/* all structures cleared in computeGaussianPyramid */
bn->computeGaussianPyramid( image );
/* compute Sobel's derivatives */
bn->dxImg_vector.clear();
bn->dyImg_vector.clear();
bn->dxImg_vector.resize( params.numOfOctave_ );
bn->dyImg_vector.resize( params.numOfOctave_ );
for ( size_t sobelCnt = 0; sobelCnt < octaveImages.size(); sobelCnt++ )
{
bn->dxImg_vector[sobelCnt].create( images_sizes[sobelCnt].height, images_sizes[sobelCnt].width, CV_16SC1 );
bn->dyImg_vector[sobelCnt].create( images_sizes[sobelCnt].height, images_sizes[sobelCnt].width, CV_16SC1 );
cv::Sobel( octaveImages[sobelCnt], bn->dxImg_vector[sobelCnt], CV_16SC1, 1, 0, 3 );
cv::Sobel( octaveImages[sobelCnt], bn->dyImg_vector[sobelCnt], CV_16SC1, 0, 1, 3 );
}
/* compute LBD descriptors */
if(flags == 0)
bn->computeLBD( sl, flags );
else
bn->computeLBD_EDL(sl);
/* resize output matrix */
if( !returnFloatDescr )
descriptors = cv::Mat( keylines.size(), 32, CV_8UC1 );
else
descriptors = cv::Mat( keylines.size(), NUM_OF_BANDS * 8, CV_32FC1 );
/* fill output matrix with descriptors */
for ( size_t k = 0; k < sl.size(); k++ )
{
for ( size_t lineC = 0; lineC < sl[k].size(); lineC++ )
{
/* get original index of keypoint */
int lineOctave = ( sl[k][lineC] ).octaveCount;
int originalIndex = correspondences.find( std::pair<int, int>( k, lineOctave ) )->second;
if( !returnFloatDescr )
{
/* get a pointer to correspondent row in output matrix */
uchar* pointerToRow = descriptors.ptr( originalIndex );
/* get LBD data */
float* desVec = sl[k][lineC].descriptor.data();
/* fill current row with binary descriptor */
for ( int comb = 0; comb < 32; comb++ )
{
*pointerToRow = bn->binaryConversion( &desVec[8 * combinations[comb][0]], &desVec[8 * combinations[comb][1]] );
pointerToRow++;
}
}
else
{
/* get a pointer to correspondent row in output matrix */
uchar* pointerToRow = descriptors.ptr( originalIndex );
/* get LBD data */
std::vector<float> desVec = sl[k][lineC].descriptor;
for ( size_t count = 0; count < desVec.size(); count++ )
{
*pointerToRow = desVec[count];
pointerToRow++;
}
}
}
}
}
/* gather lines in groups. Each group contains the same line, detected in different octaves */
int BinaryDescriptor::OctaveKeyLines( ScaleLines &keyLines )
{
/* final number of extracted lines */
unsigned int numOfFinalLine = 0;
std::vector<float> prec, w_idth, nfa;
for ( size_t scaleCounter = 0; scaleCounter < octaveImages.size(); scaleCounter++ )
{
/* get current scaled image */
cv::Mat currentScaledImage = octaveImages[scaleCounter];
/* create an LSD detector and store a pointer to it */
cv::Ptr<cv::LineSegmentDetector> ls = cv::createLineSegmentDetector( cv::LSD_REFINE_ADV, 0.8, 0.6, 2.0, 22.5 );
/* prepare a vector to host extracted segments */
std::vector<cv::Vec4i> lines_std;
/* use detector to extract segments */
ls->detect( currentScaledImage, lines_std, w_idth, prec/*, nfa*/);
/* store lines extracted from current image */
extractedLines.push_back( lines_std );
/* update lines counter */
numOfFinalLine += lines_std.size();
}
/* prepare a vector to store octave information associated to extracted lines */
std::vector<OctaveLine> octaveLines( numOfFinalLine );
/* set lines' counter to 0 for reuse */
numOfFinalLine = 0;
/* counter to give a unique ID to lines in LineVecs */
unsigned int lineIDInScaleLineVec = 0;
/* floats to compute lines' lengths */
float dx, dy;
/* loop over lines extracted from scale 0 (original image) */
for ( unsigned int lineCurId = 0; lineCurId < extractedLines[0].size(); lineCurId++ )
{
/* set octave from which it was extracted */
octaveLines[numOfFinalLine].octaveCount = 0;
/* set ID within its octave */
octaveLines[numOfFinalLine].lineIDInOctave = lineCurId;
/* set a unique ID among all lines extracted in all octaves */
octaveLines[numOfFinalLine].lineIDInScaleLineVec = lineIDInScaleLineVec;
/* compute absolute value of difference between X coordinates of line's extreme points */
dx = fabs( ( extractedLines[0][lineCurId] )[0] - ( extractedLines[0][lineCurId] )[2] );
/* compute absolute value of difference between Y coordinates of line's extreme points */
dy = fabs( ( extractedLines[0][lineCurId] )[1] - ( extractedLines[0][lineCurId] )[3] );
/* compute line's length */
octaveLines[numOfFinalLine].lineLength = sqrt( dx * dx + dy * dy );
/* update counters */
numOfFinalLine++;
lineIDInScaleLineVec++;
}
/* create and fill an array to store scale factors */
float *scale = new float[params.numOfOctave_];
scale[0] = 1;
for ( unsigned int octaveCount = 1; octaveCount < (unsigned int) params.numOfOctave_; octaveCount++ )
{
scale[octaveCount] = params.reductionRatio * scale[octaveCount - 1];
}
/* some variables' declarations */
float rho1, rho2, tempValue;
float direction, near, length;
unsigned int octaveID, lineIDInOctave;
/*more than one octave image, organize lines in scale space.
*lines corresponding to the same line in octave images should have the same index in the ScaleLineVec */
if( params.numOfOctave_ > 1 )
{
/* some other variables' declarations */
float twoPI = 2 * M_PI;
unsigned int closeLineID;
float endPointDis, minEndPointDis, minLocalDis, maxLocalDis;
float lp0, lp1, lp2, lp3, np0, np1, np2, np3;
/* loop over list of octaves */
for ( unsigned int octaveCount = 1; octaveCount < (unsigned int) params.numOfOctave_; octaveCount++ )
{
/*for each line in current octave image, find their corresponding lines
in the octaveLines,
give them the same value of lineIDInScaleLineVec*/
/* loop over list of lines extracted from current octave */
for ( unsigned int lineCurId = 0; lineCurId < extractedLines[octaveCount].size(); lineCurId++ )
{
/* get (scaled) known term from equation of current line */
cv::Vec3i line_equation;
getLineParameters( extractedLines[octaveCount][lineCurId], line_equation );
rho1 = scale[octaveCount] * fabs( line_equation[2] );
/*nearThreshold depends on the distance of the image coordinate origin to current line.
*so nearThreshold = rho1 * nearThresholdRatio, where nearThresholdRatio = 1-cos(10*pi/180) = 0.0152*/
tempValue = rho1 * 0.0152;
float nearThreshold = ( tempValue > 6 ) ? ( tempValue ) : 6;
nearThreshold = ( nearThreshold < 12 ) ? nearThreshold : 12;
/* compute scaled lenght of current line */
dx = fabs( ( extractedLines[octaveCount][lineCurId] )[0] - ( extractedLines[octaveCount][lineCurId][2] ) ); //x1-x2
dy = fabs( ( extractedLines[octaveCount][lineCurId] )[1] - ( extractedLines[octaveCount][lineCurId][3] ) ); //y1-y2
length = scale[octaveCount] * sqrt( dx * dx + dy * dy );
minEndPointDis = 12;
/* loop over the octave representations of all lines */
for ( unsigned int lineNextId = 0; lineNextId < numOfFinalLine; lineNextId++ )
{
/* if a line from same octave is encountered,
a comparison with it shouldn't be considered */
octaveID = octaveLines[lineNextId].octaveCount;
if( octaveID == octaveCount )
break;
/* take ID (in octave) of line to be compared */
lineIDInOctave = octaveLines[lineNextId].lineIDInOctave;
/* compute difference between lines' directions, to check
whether they are parallel */
cv::Vec3i line_equation_to_compare;
getLineParameters( extractedLines[octaveID][lineIDInOctave], line_equation_to_compare );
direction = fabs( getLineDirection( line_equation ) - getLineDirection( line_equation_to_compare ) );
/* the angle between two lines are larger than 10degrees
(i.e. 10*pi/180=0.1745), they are not close to parallel */
if( direction > 0.1745 && ( twoPI - direction > 0.1745 ) )
continue;
/*now check whether current line and next line are near to each other.
Get known term from equation to be compared */
rho2 = scale[octaveID] * fabs( line_equation_to_compare[2] );
/* compute difference between known terms */
near = fabs( rho1 - rho2 );
/* two lines are not close in the image */
if( near > nearThreshold )
continue;
/* get the extreme points of the two lines */
lp0 = scale[octaveCount] * ( extractedLines[octaveCount][lineCurId] )[0];
lp1 = scale[octaveCount] * ( extractedLines[octaveCount][lineCurId] )[1];
lp2 = scale[octaveCount] * ( extractedLines[octaveCount][lineCurId] )[2];
lp3 = scale[octaveCount] * ( extractedLines[octaveCount][lineCurId] )[3];
np0 = scale[octaveID] * ( extractedLines[octaveID][lineIDInOctave] )[0];
np1 = scale[octaveID] * ( extractedLines[octaveID][lineIDInOctave] )[1];
np2 = scale[octaveID] * ( extractedLines[octaveID][lineIDInOctave] )[2];
np3 = scale[octaveID] * ( extractedLines[octaveID][lineIDInOctave] )[3];
/* get the distance between the two leftmost extremes of lines
L1(0,1)<->L2(0,1) */
dx = lp0 - np0;
dy = lp1 - np1;
endPointDis = sqrt( dx * dx + dy * dy );
/* set momentaneously min and max distance between lines to
the one between left extremes */
minLocalDis = endPointDis;
maxLocalDis = endPointDis;
/* compute distance between right extremes
L1(2,3)<->L2(2,3) */
dx = lp2 - np2;
dy = lp3 - np3;
endPointDis = sqrt( dx * dx + dy * dy );
/* update (if necessary) min and max distance between lines */
minLocalDis = ( endPointDis < minLocalDis ) ? endPointDis : minLocalDis;
maxLocalDis = ( endPointDis > maxLocalDis ) ? endPointDis : maxLocalDis;
/* compute distance between left extreme of current line and
right extreme of line to be compared
L1(0,1)<->L2(2,3) */
dx = lp0 - np2;
dy = lp1 - np3;
endPointDis = sqrt( dx * dx + dy * dy );
/* update (if necessary) min and max distance between lines */
minLocalDis = ( endPointDis < minLocalDis ) ? endPointDis : minLocalDis;
maxLocalDis = ( endPointDis > maxLocalDis ) ? endPointDis : maxLocalDis;
BinaryDescriptor* bd = const_cast<BinaryDescriptor*>( this );
bd->computeLBD( sl );
/* compute distance between right extreme of current line and
left extreme of line to be compared
L1(2,3)<->L2(0,1) */
dx = lp2 - np0;
dy = lp3 - np1;
endPointDis = sqrt( dx * dx + dy * dy );
/* update (if necessary) min and max distance between lines */
minLocalDis = ( endPointDis < minLocalDis ) ? endPointDis : minLocalDis;
maxLocalDis = ( endPointDis > maxLocalDis ) ? endPointDis : maxLocalDis;
/* check whether conditions for considering line to be compared
wremoveInvalidPointsorth to be inserted in the same LineVec are satisfied */
if( ( maxLocalDis < 0.8 * ( length + octaveLines[lineNextId].lineLength ) ) && ( minLocalDis < minEndPointDis ) )
{
/* keep the closest line */
minEndPointDis = minLocalDis;
closeLineID = lineNextId;
}
}
/* add current line into octaveLines */
if( minEndPointDis < 12 )
octaveLines[numOfFinalLine].lineIDInScaleLineVec = octaveLines[closeLineID].lineIDInScaleLineVec;
else
{
octaveLines[numOfFinalLine].lineIDInScaleLineVec = lineIDInScaleLineVec;
lineIDInScaleLineVec++;
}
octaveLines[numOfFinalLine].octaveCount = octaveCount;
octaveLines[numOfFinalLine].lineIDInOctave = lineCurId;
octaveLines[numOfFinalLine].lineLength = length;
numOfFinalLine++;
}
}
}
/* Reorganize the detected lines into keyLines */
keyLines.clear();
keyLines.resize( lineIDInScaleLineVec );
unsigned int tempID;
float s1, e1, s2, e2;
bool shouldChange;
OctaveSingleLine singleLine;
for ( unsigned int lineID = 0; lineID < numOfFinalLine; lineID++ )
{
lineIDInOctave = octaveLines[lineID].lineIDInOctave;
octaveID = octaveLines[lineID].octaveCount;
cv::Vec3i tempParams;
getLineParameters( extractedLines[octaveID][lineIDInOctave], tempParams );
singleLine.octaveCount = octaveID;
singleLine.lineLength = octaveLines[lineID].lineLength;
// decide the start point and end point
shouldChange = false;
s1 = ( extractedLines[octaveID][lineIDInOctave] )[0]; //sx
s2 = ( extractedLines[octaveID][lineIDInOctave] )[1]; //sy
e1 = ( extractedLines[octaveID][lineIDInOctave] )[2]; //ex
e2 = ( extractedLines[octaveID][lineIDInOctave] )[3]; //ey
dx = e1 - s1; //ex-sx
dy = e2 - s2; //ey-sy
if( direction >= -0.75 * M_PI && direction < -0.25 * M_PI )
{
if( dy > 0 )
shouldChange = true;
}
if( direction >= -0.25 * M_PI && direction < 0.25 * M_PI )
if( dx < 0 )
{
shouldChange = true;
}
if( direction >= 0.25 * M_PI && direction < 0.75 * M_PI )
if( dy < 0 )
{
shouldChange = true;
}
if( ( direction >= 0.75 * M_PI && direction < M_PI ) || ( direction >= -M_PI && direction < -0.75 * M_PI ) )
{
if( dx > 0 )
shouldChange = true;
}
tempValue = scale[octaveID];
if( shouldChange )
{
singleLine.sPointInOctaveX = e1;
singleLine.sPointInOctaveY = e2;
singleLine.ePointInOctaveX = s1;
singleLine.ePointInOctaveY = s2;
singleLine.startPointX = tempValue * e1;
singleLine.startPointY = tempValue * e2;
singleLine.endPointX = tempValue * s1;
singleLine.endPointY = tempValue * s2;
}
/* resize output matrix */
if( !returnFloatDescr )
descriptors = cv::Mat( keylines.size(), 32, CV_8UC1 );
else
descriptors = cv::Mat( keylines.size(), NUM_OF_BANDS * 8, CV_32FC1 );
/* fill output matrix with descriptors */
for ( int k = 0; k < (int)sl.size(); k++ )
{
singleLine.sPointInOctaveX = s1;
singleLine.sPointInOctaveY = s2;
singleLine.ePointInOctaveX = e1;
singleLine.ePointInOctaveY = e2;
singleLine.startPointX = tempValue * s1;
singleLine.startPointY = tempValue * s2;
singleLine.endPointX = tempValue * e1;
singleLine.endPointY = tempValue * e2;
}
for ( int lineC = 0; lineC < (int)sl[k].size(); lineC++ )
{
/* get original index of keypoint */
int lineOctave = ( sl[k][lineC] ).octaveCount;
int originalIndex = correspondences.find( std::pair<int, int>( k, lineOctave ) )->second;
singleLine.direction = atan2( ( singleLine.endPointY - singleLine.startPointY ), ( singleLine.endPointX - singleLine.startPointX ) );
if( !returnFloatDescr )
{
/* get a pointer to correspondent row in output matrix */
uchar* pointerToRow = descriptors.ptr( originalIndex );
tempID = octaveLines[lineID].lineIDInScaleLineVec;
/* get LBD data */
float* desVec = sl[k][lineC].descriptor.data();
/* compute number of pixels covered by line */
LineIterator li( octaveImages[octaveID], Point( singleLine.startPointX, singleLine.startPointY ),
Point( singleLine.endPointX, singleLine.endPointY ) );
/* fill current row with binary descriptor */
for ( int comb = 0; comb < 32; comb++ )
{
*pointerToRow = bd->binaryConversion( &desVec[8 * combinations[comb][0]], &desVec[8 * combinations[comb][1]] );
pointerToRow++;
}
}
singleLine.numOfPixels = li.count;
else
{
std::cout << "Descrittori float" <<std::endl;
/* get a pointer to correspondent row in output matrix */
float* pointerToRow = descriptors.ptr<float>( originalIndex );
/* store line */
keyLines[tempID].push_back( singleLine );
/* get LBD data */
std::vector<float> desVec = sl[k][lineC].descriptor;
for ( int count = 0; count < (int)desVec.size(); count++ )
{
*pointerToRow = desVec[count];
pointerToRow++;
}
}
delete[] scale;
return 1;
}
}
}
int BinaryDescriptor::OctaveKeyLines_EDL( cv::Mat& image, ScaleLines &keyLines )
int BinaryDescriptor::OctaveKeyLines( cv::Mat& image, ScaleLines &keyLines )
{
/* final number of extracted lines */
......@@ -1056,6 +565,7 @@ int BinaryDescriptor::OctaveKeyLines_EDL( cv::Mat& image, ScaleLines &keyLines )
float increaseSigma = sqrt( curSigma2 - preSigma2 );
cv::GaussianBlur( image, blur, cv::Size( params.ksize_, params.ksize_ ), increaseSigma );
images_sizes[octaveCount] = blur.size();
/* for current octave, extract lines */
if( ( edLineVec_[octaveCount]->EDline( blur, true ) ) != true )
{
......@@ -1074,9 +584,6 @@ int BinaryDescriptor::OctaveKeyLines_EDL( cv::Mat& image, ScaleLines &keyLines )
} /* end of loop over number of octaves */
/*lines which correspond to the same line in the octave images will be stored
in the same element of ScaleLines.*/
/* prepare a vector to store octave information associated to extracted lines */
std::vector<OctaveLine> octaveLines( numOfFinalLine );
......@@ -1370,21 +877,18 @@ int BinaryDescriptor::OctaveKeyLines_EDL( cv::Mat& image, ScaleLines &keyLines )
keyLines[tempID].push_back( singleLine );
}
////////////////////////////////////
delete[] scale;
return 1;
}
/* compute LBD descriptors */
int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
int BinaryDescriptor::computeLBD( ScaleLines &keyLines )
{
//the default length of the band is the line length.
short numOfFinalLine = keyLines.size();
float *dL = new float[2]; //line direction cos(dir), sin(dir)
float *dO = new float[2]; //the clockwise orthogonal vector of line direction.
short heightOfLSP = params.widthOfBand_ * NUM_OF_BANDS; //the height of line support region;
short descriptor_size = NUM_OF_BANDS * 8; //each band, we compute the m( pgdL, ngdL, pgdO, ngdO) and std( pgdL, ngdL, pgdO, ngdO);
short descriptorSize = NUM_OF_BANDS * 8; //each band, we compute the m( pgdL, ngdL, pgdO, ngdO) and std( pgdL, ngdL, pgdO, ngdO);
float pgdLRowSum; //the summation of {g_dL |g_dL>0 } for each row of the region;
float ngdLRowSum; //the summation of {g_dL |g_dL<0 } for each row of the region;
float pgdL2RowSum; //the summation of {g_dL^2 |g_dL>0 } for each row of the region;
......@@ -1433,26 +937,14 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
pSingleLine = & ( keyLines[lineIDInScaleVec][lineIDInSameLine] );
octaveCount = pSingleLine->octaveCount;
/* retrieve associated dxImg and dyImg*/
if( flags == 1 )
{
/* retrieve associated dxImg and dyImg */
pdxImg = edLineVec_[octaveCount]->dxImg_.ptr<short>();
pdyImg = edLineVec_[octaveCount]->dyImg_.ptr<short>();
}
else
{
pdxImg = dxImg_vector[octaveCount].ptr<short>();
pdyImg = dyImg_vector[octaveCount].ptr<short>();
}
/* get image size to work on from real one
/* get image size to work on from real one */
realWidth = edLineVec_[octaveCount]->imageWidth;
imageWidth = realWidth -1;
imageHeight = edLineVec_[octaveCount]->imageHeight-1; */
realWidth = images_sizes[octaveCount].width;
imageWidth = realWidth - 1;
imageHeight = images_sizes[octaveCount].height - 1;
imageHeight = edLineVec_[octaveCount]->imageHeight - 1;
/* initialize memory areas */
memset( pgdLBandSum, 0, numOfBitsBand );
......@@ -1518,22 +1010,18 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
{
pgdLRowSum += gDL;
}
else
{
ngdLRowSum -= gDL;
}
if( gDO > 0 )
{
pgdORowSum += gDO;
}
else
{
ngdORowSum -= gDO;
}
sCorX += dL[0];
sCorY += dL[1];
/* gDLMat[hID][wID] = gDL; */
......@@ -1552,7 +1040,7 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
/* compute {g_dL |g_dL>0 }, {g_dL |g_dL<0 },
{g_dO |g_dO>0 }, {g_dO |g_dO<0 } of each band in the line support region
first, current row belongs to current band */
first, current row belong to current band */
bandID = hID / params.widthOfBand_;
coefInGaussion = gaussCoefL_[hID % params.widthOfBand_ + params.widthOfBand_];
pgdLBandSum[bandID] += coefInGaussion * pgdLRowSum;
......@@ -1580,7 +1068,6 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
pgdO2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdO2RowSum;
ngdO2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdO2RowSum;
}
bandID = bandID + 2;
if( bandID < NUM_OF_BANDS )
{/*the band below the current band */
......@@ -1599,7 +1086,7 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
return 0; */
/* construct line descriptor */
pSingleLine->descriptor.resize( descriptor_size );
pSingleLine->descriptor.resize( descriptorSize );
desVec = pSingleLine->descriptor.data();
short desID;
......@@ -1675,7 +1162,7 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
* vector no larger than this threshold. In Z.Wang's work, a value of 0.4 is found
* empirically to be a proper threshold.*/
desVec = pSingleLine->descriptor.data();
for ( short i = 0; i < descriptor_size; i++ )
for ( short i = 0; i < descriptorSize; i++ )
{
if( desVec[i] > 0.4 )
{
......@@ -1685,19 +1172,30 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
//re-normalize desVec;
temp = 0;
for ( short i = 0; i < descriptor_size; i++ )
for ( short i = 0; i < descriptorSize; i++ )
{
temp += desVec[i] * desVec[i];
}
temp = 1 / sqrt( temp );
for ( short i = 0; i < descriptor_size; i++ )
for ( short i = 0; i < descriptorSize; i++ )
{
desVec[i] = desVec[i] * temp;
}
}/* end for(short lineIDInSameLine = 0; lineIDInSameLine<sameLineSize;
lineIDInSameLine++) */
cv::Mat appoggio = cv::Mat( 1, 32, CV_32FC1 );
float* pointerToRow = appoggio.ptr<float>( 0 );
for ( int g = 0; g < 32; g++ )
{
/* get LBD data */
float* desVec = keyLines[lineIDInScaleVec][0].descriptor.data();
*pointerToRow = desVec[g];
pointerToRow++;
}
}/* end for(short lineIDInScaleVec = 0;
lineIDInScaleVec<numOfFinalLine; lineIDInScaleVec++) */
......@@ -1712,316 +1210,6 @@ int BinaryDescriptor::computeLBD( ScaleLines &keyLines, int flags )
delete[] pgdO2BandSum;
delete[] ngdO2BandSum;
return 1;
}
int BinaryDescriptor::computeLBD_EDL( ScaleLines &keyLines )
{
//the default length of the band is the line length.
short numOfFinalLine = keyLines.size();
float *dL = new float[2];//line direction cos(dir), sin(dir)
float *dO = new float[2];//the clockwise orthogonal vector of line direction.
short heightOfLSP = params.widthOfBand_*NUM_OF_BANDS;//the height of line support region;
short descriptorSize = NUM_OF_BANDS * 8;//each band, we compute the m( pgdL, ngdL, pgdO, ngdO) and std( pgdL, ngdL, pgdO, ngdO);
float pgdLRowSum;//the summation of {g_dL |g_dL>0 } for each row of the region;
float ngdLRowSum;//the summation of {g_dL |g_dL<0 } for each row of the region;
float pgdL2RowSum;//the summation of {g_dL^2 |g_dL>0 } for each row of the region;
float ngdL2RowSum;//the summation of {g_dL^2 |g_dL<0 } for each row of the region;
float pgdORowSum;//the summation of {g_dO |g_dO>0 } for each row of the region;
float ngdORowSum;//the summation of {g_dO |g_dO<0 } for each row of the region;
float pgdO2RowSum;//the summation of {g_dO^2 |g_dO>0 } for each row of the region;
float ngdO2RowSum;//the summation of {g_dO^2 |g_dO<0 } for each row of the region;
float *pgdLBandSum = new float[NUM_OF_BANDS];//the summation of {g_dL |g_dL>0 } for each band of the region;
float *ngdLBandSum = new float[NUM_OF_BANDS];//the summation of {g_dL |g_dL<0 } for each band of the region;
float *pgdL2BandSum = new float[NUM_OF_BANDS];//the summation of {g_dL^2 |g_dL>0 } for each band of the region;
float *ngdL2BandSum = new float[NUM_OF_BANDS];//the summation of {g_dL^2 |g_dL<0 } for each band of the region;
float *pgdOBandSum = new float[NUM_OF_BANDS];//the summation of {g_dO |g_dO>0 } for each band of the region;
float *ngdOBandSum = new float[NUM_OF_BANDS];//the summation of {g_dO |g_dO<0 } for each band of the region;
float *pgdO2BandSum = new float[NUM_OF_BANDS];//the summation of {g_dO^2 |g_dO>0 } for each band of the region;
float *ngdO2BandSum = new float[NUM_OF_BANDS];//the summation of {g_dO^2 |g_dO<0 } for each band of the region;
short numOfBitsBand = NUM_OF_BANDS*sizeof(float);
short lengthOfLSP; //the length of line support region, varies with lines
short halfHeight = (heightOfLSP-1)/2;
short halfWidth;
short bandID;
float coefInGaussion;
float lineMiddlePointX, lineMiddlePointY;
float sCorX, sCorY,sCorX0, sCorY0;
short tempCor, xCor, yCor;//pixel coordinates in image plane
short dx, dy;
float gDL;//store the gradient projection of pixels in support region along dL vector
float gDO;//store the gradient projection of pixels in support region along dO vector
short imageWidth, imageHeight, realWidth;
short *pdxImg, *pdyImg;
float *desVec;
short sameLineSize;
short octaveCount;
OctaveSingleLine *pSingleLine;
/* loop over list of LineVec */
for(short lineIDInScaleVec = 0; lineIDInScaleVec<numOfFinalLine; lineIDInScaleVec++){
sameLineSize = keyLines[lineIDInScaleVec].size();
/* loop over current LineVec's lines */
for(short lineIDInSameLine = 0; lineIDInSameLine<sameLineSize; lineIDInSameLine++){
/* get a line in current LineVec and its original ID in its octave */
pSingleLine = &(keyLines[lineIDInScaleVec][lineIDInSameLine]);
octaveCount = pSingleLine->octaveCount;
/* retrieve associated dxImg and dyImg */
pdxImg = edLineVec_[octaveCount]->dxImg_.ptr<short>();
pdyImg = edLineVec_[octaveCount]->dyImg_.ptr<short>();
/* get image size to work on from real one */
realWidth = edLineVec_[octaveCount]->imageWidth;
imageWidth = realWidth -1;
imageHeight = edLineVec_[octaveCount]->imageHeight-1;
/* initialize memory areas */
memset(pgdLBandSum, 0, numOfBitsBand);
memset(ngdLBandSum, 0, numOfBitsBand);
memset(pgdL2BandSum, 0, numOfBitsBand);
memset(ngdL2BandSum, 0, numOfBitsBand);
memset(pgdOBandSum, 0, numOfBitsBand);
memset(ngdOBandSum, 0, numOfBitsBand);
memset(pgdO2BandSum, 0, numOfBitsBand);
memset(ngdO2BandSum, 0, numOfBitsBand);
/* get length of line and its half */
lengthOfLSP = keyLines[lineIDInScaleVec][lineIDInSameLine].numOfPixels;
halfWidth = (lengthOfLSP-1)/2;
/* get middlepoint of line */
lineMiddlePointX = 0.5 * (pSingleLine->sPointInOctaveX + pSingleLine->ePointInOctaveX);
lineMiddlePointY = 0.5 * (pSingleLine->sPointInOctaveY + pSingleLine->ePointInOctaveY);
/*1.rotate the local coordinate system to the line direction (direction is the angle
between positive line direction and positive X axis)
*2.compute the gradient projection of pixels in line support region*/
/* get the vector representing original image reference system after rotation to aligh with
line's direction */
dL[0] = cos(pSingleLine->direction);
dL[1] = sin(pSingleLine->direction);
/* set the clockwise orthogonal vector of line direction */
dO[0] = -dL[1];
dO[1] = dL[0];
/* get rotated reference frame */
sCorX0= -dL[0]*halfWidth + dL[1]*halfHeight + lineMiddlePointX;//hID =0; wID = 0;
sCorY0= -dL[1]*halfWidth - dL[0]*halfHeight + lineMiddlePointY;
/* BIAS::Matrix<float> gDLMat(heightOfLSP,lengthOfLSP) */
for(short hID = 0; hID <heightOfLSP; hID++){
/*initialization */
sCorX = sCorX0;
sCorY = sCorY0;
pgdLRowSum = 0;
ngdLRowSum = 0;
pgdORowSum = 0;
ngdORowSum = 0;
for(short wID = 0; wID <lengthOfLSP; wID++){
tempCor = round(sCorX);
xCor = (tempCor<0)?0:(tempCor>imageWidth)?imageWidth:tempCor;
tempCor = round(sCorY);
yCor = (tempCor<0)?0:(tempCor>imageHeight)?imageHeight:tempCor;
/* To achieve rotation invariance, each simple gradient is rotated aligned with
* the line direction and clockwise orthogonal direction.*/
dx = pdxImg[yCor*realWidth+xCor];
dy = pdyImg[yCor*realWidth+xCor];
gDL = dx * dL[0] + dy * dL[1];
gDO = dx * dO[0] + dy * dO[1];
if(gDL>0){
pgdLRowSum += gDL;
}else{
ngdLRowSum -= gDL;
}
if(gDO>0){
pgdORowSum += gDO;
}else{
ngdORowSum -= gDO;
}
sCorX +=dL[0];
sCorY +=dL[1];
/* gDLMat[hID][wID] = gDL; */
}
sCorX0 -=dL[1];
sCorY0 +=dL[0];
coefInGaussion = gaussCoefG_[hID];
pgdLRowSum = coefInGaussion * pgdLRowSum;
ngdLRowSum = coefInGaussion * ngdLRowSum;
pgdL2RowSum = pgdLRowSum * pgdLRowSum;
ngdL2RowSum = ngdLRowSum * ngdLRowSum;
pgdORowSum = coefInGaussion * pgdORowSum;
ngdORowSum = coefInGaussion * ngdORowSum;
pgdO2RowSum = pgdORowSum * pgdORowSum;
ngdO2RowSum = ngdORowSum * ngdORowSum;
/* compute {g_dL |g_dL>0 }, {g_dL |g_dL<0 },
{g_dO |g_dO>0 }, {g_dO |g_dO<0 } of each band in the line support region
first, current row belong to current band */
bandID = hID/params.widthOfBand_;
coefInGaussion = gaussCoefL_[hID%params.widthOfBand_+params.widthOfBand_];
pgdLBandSum[bandID] += coefInGaussion * pgdLRowSum;
ngdLBandSum[bandID] += coefInGaussion * ngdLRowSum;
pgdL2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdL2RowSum;
ngdL2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdL2RowSum;
pgdOBandSum[bandID] += coefInGaussion * pgdORowSum;
ngdOBandSum[bandID] += coefInGaussion * ngdORowSum;
pgdO2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdO2RowSum;
ngdO2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdO2RowSum;
/* In order to reduce boundary effect along the line gradient direction,
* a row's gradient will contribute not only to its current band, but also
* to its nearest upper and down band with gaussCoefL_.*/
bandID--;
if(bandID>=0){/* the band above the current band */
coefInGaussion = gaussCoefL_[hID%params.widthOfBand_ + 2*params.widthOfBand_];
pgdLBandSum[bandID] += coefInGaussion * pgdLRowSum;
ngdLBandSum[bandID] += coefInGaussion * ngdLRowSum;
pgdL2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdL2RowSum;
ngdL2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdL2RowSum;
pgdOBandSum[bandID] += coefInGaussion * pgdORowSum;
ngdOBandSum[bandID] += coefInGaussion * ngdORowSum;
pgdO2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdO2RowSum;
ngdO2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdO2RowSum;
}
bandID = bandID+2;
if(bandID<NUM_OF_BANDS){/*the band below the current band */
coefInGaussion = gaussCoefL_[hID%params.widthOfBand_];
pgdLBandSum[bandID] += coefInGaussion * pgdLRowSum;
ngdLBandSum[bandID] += coefInGaussion * ngdLRowSum;
pgdL2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdL2RowSum;
ngdL2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdL2RowSum;
pgdOBandSum[bandID] += coefInGaussion * pgdORowSum;
ngdOBandSum[bandID] += coefInGaussion * ngdORowSum;
pgdO2BandSum[bandID] += coefInGaussion * coefInGaussion * pgdO2RowSum;
ngdO2BandSum[bandID] += coefInGaussion * coefInGaussion * ngdO2RowSum;
}
}
/* gDLMat.Save("gDLMat.txt");
return 0; */
/* construct line descriptor */
pSingleLine->descriptor.resize(descriptorSize);
desVec = pSingleLine->descriptor.data();
short desID;
/*Note that the first and last bands only have (lengthOfLSP * widthOfBand_ * 2.0) pixels
* which are counted. */
float invN2 = 1.0/(params.widthOfBand_ * 2.0);
float invN3 = 1.0/(params.widthOfBand_ * 3.0);
float invN, temp;
for(bandID = 0; bandID<NUM_OF_BANDS; bandID++){
if(bandID==0||bandID==NUM_OF_BANDS-1){ invN = invN2;
}else{ invN = invN3;}
desID = bandID * 8;
temp = pgdLBandSum[bandID] * invN;
desVec[desID] = temp;/* mean value of pgdL; */
desVec[desID+4] = sqrt(pgdL2BandSum[bandID] * invN - temp*temp);//std value of pgdL;
temp = ngdLBandSum[bandID] * invN;
desVec[desID+1] = temp;//mean value of ngdL;
desVec[desID+5] = sqrt(ngdL2BandSum[bandID] * invN - temp*temp);//std value of ngdL;
temp = pgdOBandSum[bandID] * invN;
desVec[desID+2] = temp;//mean value of pgdO;
desVec[desID+6] = sqrt(pgdO2BandSum[bandID] * invN - temp*temp);//std value of pgdO;
temp = ngdOBandSum[bandID] * invN;
desVec[desID+3] = temp;//mean value of ngdO;
desVec[desID+7] = sqrt(ngdO2BandSum[bandID] * invN - temp*temp);//std value of ngdO;
}
// normalize;
float tempM, tempS;
tempM = 0;
tempS = 0;
desVec = pSingleLine->descriptor.data();
int base = 0;
for(short i=0; i<NUM_OF_BANDS*8; ++base, i=base*8){
tempM += *(desVec+i) * *(desVec+i);//desVec[8*i+0] * desVec[8*i+0];
tempM += *(desVec+i+1) * *(desVec+i+1);//desVec[8*i+1] * desVec[8*i+1];
tempM += *(desVec+i+2) * *(desVec+i+2);//desVec[8*i+2] * desVec[8*i+2];
tempM += *(desVec+i+3) * *(desVec+i+3);//desVec[8*i+3] * desVec[8*i+3];
tempS += *(desVec+i+4) * *(desVec+i+4);//desVec[8*i+4] * desVec[8*i+4];
tempS += *(desVec+i+5) * *(desVec+i+5);//desVec[8*i+5] * desVec[8*i+5];
tempS += *(desVec+i+6) * *(desVec+i+6);//desVec[8*i+6] * desVec[8*i+6];
tempS += *(desVec+i+7) * *(desVec+i+7);//desVec[8*i+7] * desVec[8*i+7];
}
tempM = 1/sqrt(tempM);
tempS = 1/sqrt(tempS);
desVec = pSingleLine->descriptor.data();
base = 0;
for(short i=0; i<NUM_OF_BANDS*8; ++base, i=base*8){
*(desVec+i) = *(desVec+i) * tempM;//desVec[8*i] = desVec[8*i] * tempM;
*(desVec+1+i) = *(desVec+1+i) * tempM;//desVec[8*i+1] = desVec[8*i+1] * tempM;
*(desVec+2+i) = *(desVec+2+i) * tempM;//desVec[8*i+2] = desVec[8*i+2] * tempM;
*(desVec+3+i) = *(desVec+3+i) * tempM;//desVec[8*i+3] = desVec[8*i+3] * tempM;
*(desVec+4+i) = *(desVec+4+i) * tempS;//desVec[8*i+4] = desVec[8*i+4] * tempS;
*(desVec+5+i) = *(desVec+5+i) * tempS;//desVec[8*i+5] = desVec[8*i+5] * tempS;
*(desVec+6+i) = *(desVec+6+i) * tempS;//desVec[8*i+6] = desVec[8*i+6] * tempS;
*(desVec+7+i) = *(desVec+7+i) * tempS;//desVec[8*i+7] = desVec[8*i+7] * tempS;
}
/* In order to reduce the influence of non-linear illumination,
* a threshold is used to limit the value of element in the unit feature
* vector no larger than this threshold. In Z.Wang's work, a value of 0.4 is found
* empirically to be a proper threshold.*/
desVec = pSingleLine->descriptor.data();
for(short i=0; i<descriptorSize; i++ ){
if(desVec[i]>0.4){
desVec[i]=0.4;
}
}
//re-normalize desVec;
temp = 0;
for(short i=0; i<descriptorSize; i++){
temp += desVec[i] * desVec[i];
}
temp = 1/sqrt(temp);
for(short i=0; i<descriptorSize; i++){
desVec[i] = desVec[i] * temp;
}
}/* end for(short lineIDInSameLine = 0; lineIDInSameLine<sameLineSize;
lineIDInSameLine++) */
cv::Mat appoggio = cv::Mat(1, 32, CV_32FC1);
float* pointerToRow = appoggio.ptr<float>(0);
for(int g = 0; g<32; g++)
{
/* get LBD data */
float* desVec = keyLines[lineIDInScaleVec][0].descriptor.data();
*pointerToRow = desVec[g];
pointerToRow++;
}
}/* end for(short lineIDInScaleVec = 0;
lineIDInScaleVec<numOfFinalLine; lineIDInScaleVec++) */
delete [] dL;
delete [] dO;
delete [] pgdLBandSum;
delete [] ngdLBandSum;
delete [] pgdL2BandSum;
delete [] ngdL2BandSum;
delete [] pgdOBandSum;
delete [] ngdOBandSum;
delete [] pgdO2BandSum;
delete [] ngdO2BandSum;
return 1;
}
......@@ -159,7 +159,7 @@ void BinaryDescriptorMatcher::match( const Mat& queryDescriptors, std::vector<DM
CV_Assert( false );
}
/* create a DMatch object if required by mask of if there is
/* create a DMatch object if required by mask or if there is
no mask at all */
else if( masks.empty() || masks[itup->second].at<uchar>( counter ) != 0 )
{
......@@ -212,7 +212,7 @@ void BinaryDescriptorMatcher::match( const Mat& queryDescriptors, const Mat& tra
/* compose matches */
for ( int counter = 0; counter < queryDescriptors.rows; counter++ )
{
/* create a DMatch object if required by mask of if there is
/* create a DMatch object if required by mask or if there is
no mask at all */
if( mask.empty() || ( !mask.empty() && mask.at<uchar>( counter ) != 0 ) )
{
......
......@@ -8,12 +8,12 @@
License Agreement
For Open Source Computer Vision Library
Copyright (C) 2011-2012, Lilian Zhang, all rights reserved.
Copyright (C) 2013, Manuele Tamburrano, Stefano Fabri, all rights reserved.
Third party copyrights are property of their respective owners.
Copyright (C) 2011-2012, Lilian Zhang, all rights reserved.
Copyright (C) 2013, Manuele Tamburrano, Stefano Fabri, all rights reserved.
Third party copyrights are property of their respective owners.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
......@@ -25,18 +25,17 @@ are permitted provided that the following conditions are met:
* The name of the copyright holders may not be used to endorse or promote products
derived from this software without specific prior written permission.
This software is provided by the copyright holders and contributors "as is" and
any express or implied warranties, including, but not limited to, the implied
warranties of merchantability and fitness for a particular purpose are disclaimed.
In no event shall the Intel Corporation or contributors be liable for any direct,
indirect, incidental, special, exemplary, or consequential damages
(including, but not limited to, procurement of substitute goods or services;
loss of use, data, or profits; or business interruption) however caused
and on any theory of liability, whether in contract, strict liability,
or tort (including negligence or otherwise) arising in any way out of
the use of this software, even if advised of the possibility of such damage.
*/
This software is provided by the copyright holders and contributors "as is" and
any express or implied warranties, including, but not limited to, the implied
warranties of merchantability and fitness for a particular purpose are disclaimed.
In no event shall the Intel Corporation or contributors be liable for any direct,
indirect, incidental, special, exemplary, or consequential damages
(including, but not limited to, procurement of substitute goods or services;
loss of use, data, or profits; or business interruption) however caused
and on any theory of liability, whether in contract, strict liability,
or tort (including negligence or otherwise) arising in any way out of
the use of this software, even if advised of the possibility of such damage.
*/
#include "precomp.hpp"
......@@ -55,18 +54,17 @@ the use of this software, even if advised of the possibility of such damage.
using namespace std;
EDLineDetector::EDLineDetector()
{
// cout<<"Call EDLineDetector constructor function"<<endl;
//set parameters for line segment detection
ksize_ = 15; //15
sigma_ = 30.0; //30
gradienThreshold_ = 80; // ***** ORIGINAL WAS 25
anchorThreshold_ = 8;//8
scanIntervals_ = 2;//2
minLineLen_ = 15;//15
lineFitErrThreshold_ = 1.6;//1.4
anchorThreshold_ = 8; //8
scanIntervals_ = 2; //2
minLineLen_ = 15; //15
lineFitErrThreshold_ = 1.6; //1.4
InitEDLine_();
}
EDLineDetector::EDLineDetector(EDLineParam param)
EDLineDetector::EDLineDetector( EDLineParam param )
{
//set parameters for line segment detection
ksize_ = param.ksize;
......@@ -81,13 +79,14 @@ EDLineDetector::EDLineDetector(EDLineParam param)
void EDLineDetector::InitEDLine_()
{
bValidate_ = true;
ATA = cv::Mat_<int>(2,2);
ATV = cv::Mat_<int>(1,2);
tempMatLineFit = cv::Mat_<int>(2,2);
tempVecLineFit = cv::Mat_<int>(1,2);
fitMatT = cv::Mat_<int>(2,minLineLen_);
fitVec = cv::Mat_<int>(1,minLineLen_);
for(int i=0; i<minLineLen_; i++){
ATA = cv::Mat_<int>( 2, 2 );
ATV = cv::Mat_<int>( 1, 2 );
tempMatLineFit = cv::Mat_<int>( 2, 2 );
tempVecLineFit = cv::Mat_<int>( 1, 2 );
fitMatT = cv::Mat_<int>( 2, minLineLen_ );
fitVec = cv::Mat_<int>( 1, minLineLen_ );
for ( int i = 0; i < minLineLen_; i++ )
{
fitMatT[1][i] = 1;
}
dxImg_.create( 1, 1, CV_16SC1 );
......@@ -103,55 +102,53 @@ void EDLineDetector::InitEDLine_()
pAnchorY_ = NULL;
}
EDLineDetector::~EDLineDetector(){
if(pFirstPartEdgeX_!=NULL){
delete [] pFirstPartEdgeX_;
delete [] pFirstPartEdgeY_;
delete [] pSecondPartEdgeX_;
delete [] pSecondPartEdgeY_;
delete [] pAnchorX_;
delete [] pAnchorY_;
}
if(pFirstPartEdgeS_ != NULL){
delete [] pFirstPartEdgeS_;
delete [] pSecondPartEdgeS_;
EDLineDetector::~EDLineDetector()
{
if( pFirstPartEdgeX_ != NULL )
{
delete[] pFirstPartEdgeX_;
delete[] pFirstPartEdgeY_;
delete[] pSecondPartEdgeX_;
delete[] pSecondPartEdgeY_;
delete[] pAnchorX_;
delete[] pAnchorY_;
}
if( pFirstPartEdgeS_ != NULL )
{
delete[] pFirstPartEdgeS_;
delete[] pSecondPartEdgeS_;
}
}
int EDLineDetector::EdgeDrawing(cv::Mat &image, EdgeChains &edgeChains, bool smoothed )
int EDLineDetector::EdgeDrawing( cv::Mat &image, EdgeChains &edgeChains, bool smoothed )
{
imageWidth = image.cols;
imageHeight= image.rows;
unsigned int pixelNum = imageWidth*imageHeight;
imageHeight = image.rows;
unsigned int pixelNum = imageWidth * imageHeight;
#ifdef DEBUGEdgeDrawing
#ifdef DEBUGEdgeDrawing
cv::imshow("prima blur", image);
cv::waitKey();
#endif
if(!smoothed){//input image hasn't been smoothed.
#endif
if( !smoothed )
{ //input image hasn't been smoothed.
std::cout << "Dentro smoothed " << std::endl;
cv::Mat InImage = image.clone();
cv::GaussianBlur(InImage, image, cv::Size(ksize_,ksize_), sigma_);
cv::GaussianBlur( InImage, image, cv::Size( ksize_, ksize_ ), sigma_ );
}
#ifdef DEBUGEdgeDrawing
#ifdef DEBUGEdgeDrawing
cv::imshow("dopo blur", image);
cv::waitKey();
#endif
#endif
//Is this useful? Doesn't seem to have references
//else{
// unsigned char *pImage = image.GetImageData();
// memcpy(cvImage->data.ptr, pImage, pixelNum*sizeof(unsigned char));
// }
unsigned int edgePixelArraySize = pixelNum/5;
unsigned int maxNumOfEdge = edgePixelArraySize/20;
std::cout<<"imageHeight: "<<imageHeight<<" - imageWidth:"<<imageWidth<<std::endl;
unsigned int edgePixelArraySize = pixelNum / 5;
unsigned int maxNumOfEdge = edgePixelArraySize / 20;
//compute dx, dy images
if(gImg_.cols!= imageWidth||gImg_.rows!= imageHeight){
/*if(pFirstPartEdgeX_!= NULL){
if( gImg_.cols != (int)imageWidth || gImg_.rows != (int)imageHeight )
{
if(pFirstPartEdgeX_!= NULL){
delete [] pFirstPartEdgeX_;
delete [] pFirstPartEdgeY_;
delete [] pSecondPartEdgeX_;
......@@ -160,14 +157,14 @@ int EDLineDetector::EdgeDrawing(cv::Mat &image, EdgeChains &edgeChains, bool smo
delete [] pSecondPartEdgeS_;
delete [] pAnchorX_;
delete [] pAnchorY_;
}*/
}
dxImg_.create(imageHeight, imageWidth, CV_16SC1);
dyImg_.create(imageHeight, imageWidth, CV_16SC1 );
gImgWO_.create(imageHeight, imageWidth, CV_16SC1 );
gImg_.create(imageHeight, imageWidth, CV_16SC1 );
dirImg_.create(imageHeight, imageWidth, CV_8UC1 );
edgeImage_.create(imageHeight, imageWidth, CV_8UC1 );
dxImg_.create( imageHeight, imageWidth, CV_16SC1 );
dyImg_.create( imageHeight, imageWidth, CV_16SC1 );
gImgWO_.create( imageHeight, imageWidth, CV_16SC1 );
gImg_.create( imageHeight, imageWidth, CV_16SC1 );
dirImg_.create( imageHeight, imageWidth, CV_8UC1 );
edgeImage_.create( imageHeight, imageWidth, CV_8UC1 );
pFirstPartEdgeX_ = new unsigned int[edgePixelArraySize];
pFirstPartEdgeY_ = new unsigned int[edgePixelArraySize];
pSecondPartEdgeX_ = new unsigned int[edgePixelArraySize];
......@@ -177,96 +174,94 @@ int EDLineDetector::EdgeDrawing(cv::Mat &image, EdgeChains &edgeChains, bool smo
pAnchorX_ = new unsigned int[edgePixelArraySize];
pAnchorY_ = new unsigned int[edgePixelArraySize];
}
cv::Sobel( image, dxImg_, CV_16SC1, 1, 0, 3);
cv::Sobel( image, dyImg_, CV_16SC1, 0, 1, 3);
cv::Sobel( image, dxImg_, CV_16SC1, 1, 0, 3 );
cv::Sobel( image, dyImg_, CV_16SC1, 0, 1, 3 );
#ifdef DEBUGEdgeDrawing
#ifdef DEBUGEdgeDrawing
cv::imshow("dxImg_", dxImg_);
cv::imshow("dyImg_", dyImg_);
cv::waitKey();
#endif
#endif
//compute gradient and direction images
// double t = (double)cv::getTickCount();
cv::Mat dxABS_m = cv::abs(dxImg_);
cv::Mat dyABS_m = cv::abs(dyImg_);
cv::Mat dxABS_m = cv::abs( dxImg_ );
cv::Mat dyABS_m = cv::abs( dyImg_ );
cv::Mat sumDxDy;
cv::add(dyABS_m, dxABS_m, sumDxDy);
cv::add( dyABS_m, dxABS_m, sumDxDy );
cv::threshold( sumDxDy, gImg_, gradienThreshold_ + 1, 255, cv::THRESH_TOZERO );
gImg_ = gImg_ / 4;
gImgWO_ = sumDxDy / 4;
cv::compare( dxABS_m, dyABS_m, dirImg_, cv::CMP_LT );
cv::threshold(sumDxDy,gImg_, gradienThreshold_+1, 255, cv::THRESH_TOZERO);
gImg_ = gImg_/4;
gImgWO_ = sumDxDy/4;
cv::compare(dxABS_m, dyABS_m, dirImg_, cv::CMP_LT);
// t = ((double)cv::getTickCount() - t)/cv::getTickFrequency();
// std::cout<<"FOR ABS: "<<t<<"s"<<std::endl;
short *pdxImg = dxImg_.ptr<short>();
short *pdyImg = dyImg_.ptr<short>();
short *pgImg = gImg_.ptr<short>();
unsigned char *pdirImg = dirImg_.ptr();
//extract the anchors in the gradient image, store into a vector
memset(pAnchorX_, 0, edgePixelArraySize*sizeof(unsigned int));//initialization
memset(pAnchorY_, 0, edgePixelArraySize*sizeof(unsigned int));
memset( pAnchorX_, 0, edgePixelArraySize * sizeof(unsigned int) ); //initialization
memset( pAnchorY_, 0, edgePixelArraySize * sizeof(unsigned int) );
unsigned int anchorsSize = 0;
int indexInArray;
unsigned char gValue1, gValue2, gValue3;
for(unsigned int w=1; w<imageWidth-1; w=w+scanIntervals_){
for(unsigned int h=1; h<imageHeight-1; h=h+scanIntervals_){
indexInArray = h*imageWidth+w;
for ( unsigned int w = 1; w < imageWidth - 1; w = w + scanIntervals_ )
{
for ( unsigned int h = 1; h < imageHeight - 1; h = h + scanIntervals_ )
{
indexInArray = h * imageWidth + w;
//gValue1 = pdirImg[indexInArray];
if(pdirImg[indexInArray]==Horizontal){//if the direction of pixel is horizontal, then compare with up and down
if( pdirImg[indexInArray] == Horizontal )
{ //if the direction of pixel is horizontal, then compare with up and down
//gValue2 = pgImg[indexInArray];
if(pgImg[indexInArray]>=pgImg[indexInArray-imageWidth]+anchorThreshold_
&&pgImg[indexInArray]>=pgImg[indexInArray+imageWidth]+anchorThreshold_){// (w,h) is accepted as an anchor
if( pgImg[indexInArray] >= pgImg[indexInArray - imageWidth] + anchorThreshold_
&& pgImg[indexInArray] >= pgImg[indexInArray + imageWidth] + anchorThreshold_ )
{ // (w,h) is accepted as an anchor
pAnchorX_[anchorsSize] = w;
pAnchorY_[anchorsSize++] = h;
}
}else{// if(pdirImg[indexInArray]==Vertical){//it is vertical edge, should be compared with left and right
}
else
{ // if(pdirImg[indexInArray]==Vertical){//it is vertical edge, should be compared with left and right
//gValue2 = pgImg[indexInArray];
if(pgImg[indexInArray]>=pgImg[indexInArray-1]+anchorThreshold_
&&pgImg[indexInArray]>=pgImg[indexInArray+1]+anchorThreshold_){// (w,h) is accepted as an anchor
if( pgImg[indexInArray] >= pgImg[indexInArray - 1] + anchorThreshold_ && pgImg[indexInArray] >= pgImg[indexInArray + 1] + anchorThreshold_ )
{ // (w,h) is accepted as an anchor
pAnchorX_[anchorsSize] = w;
pAnchorY_[anchorsSize++] = h;
}
}
}
}
if(anchorsSize>edgePixelArraySize){
cout<<"anchor size is larger than its maximal size. anchorsSize="<<anchorsSize
<<", maximal size = "<<edgePixelArraySize <<endl;
if( anchorsSize > edgePixelArraySize )
{
cout << "anchor size is larger than its maximal size. anchorsSize=" << anchorsSize << ", maximal size = " << edgePixelArraySize << endl;
return -1;
}
#ifdef DEBUGEdgeDrawing
cout<<"Anchor point detection, anchors.size="<<anchorsSize<<endl;
#endif
//link the anchors by smart routing
edgeImage_.setTo(0);
edgeImage_.setTo( 0 );
unsigned char *pEdgeImg = edgeImage_.data;
memset(pFirstPartEdgeX_, 0, edgePixelArraySize*sizeof(unsigned int));//initialization
memset(pFirstPartEdgeY_, 0, edgePixelArraySize*sizeof(unsigned int));
memset(pSecondPartEdgeX_, 0, edgePixelArraySize*sizeof(unsigned int));
memset(pSecondPartEdgeY_, 0, edgePixelArraySize*sizeof(unsigned int));
memset(pFirstPartEdgeS_, 0, maxNumOfEdge*sizeof(unsigned int));
memset(pSecondPartEdgeS_, 0, maxNumOfEdge*sizeof(unsigned int));
unsigned int offsetPFirst=0, offsetPSecond=0;
unsigned int offsetPS=0;
memset( pFirstPartEdgeX_, 0, edgePixelArraySize * sizeof(unsigned int) ); //initialization
memset( pFirstPartEdgeY_, 0, edgePixelArraySize * sizeof(unsigned int) );
memset( pSecondPartEdgeX_, 0, edgePixelArraySize * sizeof(unsigned int) );
memset( pSecondPartEdgeY_, 0, edgePixelArraySize * sizeof(unsigned int) );
memset( pFirstPartEdgeS_, 0, maxNumOfEdge * sizeof(unsigned int) );
memset( pSecondPartEdgeS_, 0, maxNumOfEdge * sizeof(unsigned int) );
unsigned int offsetPFirst = 0, offsetPSecond = 0;
unsigned int offsetPS = 0;
unsigned int x, y;
unsigned int lastX, lastY;
unsigned char lastDirection;//up = 1, right = 2, down = 3, left = 4;
unsigned char shouldGoDirection;//up = 1, right = 2, down = 3, left = 4;
unsigned char lastDirection; //up = 1, right = 2, down = 3, left = 4;
unsigned char shouldGoDirection; //up = 1, right = 2, down = 3, left = 4;
int edgeLenFirst, edgeLenSecond;
for(unsigned int i=0; i<anchorsSize; i++){
for ( unsigned int i = 0; i < anchorsSize; i++ )
{
x = pAnchorX_[i];
y = pAnchorY_[i];
indexInArray = y*imageWidth+x;
if(pEdgeImg[indexInArray]){//if anchor i is already been an edge pixel.
indexInArray = y * imageWidth + x;
if( pEdgeImg[indexInArray] )
{ //if anchor i is already been an edge pixel.
continue;
}
/*The walk stops under 3 conditions:
......@@ -279,458 +274,635 @@ int EDLineDetector::EdgeDrawing(cv::Mat &image, EdgeChains &edgeChains, bool smo
* in somewhere on the chain.)
* 3. We encounter a previously detected edge pixel. */
pFirstPartEdgeS_[offsetPS] = offsetPFirst;
if(pdirImg[indexInArray]==Horizontal){//if the direction of this pixel is horizontal, then go left and right.
if( pdirImg[indexInArray] == Horizontal )
{ //if the direction of this pixel is horizontal, then go left and right.
//fist go right, pixel direction may be different during linking.
lastDirection = RightDir;
while(pgImg[indexInArray]>0&&!pEdgeImg[indexInArray]){
while ( pgImg[indexInArray] > 0 && !pEdgeImg[indexInArray] )
{
pEdgeImg[indexInArray] = 1; // Mark this pixel as an edge pixel
pFirstPartEdgeX_[offsetPFirst] = x;
pFirstPartEdgeY_[offsetPFirst++] = y;
shouldGoDirection = 0;//unknown
if(pdirImg[indexInArray]==Horizontal){//should go left or right
if(lastDirection == UpDir ||lastDirection == DownDir){//change the pixel direction now
if(x>lastX){//should go right
shouldGoDirection = 0; //unknown
if( pdirImg[indexInArray] == Horizontal )
{ //should go left or right
if( lastDirection == UpDir || lastDirection == DownDir )
{ //change the pixel direction now
if( x > lastX )
{ //should go right
shouldGoDirection = RightDir;
}else{//should go left
}
else
{ //should go left
shouldGoDirection = LeftDir;
}
}
lastX = x;
lastY = y;
if(lastDirection == RightDir||shouldGoDirection == RightDir){//go right
if(x==imageWidth-1||y==0||y==imageHeight-1){//reach the image border
if( lastDirection == RightDir || shouldGoDirection == RightDir )
{ //go right
if( x == imageWidth - 1 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the right and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray-imageWidth+1];
gValue2 = pgImg[indexInArray+1];
gValue3 = pgImg[indexInArray+imageWidth+1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-right
x = x+1;
y = y+1;
}else{//straight-right
x = x+1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray + 1];
gValue3 = pgImg[indexInArray + imageWidth + 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-right
x = x + 1;
y = y + 1;
}
else
{ //straight-right
x = x + 1;
}
lastDirection = RightDir;
} else if(lastDirection == LeftDir || shouldGoDirection==LeftDir){//go left
if(x==0||y==0||y==imageHeight-1){//reach the image border
}
else if( lastDirection == LeftDir || shouldGoDirection == LeftDir )
{ //go left
if( x == 0 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the left and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray-imageWidth-1];
gValue2 = pgImg[indexInArray-1];
gValue3 = pgImg[indexInArray+imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-left
x = x-1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-left
x = x-1;
gValue1 = pgImg[indexInArray - imageWidth - 1];
gValue2 = pgImg[indexInArray - 1];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-left
x = x - 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-left
x = x - 1;
}
lastDirection = LeftDir;
}
}else{//should go up or down.
if(lastDirection == RightDir || lastDirection == LeftDir){//change the pixel direction now
if(y>lastY){//should go down
}
else
{ //should go up or down.
if( lastDirection == RightDir || lastDirection == LeftDir )
{ //change the pixel direction now
if( y > lastY )
{ //should go down
shouldGoDirection = DownDir;
}else{//should go up
}
else
{ //should go up
shouldGoDirection = UpDir;
}
}
lastX = x;
lastY = y;
if(lastDirection==DownDir || shouldGoDirection == DownDir){//go down
if(x==0||x==imageWidth-1||y==imageHeight-1){//reach the image border
if( lastDirection == DownDir || shouldGoDirection == DownDir )
{ //go down
if( x == 0 || x == imageWidth - 1 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the down and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray+imageWidth+1];
gValue2 = pgImg[indexInArray+imageWidth];
gValue3 = pgImg[indexInArray+imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//down-right
x = x+1;
y = y+1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-down
y = y+1;
gValue1 = pgImg[indexInArray + imageWidth + 1];
gValue2 = pgImg[indexInArray + imageWidth];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //down-right
x = x + 1;
y = y + 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-down
y = y + 1;
}
lastDirection = DownDir;
}else if(lastDirection==UpDir || shouldGoDirection == UpDir){//go up
if(x==0||x==imageWidth-1||y==0){//reach the image border
}
else if( lastDirection == UpDir || shouldGoDirection == UpDir )
{ //go up
if( x == 0 || x == imageWidth - 1 || y == 0 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the up and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray-imageWidth+1];
gValue2 = pgImg[indexInArray-imageWidth];
gValue3 = pgImg[indexInArray-imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//up-left
x = x-1;
y = y-1;
}else{//straight-up
y = y-1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray - imageWidth];
gValue3 = pgImg[indexInArray - imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //up-left
x = x - 1;
y = y - 1;
}
else
{ //straight-up
y = y - 1;
}
lastDirection = UpDir;
}
}
indexInArray = y*imageWidth+x;
}//end while go right
indexInArray = y * imageWidth + x;
} //end while go right
//then go left, pixel direction may be different during linking.
x = pAnchorX_[i];
y = pAnchorY_[i];
indexInArray = y*imageWidth+x;
pEdgeImg[indexInArray] = 0;//mark the anchor point be a non-edge pixel and
indexInArray = y * imageWidth + x;
pEdgeImg[indexInArray] = 0; //mark the anchor point be a non-edge pixel and
lastDirection = LeftDir;
pSecondPartEdgeS_[offsetPS] = offsetPSecond;
while(pgImg[indexInArray]>0&&!pEdgeImg[indexInArray]){
while ( pgImg[indexInArray] > 0 && !pEdgeImg[indexInArray] )
{
pEdgeImg[indexInArray] = 1; // Mark this pixel as an edge pixel
pSecondPartEdgeX_[offsetPSecond] = x;
pSecondPartEdgeY_[offsetPSecond++] = y;
shouldGoDirection = 0;//unknown
if(pdirImg[indexInArray]==Horizontal){//should go left or right
if(lastDirection == UpDir ||lastDirection == DownDir){//change the pixel direction now
if(x>lastX){//should go right
shouldGoDirection = 0; //unknown
if( pdirImg[indexInArray] == Horizontal )
{ //should go left or right
if( lastDirection == UpDir || lastDirection == DownDir )
{ //change the pixel direction now
if( x > lastX )
{ //should go right
shouldGoDirection = RightDir;
}else{//should go left
}
else
{ //should go left
shouldGoDirection = LeftDir;
}
}
lastX = x;
lastY = y;
if(lastDirection == RightDir||shouldGoDirection == RightDir){//go right
if(x==imageWidth-1||y==0||y==imageHeight-1){//reach the image border
if( lastDirection == RightDir || shouldGoDirection == RightDir )
{ //go right
if( x == imageWidth - 1 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the right and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray-imageWidth+1];
gValue2 = pgImg[indexInArray+1];
gValue3 = pgImg[indexInArray+imageWidth+1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-right
x = x+1;
y = y+1;
}else{//straight-right
x = x+1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray + 1];
gValue3 = pgImg[indexInArray + imageWidth + 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-right
x = x + 1;
y = y + 1;
}
else
{ //straight-right
x = x + 1;
}
lastDirection = RightDir;
}else if(lastDirection == LeftDir || shouldGoDirection==LeftDir){//go left
if(x==0||y==0||y==imageHeight-1){//reach the image border
}
else if( lastDirection == LeftDir || shouldGoDirection == LeftDir )
{ //go left
if( x == 0 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the left and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray-imageWidth-1];
gValue2 = pgImg[indexInArray-1];
gValue3 = pgImg[indexInArray+imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-left
x = x-1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-left
x = x-1;
gValue1 = pgImg[indexInArray - imageWidth - 1];
gValue2 = pgImg[indexInArray - 1];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-left
x = x - 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-left
x = x - 1;
}
lastDirection = LeftDir;
}
}else{//should go up or down.
if(lastDirection == RightDir || lastDirection == LeftDir){//change the pixel direction now
if(y>lastY){//should go down
}
else
{ //should go up or down.
if( lastDirection == RightDir || lastDirection == LeftDir )
{ //change the pixel direction now
if( y > lastY )
{ //should go down
shouldGoDirection = DownDir;
}else{//should go up
}
else
{ //should go up
shouldGoDirection = UpDir;
}
}
lastX = x;
lastY = y;
if(lastDirection==DownDir || shouldGoDirection == DownDir){//go down
if(x==0||x==imageWidth-1||y==imageHeight-1){//reach the image border
if( lastDirection == DownDir || shouldGoDirection == DownDir )
{ //go down
if( x == 0 || x == imageWidth - 1 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the down and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray+imageWidth+1];
gValue2 = pgImg[indexInArray+imageWidth];
gValue3 = pgImg[indexInArray+imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//down-right
x = x+1;
y = y+1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-down
y = y+1;
gValue1 = pgImg[indexInArray + imageWidth + 1];
gValue2 = pgImg[indexInArray + imageWidth];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //down-right
x = x + 1;
y = y + 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-down
y = y + 1;
}
lastDirection = DownDir;
}else if(lastDirection==UpDir || shouldGoDirection == UpDir){//go up
if(x==0||x==imageWidth-1||y==0){//reach the image border
}
else if( lastDirection == UpDir || shouldGoDirection == UpDir )
{ //go up
if( x == 0 || x == imageWidth - 1 || y == 0 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the up and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth+1];
gValue2 = pgImg[indexInArray- imageWidth];
gValue3 = pgImg[indexInArray- imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//up-left
x = x-1;
y = y-1;
}else{//straight-up
y = y-1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray - imageWidth];
gValue3 = pgImg[indexInArray - imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //up-left
x = x - 1;
y = y - 1;
}
else
{ //straight-up
y = y - 1;
}
lastDirection = UpDir;
}
}
indexInArray = y*imageWidth+x;
}//end while go left
indexInArray = y * imageWidth + x;
} //end while go left
//end anchor is Horizontal
}else{//the direction of this pixel is vertical, go up and down
}
else
{ //the direction of this pixel is vertical, go up and down
//fist go down, pixel direction may be different during linking.
lastDirection = DownDir;
while(pgImg[indexInArray]>0&&!pEdgeImg[indexInArray]){
while ( pgImg[indexInArray] > 0 && !pEdgeImg[indexInArray] )
{
pEdgeImg[indexInArray] = 1; // Mark this pixel as an edge pixel
pFirstPartEdgeX_[offsetPFirst] = x;
pFirstPartEdgeY_[offsetPFirst++] = y;
shouldGoDirection = 0;//unknown
if(pdirImg[indexInArray]==Horizontal){//should go left or right
if(lastDirection == UpDir ||lastDirection == DownDir){//change the pixel direction now
if(x>lastX){//should go right
shouldGoDirection = 0; //unknown
if( pdirImg[indexInArray] == Horizontal )
{ //should go left or right
if( lastDirection == UpDir || lastDirection == DownDir )
{ //change the pixel direction now
if( x > lastX )
{ //should go right
shouldGoDirection = RightDir;
}else{//should go left
}
else
{ //should go left
shouldGoDirection = LeftDir;
}
}
lastX = x;
lastY = y;
if(lastDirection == RightDir||shouldGoDirection == RightDir){//go right
if(x==imageWidth-1||y==0||y==imageHeight-1){//reach the image border
if( lastDirection == RightDir || shouldGoDirection == RightDir )
{ //go right
if( x == imageWidth - 1 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the right and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth+1];
gValue2 = pgImg[indexInArray+1];
gValue3 = pgImg[indexInArray+ imageWidth+1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-right
x = x+1;
y = y+1;
}else{//straight-right
x = x+1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray + 1];
gValue3 = pgImg[indexInArray + imageWidth + 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-right
x = x + 1;
y = y + 1;
}
else
{ //straight-right
x = x + 1;
}
lastDirection = RightDir;
}else if(lastDirection == LeftDir || shouldGoDirection==LeftDir){//go left
if(x==0||y==0||y==imageHeight-1){//reach the image border
}
else if( lastDirection == LeftDir || shouldGoDirection == LeftDir )
{ //go left
if( x == 0 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the left and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth-1];
gValue2 = pgImg[indexInArray-1];
gValue3 = pgImg[indexInArray+ imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-left
x = x-1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-left
x = x-1;
gValue1 = pgImg[indexInArray - imageWidth - 1];
gValue2 = pgImg[indexInArray - 1];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-left
x = x - 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-left
x = x - 1;
}
lastDirection = LeftDir;
}
}else{//should go up or down.
if(lastDirection == RightDir || lastDirection == LeftDir){//change the pixel direction now
if(y>lastY){//should go down
}
else
{ //should go up or down.
if( lastDirection == RightDir || lastDirection == LeftDir )
{ //change the pixel direction now
if( y > lastY )
{ //should go down
shouldGoDirection = DownDir;
}else{//should go up
}
else
{ //should go up
shouldGoDirection = UpDir;
}
}
lastX = x;
lastY = y;
if(lastDirection==DownDir || shouldGoDirection == DownDir){//go down
if(x==0||x==imageWidth-1||y==imageHeight-1){//reach the image border
if( lastDirection == DownDir || shouldGoDirection == DownDir )
{ //go down
if( x == 0 || x == imageWidth - 1 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the down and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray+ imageWidth+1];
gValue2 = pgImg[indexInArray+ imageWidth];
gValue3 = pgImg[indexInArray+ imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//down-right
x = x+1;
y = y+1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-down
y = y+1;
gValue1 = pgImg[indexInArray + imageWidth + 1];
gValue2 = pgImg[indexInArray + imageWidth];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //down-right
x = x + 1;
y = y + 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-down
y = y + 1;
}
lastDirection = DownDir;
}else if(lastDirection==UpDir || shouldGoDirection == UpDir){//go up
if(x==0||x==imageWidth-1||y==0){//reach the image border
}
else if( lastDirection == UpDir || shouldGoDirection == UpDir )
{ //go up
if( x == 0 || x == imageWidth - 1 || y == 0 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the up and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth+1];
gValue2 = pgImg[indexInArray- imageWidth];
gValue3 = pgImg[indexInArray- imageWidth-1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//up-left
x = x-1;
y = y-1;
}else{//straight-up
y = y-1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray - imageWidth];
gValue3 = pgImg[indexInArray - imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //up-left
x = x - 1;
y = y - 1;
}
else
{ //straight-up
y = y - 1;
}
lastDirection = UpDir;
}
}
indexInArray = y*imageWidth+x;
}//end while go down
indexInArray = y * imageWidth + x;
} //end while go down
//then go up, pixel direction may be different during linking.
lastDirection = UpDir;
x = pAnchorX_[i];
y = pAnchorY_[i];
indexInArray = y*imageWidth+x;
pEdgeImg[indexInArray] = 0;//mark the anchor point be a non-edge pixel and
indexInArray = y * imageWidth + x;
pEdgeImg[indexInArray] = 0; //mark the anchor point be a non-edge pixel and
pSecondPartEdgeS_[offsetPS] = offsetPSecond;
while(pgImg[indexInArray]>0&&!pEdgeImg[indexInArray]){
while ( pgImg[indexInArray] > 0 && !pEdgeImg[indexInArray] )
{
pEdgeImg[indexInArray] = 1; // Mark this pixel as an edge pixel
pSecondPartEdgeX_[offsetPSecond] = x;
pSecondPartEdgeY_[offsetPSecond++] = y;
shouldGoDirection = 0;//unknown
if(pdirImg[indexInArray]==Horizontal){//should go left or right
if(lastDirection == UpDir ||lastDirection == DownDir){//change the pixel direction now
if(x>lastX){//should go right
shouldGoDirection = 0; //unknown
if( pdirImg[indexInArray] == Horizontal )
{ //should go left or right
if( lastDirection == UpDir || lastDirection == DownDir )
{ //change the pixel direction now
if( x > lastX )
{ //should go right
shouldGoDirection = RightDir;
}else{//should go left
}
else
{ //should go left
shouldGoDirection = LeftDir;
}
}
lastX = x;
lastY = y;
if(lastDirection == RightDir||shouldGoDirection == RightDir){//go right
if(x==imageWidth-1||y==0||y==imageHeight-1){//reach the image border
if( lastDirection == RightDir || shouldGoDirection == RightDir )
{ //go right
if( x == imageWidth - 1 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the right and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth+1];
gValue2 = pgImg[indexInArray+1];
gValue3 = pgImg[indexInArray+ imageWidth +1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-right
x = x+1;
y = y+1;
}else{//straight-right
x = x+1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray + 1];
gValue3 = pgImg[indexInArray + imageWidth + 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-right
x = x + 1;
y = y + 1;
}
else
{ //straight-right
x = x + 1;
}
lastDirection = RightDir;
}else if(lastDirection == LeftDir || shouldGoDirection==LeftDir){//go left
if(x==0||y==0||y==imageHeight-1){//reach the image border
}
else if( lastDirection == LeftDir || shouldGoDirection == LeftDir )
{ //go left
if( x == 0 || y == 0 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the left and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth -1];
gValue2 = pgImg[indexInArray-1];
gValue3 = pgImg[indexInArray+ imageWidth -1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-left
x = x-1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-left
x = x-1;
gValue1 = pgImg[indexInArray - imageWidth - 1];
gValue2 = pgImg[indexInArray - 1];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-left
x = x - 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-left
x = x - 1;
}
lastDirection = LeftDir;
}
}else{//should go up or down.
if(lastDirection == RightDir || lastDirection == LeftDir){//change the pixel direction now
if(y>lastY){//should go down
}
else
{ //should go up or down.
if( lastDirection == RightDir || lastDirection == LeftDir )
{ //change the pixel direction now
if( y > lastY )
{ //should go down
shouldGoDirection = DownDir;
}else{//should go up
}
else
{ //should go up
shouldGoDirection = UpDir;
}
}
lastX = x;
lastY = y;
if(lastDirection==DownDir || shouldGoDirection == DownDir){//go down
if(x==0||x==imageWidth-1||y==imageHeight-1){//reach the image border
if( lastDirection == DownDir || shouldGoDirection == DownDir )
{ //go down
if( x == 0 || x == imageWidth - 1 || y == imageHeight - 1 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the down and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray+ imageWidth +1];
gValue2 = pgImg[indexInArray+ imageWidth];
gValue3 = pgImg[indexInArray+ imageWidth -1];
if(gValue1>=gValue2&&gValue1>=gValue3){//down-right
x = x+1;
y = y+1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//down-left
x = x-1;
y = y+1;
}else{//straight-down
y = y+1;
gValue1 = pgImg[indexInArray + imageWidth + 1];
gValue2 = pgImg[indexInArray + imageWidth];
gValue3 = pgImg[indexInArray + imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //down-right
x = x + 1;
y = y + 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //down-left
x = x - 1;
y = y + 1;
}
else
{ //straight-down
y = y + 1;
}
lastDirection = DownDir;
}else if(lastDirection==UpDir || shouldGoDirection == UpDir){//go up
if(x==0||x==imageWidth-1||y==0){//reach the image border
}
else if( lastDirection == UpDir || shouldGoDirection == UpDir )
{ //go up
if( x == 0 || x == imageWidth - 1 || y == 0 )
{ //reach the image border
break;
}
// Look at 3 neighbors to the up and pick the one with the max. gradient value
gValue1 = pgImg[indexInArray- imageWidth +1];
gValue2 = pgImg[indexInArray- imageWidth];
gValue3 = pgImg[indexInArray- imageWidth -1];
if(gValue1>=gValue2&&gValue1>=gValue3){//up-right
x = x+1;
y = y-1;
}else if(gValue3>=gValue2&&gValue3>=gValue1){//up-left
x = x-1;
y = y-1;
}else{//straight-up
y = y-1;
gValue1 = pgImg[indexInArray - imageWidth + 1];
gValue2 = pgImg[indexInArray - imageWidth];
gValue3 = pgImg[indexInArray - imageWidth - 1];
if( gValue1 >= gValue2 && gValue1 >= gValue3 )
{ //up-right
x = x + 1;
y = y - 1;
}
else if( gValue3 >= gValue2 && gValue3 >= gValue1 )
{ //up-left
x = x - 1;
y = y - 1;
}
else
{ //straight-up
y = y - 1;
}
lastDirection = UpDir;
}
}
indexInArray = y*imageWidth+x;
}//end while go up
}//end anchor is Vertical
indexInArray = y * imageWidth + x;
} //end while go up
} //end anchor is Vertical
//only keep the edge chains whose length is larger than the minLineLen_;
edgeLenFirst = offsetPFirst - pFirstPartEdgeS_[offsetPS];
edgeLenSecond = offsetPSecond - pSecondPartEdgeS_[offsetPS];
if(edgeLenFirst+edgeLenSecond<minLineLen_+1){//short edge, drop it
if( edgeLenFirst + edgeLenSecond < minLineLen_ + 1 )
{ //short edge, drop it
offsetPFirst = pFirstPartEdgeS_[offsetPS];
offsetPSecond = pSecondPartEdgeS_[offsetPS];
}else{
}
else
{
offsetPS++;
}
}
//store the last index
pFirstPartEdgeS_[offsetPS] = offsetPFirst;
pSecondPartEdgeS_[offsetPS] = offsetPSecond;
if(offsetPS>maxNumOfEdge){
cout<<"Edge drawing Error: The total number of edges is larger than MaxNumOfEdge, "
"numofedge = "<<offsetPS<<", MaxNumOfEdge="<<maxNumOfEdge<<endl;
if( offsetPS > maxNumOfEdge )
{
cout << "Edge drawing Error: The total number of edges is larger than MaxNumOfEdge, "
"numofedge = "
<< offsetPS << ", MaxNumOfEdge=" << maxNumOfEdge << endl;
return -1;
}
if(offsetPFirst>edgePixelArraySize||offsetPSecond>edgePixelArraySize){
cout<<"Edge drawing Error: The total number of edge pixels is larger than MaxNumOfEdgePixels, "
"numofedgePixel1 = "<<offsetPFirst<<", numofedgePixel2 = "<<offsetPSecond <<
", MaxNumOfEdgePixel="<<edgePixelArraySize<<endl;
if( offsetPFirst > edgePixelArraySize || offsetPSecond > edgePixelArraySize )
{
cout << "Edge drawing Error: The total number of edge pixels is larger than MaxNumOfEdgePixels, "
"numofedgePixel1 = "
<< offsetPFirst << ", numofedgePixel2 = " << offsetPSecond << ", MaxNumOfEdgePixel=" << edgePixelArraySize << endl;
return -1;
}
......@@ -738,75 +910,52 @@ int EDLineDetector::EdgeDrawing(cv::Mat &image, EdgeChains &edgeChains, bool smo
*pFirstPartEdgeS_, pSecondPartEdgeX_, pSecondPartEdgeY_, pSecondPartEdgeS_;
*we should reorganize them into edgeChains for easily using. */
int tempID;
edgeChains.xCors.resize(offsetPFirst+offsetPSecond);
edgeChains.yCors.resize(offsetPFirst+offsetPSecond);
edgeChains.sId.resize(offsetPS+1);
edgeChains.xCors.resize( offsetPFirst + offsetPSecond );
edgeChains.yCors.resize( offsetPFirst + offsetPSecond );
edgeChains.sId.resize( offsetPS + 1 );
unsigned int *pxCors = edgeChains.xCors.data();
unsigned int *pyCors = edgeChains.yCors.data();
unsigned int *psId = edgeChains.sId.data();
offsetPFirst = 0;
offsetPSecond= 0;
offsetPSecond = 0;
unsigned int indexInCors = 0;
unsigned int numOfEdges = 0;
for(unsigned int edgeId = 0; edgeId<offsetPS; edgeId++){
for ( unsigned int edgeId = 0; edgeId < offsetPS; edgeId++ )
{
//step1, put the first and second parts edge coordinates together from edge start to edge end
psId[numOfEdges++] = indexInCors;
indexInArray = pFirstPartEdgeS_[edgeId];
offsetPFirst = pFirstPartEdgeS_[edgeId+1];
for(tempID = offsetPFirst-1; tempID>= indexInArray; tempID--){//add first part edge
offsetPFirst = pFirstPartEdgeS_[edgeId + 1];
for ( tempID = offsetPFirst - 1; tempID >= indexInArray; tempID-- )
{ //add first part edge
pxCors[indexInCors] = pFirstPartEdgeX_[tempID];
pyCors[indexInCors++] = pFirstPartEdgeY_[tempID];
}
indexInArray = pSecondPartEdgeS_[edgeId];
offsetPSecond= pSecondPartEdgeS_[edgeId+1];
for(tempID = indexInArray+1; tempID<offsetPSecond; tempID++){//add second part edge
offsetPSecond = pSecondPartEdgeS_[edgeId + 1];
for ( tempID = indexInArray + 1; tempID < (int)offsetPSecond; tempID++ )
{ //add second part edge
pxCors[indexInCors] = pSecondPartEdgeX_[tempID];
pyCors[indexInCors++]= pSecondPartEdgeY_[tempID];
pyCors[indexInCors++] = pSecondPartEdgeY_[tempID];
}
}
psId[numOfEdges] = indexInCors;//the end index of the last edge
psId[numOfEdges] = indexInCors; //the end index of the last edge
edgeChains.numOfEdges = numOfEdges;
//#ifdef DEBUGEdgeDrawing
// cout<<"Edge Drawing, numofedge = "<<numOfEdges <<endl;
// /*Show the extracted edge cvImage in color. Each chain is in different color.*/
// IplImage* cvColorImg = cvCreateImage(cvSize(imageWidth,imageHeight),IPL_DEPTH_8U, 3);
// cvSet(cvColorImg, cvScalar(0,0,0));
// CvScalar s;
// srand((unsigned)time(0));
//
// int lowest=100, highest=255;
// int range=(highest-lowest)+1;
// int r, g, b; //the color of lines
// for(unsigned int i=0; i<edgeChains.numOfEdges; i++){
// r = lowest+int(rand()%range);
// g = lowest+int(rand()%range);
// b = lowest+int(rand()%range);
// s.val[0] = r; s.val[1] = g; s.val[2] = b;
// for(indexInCors = psId[i]; indexInCors<psId[i+1]; indexInCors++){
// cvSet2D(cvColorImg,pyCors[indexInCors],pxCors[indexInCors],s);
// }
// }
//
// cvNamedWindow("EdgeColorImage", CV_WINDOW_AUTOSIZE);
// cvShowImage("EdgeColorImage", cvColorImg);
// cvWaitKey(0);
// cvReleaseImage(&cvColorImg);
//#endif
return 1;
}
int EDLineDetector::EDline(cv::Mat &image, LineChains &lines, bool smoothed)
int EDLineDetector::EDline( cv::Mat &image, LineChains &lines, bool smoothed )
{
//first, call EdgeDrawing function to extract edges
EdgeChains edges;
if((EdgeDrawing(image, edges, smoothed))!=true){
cout<<"Line Detection not finished"<<endl;
if( ( EdgeDrawing( image, edges, smoothed ) ) != true )
{
cout << "Line Detection not finished" << endl;
return -1;
}
// bValidate_ =false;
#ifdef DEBUGEDLine
cv::imshow("EdgeDrawing", image);
cv::waitKey();
......@@ -814,9 +963,9 @@ int EDLineDetector::EDline(cv::Mat &image, LineChains &lines, bool smoothed)
#endif
//detect lines
unsigned int linePixelID = edges.sId[edges.numOfEdges];
lines.xCors.resize(linePixelID);
lines.yCors.resize(linePixelID);
lines.sId.resize(5*edges.numOfEdges);
lines.xCors.resize( linePixelID );
lines.yCors.resize( linePixelID );
lines.sId.resize( 5 * edges.numOfEdges );
unsigned int *pEdgeXCors = edges.xCors.data();
unsigned int *pEdgeYCors = edges.yCors.data();
unsigned int *pEdgeSID = edges.sId.data();
......@@ -824,192 +973,218 @@ int EDLineDetector::EDline(cv::Mat &image, LineChains &lines, bool smoothed)
unsigned int *pLineYCors = lines.yCors.data();
unsigned int *pLineSID = lines.sId.data();
logNT_ = 2.0 * ( log10( (double) imageWidth ) + log10( (double) imageHeight ) );
double lineFitErr;//the line fit error;
std::vector<double> lineEquation(2, 0);
//lineEquation.reserve(2);
double lineFitErr; //the line fit error;
std::vector<double> lineEquation( 2, 0 );
lineEquations_.clear();
lineEndpoints_.clear();
lineDirection_.clear();
unsigned char *pdirImg = dirImg_.data;
unsigned int numOfLines = 0;
unsigned int offsetInEdgeArrayS, offsetInEdgeArrayE, newOffsetS;//start index and end index
unsigned int offsetInLineArray=0;
float direction;//line direction
unsigned int offsetInEdgeArrayS, offsetInEdgeArrayE, newOffsetS; //start index and end index
unsigned int offsetInLineArray = 0;
float direction; //line direction
for(unsigned int edgeID=0; edgeID<edges.numOfEdges; edgeID++){
for ( unsigned int edgeID = 0; edgeID < edges.numOfEdges; edgeID++ )
{
offsetInEdgeArrayS = pEdgeSID[edgeID];
offsetInEdgeArrayE = pEdgeSID[edgeID+1];
while(offsetInEdgeArrayE > offsetInEdgeArrayS+minLineLen_){//extract line segments from an edge, may find more than one segments
offsetInEdgeArrayE = pEdgeSID[edgeID + 1];
while ( offsetInEdgeArrayE > offsetInEdgeArrayS + minLineLen_ )
{ //extract line segments from an edge, may find more than one segments
//find an initial line segment
while(offsetInEdgeArrayE > offsetInEdgeArrayS+minLineLen_){
lineFitErr = LeastSquaresLineFit_(pEdgeXCors,pEdgeYCors,
offsetInEdgeArrayS,lineEquation);
if(lineFitErr<=lineFitErrThreshold_) break;//ok, an initial line segment detected
offsetInEdgeArrayS +=SkipEdgePoint; //skip the first two pixel in the chain and try with the remaining pixels
}
if(lineFitErr>lineFitErrThreshold_) break; //no line is detected
while ( offsetInEdgeArrayE > offsetInEdgeArrayS + minLineLen_ )
{
lineFitErr = LeastSquaresLineFit_( pEdgeXCors, pEdgeYCors, offsetInEdgeArrayS, lineEquation );
if( lineFitErr <= lineFitErrThreshold_ )
break; //ok, an initial line segment detected
offsetInEdgeArrayS += SkipEdgePoint; //skip the first two pixel in the chain and try with the remaining pixels
}
if( lineFitErr > lineFitErrThreshold_ )
break; //no line is detected
//An initial line segment is detected. Try to extend this line segment
pLineSID[numOfLines] = offsetInLineArray;
double coef1;//for a line ax+by+c=0, coef1 = 1/sqrt(a^2+b^2);
double pointToLineDis;//for a line ax+by+c=0 and a point(xi, yi), pointToLineDis = coef1*|a*xi+b*yi+c|
bool bExtended=true;
double coef1; //for a line ax+by+c=0, coef1 = 1/sqrt(a^2+b^2);
double pointToLineDis; //for a line ax+by+c=0 and a point(xi, yi), pointToLineDis = coef1*|a*xi+b*yi+c|
bool bExtended = true;
bool bFirstTry = true;
int numOfOutlier;//to against noise, we accept a few outlier of a line.
int numOfOutlier; //to against noise, we accept a few outlier of a line.
int tryTimes = 0;
if(pdirImg[pEdgeYCors[offsetInEdgeArrayS]*imageWidth + pEdgeXCors[offsetInEdgeArrayS]]==Horizontal){//y=ax+b, i.e. ax-y+b=0
while(bExtended){
if( pdirImg[pEdgeYCors[offsetInEdgeArrayS] * imageWidth + pEdgeXCors[offsetInEdgeArrayS]] == Horizontal )
{ //y=ax+b, i.e. ax-y+b=0
while ( bExtended )
{
tryTimes++;
if(bFirstTry){
if( bFirstTry )
{
bFirstTry = false;
for(int i=0; i<minLineLen_; i++){//First add the initial line segment to the line array
for ( int i = 0; i < minLineLen_; i++ )
{ //First add the initial line segment to the line array
pLineXCors[offsetInLineArray] = pEdgeXCors[offsetInEdgeArrayS];
pLineYCors[offsetInLineArray++] = pEdgeYCors[offsetInEdgeArrayS++];
}
}else{//after each try, line is extended, line equation should be re-estimated
}
else
{ //after each try, line is extended, line equation should be re-estimated
//adjust the line equation
lineFitErr = LeastSquaresLineFit_(pLineXCors,pLineYCors,pLineSID[numOfLines],
newOffsetS,offsetInLineArray,lineEquation);
lineFitErr = LeastSquaresLineFit_( pLineXCors, pLineYCors, pLineSID[numOfLines], newOffsetS, offsetInLineArray, lineEquation );
}
coef1 = 1/sqrt(lineEquation[0]*lineEquation[0]+1);
numOfOutlier=0;
coef1 = 1 / sqrt( lineEquation[0] * lineEquation[0] + 1 );
numOfOutlier = 0;
newOffsetS = offsetInLineArray;
while(offsetInEdgeArrayE > offsetInEdgeArrayS){
pointToLineDis = fabs(lineEquation[0]*pEdgeXCors[offsetInEdgeArrayS] -
pEdgeYCors[offsetInEdgeArrayS] + lineEquation[1])*coef1;
while ( offsetInEdgeArrayE > offsetInEdgeArrayS )
{
pointToLineDis = fabs( lineEquation[0] * pEdgeXCors[offsetInEdgeArrayS] - pEdgeYCors[offsetInEdgeArrayS] + lineEquation[1] ) * coef1;
pLineXCors[offsetInLineArray] = pEdgeXCors[offsetInEdgeArrayS];
pLineYCors[offsetInLineArray++] = pEdgeYCors[offsetInEdgeArrayS++];
if(pointToLineDis>lineFitErrThreshold_){
if( pointToLineDis > lineFitErrThreshold_ )
{
numOfOutlier++;
if(numOfOutlier>3) break;
}else{//we count number of connective outliers.
numOfOutlier=0;
if( numOfOutlier > 3 )
break;
}
else
{ //we count number of connective outliers.
numOfOutlier = 0;
}
}
//pop back the last few outliers from lines and return them to edge chain
offsetInLineArray -=numOfOutlier;
offsetInEdgeArrayS -=numOfOutlier;
if(offsetInLineArray - newOffsetS>0&&tryTimes<TryTime){//some new pixels are added to the line
}else{
bExtended = false;//no new pixels are added.
offsetInLineArray -= numOfOutlier;
offsetInEdgeArrayS -= numOfOutlier;
if( offsetInLineArray - newOffsetS > 0 && tryTimes < TryTime )
{ //some new pixels are added to the line
}
else
{
bExtended = false; //no new pixels are added.
}
}
//the line equation coefficients,for line w1x+w2y+w3 =0, we normalize it to make w1^2+w2^2 = 1.
std::vector<double> lineEqu(3,0);
//lineEqu.reserve(3);
lineEqu[0] = lineEquation[0]*coef1;
lineEqu[1] = -1*coef1;
lineEqu[2] = lineEquation[1]*coef1;
if(LineValidation_(pLineXCors,pLineYCors,pLineSID[numOfLines],offsetInLineArray,lineEqu,direction)){//check the line
std::vector<double> lineEqu( 3, 0 );
lineEqu[0] = lineEquation[0] * coef1;
lineEqu[1] = -1 * coef1;
lineEqu[2] = lineEquation[1] * coef1;
if( LineValidation_( pLineXCors, pLineYCors, pLineSID[numOfLines], offsetInLineArray, lineEqu, direction ) )
{ //check the line
//store the line equation coefficients
lineEquations_.push_back(lineEqu);
lineEquations_.push_back( lineEqu );
/*At last, compute the line endpoints and store them.
*we project the first and last pixels in the pixelChain onto the best fit line
*to get the line endpoints.
*xp= (w2^2*x0-w1*w2*y0-w3*w1)/(w1^2+w2^2)
*yp= (w1^2*y0-w1*w2*x0-w3*w2)/(w1^2+w2^2) */
std::vector<float> lineEndP(4, 0);//line endpoints
//lineEndP.resize(4);
double a1 = lineEqu[1]*lineEqu[1];
double a2 = lineEqu[0]*lineEqu[0];
double a3 = lineEqu[0]*lineEqu[1];
double a4 = lineEqu[2]*lineEqu[0];
double a5 = lineEqu[2]*lineEqu[1];
unsigned int Px = pLineXCors[pLineSID[numOfLines] ];//first pixel
unsigned int Py = pLineYCors[pLineSID[numOfLines] ];
lineEndP[0] = a1*Px-a3*Py-a4;//x
lineEndP[1] = a2*Py-a3*Px-a5;//y
Px = pLineXCors[offsetInLineArray -1 ];//last pixel
Py = pLineYCors[offsetInLineArray -1 ];
lineEndP[2] = a1*Px-a3*Py-a4;//x
lineEndP[3] = a2*Py-a3*Px-a5;//y
lineEndpoints_.push_back(lineEndP);
lineDirection_.push_back(direction);
std::vector<float> lineEndP( 4, 0 ); //line endpoints
double a1 = lineEqu[1] * lineEqu[1];
double a2 = lineEqu[0] * lineEqu[0];
double a3 = lineEqu[0] * lineEqu[1];
double a4 = lineEqu[2] * lineEqu[0];
double a5 = lineEqu[2] * lineEqu[1];
unsigned int Px = pLineXCors[pLineSID[numOfLines]]; //first pixel
unsigned int Py = pLineYCors[pLineSID[numOfLines]];
lineEndP[0] = a1 * Px - a3 * Py - a4; //x
lineEndP[1] = a2 * Py - a3 * Px - a5; //y
Px = pLineXCors[offsetInLineArray - 1]; //last pixel
Py = pLineYCors[offsetInLineArray - 1];
lineEndP[2] = a1 * Px - a3 * Py - a4; //x
lineEndP[3] = a2 * Py - a3 * Px - a5; //y
lineEndpoints_.push_back( lineEndP );
lineDirection_.push_back( direction );
numOfLines++;
}else{
offsetInLineArray = pLineSID[numOfLines];// line was not accepted, the offset is set back
}
}else{//x=ay+b, i.e. x-ay-b=0
while(bExtended){
else
{
offsetInLineArray = pLineSID[numOfLines]; // line was not accepted, the offset is set back
}
}
else
{ //x=ay+b, i.e. x-ay-b=0
while ( bExtended )
{
tryTimes++;
if(bFirstTry){
if( bFirstTry )
{
bFirstTry = false;
for(int i=0; i<minLineLen_; i++){//First add the initial line segment to the line array
for ( int i = 0; i < minLineLen_; i++ )
{ //First add the initial line segment to the line array
pLineXCors[offsetInLineArray] = pEdgeXCors[offsetInEdgeArrayS];
pLineYCors[offsetInLineArray++] = pEdgeYCors[offsetInEdgeArrayS++];
}
}else{//after each try, line is extended, line equation should be re-estimated
}
else
{ //after each try, line is extended, line equation should be re-estimated
//adjust the line equation
lineFitErr = LeastSquaresLineFit_(pLineXCors,pLineYCors,pLineSID[numOfLines],
newOffsetS,offsetInLineArray,lineEquation);
lineFitErr = LeastSquaresLineFit_( pLineXCors, pLineYCors, pLineSID[numOfLines], newOffsetS, offsetInLineArray, lineEquation );
}
coef1 = 1/sqrt(1+lineEquation[0]*lineEquation[0]);
numOfOutlier=0;
coef1 = 1 / sqrt( 1 + lineEquation[0] * lineEquation[0] );
numOfOutlier = 0;
newOffsetS = offsetInLineArray;
while(offsetInEdgeArrayE > offsetInEdgeArrayS){
pointToLineDis = fabs(pEdgeXCors[offsetInEdgeArrayS] -
lineEquation[0]*pEdgeYCors[offsetInEdgeArrayS] - lineEquation[1])*coef1;
while ( offsetInEdgeArrayE > offsetInEdgeArrayS )
{
pointToLineDis = fabs( pEdgeXCors[offsetInEdgeArrayS] - lineEquation[0] * pEdgeYCors[offsetInEdgeArrayS] - lineEquation[1] ) * coef1;
pLineXCors[offsetInLineArray] = pEdgeXCors[offsetInEdgeArrayS];
pLineYCors[offsetInLineArray++] = pEdgeYCors[offsetInEdgeArrayS++];
if(pointToLineDis>lineFitErrThreshold_){
if( pointToLineDis > lineFitErrThreshold_ )
{
numOfOutlier++;
if(numOfOutlier>3) break;
}else{//we count number of connective outliers.
numOfOutlier=0;
if( numOfOutlier > 3 )
break;
}
else
{ //we count number of connective outliers.
numOfOutlier = 0;
}
}
//pop back the last few outliers from lines and return them to edge chain
offsetInLineArray -= numOfOutlier;
offsetInEdgeArrayS -= numOfOutlier;
if(offsetInLineArray - newOffsetS>0&&tryTimes<TryTime){//some new pixels are added to the line
}else{
bExtended = false;//no new pixels are added.
if( offsetInLineArray - newOffsetS > 0 && tryTimes < TryTime )
{ //some new pixels are added to the line
}
else
{
bExtended = false; //no new pixels are added.
}
}
//the line equation coefficients,for line w1x+w2y+w3 =0, we normalize it to make w1^2+w2^2 = 1.
std::vector<double> lineEqu(3,0);
//lineEqu.reserve(3);
lineEqu[0] = 1*coef1;
lineEqu[1] = -lineEquation[0]*coef1;
lineEqu[2] = -lineEquation[1]*coef1;
std::vector<double> lineEqu( 3, 0 );
lineEqu[0] = 1 * coef1;
lineEqu[1] = -lineEquation[0] * coef1;
lineEqu[2] = -lineEquation[1] * coef1;
if(LineValidation_(pLineXCors,pLineYCors,pLineSID[numOfLines],offsetInLineArray,lineEqu,direction)){//check the line
if( LineValidation_( pLineXCors, pLineYCors, pLineSID[numOfLines], offsetInLineArray, lineEqu, direction ) )
{ //check the line
//store the line equation coefficients
lineEquations_.push_back(lineEqu);
lineEquations_.push_back( lineEqu );
/*At last, compute the line endpoints and store them.
*we project the first and last pixels in the pixelChain onto the best fit line
*to get the line endpoints.
*xp= (w2^2*x0-w1*w2*y0-w3*w1)/(w1^2+w2^2)
*yp= (w1^2*y0-w1*w2*x0-w3*w2)/(w1^2+w2^2) */
std::vector<float> lineEndP(4,0);//line endpoints
//lineEndP.reserve(4);
double a1 = lineEqu[1]*lineEqu[1];
double a2 = lineEqu[0]*lineEqu[0];
double a3 = lineEqu[0]*lineEqu[1];
double a4 = lineEqu[2]*lineEqu[0];
double a5 = lineEqu[2]*lineEqu[1];
unsigned int Px = pLineXCors[pLineSID[numOfLines] ];//first pixel
unsigned int Py = pLineYCors[pLineSID[numOfLines] ];
lineEndP[0] = a1*Px-a3*Py-a4;//x
lineEndP[1] = a2*Py-a3*Px-a5;//y
Px = pLineXCors[offsetInLineArray -1 ];//last pixel
Py = pLineYCors[offsetInLineArray -1 ];
lineEndP[2] = a1*Px-a3*Py-a4;//x
lineEndP[3] = a2*Py-a3*Px-a5;//y
lineEndpoints_.push_back(lineEndP);
lineDirection_.push_back(direction);
std::vector<float> lineEndP( 4, 0 ); //line endpoints
double a1 = lineEqu[1] * lineEqu[1];
double a2 = lineEqu[0] * lineEqu[0];
double a3 = lineEqu[0] * lineEqu[1];
double a4 = lineEqu[2] * lineEqu[0];
double a5 = lineEqu[2] * lineEqu[1];
unsigned int Px = pLineXCors[pLineSID[numOfLines]]; //first pixel
unsigned int Py = pLineYCors[pLineSID[numOfLines]];
lineEndP[0] = a1 * Px - a3 * Py - a4; //x
lineEndP[1] = a2 * Py - a3 * Px - a5; //y
Px = pLineXCors[offsetInLineArray - 1]; //last pixel
Py = pLineYCors[offsetInLineArray - 1];
lineEndP[2] = a1 * Px - a3 * Py - a4; //x
lineEndP[3] = a2 * Py - a3 * Px - a5; //y
lineEndpoints_.push_back( lineEndP );
lineDirection_.push_back( direction );
numOfLines++;
}else{
offsetInLineArray = pLineSID[numOfLines];// line was not accepted, the offset is set back
}
else
{
offsetInLineArray = pLineSID[numOfLines]; // line was not accepted, the offset is set back
}
}
//Extract line segments from the remaining pixel; Current chain has been shortened already.
}
}//end for(unsigned int edgeID=0; edgeID<edges.numOfEdges; edgeID++)
// if(numOfLines < 5)
// {
// std::cout<<"LINEE MINORI DI 5"<<std::endl;
// //return -1;
// }
} //end for(unsigned int edgeID=0; edgeID<edges.numOfEdges; edgeID++)
pLineSID[numOfLines] = offsetInLineArray;
lines.numOfLines = numOfLines;
#ifdef DEBUGEDLine
......@@ -1046,14 +1221,9 @@ int EDLineDetector::EDline(cv::Mat &image, LineChains &lines, bool smoothed)
return 1;
}
double EDLineDetector::LeastSquaresLineFit_( unsigned int *xCors, unsigned int *yCors,
unsigned int offsetS, std::vector<double> &lineEquation)
double EDLineDetector::LeastSquaresLineFit_( unsigned int *xCors, unsigned int *yCors, unsigned int offsetS, std::vector<double> &lineEquation )
{
// if(lineEquation.size()!=2){
// //std::cout<<"SHOULD NOT BE != 2"<<std::endl;
// //CV_Assert(false);
// }
float * pMatT;
float * pATA;
double fitError = 0;
......@@ -1062,246 +1232,261 @@ double EDLineDetector::LeastSquaresLineFit_( unsigned int *xCors, unsigned in
unsigned int offset = offsetS;
/*If the first pixel in this chain is horizontal,
*then we try to find a horizontal line, y=ax+b;*/
if(pdirImg[yCors[offsetS]*imageWidth + xCors[offsetS]]==Horizontal){
if( pdirImg[yCors[offsetS] * imageWidth + xCors[offsetS]] == Horizontal )
{
/*Build the system,and solve it using least square regression: mat * [a,b]^T = vec
* [x0,1] [y0]
* [x1,1] [a] [y1]
* . [b] = .
* [xn,1] [yn]*/
pMatT= fitMatT.ptr<float>();//fitMatT = [x0, x1, ... xn; 1,1,...,1];
for(int i=0; i<minLineLen_; i++){
pMatT = fitMatT.ptr<float>(); //fitMatT = [x0, x1, ... xn; 1,1,...,1];
for ( int i = 0; i < minLineLen_; i++ )
{
//*(pMatT+minLineLen_) = 1; //the value are not changed;
*(pMatT++) = xCors[offsetS];
* ( pMatT++ ) = xCors[offsetS];
fitVec[0][i] = yCors[offsetS++];
}
ATA = fitMatT * fitMatT.t();
ATV = fitMatT * fitVec.t();
/* [a,b]^T = Inv(mat^T * mat) * mat^T * vec */
pATA = ATA.ptr<float>();
coef = 1.0/(double(pATA[0])*double(pATA[3]) - double(pATA[1])*double(pATA[2]));
coef = 1.0 / ( double( pATA[0] ) * double( pATA[3] ) - double( pATA[1] ) * double( pATA[2] ) );
// lineEquation = svd.Invert(ATA) * matT * vec;
lineEquation[0] = coef *( double(pATA[3])*double(ATV[0][0])-double(pATA[1])*double(ATV[0][1]));
lineEquation[1] = coef *( double(pATA[0])*double(ATV[0][1])-double(pATA[2])*double(ATV[0][0]));
lineEquation[0] = coef * ( double( pATA[3] ) * double( ATV[0][0] ) - double( pATA[1] ) * double( ATV[0][1] ) );
lineEquation[1] = coef * ( double( pATA[0] ) * double( ATV[0][1] ) - double( pATA[2] ) * double( ATV[0][0] ) );
/*compute line fit error */
for(int i=0; i<minLineLen_; i++){
for ( int i = 0; i < minLineLen_; i++ )
{
//coef = double(yCors[offset]) - double(xCors[offset++]) * lineEquation[0] - lineEquation[1];
coef = double(yCors[offset]) - double(xCors[offset]) * lineEquation[0] - lineEquation[1];
coef = double( yCors[offset] ) - double( xCors[offset] ) * lineEquation[0] - lineEquation[1];
offset++;
fitError += coef*coef;
fitError += coef * coef;
}
return sqrt(fitError);
return sqrt( fitError );
}
/*If the first pixel in this chain is vertical,
*then we try to find a vertical line, x=ay+b;*/
if(pdirImg[yCors[offsetS]*imageWidth + xCors[offsetS]]==Vertical){
if( pdirImg[yCors[offsetS] * imageWidth + xCors[offsetS]] == Vertical )
{
/*Build the system,and solve it using least square regression: mat * [a,b]^T = vec
* [y0,1] [x0]
* [y1,1] [a] [x1]
* . [b] = .
* [yn,1] [xn]*/
pMatT = fitMatT.ptr<float>();//fitMatT = [y0, y1, ... yn; 1,1,...,1];
for(int i=0; i<minLineLen_; i++){
pMatT = fitMatT.ptr<float>(); //fitMatT = [y0, y1, ... yn; 1,1,...,1];
for ( int i = 0; i < minLineLen_; i++ )
{
//*(pMatT+minLineLen_) = 1;//the value are not changed;
*(pMatT++) = yCors[offsetS];
* ( pMatT++ ) = yCors[offsetS];
fitVec[0][i] = xCors[offsetS++];
}
ATA = fitMatT*(fitMatT.t());
ATA = fitMatT * ( fitMatT.t() );
ATV = fitMatT * fitVec.t();
/* [a,b]^T = Inv(mat^T * mat) * mat^T * vec */
pATA = ATA.ptr<float>();
coef = 1.0/(double(pATA[0])*double(pATA[3]) - double(pATA[1])*double(pATA[2]));
coef = 1.0 / ( double( pATA[0] ) * double( pATA[3] ) - double( pATA[1] ) * double( pATA[2] ) );
// lineEquation = svd.Invert(ATA) * matT * vec;
lineEquation[0] = coef *( double(pATA[3])*double(ATV[0][0])-double(pATA[1])*double(ATV[0][1]));
lineEquation[1] = coef *( double(pATA[0])*double(ATV[0][1])-double(pATA[2])*double(ATV[0][0]));
lineEquation[0] = coef * ( double( pATA[3] ) * double( ATV[0][0] ) - double( pATA[1] ) * double( ATV[0][1] ) );
lineEquation[1] = coef * ( double( pATA[0] ) * double( ATV[0][1] ) - double( pATA[2] ) * double( ATV[0][0] ) );
/*compute line fit error */
for(int i=0; i<minLineLen_; i++){
for ( int i = 0; i < minLineLen_; i++ )
{
//coef = double(xCors[offset]) - double(yCors[offset++]) * lineEquation[0] - lineEquation[1];
coef = double(xCors[offset]) - double(yCors[offset]) * lineEquation[0] - lineEquation[1];
coef = double( xCors[offset] ) - double( yCors[offset] ) * lineEquation[0] - lineEquation[1];
offset++;
fitError += coef*coef;
fitError += coef * coef;
}
return sqrt(fitError);
return sqrt( fitError );
}
return 0;
}
double EDLineDetector::LeastSquaresLineFit_( unsigned int *xCors, unsigned int *yCors,
unsigned int offsetS, unsigned int newOffsetS,
unsigned int offsetE, std::vector<double> &lineEquation)
double EDLineDetector::LeastSquaresLineFit_( unsigned int *xCors, unsigned int *yCors, unsigned int offsetS, unsigned int newOffsetS,
unsigned int offsetE, std::vector<double> &lineEquation )
{
int length = offsetE - offsetS;
int newLength = offsetE - newOffsetS;
if(length<=0||newLength<=0){
cout<<"EDLineDetector::LeastSquaresLineFit_ Error:"
if( length <= 0 || newLength <= 0 )
{
cout << "EDLineDetector::LeastSquaresLineFit_ Error:"
" the expected line index is wrong...offsetE = "
<<offsetE<<", offsetS="<<offsetS<<", newOffsetS="<<newOffsetS<<endl;
<< offsetE << ", offsetS=" << offsetS << ", newOffsetS=" << newOffsetS << endl;
return -1;
}
if(lineEquation.size()!=2){
std::cout<<"SHOULD NOT BE != 2"<<std::endl;
//CV_Assert(false);
if( lineEquation.size() != 2 )
{
std::cout << "SHOULD NOT BE != 2" << std::endl;
}
cv::Mat_<float> matT(2,newLength);
cv::Mat_<float> vec(newLength,1);
cv::Mat_<float> matT( 2, newLength );
cv::Mat_<float> vec( newLength, 1 );
float * pMatT;
float * pATA;
// double fitError = 0;
double coef;
unsigned char *pdirImg = dirImg_.data;
/*If the first pixel in this chain is horizontal,
*then we try to find a horizontal line, y=ax+b;*/
if(pdirImg[yCors[offsetS]*imageWidth + xCors[offsetS] ]==Horizontal){
if( pdirImg[yCors[offsetS] * imageWidth + xCors[offsetS]] == Horizontal )
{
/*Build the new system,and solve it using least square regression: mat * [a,b]^T = vec
* [x0',1] [y0']
* [x1',1] [a] [y1']
* . [b] = .
* [xn',1] [yn']*/
pMatT = matT.ptr<float>();//matT = [x0', x1', ... xn'; 1,1,...,1]
for(int i=0; i<newLength; i++){
*(pMatT+newLength) = 1;
*(pMatT++) = xCors[newOffsetS];
pMatT = matT.ptr<float>(); //matT = [x0', x1', ... xn'; 1,1,...,1]
for ( int i = 0; i < newLength; i++ )
{
* ( pMatT + newLength ) = 1;
* ( pMatT++ ) = xCors[newOffsetS];
vec[0][i] = yCors[newOffsetS++];
}
/* [a,b]^T = Inv(ATA + mat^T * mat) * (ATV + mat^T * vec) */
tempMatLineFit = matT* matT.t();
tempMatLineFit = matT * matT.t();
tempVecLineFit = matT * vec;
ATA = ATA + tempMatLineFit;
ATV = ATV + tempVecLineFit;
pATA = ATA.ptr<float>();
coef = 1.0/(double(pATA[0])*double(pATA[3]) - double(pATA[1])*double(pATA[2]));
lineEquation[0] = coef *( double(pATA[3])*double(ATV[0][0])-double(pATA[1])*double(ATV[0][1]));
lineEquation[1] = coef *( double(pATA[0])*double(ATV[0][1])-double(pATA[2])*double(ATV[0][0]));
/*compute line fit error */
// for(int i=0; i<length; i++){
// coef = double(yCors[offsetS]) - double(xCors[offsetS++]) * lineEquation[0] - lineEquation[1];
// fitError += coef*coef;
// }
coef = 1.0 / ( double( pATA[0] ) * double( pATA[3] ) - double( pATA[1] ) * double( pATA[2] ) );
lineEquation[0] = coef * ( double( pATA[3] ) * double( ATV[0][0] ) - double( pATA[1] ) * double( ATV[0][1] ) );
lineEquation[1] = coef * ( double( pATA[0] ) * double( ATV[0][1] ) - double( pATA[2] ) * double( ATV[0][0] ) );
return 0;
}
/*If the first pixel in this chain is vertical,
*then we try to find a vertical line, x=ay+b;*/
if(pdirImg[yCors[offsetS]*imageWidth + xCors[offsetS] ]==Vertical){
if( pdirImg[yCors[offsetS] * imageWidth + xCors[offsetS]] == Vertical )
{
/*Build the system,and solve it using least square regression: mat * [a,b]^T = vec
* [y0',1] [x0']
* [y1',1] [a] [x1']
* . [b] = .
* [yn',1] [xn']*/
// pMatT = matT.GetData();//matT = [y0', y1', ... yn'; 1,1,...,1]
pMatT = matT.ptr<float>();//matT = [y0', y1', ... yn'; 1,1,...,1]
for(int i=0; i<newLength; i++){
*(pMatT+newLength) = 1;
*(pMatT++) = yCors[newOffsetS];
pMatT = matT.ptr<float>(); //matT = [y0', y1', ... yn'; 1,1,...,1]
for ( int i = 0; i < newLength; i++ )
{
* ( pMatT + newLength ) = 1;
* ( pMatT++ ) = yCors[newOffsetS];
vec[0][i] = xCors[newOffsetS++];
}
/* [a,b]^T = Inv(ATA + mat^T * mat) * (ATV + mat^T * vec) */
// matT.MultiplyWithTransposeOf(matT, tempMatLineFit);
tempMatLineFit = matT*matT.t();
tempMatLineFit = matT * matT.t();
tempVecLineFit = matT * vec;
ATA = ATA + tempMatLineFit;
ATV = ATV + tempVecLineFit;
// pATA = ATA.GetData();
pATA = ATA.ptr<float>();
coef = 1.0/(double(pATA[0])*double(pATA[3]) - double(pATA[1])*double(pATA[2]));
lineEquation[0] = coef *( double(pATA[3])*double(ATV[0][0])-double(pATA[1])*double(ATV[0][1]));
lineEquation[1] = coef *( double(pATA[0])*double(ATV[0][1])-double(pATA[2])*double(ATV[0][0]));
/*compute line fit error */
// for(int i=0; i<length; i++){
// coef = double(xCors[offsetS]) - double(yCors[offsetS++]) * lineEquation[0] - lineEquation[1];
// fitError += coef*coef;
// }
coef = 1.0 / ( double( pATA[0] ) * double( pATA[3] ) - double( pATA[1] ) * double( pATA[2] ) );
lineEquation[0] = coef * ( double( pATA[3] ) * double( ATV[0][0] ) - double( pATA[1] ) * double( ATV[0][1] ) );
lineEquation[1] = coef * ( double( pATA[0] ) * double( ATV[0][1] ) - double( pATA[2] ) * double( ATV[0][0] ) );
}
return 0;
}
bool EDLineDetector::LineValidation_( unsigned int *xCors, unsigned int *yCors,
unsigned int offsetS, unsigned int offsetE,
std::vector<double> &lineEquation, float &direction)
bool EDLineDetector::LineValidation_( unsigned int *xCors, unsigned int *yCors, unsigned int offsetS, unsigned int offsetE,
std::vector<double> &lineEquation, float &direction )
{
if(bValidate_){
if( bValidate_ )
{
int n = offsetE - offsetS;
/*first compute the direction of line, make sure that the dark side always be the
*left side of a line.*/
int meanGradientX=0, meanGradientY=0;
int meanGradientX = 0, meanGradientY = 0;
short *pdxImg = dxImg_.ptr<short>();
short *pdyImg = dyImg_.ptr<short>();
double dx, dy;
std::vector<double> pointDirection;
int index;
for(int i=0; i<n; i++){
//index = yCors[offsetS]*imageWidth + xCors[offsetS++];
index = yCors[offsetS]*imageWidth + xCors[offsetS];
for ( int i = 0; i < n; i++ )
{
index = yCors[offsetS] * imageWidth + xCors[offsetS];
offsetS++;
meanGradientX += pdxImg[index];
meanGradientY += pdyImg[index];
dx = (double) pdxImg[index];
dy = (double) pdyImg[index];
pointDirection.push_back(atan2(-dx,dy));
pointDirection.push_back( atan2( -dx, dy ) );
}
dx = fabs(lineEquation[1]);
dy = fabs(lineEquation[0]);
if(meanGradientX==0&&meanGradientY==0){//not possible, if happens, it must be a wrong line,
dx = fabs( lineEquation[1] );
dy = fabs( lineEquation[0] );
if( meanGradientX == 0 && meanGradientY == 0 )
{ //not possible, if happens, it must be a wrong line,
return false;
}
if(meanGradientX>0&&meanGradientY>=0){//first quadrant, and positive direction of X axis.
direction = atan2(-dy,dx);//line direction is in fourth quadrant
if( meanGradientX > 0 && meanGradientY >= 0 )
{ //first quadrant, and positive direction of X axis.
direction = atan2( -dy, dx ); //line direction is in fourth quadrant
}
if(meanGradientX<=0&&meanGradientY>0){//second quadrant, and positive direction of Y axis.
direction = atan2(dy,dx);//line direction is in first quadrant
if( meanGradientX <= 0 && meanGradientY > 0 )
{ //second quadrant, and positive direction of Y axis.
direction = atan2( dy, dx ); //line direction is in first quadrant
}
if(meanGradientX<0&&meanGradientY<=0){//third quadrant, and negative direction of X axis.
direction = atan2(dy,-dx);//line direction is in second quadrant
if( meanGradientX < 0 && meanGradientY <= 0 )
{ //third quadrant, and negative direction of X axis.
direction = atan2( dy, -dx ); //line direction is in second quadrant
}
if(meanGradientX>=0&&meanGradientY<0){//fourth quadrant, and negative direction of Y axis.
direction = atan2(-dy,-dx);//line direction is in third quadrant
if( meanGradientX >= 0 && meanGradientY < 0 )
{ //fourth quadrant, and negative direction of Y axis.
direction = atan2( -dy, -dx ); //line direction is in third quadrant
}
/*then check whether the line is on the border of the image. We don't keep the border line.*/
if(fabs(direction)<0.15||M_PI-fabs(direction)<0.15){//Horizontal line
if(fabs(lineEquation[2])<10||fabs(imageHeight - fabs(lineEquation[2]))<10){//upper border or lower border
if( fabs( direction ) < 0.15 || M_PI - fabs( direction ) < 0.15 )
{ //Horizontal line
if( fabs( lineEquation[2] ) < 10 || fabs( imageHeight - fabs( lineEquation[2] ) ) < 10 )
{ //upper border or lower border
return false;
}
}
if(fabs(fabs(direction)-M_PI*0.5)<0.15){//Vertical line
if(fabs(lineEquation[2])<10||fabs(imageWidth - fabs(lineEquation[2]))<10){//left border or right border
if( fabs( fabs( direction ) - M_PI * 0.5 ) < 0.15 )
{ //Vertical line
if( fabs( lineEquation[2] ) < 10 || fabs( imageWidth - fabs( lineEquation[2] ) ) < 10 )
{ //left border or right border
return false;
}
}
//count the aligned points on the line which have the same direction as the line.
double disDirection;
int k = 0;
for(int i=0; i<n; i++){
disDirection = fabs(direction - pointDirection[i]);
if(fabs(2*M_PI-disDirection)<0.392699||disDirection<0.392699){//same direction, pi/8 = 0.392699081698724
for ( int i = 0; i < n; i++ )
{
disDirection = fabs( direction - pointDirection[i] );
if( fabs( 2 * M_PI - disDirection ) < 0.392699 || disDirection < 0.392699 )
{ //same direction, pi/8 = 0.392699081698724
k++;
}
}
//now compute NFA(Number of False Alarms)
double ret = nfa(n,k,0.125,logNT_);
double ret = nfa( n, k, 0.125, logNT_ );
return (ret>0); //0 corresponds to 1 mean false alarm
}else{
return ( ret > 0 ); //0 corresponds to 1 mean false alarm
}
else
{
return true;
}
}
int EDLineDetector::EDline(cv::Mat &image, bool smoothed)
int EDLineDetector::EDline( cv::Mat &image, bool smoothed )
{
if((EDline(image, lines_, smoothed)) != true){
if( ( EDline( image, lines_, smoothed ) ) != true )
{
return -1;
}
lineSalience_.clear();
lineSalience_.resize(lines_.numOfLines);
lineSalience_.resize( lines_.numOfLines );
unsigned char *pgImg = gImgWO_.ptr();
unsigned int indexInLineArray;
unsigned int *pXCor = lines_.xCors.data();
unsigned int *pYCor = lines_.yCors.data();
unsigned int *pSID = lines_.sId.data();
for(unsigned int i=0; i<lineSalience_.size(); i++){
int salience=0;
for(indexInLineArray = pSID[i];indexInLineArray < pSID[i+1]; indexInLineArray++){
salience += pgImg[pYCor[indexInLineArray]*imageWidth+pXCor[indexInLineArray] ];
for ( unsigned int i = 0; i < lineSalience_.size(); i++ )
{
int salience = 0;
for ( indexInLineArray = pSID[i]; indexInLineArray < pSID[i + 1]; indexInLineArray++ )
{
salience += pgImg[pYCor[indexInLineArray] * imageWidth + pXCor[indexInLineArray]];
}
lineSalience_[i] = (float) salience;
}
return 1;
}
......@@ -46,12 +46,14 @@ namespace cv
CV_INIT_ALGORITHM( BinaryDescriptor, "BINARY.DESCRIPTOR", );
CV_INIT_ALGORITHM( BinaryDescriptorMatcher, "BINARY.DESCRIPTOR.MATCHER", );
CV_INIT_ALGORITHM( LSDDetector, "LSDDETECTOR", );
bool initModule_line_descriptor( void )
{
bool all = true;
all &= !BinaryDescriptor_info_auto.name().empty();
all &= !BinaryDescriptorMatcher_info_auto.name().empty();
all &= !LSDDetector_info_auto.name().empty();
return all;
}
......
......@@ -58,8 +58,6 @@
#include <algorithm>
#include <bitset>
#include "opencv2/line_descriptor.hpp"
#endif
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment