Commit 46b2cb24 authored by Kirill Kornyakov's avatar Kirill Kornyakov

Merge pull request #1 from lenlen/tracking_api

Added initial version of the tracking module
parents e3aa8bf9 9e30b50d
set(the_description "Tracking API")
ocv_define_module(tracking opencv_imgproc)
Common Interfaces of Tracker
============================
.. highlight:: cpp
Tracker : Algorithm
-------------------
.. ocv:class:: Tracker
Base abstract class for the long-term tracker::
class CV_EXPORTS_W Tracker : public virtual Algorithm
{
virtual ~Tracker();
bool init( const Mat& image, const Rect& boundingBox );
bool update( const Mat& image, Rect& boundingBox );
static Ptr<Tracker> create( const String& trackerType );
};
Tracker::init
-------------
Initialize the tracker with a know bounding box that surrounding the target
.. ocv:function:: bool Tracker::init( const Mat& image, const Rect& boundingBox )
:param image: The initial frame
:param boundingBox: The initial boundig box
Tracker::update
---------------
Update the tracker, find the new most likely bounding box for the target
.. ocv:function:: bool Tracker::update( const Mat& image, Rect& boundingBox )
:param image: The current frame
:param boundingBox: The boundig box that represent the new target location
Tracker::create
---------------
Creates a tracker by its name.
.. ocv:function:: static Ptr<Tracker> Tracker::create( const String& trackerType )
:param trackerType: Tracker type
The following detector types are supported:
* ``"MIL"`` -- :ocv:class:`TrackerMIL`
* ``"BOOSTING"`` -- :ocv:class:`TrackerBoosting`
Creating Own Tracker
--------------------
If you want create a new tracker, you should follow some simple rules.
First, your tracker should be inherit from :ocv:class:`Tracker`, so you must implement two method:
* Tracker: initImpl, it should be called once in the first frame, here you should initialize all structures. The second argument is the initial bounding box of the target.
* Tracker:updateImpl, it should be called at the begin of in loop through video frames. Here you should overwrite the bounding box with new location.
Example of creating specialized Tracker ``TrackerMIL`` : ::
class CV_EXPORTS_W TrackerMIL : public Tracker
{
public:
TrackerMIL( const TrackerMIL::Params &parameters = TrackerMIL::Params() );
virtual ~TrackerMIL();
...
protected:
bool initImpl( const Mat& image, const Rect& boundingBox );
bool updateImpl( const Mat& image, Rect& boundingBox );
...
};
Every tracker has three component :ocv:class:`TrackerSampler`, :ocv:class:`TrackerFeatureSet` and :ocv:class:`TrackerModel`.
The first two are instantiated from Tracker base class, instead the last component is abstract, so you must implement your TrackerModel.
Finally add your tracker in the file tracking_init.cpp
TrackerSampler
..............
TrackerSampler is already instantiated, but you should define the sampling algorithm and add the classes (or single class) to TrackerSampler.
You can choose one of the ready implementation as TrackerSamplerCSC or you can implement your sampling method, in this case
the class must inherit :ocv:class:`TrackerSamplerAlgorithm`. Fill the samplingImpl method that writes the result in "sample" output argument.
Example of creating specialized TrackerSamplerAlgorithm ``TrackerSamplerCSC`` : ::
class CV_EXPORTS_W TrackerSamplerCSC : public TrackerSamplerAlgorithm
{
public:
TrackerSamplerCSC( const TrackerSamplerCSC::Params &parameters = TrackerSamplerCSC::Params() );
~TrackerSamplerCSC();
...
protected:
bool samplingImpl( const Mat& image, Rect boundingBox, std::vector<Mat>& sample );
...
};
Example of adding TrackerSamplerAlgorithm to TrackerSampler : ::
//sampler is the TrackerSampler
Ptr<TrackerSamplerAlgorithm> CSCSampler = new TrackerSamplerCSC( CSCparameters );
if( !sampler->addTrackerSamplerAlgorithm( CSCSampler ) )
return false;
//or add CSC sampler with default parameters
//sampler->addTrackerSamplerAlgorithm( "CSC" );
.. seealso::
:ocv:class:`TrackerSamplerCSC`, :ocv:class:`TrackerSamplerAlgorithm`
TrackerFeatureSet
.................
TrackerFeatureSet is already instantiated (as first) , but you should define what kinds of features you'll use in your tracker.
You can use multiple feature types, so you can add a ready implementation as :ocv:class:`TrackerFeatureHAAR` in your TrackerFeatureSet or develop your own implementation.
In this case, in the computeImpl method put the code that extract the features and
in the selection method optionally put the code for the refinement and selection of the features.
Example of creating specialized TrackerFeature ``TrackerFeatureHAAR`` : ::
class CV_EXPORTS_W TrackerFeatureHAAR : public TrackerFeature
{
public:
TrackerFeatureHAAR( const TrackerFeatureHAAR::Params &parameters = TrackerFeatureHAAR::Params() );
~TrackerFeatureHAAR();
void selection( Mat& response, int npoints );
...
protected:
bool computeImpl( const std::vector<Mat>& images, Mat& response );
...
};
Example of adding TrackerFeature to TrackerFeatureSet : ::
//featureSet is the TrackerFeatureSet
Ptr<TrackerFeature> trackerFeature = new TrackerFeatureHAAR( HAARparameters );
featureSet->addTrackerFeature( trackerFeature );
.. seealso::
:ocv:class:`TrackerFeatureHAAR`, :ocv:class:`TrackerFeatureSet`
TrackerModel
............
TrackerModel is abstract, so in your implementation you must develop your TrackerModel that inherit from :ocv:class:`TrackerModel`.
Fill the method for the estimation of the state "modelEstimationImpl", that estimates the most likely target location,
see [AAM]_ table I (ME) for further information. Fill "modelUpdateImpl" in order to update the model, see [AAM]_ table I (MU).
In this class you can use the :c:type:`ConfidenceMap` and :c:type:`Trajectory` to storing the model. The first represents the model on the all
possible candidate states and the second represents the list of all estimated states.
Example of creating specialized TrackerModel ``TrackerMILModel`` : ::
class TrackerMILModel : public TrackerModel
{
public:
TrackerMILModel( const Rect& boundingBox );
~TrackerMILModel();
...
protected:
void modelEstimationImpl( const std::vector<Mat>& responses );
void modelUpdateImpl();
...
};
And add it in your Tracker : ::
bool TrackerMIL::initImpl( const Mat& image, const Rect& boundingBox )
{
...
//model is the general TrackerModel field od the general Tracker
model = new TrackerMILModel( boundingBox );
...
}
In the last step you should define the TrackerStateEstimator based on your implementation or you can use one of ready class as :ocv:class:`TrackerStateEstimatorMILBoosting`.
It represent the statistical part of the model that estimates the most likely target state.
Example of creating specialized TrackerStateEstimator ``TrackerStateEstimatorMILBoosting`` : ::
class CV_EXPORTS_W TrackerStateEstimatorMILBoosting : public TrackerStateEstimator
{
class TrackerMILTargetState : public TrackerTargetState
{
...
};
public:
TrackerStateEstimatorMILBoosting( int nFeatures = 250 );
~TrackerStateEstimatorMILBoosting();
...
protected:
Ptr<TrackerTargetState> estimateImpl( const std::vector<ConfidenceMap>& confidenceMaps );
void updateImpl( std::vector<ConfidenceMap>& confidenceMaps );
...
};
And add it in your TrackerModel : ::
//model is the TrackerModel of your Tracker
Ptr<TrackerStateEstimatorMILBoosting> stateEstimator = new TrackerStateEstimatorMILBoosting( params.featureSetNumFeatures );
model->setTrackerStateEstimator( stateEstimator );
.. seealso::
:ocv:class:`TrackerModel`, :ocv:class:`TrackerStateEstimatorMILBoosting`, :ocv:class:`TrackerTargetState`
During this step, you should define your TrackerTargetState based on your implementation. :ocv:class:`TrackerTargetState` base class has only the bounding box (upper-left position, width and height), you can
enrich it adding scale factor, target rotation, etc.
Example of creating specialized TrackerTargetState ``TrackerMILTargetState`` : ::
class TrackerMILTargetState : public TrackerTargetState
{
public:
TrackerMILTargetState( const Point2f& position, int targetWidth, int targetHeight, bool foreground, const Mat& features );
~TrackerMILTargetState();
...
private:
bool isTarget;
Mat targetFeatures;
...
};
Try it
......
To try your tracker you can use the demo at https://github.com/lenlen/opencv/blob/tracking_api/samples/cpp/tracker.cpp.
The first argument is the name of the tracker and the second is a video source.
Common Interfaces of TrackerFeatureSet
======================================
.. highlight:: cpp
TrackerFeatureSet
-----------------
Class that manages the extraction and selection of features
[AAM]_ Feature Extraction and Feature Set Refinement (Feature Processing and Feature Selection). See table I and section III C
[AMVOT]_ Appearance modelling -> Visual representation (Table II, section 3.1 - 3.2)
.. ocv:class:: TrackerFeatureSet
TrackerFeatureSet class::
class CV_EXPORTS_W TrackerFeatureSet
{
public:
TrackerFeatureSet();
~TrackerFeatureSet();
void extraction( const std::vector<Mat>& images );
void selection();
void removeOutliers();
bool addTrackerFeature( String trackerFeatureType );
bool addTrackerFeature( Ptr<TrackerFeature>& feature );
const std::vector<std::pair<String, Ptr<TrackerFeature> > >& getTrackerFeature() const;
const std::vector<Mat>& getResponses() const;
};
TrackerFeatureSet is an aggregation of :ocv:class:`TrackerFeature`
.. seealso::
:ocv:class:`TrackerFeature`
TrackerFeatureSet::extraction
-----------------------------
Extract features from the images collection
.. ocv:function:: void TrackerFeatureSet::extraction( const std::vector<Mat>& images )
:param images: The input images
TrackerFeatureSet::selection
----------------------------
Identify most effective features for all feature types (optional)
.. ocv:function:: void TrackerFeatureSet::selection()
TrackerFeatureSet::removeOutliers
---------------------------------
Remove outliers for all feature types (optional)
.. ocv:function:: void TrackerFeatureSet::removeOutliers()
TrackerFeatureSet::addTrackerFeature
------------------------------------
Add TrackerFeature in the collection. Return true if TrackerFeature is added, false otherwise
.. ocv:function:: bool TrackerFeatureSet::addTrackerFeature( String trackerFeatureType )
:param trackerFeatureType: The TrackerFeature name
.. ocv:function:: bool TrackerFeatureSet::addTrackerFeature( Ptr<TrackerFeature>& feature )
:param feature: The TrackerFeature class
The modes available now:
* ``"HAAR"`` -- Haar Feature-based
The modes available soon:
* ``"HOG"`` -- Histogram of Oriented Gradients features
* ``"LBP"`` -- Local Binary Pattern features
* ``"FEATURE2D"`` -- All types of Feature2D
Example ``TrackerFeatureSet::addTrackerFeature`` : ::
//sample usage:
Ptr<TrackerFeature> trackerFeature = new TrackerFeatureHAAR( HAARparameters );
featureSet->addTrackerFeature( trackerFeature );
//or add CSC sampler with default parameters
//featureSet->addTrackerFeature( "HAAR" );
.. note:: If you use the second method, you must initialize the TrackerFeature
TrackerFeatureSet::getTrackerFeature
------------------------------------
Get the TrackerFeature collection (TrackerFeature name, TrackerFeature pointer)
.. ocv:function:: const std::vector<std::pair<String, Ptr<TrackerFeature> > >& TrackerFeatureSet::getTrackerFeature() const
TrackerFeatureSet::getResponses
-------------------------------
Get the responses
.. ocv:function:: const std::vector<Mat>& TrackerFeatureSet::getResponses() const
.. note:: Be sure to call extraction before getResponses
Example ``TrackerFeatureSet::getResponses`` : ::
//get the patches from sampler
std::vector<Mat> detectSamples = sampler->getSamples();
if( detectSamples.empty() )
return false;
//features extraction
featureSet->extraction( detectSamples );
//get responses
std::vector<Mat> response = featureSet->getResponses();
TrackerFeature
--------------
Abstract base class for TrackerFeature that represents the feature.
.. ocv:class:: TrackerFeature
TrackerFeature class::
class CV_EXPORTS_W TrackerFeature
{
public:
virtual ~TrackerFeature();
static Ptr<TrackerFeature> create( const String& trackerFeatureType );
void compute( const std::vector<Mat>& images, Mat& response );
virtual void selection( Mat& response, int npoints ) = 0;
String getClassName() const;
};
TrackerFeature::create
----------------------
Create TrackerFeature by tracker feature type
.. ocv:function:: static Ptr<TrackerFeature> TrackerFeature::create( const String& trackerFeatureType )
:param trackerFeatureType: The TrackerFeature name
The modes available now:
* ``"HAAR"`` -- Haar Feature-based
The modes available soon:
* ``"HOG"`` -- Histogram of Oriented Gradients features
* ``"LBP"`` -- Local Binary Pattern features
* ``"FEATURE2D"`` -- All types of Feature2D
TrackerFeature::compute
-----------------------
Compute the features in the images collection
.. ocv:function:: void TrackerFeature::compute( const std::vector<Mat>& images, Mat& response )
:param images: The images
:param response: The output response
TrackerFeature::selection
-------------------------
Identify most effective features
.. ocv:function:: void TrackerFeature::selection( Mat& response, int npoints )
:param response: Collection of response for the specific TrackerFeature
:param npoints: Max number of features
.. note:: This method modifies the response parameter
TrackerFeature::getClassName
----------------------------
Get the name of the specific TrackerFeature
.. ocv:function:: String TrackerFeature::getClassName() const
Specialized TrackerFeature
==========================
In [AAM]_ table I and section III C are described the most known features type. At moment only :ocv:class:`TrackerFeatureHAAR` is implemented.
TrackerFeatureHAAR : TrackerFeature
-----------------------------------
TrackerFeature based on HAAR features, used by TrackerMIL and many others algorithms
.. ocv:class:: TrackerFeatureHAAR
TrackerFeatureHAAR class::
class CV_EXPORTS_W TrackerFeatureHAAR : TrackerFeature
{
public:
TrackerFeatureHAAR( const TrackerFeatureHAAR::Params &parameters = TrackerFeatureHAAR::Params() );
~TrackerFeatureHAAR();
void selection( Mat& response, int npoints );
bool extractSelected( const std::vector<int> selFeatures, const std::vector<Mat>& images, Mat& response );
std::vector<std::pair<float, float> >& getMeanSigmaPairs();
bool swapFeature( int source, int target );
bool swapFeature( int id, CvHaarEvaluator::FeatureHaar& feature );
CvHaarEvaluator::FeatureHaar& getFeatureAt( int id );
};
.. note:: HAAR features implementation is copied from apps/traincascade and modified according to MIL implementation
TrackerFeatureHAAR::Params
--------------------------
.. ocv:struct:: TrackerFeatureHAAR::Params
List of TrackerFeatureHAAR parameters::
struct CV_EXPORTS Params
{
Params();
int numFeatures; // # of rects
Size rectSize; // rect size
bool isIntegral; // true if input images are integral, false otherwise
};
TrackerFeatureHAAR::TrackerFeatureHAAR
--------------------------------------
Constructor
.. ocv:function:: TrackerFeatureHAAR::TrackerFeatureHAAR( const TrackerFeatureHAAR::Params &parameters = TrackerFeatureHAAR::Params() )
:param parameters: TrackerFeatureHAAR parameters :ocv:struct:`TrackerFeatureHAAR::Params`
TrackerFeatureHAAR::selection
-----------------------------
Identify most effective features
.. ocv:function:: void TrackerFeatureHAAR::selection( Mat& response, int npoints )
:param response: Collection of response for the specific TrackerFeature
:param npoints: Max number of features
.. note:: This method modifies the response parameter
TrackerFeatureHAAR::extractSelected
-----------------------------------
Compute the features only for the selected indices in the images collection
.. ocv:function:: bool TrackerFeatureHAAR::extractSelected( const std::vector<int> selFeatures, const std::vector<Mat>& images, Mat& response )
:param selFeatures: indices of selected features
:param images: The images
:param response: Collection of response for the specific TrackerFeature
TrackerFeatureHAAR::getMeanSigmaPairs
-------------------------------------
Get the list of mean/sigma. Return the list of mean/sigma
.. ocv:function:: std::vector<std::pair<float, float> >& TrackerFeatureHAAR::getMeanSigmaPairs()
TrackerFeatureHAAR::swapFeature
-------------------------------
Swap the feature in position source with the feature in position target
.. ocv:function:: bool TrackerFeatureHAAR::swapFeature( int source, int target )
:param source: The source position
:param target: The target position
Swap the feature in position id with the feature input
.. ocv:function:: bool TrackerFeatureHAAR::swapFeature( int id, CvHaarEvaluator::FeatureHaar& feature )
:param id: The position
:param feature: The feature
TrackerFeatureHAAR::getFeatureAt
--------------------------------
Get the feature in position id
.. ocv:function:: CvHaarEvaluator::FeatureHaar& TrackerFeatureHAAR::getFeatureAt( int id )
:param id: The position
TrackerFeatureHOG
-----------------
TODO To be implemented
TrackerFeatureLBP
-----------------
TODO To be implemented
TrackerFeatureFeature2d
-----------------------
TODO To be implemented
This diff is collapsed.
Common Interfaces of TrackerSampler
===================================
.. highlight:: cpp
TrackerSampler
--------------
Class that manages the sampler in order to select regions for the update the model of the tracker
[AAM]_ Sampling e Labeling. See table I and section III B
.. ocv:class:: TrackerSampler
TrackerSampler class::
class CV_EXPORTS_W TrackerSampler
{
public:
TrackerSampler();
~TrackerSampler();
void sampling( const Mat& image, Rect boundingBox );
const std::vector<std::pair<String, Ptr<TrackerSamplerAlgorithm> > >& getSamplers() const;
const std::vector<Mat>& getSamples() const;
bool addTrackerSamplerAlgorithm( String trackerSamplerAlgorithmType );
bool addTrackerSamplerAlgorithm( Ptr<TrackerSamplerAlgorithm>& sampler );
};
TrackerSampler is an aggregation of :ocv:class:`TrackerSamplerAlgorithm`
.. seealso::
:ocv:class:`TrackerSamplerAlgorithm`
TrackerSampler::sampling
------------------------
Computes the regions starting from a position in an image
.. ocv:function:: void TrackerSampler::sampling( const Mat& image, Rect boundingBox )
:param image: The current frame
:param boundingBox: The bounding box from which regions can be calculated
TrackerSampler::getSamplers
---------------------------
Return the collection of the :ocv:class:`TrackerSamplerAlgorithm`
.. ocv:function:: const std::vector<std::pair<String, Ptr<TrackerSamplerAlgorithm> > >& TrackerSampler::getSamplers() const
TrackerSampler::getSamples
--------------------------
Return the samples from all :ocv:class:`TrackerSamplerAlgorithm`, [AAM]_ Fig. 1 variable Sk
.. ocv:function:: const std::vector<Mat>& TrackerSampler::getSamples() const
TrackerSampler::addTrackerSamplerAlgorithm
------------------------------------------
Add TrackerSamplerAlgorithm in the collection.
Return true if sampler is added, false otherwise
.. ocv:function:: bool TrackerSampler::addTrackerSamplerAlgorithm( String trackerSamplerAlgorithmType )
:param trackerSamplerAlgorithmType: The TrackerSamplerAlgorithm name
.. ocv:function:: bool TrackerSampler::addTrackerSamplerAlgorithm( Ptr<TrackerSamplerAlgorithm>& sampler )
:param sampler: The TrackerSamplerAlgorithm class
The modes available now:
* ``"CSC"`` -- Current State Center
* ``"CS"`` -- Current State
Example ``TrackerSamplerAlgorithm::addTrackerSamplerAlgorithm`` : ::
//sample usage:
TrackerSamplerCSC::Params CSCparameters;
Ptr<TrackerSamplerAlgorithm> CSCSampler = new TrackerSamplerCSC( CSCparameters );
if( !sampler->addTrackerSamplerAlgorithm( CSCSampler ) )
return false;
//or add CSC sampler with default parameters
//sampler->addTrackerSamplerAlgorithm( "CSC" );
.. note:: If you use the second method, you must initialize the TrackerSamplerAlgorithm
TrackerSamplerAlgorithm
-----------------------
Abstract base class for TrackerSamplerAlgorithm that represents the algorithm for the specific sampler.
.. ocv:class:: TrackerSamplerAlgorithm
TrackerSamplerAlgorithm class::
class CV_EXPORTS_W TrackerSamplerAlgorithm
{
public:
virtual ~TrackerSamplerAlgorithm();
static Ptr<TrackerSamplerAlgorithm> create( const String& trackerSamplerType );
bool sampling( const Mat& image, Rect boundingBox, std::vector<Mat>& sample );
String getClassName() const;
};
TrackerSamplerAlgorithm::create
-------------------------------
Create TrackerSamplerAlgorithm by tracker sampler type.
.. ocv:function:: static Ptr<TrackerSamplerAlgorithm> TrackerSamplerAlgorithm::create( const String& trackerSamplerType )
:param trackerSamplerType: The trackerSamplerType name
The modes available now:
* ``"CSC"`` -- Current State Center
* ``"CS"`` -- Current State
TrackerSamplerAlgorithm::sampling
---------------------------------
Computes the regions starting from a position in an image. Return true if samples are computed, false otherwise
.. ocv:function:: bool TrackerSamplerAlgorithm::sampling( const Mat& image, Rect boundingBox, std::vector<Mat>& sample )
:param image: The current frame
:param boundingBox: The bounding box from which regions can be calculated
:sample: The computed samples [AAM]_ Fig. 1 variable Sk
TrackerSamplerAlgorithm::getClassName
-------------------------------------
Get the name of the specific TrackerSamplerAlgorithm
.. ocv:function:: String TrackerSamplerAlgorithm::getClassName() const
Specialized TrackerSamplerAlgorithm
===================================
In [AAM]_ table I are described the most known sampling strategies. At moment :ocv:class:`TrackerSamplerCSC` and :ocv:class:`TrackerSamplerCS` are implemented.
TrackerSamplerCSC : TrackerSamplerAlgorithm
-------------------------------------------
TrackerSampler based on CSC (current state centered), used by MIL algorithm TrackerMIL
.. ocv:class:: TrackerSamplerCSC
TrackerSamplerCSC class::
class CV_EXPORTS_W TrackerSamplerCSC
{
public:
TrackerSamplerCSC( const TrackerSamplerCSC::Params &parameters = TrackerSamplerCSC::Params() );
void setMode( int samplingMode );
~TrackerSamplerCSC();
};
TrackerSamplerCSC::Params
-------------------------
.. ocv:struct:: TrackerSamplerCSC::Params
List of TrackerSamplerCSC parameters::
struct CV_EXPORTS Params
{
Params();
float initInRad; // radius for gathering positive instances during init
float trackInPosRad; // radius for gathering positive instances during tracking
float searchWinSize; // size of search window
int initMaxNegNum; // # negative samples to use during init
int trackMaxPosNum; // # positive samples to use during training
int trackMaxNegNum; // # negative samples to use during training
};
TrackerSamplerCSC::TrackerSamplerCSC
------------------------------------
Constructor
.. ocv:function:: TrackerSamplerCSC::TrackerSamplerCSC( const TrackerSamplerCSC::Params &parameters = TrackerSamplerCSC::Params() )
:param parameters: TrackerSamplerCSC parameters :ocv:struct:`TrackerSamplerCSC::Params`
TrackerSamplerCSC::setMode
--------------------------
Set the sampling mode of TrackerSamplerCSC
.. ocv:function:: void TrackerSamplerCSC::setMode( int samplingMode )
:param samplingMode: The sampling mode
The modes are:
* ``"MODE_INIT_POS = 1"`` -- for the positive sampling in initialization step
* ``"MODE_INIT_NEG = 2"`` -- for the negative sampling in initialization step
* ``"MODE_TRACK_POS = 3"`` -- for the positive sampling in update step
* ``"MODE_TRACK_NEG = 4"`` -- for the negative sampling in update step
* ``"MODE_DETECT = 5"`` -- for the sampling in detection step
TrackerSamplerCS : TrackerSamplerAlgorithm
-------------------------------------------
TrackerSampler based on CS (current state), used by algorithm TrackerBoosting
.. ocv:class:: TrackerSamplerCS
TrackerSamplerCS class::
class CV_EXPORTS_W TrackerSamplerCS
{
public:
TrackerSamplerCS( const TrackerSamplerCS::Params &parameters = TrackerSamplerCS::Params() );
void setMode( int samplingMode );
~TrackerSamplerCS();
};
TrackerSamplerCS::Params
-------------------------
.. ocv:struct:: TrackerSamplerCS::Params
List of TrackerSamplerCS parameters::
struct CV_EXPORTS Params
{
Params();
float overlap; //overlapping for the search windows
float searchFactor; //search region parameter
};
TrackerSamplerCS::TrackerSamplerCS
------------------------------------
Constructor
.. ocv:function:: TrackerSamplerCS::TrackerSamplerCS( const TrackerSamplerCS::Params &parameters = TrackerSamplerCS::Params() )
:param parameters: TrackerSamplerCS parameters :ocv:struct:`TrackerSamplerCS::Params`
TrackerSamplerCS::setMode
--------------------------
Set the sampling mode of TrackerSamplerCS
.. ocv:function:: void TrackerSamplerCS::setMode( int samplingMode )
:param samplingMode: The sampling mode
The modes are:
* ``"MODE_POSITIVE = 1"`` -- for the positive sampling
* ``"MODE_NEGATIVE = 2"`` -- for the negative sampling
* ``"MODE_CLASSIFY = 3"`` -- for the sampling in classification step
@startuml
package "Tracker package" #DDDDDD {
class Algorithm {
}
class Tracker{
Ptr<TrackerFeatureSet> featureSet;
Ptr<TrackerSampler> sampler;
Ptr<TrackerModel> model;
---
+static Ptr<Tracker> create(const string& trackerType);
+bool init(const Mat& image, const Rect& boundingBox);
+bool update(const Mat& image, Rect& boundingBox);
}
class Tracker
note right: Tracker is the general interface for each specialized trackers
class TrackerMIL{
+Params
---
TrackerMIL(TrackerMIL::Params parameters);
+bool init(const Mat& image, const Rect& boundingBox);
+bool update(const Mat& image, Rect& boundingBox);
}
class TrackerBoosting{
+Params
---
TrackerBoosting(TrackerBoosting::Params parameters);
+bool init(const Mat& image, const Rect& boundingBox);
+bool update(const Mat& image, Rect& boundingBox);
}
Algorithm <|-- Tracker : virtual inheritance
Tracker <|-- TrackerMIL
Tracker <|-- TrackerBoosting
note "Single instance of the Tracker" as N1
TrackerBoosting .. N1
TrackerMIL .. N1
}
@enduml
@startuml
package "TrackerFeature package" #DDDDDD {
class TrackerFeatureSet{
-vector<pair<string, Ptr<TrackerFeature> > > features
-vector<Mat> responses
...
TrackerFeatureSet();
~TrackerFeatureSet();
--
+extraction(const std::vector<Mat>& images);
+selection();
+removeOutliers();
+vector<Mat> response getResponses();
+vector<pair<string TrackerFeatureType, Ptr<TrackerFeature> > > getTrackerFeatures();
+bool addTrackerFeature(string trackerFeatureType);
+bool addTrackerFeature(Ptr<TrackerFeature>& feature);
-clearResponses();
}
class TrackerFeature <<virtual>>{
static Ptr<TrackerFeature> = create(const string& trackerFeatureType);
compute(const std::vector<Mat>& images, Mat& response);
selection(Mat& response, int npoints);
}
note bottom: Can be specialized as in table II\nA tracker can use more types of features
class TrackerFeatureFeature2D{
-vector<Keypoints> keypoints
---
TrackerFeatureFeature2D(string detectorType, string descriptorType);
~TrackerFeatureFeature2D();
---
compute(const std::vector<Mat>& images, Mat& response);
selection( Mat& response, int npoints);
}
class TrackerFeatureHOG{
TrackerFeatureHOG();
~TrackerFeatureHOG();
---
compute(const std::vector<Mat>& images, Mat& response);
selection(Mat& response, int npoints);
}
TrackerFeatureSet *-- TrackerFeature
TrackerFeature <|-- TrackerFeatureHOG
TrackerFeature <|-- TrackerFeatureFeature2D
note "Per readability and simplicity in this diagram\n there are only two TrackerFeature but you\n can considering the implementation of the other TrackerFeature" as N1
TrackerFeatureHOG .. N1
TrackerFeatureFeature2D .. N1
}
@enduml
@startuml
package "TrackerModel package" #DDDDDD {
class Typedef << (T,#FF7700) >>{
ConfidenceMap
Trajectory
}
class TrackerModel{
-vector<ConfidenceMap> confidenceMaps;
-Trajectory trajectory;
-Ptr<TrackerStateEstimator> stateEstimator;
...
TrackerModel();
~TrackerModel();
+bool setTrackerStateEstimator(Ptr<TrackerStateEstimator> trackerStateEstimator);
+Ptr<TrackerStateEstimator> getTrackerStateEstimator();
+void modelEstimation(const vector<Mat>& responses);
+void modelUpdate();
+void setLastTargetState(const Ptr<TrackerTargetState> lastTargetState);
+void runStateEstimator();
+const vector<ConfidenceMap>& getConfidenceMaps();
+const ConfidenceMap& getLastConfidenceMap();
}
class TrackerTargetState <<virtual>>{
Point2f targetPosition;
---
Point2f getTargetPosition();
void setTargetPosition(Point2f position);
}
class TrackerTargetState
note bottom: Each TrackerStateEstimator can create own state
class TrackerStateEstimator <<virtual>>{
~TrackerStateEstimator();
static Ptr<TrackerStateEstimator> create(const String& trackeStateEstimatorType);
Ptr<TrackerTargetState> estimate(const vector<ConfidenceMap>& confidenceMaps)
void update(vector<ConfidenceMap>& confidenceMaps)
}
class TrackerStateEstimatorSVM{
TrackerStateEstimatorSVM()
~TrackerStateEstimatorSVM()
Ptr<TrackerTargetState> estimate(const vector<ConfidenceMap>& confidenceMaps)
void update(vector<ConfidenceMap>& confidenceMaps)
}
class TrackerStateEstimatorMILBoosting{
TrackerStateEstimatorMILBoosting()
~TrackerStateEstimatorMILBoosting()
Ptr<TrackerTargetState> estimate(const vector<ConfidenceMap>& confidenceMaps)
void update(vector<ConfidenceMap>& confidenceMaps)
}
TrackerModel -> TrackerStateEstimator: create
TrackerModel *-- TrackerTargetState
TrackerStateEstimator <|-- TrackerStateEstimatorMILBoosting
TrackerStateEstimator <|-- TrackerStateEstimatorSVM
}
@enduml
@startuml
package "TrackerSampler package" #DDDDDD {
class TrackerSampler{
-vector<pair<String, Ptr<TrackerSamplerAlgorithm> > > samplers
-vector<Mat> samples;
...
TrackerSampler();
~TrackerSampler();
+sampling(const Mat& image, Rect boundingBox);
+const vector<pair<String, Ptr<TrackerSamplerAlgorithm> > >& getSamplers();
+const vector<Mat>& getSamples();
+bool addTrackerSamplerAlgorithm(String trackerSamplerAlgorithmType);
+bool addTrackerSamplerAlgorithm(Ptr<TrackerSamplerAlgorithm>& sampler);
---
-void clearSamples();
}
class TrackerSamplerAlgorithm{
~TrackerSamplerAlgorithm();
+static Ptr<TrackerSamplerAlgorithm> create(const String& trackerSamplerType);
+bool sampling(const Mat& image, Rect boundingBox, vector<Mat>& sample);
}
note bottom: A tracker could sample the target\nor it could sample the target and the background
class TrackerSamplerCS{
TrackerSamplerCS();
~TrackerSamplerCS();
+bool sampling(const Mat& image, Rect boundingBox, vector<Mat>& sample);
}
class TrackerSamplerCSC{
TrackerSamplerCSC();
~TrackerSamplerCSC();
+bool sampling(const Mat& image, Rect boundingBox, vector<Mat>& sample);
}
TrackerSampler *-- TrackerSamplerAlgorithm
TrackerSamplerAlgorithm <|-- TrackerSamplerCS
TrackerSamplerAlgorithm <|-- TrackerSamplerCSC
}
@enduml
@startuml
package "Tracker" #DDDDDD {
}
package "TrackerFeature" #DDDDDD {
}
package "TrackerSampler" #DDDDDD {
}
package "TrackerModel" #DDDDDD {
}
Tracker -> TrackerModel: create
Tracker -> TrackerSampler: create
Tracker -> TrackerFeature: create
@enduml
\ No newline at end of file
Tracker Algorithms
==================
.. highlight:: cpp
Two algorithms will be implemented soon, the first is MIL (Multiple Instance Learning) [MIL]_ and second is Online Boosting [OLB]_.
.. [MIL] B Babenko, M-H Yang, and S Belongie, Visual Tracking with Online Multiple Instance Learning, In CVPR, 2009
.. [OLB] H Grabner, M Grabner, and H Bischof, Real-time tracking via on-line boosting, In Proc. BMVC, volume 1, pages 47– 56, 2006
TrackerBoosting
---------------
This is a real-time object tracking based on a novel on-line version of the AdaBoost algorithm.
The classifier uses the surrounding background as negative examples in update step to avoid the drifting problem.
.. ocv:class:: TrackerBoosting
Implementation of TrackerBoosting from :ocv:class:`Tracker`::
class CV_EXPORTS_W TrackerBoosting : public Tracker
{
public:
TrackerBoosting( const TrackerBoosting::Params &parameters = TrackerBoosting::Params() );
virtual ~TrackerBoosting();
void read( const FileNode& fn );
void write( FileStorage& fs ) const;
};
TrackerMIL
----------
The MIL algorithm trains a classifier in an online manner to separate the object from the background. Multiple Instance Learning avoids the drift problem for a robust tracking.
Original code can be found here http://vision.ucsd.edu/~bbabenko/project_miltrack.shtml
.. ocv:class:: TrackerMIL
Implementation of TrackerMIL from :ocv:class:`Tracker`::
class CV_EXPORTS_W TrackerMIL : public Tracker
{
public:
TrackerMIL( const TrackerMIL::Params &parameters = TrackerMIL::Params() );
virtual ~TrackerMIL();
void read( const FileNode& fn );
void write( FileStorage& fs ) const;
};
TrackerMIL::Params
------------------
.. ocv:struct:: TrackerMIL::Params
List of MIL parameters::
struct CV_EXPORTS Params
{
Params();
//parameters for sampler
float samplerInitInRadius; // radius for gathering positive instances during init
int samplerInitMaxNegNum; // # negative samples to use during init
float samplerSearchWinSize; // size of search window
float samplerTrackInRadius; // radius for gathering positive instances during tracking
int samplerTrackMaxPosNum; // # positive samples to use during tracking
int samplerTrackMaxNegNum; // # negative samples to use during tracking
int featureSetNumFeatures; // # features
void read( const FileNode& fn );
void write( FileStorage& fs ) const;
};
TrackerMIL::TrackerMIL
----------------------
Constructor
.. ocv:function:: bool TrackerMIL::TrackerMIL( const TrackerMIL::Params &parameters = TrackerMIL::Params() )
:param parameters: MIL parameters :ocv:struct:`TrackerMIL::Params`
TrackerBoosting::Params
------------------
.. ocv:struct:: TrackerBoosting::Params
List of BOOSTING parameters::
struct CV_EXPORTS Params
{
Params();
int numClassifiers; //the number of classifiers to use in a OnlineBoosting algorithm
float samplerOverlap; //search region parameters to use in a OnlineBoosting algorithm
float samplerSearchFactor; // search region parameters to use in a OnlineBoosting algorithm
int iterationInit; //the initial iterations
int featureSetNumFeatures; // #features
void read( const FileNode& fn );
void write( FileStorage& fs ) const;
};
TrackerBoosting::TrackerBoosting
----------------------
Constructor
.. ocv:function:: bool TrackerBoosting::TrackerBoosting( const TrackerBoosting::Params &parameters = TrackerBoosting::Params() )
:param parameters: BOOSTING parameters :ocv:struct:`TrackerBoosting::Params`
Tracking API
============
.. highlight:: cpp
Long-term optical tracking API
------------------------------
Long-term optical tracking is one of most important issue for many computer vision applications in real world scenario.
The development in this area is very fragmented and this API is an unique interface useful for plug several algorithms and compare them.
This work is partially based on [AAM]_ and [AMVOT]_.
This algorithms start from a bounding box of the target and with their internal representation they avoid the drift during the tracking.
These long-term trackers are able to evaluate online the quality of the location of the target in the new frame, without ground truth.
There are three main components: the TrackerSampler, the TrackerFeatureSet and the TrackerModel. The first component is the object that computes the patches over the frame based on the last target location.
The TrackerFeatureSet is the class that manages the Features, is possible plug many kind of these (HAAR, HOG, LBP, Feature2D, etc).
The last component is the internal representation of the target, it is the appearence model. It stores all state candidates and compute the trajectory (the most likely target states). The class TrackerTargetState represents a possible state of the target.
The TrackerSampler and the TrackerFeatureSet are the visual representation of the target, instead the TrackerModel is the statistical model.
A recent benchmark between these algorithms can be found in [OOT]_.
UML design:
-----------
**General diagram**
.. image:: pics/package.png
:width: 50%
:alt: General diagram
:align: center
**Tracker diagram**
.. image:: pics/Tracker.png
:width: 80%
:alt: Tracker diagram
:align: center
**TrackerSampler diagram**
.. image:: pics/TrackerSampler.png
:width: 100%
:alt: TrackerSampler diagram
:align: center
**TrackerFeatureSet diagram**
.. image:: pics/TrackerFeature.png
:width: 100%
:alt: TrackerFeatureSet diagram
:align: center
**TrackerModel diagram**
.. image:: pics/TrackerModel.png
:width: 100%
:alt: TrackerModel diagram
:align: center
To see how API works, try tracker demo:
https://github.com/lenlen/opencv/blob/tracking_api/samples/cpp/tracker.cpp
.. note:: This Tracking API has been designed with PlantUML. If you modify this API please change UML files under modules/tracking/misc/
The following reference was used in the API
.. [AAM] S Salti, A Cavallaro, L Di Stefano, Adaptive Appearance Modeling for Video Tracking: Survey and Evaluation, IEEE Transactions on Image Processing, Vol. 21, Issue 10, October 2012, pp. 4334-4348
.. [AMVOT] X Li, W Hu, C Shen, Z Zhang, A Dick, A van den Hengel, A Survey of Appearance Models in Visual Object Tracking, ACM Transactions on Intelligent Systems and Technology (TIST), 2013
.. [OOT] Yi Wu and Jongwoo Lim and Ming-Hsuan Yang, Online Object Tracking: A Benchmark, The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2013
Tracker classes:
----------------
.. toctree::
:maxdepth: 2
tracker_algorithms
common_interfaces_tracker
common_interfaces_tracker_sampler
common_interfaces_tracker_feature_set
common_interfaces_tracker_model
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#ifndef __OPENCV_TRACKING_HPP__
#define __OPENCV_TRACKING_HPP__
#include "opencv2/tracking/tracker.hpp"
namespace cv
{
CV_EXPORTS bool initModule_tracking(void);
}
#endif //__OPENCV_TRACKING_HPP__
This diff is collapsed.
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#ifndef __OPENCV_ONLINEBOOSTING_HPP__
#define __OPENCV_ONLINEBOOSTING_HPP__
#include "opencv2/core.hpp"
namespace cv
{
//TODO based on the original implementation
//http://vision.ucsd.edu/~bbabenko/project_miltrack.shtml
class BaseClassifier;
class WeakClassifierHaarFeature;
class EstimatedGaussDistribution;
class ClassifierThreshold;
class Detector;
class StrongClassifierDirectSelection
{
public:
StrongClassifierDirectSelection( int numBaseClf, int numWeakClf, Size patchSz, const Rect& sampleROI, bool useFeatureEx = false, int iterationInit =
0 );
virtual ~StrongClassifierDirectSelection();
void initBaseClassifier();
bool update( const Mat& image, int target, float importance = 1.0 );
float eval( const Mat& response );
std::vector<int> getSelectedWeakClassifier();
float classifySmooth( const std::vector<Mat>& images, const Rect& sampleROI, int& idx );
int getNumBaseClassifier();
Size getPatchSize() const;
Rect getROI() const;
bool getUseFeatureExchange() const;
int getReplacedClassifier() const;
void replaceWeakClassifier( int idx );
int getSwappedClassifier() const;
private:
//StrongClassifier
int numBaseClassifier;
int numAllWeakClassifier;
int numWeakClassifier;
int iterInit;
BaseClassifier** baseClassifier;
std::vector<float> alpha;
cv::Size patchSize;
bool useFeatureExchange;
//StrongClassifierDirectSelection
std::vector<bool> m_errorMask;
std::vector<float> m_errors;
std::vector<float> m_sumErrors;
Detector* detector;
Rect ROI;
int replacedClassifier;
int swappedClassifier;
};
class BaseClassifier
{
public:
BaseClassifier( int numWeakClassifier, int iterationInit );
BaseClassifier( int numWeakClassifier, int iterationInit, WeakClassifierHaarFeature** weakCls );
WeakClassifierHaarFeature** getReferenceWeakClassifier()
{
return weakClassifier;
}
;
void trainClassifier( const Mat& image, int target, float importance, std::vector<bool>& errorMask );
int selectBestClassifier( std::vector<bool>& errorMask, float importance, std::vector<float> & errors );
int computeReplaceWeakestClassifier( const std::vector<float> & errors );
void replaceClassifierStatistic( int sourceIndex, int targetIndex );
int getIdxOfNewWeakClassifier()
{
return m_idxOfNewWeakClassifier;
}
;
int eval( const Mat& image );
virtual ~BaseClassifier();
float getError( int curWeakClassifier );
void getErrors( float* errors );
int getSelectedClassifier() const;
void replaceWeakClassifier( int index );
protected:
void generateRandomClassifier();
WeakClassifierHaarFeature** weakClassifier;
bool m_referenceWeakClassifier;
int m_numWeakClassifier;
int m_selectedClassifier;
int m_idxOfNewWeakClassifier;
std::vector<float> m_wCorrect;
std::vector<float> m_wWrong;
int m_iterationInit;
};
class EstimatedGaussDistribution
{
public:
EstimatedGaussDistribution();
EstimatedGaussDistribution( float P_mean, float R_mean, float P_sigma, float R_sigma );
virtual ~EstimatedGaussDistribution();
void update( float value ); //, float timeConstant = -1.0);
float getMean();
float getSigma();
void setValues( float mean, float sigma );
private:
float m_mean;
float m_sigma;
float m_P_mean;
float m_P_sigma;
float m_R_mean;
float m_R_sigma;
};
class WeakClassifierHaarFeature
{
public:
WeakClassifierHaarFeature();
virtual ~WeakClassifierHaarFeature();
bool update( float value, int target );
int eval( float value );
private:
float sigma;
float mean;
ClassifierThreshold* m_classifier;
void getInitialDistribution( EstimatedGaussDistribution *distribution );
void generateRandomClassifier( EstimatedGaussDistribution* m_posSamples, EstimatedGaussDistribution* m_negSamples );
};
class Detector
{
public:
Detector( StrongClassifierDirectSelection* classifier );
virtual
~Detector( void );
void
classifySmooth( const std::vector<Mat>& image, float minMargin = 0 );
int
getNumDetections();
float
getConfidence( int patchIdx );
float
getConfidenceOfDetection( int detectionIdx );
float getConfidenceOfBestDetection()
{
return m_maxConfidence;
}
;
int
getPatchIdxOfBestDetection();
int
getPatchIdxOfDetection( int detectionIdx );
const std::vector<int> &
getIdxDetections() const
{
return m_idxDetections;
}
;
const std::vector<float> &
getConfidences() const
{
return m_confidences;
}
;
const cv::Mat &
getConfImageDisplay() const
{
return m_confImageDisplay;
}
private:
void
prepareConfidencesMemory( int numPatches );
void
prepareDetectionsMemory( int numDetections );
StrongClassifierDirectSelection* m_classifier;
std::vector<float> m_confidences;
int m_sizeConfidences;
int m_numDetections;
std::vector<int> m_idxDetections;
int m_sizeDetections;
int m_idxBestDetection;
float m_maxConfidence;
cv::Mat_<float> m_confMatrix;
cv::Mat_<float> m_confMatrixSmooth;
cv::Mat_<unsigned char> m_confImageDisplay;
};
class ClassifierThreshold
{
public:
ClassifierThreshold( EstimatedGaussDistribution* posSamples, EstimatedGaussDistribution* negSamples );
virtual ~ClassifierThreshold();
void update( float value, int target );
int eval( float value );
void* getDistribution( int target );
private:
EstimatedGaussDistribution* m_posSamples;
EstimatedGaussDistribution* m_negSamples;
float m_threshold;
int m_parity;
};
} /* namespace cv */
#endif
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#ifndef __OPENCV_ONLINEMIL_HPP__
#define __OPENCV_ONLINEMIL_HPP__
#include "opencv2/core.hpp"
#include <limits>
namespace cv
{
//TODO based on the original implementation
//http://vision.ucsd.edu/~bbabenko/project_miltrack.shtml
#define sign(s) ((s > 0 ) ? 1 : ((s<0) ? -1 : 0))
class ClfOnlineStump;
class ClfMilBoost
{
public:
struct CV_EXPORTS Params
{
Params();
int _numSel;
int _numFeat;
float _lRate;
};
ClfMilBoost();
~ClfMilBoost();
void init( const ClfMilBoost::Params &parameters = ClfMilBoost::Params() );
void update( const Mat& posx, const Mat& negx );
std::vector<float> classify( const Mat& x, bool logR = true );
inline float sigmoid( float x )
{
return 1.0f / ( 1.0f + exp( -x ) );
}
private:
uint _numsamples;
ClfMilBoost::Params _myParams;
std::vector<int> _selectors;
std::vector<ClfOnlineStump*> _weakclf;
uint _counter;
};
class ClfOnlineStump
{
public:
float _mu0, _mu1, _sig0, _sig1;
float _q;
int _s;
float _log_n1, _log_n0;
float _e1, _e0;
float _lRate;
ClfOnlineStump();
ClfOnlineStump( int ind );
void init();
void update( const Mat& posx, const Mat& negx, const cv::Mat_<float> & posw = cv::Mat_<float>(), const cv::Mat_<float> & negw = cv::Mat_<float>() );
bool classify( const Mat& x, int i );
float classifyF( const Mat& x, int i );
std::vector<float> classifySetF( const Mat& x );
private:
bool _trained;
int _ind;
};
} /* namespace cv */
#endif
This diff is collapsed.
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#ifdef __OPENCV_BUILD
#error this is a compatibility header which should not be used inside the OpenCV library
#endif
#include "opencv2/tracking.hpp"
#include "perf_precomp.hpp"
CV_PERF_TEST_MAIN(tracking)
#ifdef __GNUC__
# pragma GCC diagnostic ignored "-Wmissing-declarations"
# if defined __clang__ || defined __APPLE__
# pragma GCC diagnostic ignored "-Wmissing-prototypes"
# pragma GCC diagnostic ignored "-Wextra"
# endif
#endif
#ifndef __OPENCV_TRACKING_PRECOMP_HPP__
#define __OPENCV_TRACKING_PRECOMP_HPP__
#include "opencv2/ts.hpp"
#include <opencv2/imgproc.hpp>
#include <opencv2/tracking.hpp>
#include <opencv2/highgui.hpp>
#ifdef GTEST_CREATE_SHARED_LIBRARY
#error no modules except ts should have GTEST_CREATE_SHARED_LIBRARY defined
#endif
#endif
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#include "perf_precomp.hpp"
using namespace std;
using namespace cv;
using namespace perf;
#include <opencv2/core/utility.hpp>
#include <opencv2/tracking.hpp>
#include <opencv2/highgui.hpp>
#include <iostream>
using namespace std;
using namespace cv;
static Mat image;
static Rect boundingBox;
static bool paused;
static bool selectObject = false;
static bool startSelection = false;
static const char* keys =
{ "{@tracker_algorithm | | tracker algorithm }"
"{@video_name | | video name }" };
static void help()
{
cout << "\nThis example shows the functionality of \"Long-term optical tracking API\""
"-- pause video [p] and draw a bounding box around the target to start the tracker\n"
"Call:\n"
"./tracker <tracker_algorithm> <video_name>\n"
<< endl;
cout << "\n\nHot keys: \n"
"\tq - quit the program\n"
"\tp - pause video\n";
}
static void onMouse( int event, int x, int y, int, void* )
{
if( !selectObject )
{
switch ( event )
{
case EVENT_LBUTTONDOWN:
//set origin of the bounding box
startSelection = true;
boundingBox.x = x;
boundingBox.y = y;
break;
case EVENT_LBUTTONUP:
//sei with and height of the bounding box
boundingBox.width = std::abs( x - boundingBox.x );
boundingBox.height = std::abs( y - boundingBox.y );
paused = false;
selectObject = true;
break;
case EVENT_MOUSEMOVE:
if( startSelection && !selectObject )
{
//draw the bounding box
Mat currentFrame;
image.copyTo( currentFrame );
rectangle( currentFrame, Point( boundingBox.x, boundingBox.y ), Point( x, y ), Scalar( 255, 0, 0 ), 2, 1 );
imshow( "Tracking API", currentFrame );
}
break;
}
}
}
int main( int argc, char** argv )
{
CommandLineParser parser( argc, argv, keys );
String tracker_algorithm = parser.get<String>( 0 );
String video_name = parser.get<String>( 1 );
if( tracker_algorithm.empty() || video_name.empty() )
{
help();
return -1;
}
//open the capture
VideoCapture cap;
cap.open( video_name );
if( !cap.isOpened() )
{
help();
cout << "***Could not initialize capturing...***\n";
cout << "Current parameter's value: \n";
parser.printMessage();
return -1;
}
Mat frame;
paused = true;
namedWindow( "Tracking API", 1 );
setMouseCallback( "Tracking API", onMouse, 0 );
//instantiates the specific Tracker
Ptr<Tracker> tracker = Tracker::create( tracker_algorithm );
if( tracker == NULL )
{
cout << "***Error in the instantiation of the tracker...***\n";
return -1;
}
//get the first frame
cap >> frame;
frame.copyTo( image );
imshow( "Tracking API", image );
bool initialized = false;
for ( ;; )
{
if( !paused )
{
cap >> frame;
frame.copyTo( image );
if( !initialized && selectObject )
{
//initializes the tracker
if( !tracker->init( frame, boundingBox ) )
{
cout << "***Could not initialize tracker...***\n";
return -1;
}
initialized = true;
}
else if( initialized )
{
//updates the tracker
if( tracker->update( frame, boundingBox ) )
{
rectangle( image, boundingBox, Scalar( 255, 0, 0 ), 2, 1 );
}
}
imshow( "Tracking API", image );
}
char c = (char) waitKey( 2 );
if( c == 'q' )
break;
if( c == 'p' )
paused = !paused;
}
return 0;
}
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#ifndef __OPENCV_PRECOMP_H__
#define __OPENCV_PRECOMP_H__
#include "opencv2/tracking.hpp"
#include "opencv2/core/utility.hpp"
#include "opencv2/core/private.hpp"
#endif
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#include "precomp.hpp"
namespace cv
{
/*
* Tracker
*/
Tracker::~Tracker()
{
}
bool Tracker::init( const Mat& image, const Rect& boundingBox )
{
if( isInit )
{
return false;
}
if( image.empty() )
return false;
sampler = Ptr<TrackerSampler>( new TrackerSampler() );
featureSet = Ptr<TrackerFeatureSet>( new TrackerFeatureSet() );
model = Ptr<TrackerModel>();
bool initTracker = initImpl( image, boundingBox );
//check if the model component is initialized
if( model == 0 )
{
CV_Error( -1, "The model are not initialized" );
return false;
}
if( initTracker )
{
isInit = true;
}
return initTracker;
}
bool Tracker::update( const Mat& image, Rect& boundingBox )
{
if( !isInit )
{
return false;
}
if( image.empty() )
return false;
return updateImpl( image, boundingBox );
}
Ptr<Tracker> Tracker::create( const String& trackerType )
{
return Algorithm::create<Tracker>( "TRACKER." + trackerType );
}
} /* namespace cv */
This diff is collapsed.
This diff is collapsed.
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#ifndef __OPENCV_TRACKER_BOOSTING_MODEL_HPP__
#define __OPENCV_TRACKER_BOOSTING_MODEL_HPP__
#include "precomp.hpp"
#include "opencv2/core.hpp"
namespace cv
{
/**
* \brief Implementation of TrackerModel for BOOSTING algorithm
*/
class TrackerBoostingModel : public TrackerModel
{
public:
enum
{
MODE_POSITIVE = 1, // mode for positive features
MODE_NEGATIVE = 2, // mode for negative features
MODE_CLASSIFY = 3 // mode for classify step
};
/**
* \brief Constructor
* \param boundingBox The first boundingBox
*/
TrackerBoostingModel( const Rect& boundingBox );
/**
* \brief Destructor
*/
~TrackerBoostingModel()
{
}
;
/**
* \brief Set the mode
*/
void setMode( int trainingMode, const std::vector<Mat>& samples );
/**
* \brief Create the ConfidenceMap from a list of responses
* \param responses The list of the responses
* \param confidenceMap The output
*/
void responseToConfidenceMap( const std::vector<Mat>& responses, ConfidenceMap& confidenceMap );
/**
* \brief return the selected weak classifiers for the detect
* @return the selected weak classifiers
*/
std::vector<int> getSelectedWeakClassifier();
protected:
void modelEstimationImpl( const std::vector<Mat>& responses );
void modelUpdateImpl();
private:
std::vector<Mat> currentSample;
std::vector<std::pair<float, float> > meanSigmaPair;
int mode;
};
} /* namespace cv */
#endif
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
#include "test_precomp.hpp"
CV_TEST_MAIN("cv")
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment