Commit 468d3ecc authored by Vadim Pisarevsky's avatar Vadim Pisarevsky

Merge pull request #61 from lluisgomez/master

adds class OCRHMMDecoder API interface and implementation and updates webcam_demo sample to use it
parents 37af0432 0898fb9d
......@@ -7,11 +7,12 @@ OCRTesseract
------------
.. ocv:class:: OCRTesseract : public BaseOCR
OCRTesseract class provides an interface with the tesseract-ocr API (v3.02.02) in C++. Notice that it is compiled only when tesseract-ocr is correctly installed. ::
OCRTesseract class provides an interface with the tesseract-ocr API (v3.02.02) in C++. Notice that it is compiled only when tesseract-ocr is correctly installed.
.. note::
* (C++) An example of OCRTesseract recognition combined with scene text detection can be found at the end_to_end_recognition demo: https://github.com/Itseez/opencv_contrib/blob/master/modules/text/samples/end_to_end_recognition.cpp
* (C++) Another example of OCRTesseract recognition combined with scene text detection can be found at the webcam_demo: https://github.com/Itseez/opencv_contrib/blob/master/modules/text/samples/webcam_demo.cpp
OCRTesseract::create
--------------------
......@@ -37,3 +38,69 @@ Recognize text using the tesseract-ocr API. Takes image on input and returns rec
:param component_text: If provided the method will output a list of text strings for the recognition of individual text elements found (e.g. words or text lines).
:param component_confidences: If provided the method will output a list of confidence values for the recognition of individual text elements found (e.g. words or text lines).
:param component_level: ``OCR_LEVEL_WORD`` (by default), or ``OCR_LEVEL_TEXT_LINE``.
OCRHMMDecoder
-------------
.. ocv:class:: OCRHMMDecoder : public BaseOCR
OCRHMMDecoder class provides an interface for OCR using Hidden Markov Models.
.. note::
* (C++) An example on using OCRHMMDecoder recognition combined with scene text detection can be found at the webcam_demo sample: https://github.com/Itseez/opencv_contrib/blob/master/modules/text/samples/webcam_demo.cpp
OCRHMMDecoder::ClassifierCallback
---------------------------------
Callback with the character classifier is made a class. This way it hides the feature extractor and the classifier itself, so developers can write their own OCR code.
.. ocv:class:: OCRHMMDecoder::ClassifierCallback
The default character classifier and feature extractor can be loaded using the utility funtion ``loadOCRHMMClassifierNM`` and KNN model provided in https://github.com/Itseez/opencv_contrib/blob/master/modules/text/samples/OCRHMM_knn_model_data.xml.gz.
OCRHMMDecoder::ClassifierCallback::eval
---------------------------------------
The character classifier must return a (ranked list of) class(es) id('s)
.. ocv:function:: void OCRHMMDecoder::ClassifierCallback::eval( InputArray image, std::vector<int>& out_class, std::vector<double>& out_confidence)
:param image: Input image ``CV_8UC1`` or ``CV_8UC3`` with a single letter.
:param out_class: The classifier returns the character class categorical label, or list of class labels, to which the input image corresponds.
:param out_confidence: The classifier returns the probability of the input image corresponding to each classes in ``out_class``.
OCRHMMDecoder::create
---------------------
Creates an instance of the OCRHMMDecoder class. Initializes HMMDecoder.
.. ocv:function:: Ptr<OCRHMMDecoder> OCRHMMDecoder::create(const Ptr<OCRHMMDecoder::ClassifierCallback> classifier, const std::string& vocabulary, InputArray transition_probabilities_table, InputArray emission_probabilities_table, decoder_mode mode = OCR_DECODER_VITERBI)
:param classifier: The character classifier with built in feature extractor.
:param vocabulary: The language vocabulary (chars when ascii english text). vocabulary.size() must be equal to the number of classes of the classifier.
:param transition_probabilities_table: Table with transition probabilities between character pairs. cols == rows == vocabulary.size().
:param emission_probabilities_table: Table with observation emission probabilities. cols == rows == vocabulary.size().
:param mode: HMM Decoding algorithm. Only ``OCR_DECODER_VITERBI`` is available for the moment (http://en.wikipedia.org/wiki/Viterbi_algorithm).
OCRHMMDecoder::run
------------------
Recognize text using HMM. Takes image on input and returns recognized text in the output_text parameter. Optionally provides also the Rects for individual text elements found (e.g. words), and the list of those text elements with their confidence values.
.. ocv:function:: void OCRHMMDecoder::run(Mat& image, string& output_text, vector<Rect>* component_rects=NULL, vector<string>* component_texts=NULL, vector<float>* component_confidences=NULL, int component_level=0)
:param image: Input image ``CV_8UC1`` with a single text line (or word).
:param output_text: Output text. Most likely character sequence found by the HMM decoder.
:param component_rects: If provided the method will output a list of Rects for the individual text elements found (e.g. words).
:param component_text: If provided the method will output a list of text strings for the recognition of individual text elements found (e.g. words).
:param component_confidences: If provided the method will output a list of confidence values for the recognition of individual text elements found (e.g. words).
:param component_level: Only ``OCR_LEVEL_WORD`` is supported.
loadOCRHMMClassifierNM
----------------------
Allow to implicitly load the default character classifier when creating an OCRHMMDecoder object.
.. ocv:function:: Ptr<OCRHMMDecoder::ClassifierCallback> loadOCRHMMClassifierNM(const std::string& filename)
:param filename: The XML or YAML file with the classifier model (e.g. OCRHMM_knn_model_data.xml)
The default classifier is based in the scene text recognition method proposed by Lukás Neumann & Jiri Matas in [Neumann11b]. Basically, the region (contour) in the input image is normalized to a fixed size, while retaining the centroid and aspect ratio, in order to extract a feature vector based on gradient orientations along the chain-code of its perimeter. Then, the region is classified using a KNN model trained with synthetic data of rendered characters with different standard font types.
.. [Neumann11b] Neumann L., Matas J.: Text Localization in Real-world Images using Efficiently Pruned Exhaustive Search, ICDAR 2011. The paper is available online at http://cmp.felk.cvut.cz/~neumalu1/icdar2011_article.pdf
......@@ -52,7 +52,6 @@ namespace cv
namespace text
{
enum
{
OCR_LEVEL_WORD,
......@@ -69,6 +68,8 @@ public:
int component_level=0) = 0;
};
/* OCR Tesseract */
class CV_EXPORTS OCRTesseract : public BaseOCR
{
public:
......@@ -81,6 +82,52 @@ public:
};
/* OCR HMM Decoder */
enum decoder_mode
{
OCR_DECODER_VITERBI = 0 // Other algorithms may be added
};
class CV_EXPORTS OCRHMMDecoder : public BaseOCR
{
public:
//! callback with the character classifier is made a class. This way we hide the feature extractor and the classifier itself
class CV_EXPORTS ClassifierCallback
{
public:
virtual ~ClassifierCallback() { }
//! The classifier must return a (ranked list of) class(es) id('s)
virtual void eval( InputArray image, std::vector<int>& out_class, std::vector<double>& out_confidence);
};
public:
//! Decode a group of regions and output the most likely sequence of characters
virtual void run(Mat& image, std::string& output_text, std::vector<Rect>* component_rects=NULL,
std::vector<std::string>* component_texts=NULL, std::vector<float>* component_confidences=NULL,
int component_level=0);
static Ptr<OCRHMMDecoder> create(const Ptr<OCRHMMDecoder::ClassifierCallback> classifier,// The character classifier with built in feature extractor
const std::string& vocabulary, // The language vocabulary (chars when ascii english text)
// size() must be equal to the number of classes
InputArray transition_probabilities_table, // Table with transition probabilities between character pairs
// cols == rows == vocabulari.size()
InputArray emission_probabilities_table, // Table with observation emission probabilities
// cols == rows == vocabulari.size()
decoder_mode mode = OCR_DECODER_VITERBI); // HMM Decoding algorithm (only Viterbi for the moment)
protected:
Ptr<OCRHMMDecoder::ClassifierCallback> classifier;
std::string vocabulary;
Mat transition_p;
Mat emission_p;
decoder_mode mode;
};
CV_EXPORTS Ptr<OCRHMMDecoder::ClassifierCallback> loadOCRHMMClassifierNM(const std::string& filename);
}
}
#endif // _OPENCV_TEXT_OCR_HPP_
<?xml version="1.0"?>
<opencv_storage>
<transition_probabilities type_id="opencv-matrix">
<rows>62</rows>
<cols>62</cols>
<dt>d</dt>
<data>
8.50520944078e-05 0.0544758664682 0.0617478205401 0.0385285987667 0.00146714862853 0.00848394641718 0.0317882202849 0.000893046991282 0.0399106953009 0.000786731873272 0.013140548586 0.118371252392 0.0354667233681 0.116691473528 0.000233893259622 0.034509887306 0.000403997448437 0.113565809058 0.0640442270891 0.196980650649 0.0180948330853 0.0160748458431 0.00967467573889 0.00372102913034 0.0153519030406 0.00550712311291 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.100866163733 0.0314799292167 0.00242153301667 0.00502933780386 0.126292260408 0.00204898947564 0.00130390239359 0.00130390239359 0.112973828816 0.00614696842693 0.0 0.289000651951 0.00484306603334 0.00186271770513 0.0848467914688 0.00186271770513 0.0 0.0757194747136 0.053366862252 0.0113625780013 0.0739498928937 0.0027940765577 0.00111763062308 0.0 0.00940672441092 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.123990600299 0.0 0.0241828669088 0.00046998504593 0.136594744713 0.0 0.0 0.107583849605 0.0869472334971 0.0 0.0763939329203 0.0394787438581 0.000299081392865 0.00128177739799 0.152574236274 0.000341807306131 0.00290536210211 0.0535355693228 0.0138431958983 0.115274513993 0.053407391583 0.0 0.0 0.0 0.0107242042299 0.000170903653066 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0556214577567 0.00263608804534 0.00263608804534 0.0282061420851 0.318834849084 0.00355871886121 0.0217477263741 0.00329511005668 0.226110452089 0.00724924212469 0.000527217609068 0.0645841571108 0.00948991696323 0.0267562936602 0.0452089099776 0.00237247924081 0.000131804402267 0.0322261763543 0.067615658363 0.00105443521814 0.0506787926717 0.011862396204 0.00546988269408 0.0 0.0119942006063 0.000131804402267 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0467747490338 0.00432414505657 0.0376672576757 0.147403599628 0.0215824585124 0.0132020357922 0.0106891845351 0.00250009566693 0.00761508731201 0.000982180440578 0.00145413727566 0.0484584869319 0.0227814839854 0.112134392898 0.00517876959578 0.0174496473079 0.00329094225544 0.22472798704 0.197022845262 0.0348482722553 0.0022194726839 0.0134061252344 0.00769162085284 0.0118626988278 0.00371187672998 0.00102044721099 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0729166666667 0.0 0.000508130081301 0.0 0.146214430894 0.111788617886 0.000762195121951 0.0 0.21468495935 0.0 0.0 0.0782520325203 0.0 0.00279471544715 0.111280487805 0.0 0.0 0.0581808943089 0.0162601626016 0.0429369918699 0.114456300813 0.0 0.000762195121951 0.0 0.0282012195122 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0787207872079 0.0 0.0 0.000351432085749 0.212440695836 0.000527148128624 0.0419961342471 0.0789843612722 0.107977508347 0.0 0.0 0.125812686698 0.00773150588649 0.0492883500264 0.0347917764892 0.000615006150062 0.0 0.110964681075 0.081356527851 0.00404146898612 0.0533298190125 0.0 0.000527148128624 0.0 0.0103672465296 0.000175716042875 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.154653160894 0.00144704711947 0.000542642669802 0.00271321334901 0.298996111061 0.00614995025776 0.0 0.00271321334901 0.184136745953 0.0 0.0 0.016460160984 0.0103102107262 0.0121190196256 0.133309215881 0.00090440444967 0.000361761779868 0.0371710228814 0.0124807814054 0.0618612643574 0.0365379397667 0.0 0.00614995025776 0.0 0.0205299810075 0.000452202224835 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0319377484564 0.0157151315233 0.0745496066988 0.0267783134568 0.0643998985029 0.0226507654572 0.0288928359976 0.000439820688489 0.000473653049142 0.000405988327836 0.00314640954073 0.0478558741436 0.0279116975387 0.299247229975 0.0865431785503 0.0175928275395 0.00182694747526 0.0263554089487 0.0856297048127 0.0803856889114 0.00159012095069 0.0491584200288 6.76647213059e-05 0.00296033155713 0.0 0.00348473314726 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.125310173697 0.0 0.0 0.0 0.263027295285 0.0 0.0 0.0 0.014888337469 0.0 0.00248138957816 0.0 0.0 0.0 0.233250620347 0.0 0.0 0.00124069478908 0.0 0.0 0.359801488834 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0458695168114 0.00668002672011 0.000890670229348 0.00267201068804 0.422845691383 0.00400801603206 0.000890670229348 0.0060120240481 0.197951458473 0.000890670229348 0.00222667557337 0.0721442885772 0.00668002672011 0.0334001336005 0.0106880427522 0.00534402137609 0.0 0.00489868626141 0.135381874861 0.00668002672011 0.00935203740815 0.0 0.00801603206413 0.0 0.0164773992429 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.10850279394 0.00172821015035 0.00357163431073 0.0179157785587 0.21712080189 0.00385966933579 0.0033988132957 0.000460856040094 0.153148222824 0.0 0.00558787948615 0.111066305663 0.00483898842099 0.0107149029322 0.0684659254565 0.00397488334581 0.0 0.00138256812028 0.0326055648367 0.0247710121551 0.0354859150873 0.00639437755631 0.000806498070165 0.0 0.18414079152 5.76070050118e-05 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.142847308275 0.0638854467851 0.000619578686493 0.000275368305108 0.226215062646 0.00385515627151 0.0 0.000481894533939 0.159920143192 0.0 0.0 0.00530083987333 0.0549359768691 0.00901831199229 0.0927991188214 0.153242461793 0.0 0.000963789067878 0.037518931571 0.000413052457662 0.0410298774611 0.00068842076277 0.000275368305108 0.0 0.00571389233099 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0555165707641 0.00324799005554 0.0619123042685 0.0696914409447 0.155843374702 0.016380295527 0.238707219761 0.00453114662069 0.0559175571907 0.00272670770094 0.0130721575075 0.0076989393909 0.00364897648215 0.0156384706378 0.0215730697516 0.00425045612206 0.00272670770094 0.00411011087275 0.0990837460152 0.134691340698 0.00978406880927 0.011588507729 0.00312769412755 0.000280690498627 0.0034885819115 0.000761874210558 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0216214746118 0.0155565829911 0.0320378579782 0.0258913758873 0.00579292338655 0.00976365960456 0.0219206396693 0.00244771410699 0.0185754303897 0.000815904702331 0.0101444151323 0.0608664907939 0.0677744839403 0.23911447143 0.0419103048764 0.0337512578531 0.000435149174576 0.141885827735 0.0423726508744 0.0431341619299 0.0949985041747 0.0226005602546 0.0346215562022 0.00329081563273 0.00709837091028 0.00157741575784 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0966494041305 0.00112310476072 0.000623947089287 0.000686341798216 0.202595619891 0.00124789417857 0.000748736507144 0.0483558994197 0.0851687776877 0.0 0.000873525925002 0.101516191427 0.00174705185 0.00118549946964 0.104760716291 0.0722530729394 0.0 0.145878829475 0.0374992200661 0.0552817121108 0.0366880888501 0.0 0.000998315342859 0.0 0.00411805078929 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.000904977375566 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.999095022624 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.143023601179 0.00941305490187 0.0207224291165 0.0281934702643 0.211610957527 0.00680847174941 0.0125659713496 0.0013479860175 0.139322351436 9.13888825424e-05 0.0106924992575 0.0147136100893 0.0268683314675 0.021065137426 0.0920742991615 0.0123831935845 0.00036555553017 0.029153053531 0.110146450684 0.050538052046 0.0292215951929 0.00959583266696 0.00228472206356 0.0 0.0177065959926 9.13888825424e-05 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0315670455934 0.00171128223934 0.0422320009779 0.001466813348 0.125473658477 0.00409485392984 0.002933626696 0.0628896222956 0.108330277472 0.000733406674001 0.00928981787068 0.0301002322454 0.0154626573768 0.0203825938149 0.0375565334311 0.0439738418286 0.00443099865542 0.00161960640508 0.200647842562 0.199425498105 0.0422931182007 0.0 0.00589781200342 6.11172228334e-05 0.00742574257426 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0828099447776 0.000805820870761 0.011542198943 0.000237006138459 0.250088877302 0.00355509207688 0.000805820870761 0.0456473822672 0.276775768492 0.0 4.74012276918e-05 0.028914748892 0.00312848102766 0.006920579243 0.055056525964 0.00109022823691 0.0 0.0702723200531 0.0615741947716 0.0335126679781 0.0341762851658 0.000189604910767 0.0037683976015 0.0 0.0284170360012 0.000663617187685 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0407358156028 0.0375886524823 0.0419769503546 0.0322695035461 0.043085106383 0.0132092198582 0.0246010638298 8.86524822695e-05 0.0436170212766 0.0 0.000886524822695 0.105230496454 0.0573581560284 0.148670212766 0.00833333333333 0.0339095744681 0.000177304964539 0.131826241135 0.144060283688 0.0859485815603 0.000487588652482 0.000531914893617 0.0 0.00168439716312 0.000975177304965 0.00274822695035 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.131552670994 0.0 0.000124812780829 0.0 0.623689465801 0.0 0.0 0.0 0.18072890664 0.0 0.0 0.0 0.000124812780829 0.0 0.0567898152771 0.0 0.0 0.0 0.0 0.0 0.00399400898652 0.0 0.0 0.0 0.00299550673989 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.196585140748 0.00646054453161 0.00230733733272 0.0133825565298 0.193124134749 0.00599907706507 0.000922934933087 0.0539916935856 0.174665436087 0.0 0.0073834794647 0.0535302261191 0.00323027226581 0.0779880018459 0.095062298108 0.000922934933087 0.0 0.03737886479 0.0673742501154 0.00553760959852 0.00138440239963 0.0 0.0 0.0 0.00276880479926 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.092132505176 0.0 0.123188405797 0.0 0.15734989648 0.0 0.0 0.0310559006211 0.166149068323 0.0 0.0 0.00621118012422 0.0 0.00103519668737 0.0207039337474 0.230848861284 0.00310559006211 0.0 0.0 0.142857142857 0.0175983436853 0.00207039337474 0.00103519668737 0.000517598343685 0.00414078674948 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0538403614458 0.0143072289157 0.042921686747 0.0143072289157 0.140436746988 0.00753012048193 0.00978915662651 0.00451807228916 0.181852409639 0.0 0.0 0.0504518072289 0.0730421686747 0.0557228915663 0.0368975903614 0.0798192771084 0.0 0.0286144578313 0.157379518072 0.0316265060241 0.00263554216867 0.0 0.0135542168675 0.000753012048193 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.104328523862 0.0 0.0 0.0 0.369589345172 0.0 0.0 0.0 0.19089900111 0.0 0.00221975582686 0.0732519422863 0.00443951165372 0.0 0.0532741398446 0.0 0.0 0.0 0.0 0.0 0.00887902330744 0.00887902330744 0.00443951165372 0.0 0.0332963374029 0.146503884573 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0094510076442 0.0116747741487 0.012022237665 0.000799166087561 0.00371785962474 0.00357887421821 6.94927032662e-05 0.00267546907575 3.47463516331e-05 6.94927032662e-05 0.0102501737318 0.00681028492008 0.0127171646977 3.47463516331e-05 0.0097637248089 0.00045170257123 0.0105281445448 0.0102849200834 0.00535093815149 0.00615010423905 0.00277970813065 0.00156358582349 0.000868658790827 0.000138985406532 0.000173731758165 6.94927032662e-05 0.0492355802641 0.0562890896456 0.0374913134121 0.00159833217512 0.00879082696317 0.0277623349548 0.000764419735928 0.0339471855455 0.000660180681028 0.0107713690063 0.101841556637 0.032383599722 0.10170257123 0.000208478109798 0.0330785267547 0.000555941626129 0.0980542043085 0.0574704656011 0.16362056984 0.0178596247394 0.0145239749826 0.00868658790827 0.00347463516331 0.0126129256428 0.00458651841557 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0593910159783 0.0 0.0 0.0 0.0530599939705 0.0 0.0 0.0 0.0276354135263 0.0 0.0 0.0354738217265 0.0 0.0 0.0400964727163 0.0 0.0 0.0477338960908 0.000100492412823 0.0 0.0410009044317 0.0 0.0 0.0 0.00251231032057 0.0 0.0841121495327 0.0169832177671 0.0013064013667 0.00271329514622 0.0946638528791 0.00110541654105 0.00070344688976 0.00070344688976 0.0747663551402 0.00331624962315 0.0 0.173650889358 0.00261280273339 0.00100492412823 0.065822530399 0.00100492412823 0.0 0.0647171138579 0.0288413224802 0.00613003718219 0.0603959401065 0.00150738619234 0.000602954476937 0.0 0.00633102200784 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0294084251164 0.0 0.0 5.67730214602e-05 0.00573407516748 0.0 0.0 0.0255478596571 0.00482570682412 0.0 0.0 0.0181673668673 5.67730214602e-05 0.0 0.106335869195 0.00011354604292 0.0 0.0204950607471 0.0 0.0 0.0112410582491 0.0 0.0 0.0 0.00164641762235 0.00011354604292 0.0970818666969 0.0 0.0160667650732 0.000340638128761 0.0936187123879 0.0 0.0 0.0842511638469 0.0601794027478 0.0 0.0507550811854 0.0353128193482 0.000227092085841 0.000851595321903 0.154536164415 0.000283865107301 0.00193028272965 0.0458158283184 0.00919722947655 0.0765868059498 0.0411036675372 0.0 0.0 0.0 0.00794822300443 0.000170319064381 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0157669237361 0.0 0.0 0.0 0.0894601542416 0.0 0.0 0.0 0.0814910025707 0.0 0.0 0.0 0.0 0.0 0.0176520994002 0.0 0.0 0.0177377892031 0.0 0.0 0.0083119108826 0.0 0.0012853470437 0.0 0.00154241645244 0.0 0.0440445586975 0.00171379605827 0.00171379605827 0.0183376178235 0.252013710368 0.00231362467866 0.0141388174807 0.00214224507284 0.187746358183 0.00471293916024 0.000342759211654 0.0419880034276 0.00616966580977 0.0173950299914 0.0382176520994 0.00154241645244 8.56898029135e-05 0.0298200514139 0.0439588688946 0.000685518423308 0.0371036846615 0.00771208226221 0.00419880034276 0.0 0.00856898029135 8.56898029135e-05 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0019322058157 0.000119271963932 0.000596359819661 0.00100188449703 9.54175711457e-05 0.000548651034088 0.000190835142291 0.0 0.000310107106224 0.000262398320651 9.54175711457e-05 0.00388826602419 0.00329190620453 0.00970873786408 0.0 0.000763340569166 0.000811049354739 0.00114501085375 0.00166980749505 0.000429379070156 0.000190835142291 0.00236158488586 7.15631783593e-05 0.0133107511748 0.000500942248515 0.0 0.0447031320818 0.00410295555927 0.035519190859 0.138331623769 0.0202285250829 0.012618973784 0.0100904081487 0.00233773049307 0.00727558979986 0.0010495932826 0.0014074091744 0.0472555521099 0.0229479258605 0.109706352425 0.00484244173565 0.0166980749505 0.00348274134682 0.210705851483 0.185062379237 0.0327997900813 0.00217074974357 0.0137162758522 0.00722788101429 0.0177476682331 0.00372128527468 0.000954175711457 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0470732080086 0.0 0.0 0.0 0.0255234601865 0.0 0.0 0.0 0.0354577410974 0.0 0.0 0.0409598043711 0.0 0.0 0.0602170258291 0.0 0.0 0.038820113098 0.0 0.0 0.0175760354577 0.0 0.0 0.0 0.0 0.0 0.0674002751032 0.0 0.000305670181874 0.0 0.100718324927 0.0672474400122 0.000458505272811 0.0 0.14687452239 0.0 0.0 0.0675531101941 0.0 0.00168118600031 0.0970502827449 0.0 0.0 0.0544092923735 0.00978144581996 0.0258291303683 0.0776402261959 0.0 0.000458505272811 0.0 0.016964695094 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0279106858054 0.0 0.0 0.0 0.0154173312068 0.0 0.0 0.00146198830409 0.0104997341839 0.0 0.0 0.0204678362573 0.0 0.000930356193514 0.0151515151515 0.000132908027645 0.0 0.0530303030303 0.0 0.0 0.0152844231792 0.0 0.0 0.0 0.00212652844232 0.0 0.0734981392876 0.0 0.0 0.00026581605529 0.168394471026 0.000398724082935 0.0317650186071 0.0604731525784 0.0869218500797 0.0 0.0 0.105396065922 0.00584795321637 0.0377458798511 0.0338915470494 0.000531632110579 0.0 0.110446570973 0.0615364167996 0.00305688463583 0.0479797979798 0.0 0.000398724082935 0.0 0.00890483785221 0.000132908027645 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0484241850399 0.0 0.0 0.0 0.0346273501961 0.0 0.0 0.0 0.0175842012715 0.0 0.0 0.0 0.0 0.0 0.0416610307047 0.0 0.0 0.000135263086704 0.0 0.0 0.020559989179 0.0 0.0 0.0 0.00500473420803 0.000135263086704 0.139862031652 0.00108210469363 0.000405789260111 0.00202894630055 0.240903557419 0.00459894494792 0.0 0.00202894630055 0.1464899229 0.0 0.0 0.01230894089 0.00770999594211 0.00906262680914 0.120519410253 0.000676315433518 0.000270526173407 0.0278641958609 0.00933315298255 0.0462599756526 0.0376031381036 0.0 0.00459894494792 0.0 0.0178547274449 0.000405789260111 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 3.0466441215e-05 0.000396063735795 0.00143192273711 3.0466441215e-05 3.0466441215e-05 0.00036559729458 0.0 6.093288243e-05 0.0 0.0 0.00137098985468 0.010480455778 0.0476495140603 0.00012186576486 0.0 0.0 0.00225451664991 0.00109679188374 0.00079212747159 0.0 0.00018279864729 0.0 3.0466441215e-05 0.0 0.0 0.028760320507 0.014166895165 0.0673308350852 0.0248301495902 0.0580081040734 0.0204125156141 0.0262011394449 0.000396063735795 0.000456996618225 0.00036559729458 0.002833379033 0.043780276026 0.0303750418914 0.293300429577 0.0779940895104 0.0158425494318 0.00164518782561 0.0248606160314 0.077658958657 0.0727843280626 0.00143192273711 0.044359138409 6.093288243e-05 0.00268104682692 0.0 0.00313804344515 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0665859564165 0.0 0.0 0.0 0.0556900726392 0.0 0.0 0.0 0.00968523002421 0.0 0.0 0.0 0.0 0.0 0.089588377724 0.0 0.0 0.00121065375303 0.0 0.0 0.118644067797 0.0 0.0 0.0 0.0 0.0 0.0944309927361 0.0 0.0 0.0 0.15617433414 0.0 0.0 0.0 0.0121065375303 0.0 0.00121065375303 0.0 0.0 0.0 0.158595641646 0.0 0.0 0.00121065375303 0.0 0.0 0.234866828087 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.017523364486 0.0 0.0 0.000389408099688 0.0323208722741 0.0 0.0 0.00545171339564 0.0 0.0272585669782 0.0 0.0 0.0 0.0 0.0 0.0 0.000778816199377 0.0 0.0 0.0 0.0 0.0 0.0401090342679 0.00584112149533 0.000778816199377 0.00233644859813 0.378504672897 0.0035046728972 0.000778816199377 0.00545171339564 0.189252336449 0.000778816199377 0.00194704049844 0.0658099688474 0.00584112149533 0.0428348909657 0.00934579439252 0.00467289719626 0.0 0.00428348909657 0.118380062305 0.00584112149533 0.00856697819315 0.0 0.00700934579439 0.0 0.0144080996885 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0148755154236 0.0 0.0 0.0 0.0127355289942 0.0 0.0 0.0 0.0153974633332 0.0 0.0 0.0 0.0 0.0 0.0140403987682 0.0 0.0 0.0 0.0 0.0 0.00480192076831 0.0 0.0 0.0 0.000782921864398 0.0 0.105746646485 0.0015658437288 0.00323607703951 0.0162325799885 0.203089931625 0.00349705099431 0.00307949266663 0.000417558327679 0.146458583433 0.0 0.00506289472311 0.100631556971 0.00438436244063 0.00970823111853 0.0690537084399 0.00360144057623 0.0 0.00125267498304 0.0295422516833 0.0224437601127 0.0345529516154 0.00579362179654 0.000730727073438 0.0 0.167232110235 5.21947909599e-05 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0529742527375 0.0 0.00029594554602 0.0 0.0317648219394 0.0 0.0 9.86485153398e-05 0.0415310249581 0.0 0.0 9.86485153398e-05 0.0 0.00029594554602 0.0378810298905 0.0 0.0 0.00019729703068 9.86485153398e-05 0.0 0.0217026733748 0.0 0.0 0.0 0.00207161882214 0.0 0.128834961034 0.0457729111177 0.000591891092039 0.00019729703068 0.177961921673 0.00276215842952 0.0 0.000394594061359 0.135345763046 0.0 0.0 0.00384729209825 0.0393607576206 0.00660945052777 0.0854296142843 0.109795797573 0.0 0.000789188122719 0.0269310446878 0.00029594554602 0.0402485942587 0.000493242576699 0.00019729703068 0.0 0.00512972279767 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.00405734379226 0.0 0.0 0.0 0.00676223965377 3.86413694501e-05 0.0 0.0 0.00274353723096 0.0 0.0 0.0 0.0 0.0 0.00811468758453 0.0 0.0 3.86413694501e-05 0.0 0.0 0.00239576490591 0.0 0.0 0.0 7.72827389003e-05 0.0 0.0555276478998 0.00312995092546 0.059662274431 0.0671587001043 0.153560802195 0.0158043201051 0.230032072337 0.00436647474787 0.0552571583137 0.00262761312261 0.0125970864407 0.00741914293443 0.00351636461996 0.0150701340856 0.0248464005564 0.00409598516171 0.00262761312261 0.00398006105336 0.0954828239113 0.129796359983 0.0106263765988 0.0111673557711 0.00301402681711 0.000270489586151 0.00340044051161 0.000734186019553 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.000760726239984 0.00699868140785 0.00213003347195 0.00101430165331 0.000152145247997 0.00238360888528 0.0 0.000101430165331 0.000659296074653 0.0 0.000253575413328 0.000405720661325 0.000811441322649 0.00121716198397 0.000253575413328 0.0047165026879 0.0 0.00583223450654 0.000710011157318 0.000304290495994 0.00542651384522 0.00968658078913 0.000760726239984 0.000355005578659 0.000152145247997 0.0 0.0205396084796 0.0180038543463 0.030936200426 0.0246475301755 0.00547722892788 0.0102951617811 0.0204381783142 0.00233289380262 0.0176488487676 0.000760726239984 0.0095851506238 0.0569530378335 0.0635967136626 0.22355208439 0.0392027589005 0.0338269601379 0.000405720661325 0.135206410386 0.0398620549751 0.0403692058018 0.0912871487981 0.0259154072421 0.0326605132366 0.0032457652906 0.00669439091186 0.0014707373973 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0404912836767 0.0 0.000158478605388 7.92393026941e-05 0.0355784469097 0.0 0.0 0.00625990491284 0.01735340729 0.0 0.0 0.0184627575277 0.0 7.92393026941e-05 0.0282884310618 0.0 0.0 0.0790808240887 0.00118858954041 0.0 0.0160063391442 0.0 0.0 0.0 0.000316957210777 0.0 0.081616481775 0.000713153724247 0.000475435816165 0.000475435816165 0.146434231379 0.000792393026941 0.000475435816165 0.0338351822504 0.0627575277338 0.0 0.000554675118859 0.0736925515055 0.00110935023772 0.000792393026941 0.0806656101426 0.0458795562599 0.0 0.132171156894 0.0244057052298 0.0351030110935 0.0312995245642 0.0 0.000633914421553 0.0 0.00277337559429 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.00115473441109 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.240184757506 0.0 0.0 0.0 0.0 0.0 0.00115473441109 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.757505773672 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0120088165995 0.0 3.80025841757e-05 0.0 0.0800334422741 0.0 0.0 0.000570038762636 0.00646043930987 0.0 0.0 0.0 0.0 0.0 0.00805654784525 0.0 0.0 0.0 3.80025841757e-05 0.0 0.00497833852702 0.0 0.0 0.0 3.80025841757e-05 0.0 0.12495249677 0.0078285323402 0.0172531732158 0.0234475944364 0.216006688455 0.00566238504218 0.0104507106483 0.0014060956145 0.119100098807 7.60051683514e-05 0.00889260469712 0.0122368321046 0.0223455194953 0.017519191305 0.0806034810367 0.0102987003116 0.000304020673406 0.0242456487041 0.0916242304477 0.0420308580984 0.0267918218439 0.0079805426769 0.00190012920879 0.0 0.0147450026602 7.60051683514e-05 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0124152054269 0.0 0.0137377874483 0.0 0.0185161482998 0.0 0.0 0.0170655744699 0.0123298775545 0.0 0.00435172148982 0.00908741840522 0.00443704936217 0.00482102478775 0.0107086479799 0.0176202056402 0.00285848372371 4.26639361748e-05 0.0 0.0348990997909 0.0287554929818 0.0 0.00627159861769 0.0 0.00337045095781 0.0 0.0282435257477 0.00119459021289 0.0363496736209 0.00102393446819 0.0968471351167 0.00285848372371 0.00204786893639 0.0524339775588 0.081786765647 0.000511967234097 0.00866077904347 0.0255556977687 0.0130125005333 0.0166389351082 0.0315713127693 0.0395068048978 0.00452237723452 0.00115192627672 0.140065702462 0.156661973634 0.0439011903238 0.0 0.00725286914971 4.26639361748e-05 0.00686889372413 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.010395010395 0.0 4.158004158e-05 0.0 0.0128898128898 0.0 0.0 0.0122245322245 0.00756756756757 0.0 0.0 0.0 0.0 0.0 0.00935550935551 0.0 0.0 0.0216632016632 0.0 8.31600831601e-05 0.00390852390852 0.0 0.00253638253638 0.0 0.00120582120582 0.0 0.0778378378378 0.000706860706861 0.0101455301455 0.0002079002079 0.225821205821 0.0031185031185 0.000706860706861 0.0461538461538 0.24656964657 0.0 4.158004158e-05 0.0253638253638 0.00274428274428 0.00607068607069 0.052972972973 0.000956340956341 0.0 0.072474012474 0.0540124740125 0.0294386694387 0.0319334719335 0.00016632016632 0.0045738045738 0.0 0.0255301455301 0.000582120582121 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
7.35672772751e-05 0.0002942691091 7.35672772751e-05 0.0 0.0 0.0 0.000367836386375 0.0 0.0 0.0 0.0 0.000588538218201 0.00044140366365 0.101081438976 0.0 0.00537041124108 0.0 0.0014713455455 0.0023541528728 0.00125064371368 7.35672772751e-05 0.0 0.0 0.0 0.0 0.0 0.0338409475465 0.0313396601192 0.0348708894284 0.0267784889281 0.0357536967557 0.010961524314 0.020598837637 7.35672772751e-05 0.0361951004193 0.0 0.000735672772751 0.0876186272346 0.0478187302288 0.173913043478 0.00691532406386 0.0308246891783 0.00014713455455 0.110130214081 0.120723902008 0.071948797175 0.00044140366365 0.00044140366365 0.0 0.00139777826823 0.000809240050026 0.00228058559553 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0331849453662 0.0 0.000202347227843 0.0 0.0333872925941 0.0 0.0 0.0 0.0400647511129 0.0 0.0 0.0 0.000202347227843 0.0 0.0176042088223 0.0 0.0 0.0 0.0 0.0 0.00161877782274 0.0 0.0 0.0 0.0 0.0 0.123229461756 0.0 0.000202347227843 0.0 0.522258195063 0.0 0.0 0.0 0.166531768515 0.0 0.0 0.0 0.000202347227843 0.0 0.0548360987454 0.0 0.0 0.0 0.0 0.0 0.00404694455686 0.0 0.0 0.0 0.00242816673412 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0719386405713 0.0 0.0 0.0 0.0409944459138 0.0 0.0 0.0444326897646 0.0573922242793 0.0 0.0 0.0 0.0 0.0 0.0454906109495 0.0 0.0 0.024332187252 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.148637926474 0.00370272414705 0.00132240148109 0.00766992859032 0.131182226924 0.00343824385083 0.000528960592436 0.0531605395398 0.128801904258 0.0 0.00423168473949 0.0306797143613 0.00185136207353 0.0446971700608 0.0772282464956 0.000528960592436 0.0 0.0335889976197 0.0386141232478 0.00317376355462 0.000793440888654 0.0 0.0 0.0 0.00158688177731 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.00405268490375 0.0 0.0 0.0 0.00506585612969 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00405268490375 0.0 0.00101317122594 0.0 0.0 0.0901722391084 0.0 0.120567375887 0.0 0.156028368794 0.0 0.0 0.0303951367781 0.165146909828 0.0 0.0 0.00607902735562 0.0 0.00101317122594 0.0202634245187 0.225937183384 0.00303951367781 0.0 0.0 0.139817629179 0.0172239108409 0.00405268490375 0.00101317122594 0.00101317122594 0.00405268490375 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0142083897158 0.0 0.0 0.0 0.0290933694181 0.0 0.0 0.0 0.00338294993234 0.0 0.0 0.0 0.0 0.0 0.0175913396482 0.0 0.0 0.0 0.0 0.0 0.00338294993234 0.0 0.0 0.0 0.0 0.0 0.0554803788904 0.0128552097429 0.0385656292287 0.0128552097429 0.140730717185 0.00676589986468 0.00879566982409 0.00405953991881 0.165087956698 0.0 0.0 0.0453315290934 0.0656292286874 0.0500676589986 0.041948579161 0.0717185385656 0.0 0.0257104194858 0.141407307172 0.0284167794317 0.00405953991881 0.0 0.0121786197564 0.000676589986468 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0078125 0.0 0.0 0.0 0.029296875 0.0 0.0 0.0 0.00390625 0.0 0.0 0.0 0.0 0.0 0.03515625 0.0 0.0 0.0 0.0 0.0 0.00390625 0.0 0.0 0.0 0.0 0.0 0.095703125 0.0 0.0 0.0 0.33984375 0.0 0.0 0.0 0.169921875 0.0 0.001953125 0.064453125 0.00390625 0.0 0.064453125 0.0 0.0 0.0 0.0 0.0 0.009765625 0.0078125 0.00390625 0.0 0.029296875 0.12890625 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1
</data></transition_probabilities>
</opencv_storage>
......@@ -46,6 +46,7 @@ public:
};
//OCR recognition is done in parallel for different detections
template <class T>
class Parallel_OCR: public cv::ParallelLoopBody
{
private:
......@@ -54,12 +55,12 @@ private:
vector< vector<Rect> > &boxes;
vector< vector<string> > &words;
vector< vector<float> > &confidences;
vector< Ptr<OCRTesseract> > &ocrs;
vector< Ptr<T> > &ocrs;
public:
Parallel_OCR(vector<Mat> &_detections, vector<string> &_outputs, vector< vector<Rect> > &_boxes,
vector< vector<string> > &_words, vector< vector<float> > &_confidences,
vector< Ptr<OCRTesseract> > &_ocrs)
vector< Ptr<T> > &_ocrs)
: detections(_detections), outputs(_outputs), boxes(_boxes), words(_words),
confidences(_confidences), ocrs(_ocrs)
{}
......@@ -88,6 +89,7 @@ int main(int argc, char* argv[])
cout << " Usage: " << argv[0] << " [camera_index]" << endl << endl;
cout << " Press 'e' to switch between MSER/CSER regions." << endl;
cout << " Press 'g' to switch between Horizontal and Arbitrary oriented grouping." << endl;
cout << " Press 'o' to switch between OCRTesseract/OCRHMMDecoder recognition." << endl;
cout << " Press 's' to scale down frame size to 320x240." << endl;
cout << " Press 'ESC' to exit." << endl << endl;
......@@ -98,7 +100,7 @@ int main(int argc, char* argv[])
int RECOGNITION = 0;
char *region_types_str[2] = {const_cast<char *>("ERStats"), const_cast<char *>("MSER")};
char *grouping_algorithms_str[2] = {const_cast<char *>("exhaustive_search"), const_cast<char *>("multioriented")};
char *recognitions_str[3] = {const_cast<char *>("Tesseract"), const_cast<char *>("NM_chain_features + KNN"), const_cast<char *>("NM_chain_features + MLP")};
char *recognitions_str[2] = {const_cast<char *>("Tesseract"), const_cast<char *>("NM_chain_features + KNN")};
Mat frame,grey,orig_grey,out_img;
vector<Mat> channels;
......@@ -119,6 +121,7 @@ int main(int argc, char* argv[])
//double t_r = getTickCount();
//Initialize OCR engine (we initialize 10 instances in order to work several recognitions in parallel)
cout << "Initializing OCR engines ..." << endl;
int num_ocrs = 10;
vector< Ptr<OCRTesseract> > ocrs;
for (int o=0; o<num_ocrs; o++)
......@@ -126,6 +129,22 @@ int main(int argc, char* argv[])
ocrs.push_back(OCRTesseract::create());
}
Mat transition_p;
string filename = "OCRHMM_transitions_table.xml";
FileStorage fs(filename, FileStorage::READ);
fs["transition_probabilities"] >> transition_p;
fs.release();
Mat emission_p = Mat::eye(62,62,CV_64FC1);
string voc = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
vector< Ptr<OCRHMMDecoder> > decoders;
for (int o=0; o<num_ocrs; o++)
{
decoders.push_back(OCRHMMDecoder::create(loadOCRHMMClassifierNM("OCRHMM_knn_model_data.xml.gz"),
voc, transition_p, emission_p));
}
cout << " Done!" << endl;
//cout << "TIME_OCR_INITIALIZATION_ALT = "<< ((double)getTickCount() - t_r)*1000/getTickFrequency() << endl;
......@@ -216,7 +235,12 @@ int main(int argc, char* argv[])
float scale_img = (float)(600.f/frame.rows);
float scale_font = (float)(2-scale_img)/1.4f;
vector<string> words_detection;
float min_confidence1 = 51.f, min_confidence2 = 60.f;
float min_confidence1 = 0.f, min_confidence2 = 0.f;
if (RECOGNITION == 0)
{
min_confidence1 = 51.f; min_confidence2 = 60.f;
}
vector<Mat> detections;
......@@ -238,25 +262,32 @@ int main(int argc, char* argv[])
vector< vector<string> > words((int)detections.size());
vector< vector<float> > confidences((int)detections.size());
// parallel process detections in batches of ocrs.size()
for (int i=0; i<(int)detections.size(); i=i+(int)ocrs.size())
// parallel process detections in batches of ocrs.size() (== num_ocrs)
for (int i=0; i<(int)detections.size(); i=i+(int)num_ocrs)
{
Range r;
if (i+(int)ocrs.size() <= (int)detections.size())
r = Range(i,i+(int)ocrs.size());
if (i+(int)num_ocrs <= (int)detections.size())
r = Range(i,i+(int)num_ocrs);
else
r = Range(i,(int)detections.size());
parallel_for_(r, Parallel_OCR(detections, outputs, boxes, words, confidences, ocrs));
switch(RECOGNITION)
{
case 0:
parallel_for_(r, Parallel_OCR<OCRTesseract>(detections, outputs, boxes, words, confidences, ocrs));
break;
case 1:
parallel_for_(r, Parallel_OCR<OCRHMMDecoder>(detections, outputs, boxes, words, confidences, decoders));
break;
}
}
for (int i=0; i<(int)detections.size(); i++)
{
outputs[i].erase(remove(outputs[i].begin(), outputs[i].end(), '\n'), outputs[i].end());
//cout << "OCR output = \"" << output << "\" lenght = " << output.size() << endl;
//cout << "OCR output = \"" << outputs[i] << "\" lenght = " << outputs[i].size() << endl;
if (outputs[i].size() < 3)
continue;
......@@ -308,15 +339,15 @@ int main(int argc, char* argv[])
{
case 103: //g
GROUPING_ALGORITHM = (GROUPING_ALGORITHM+1)%2;
cout << "Grouping switched to " << GROUPING_ALGORITHM << endl;
cout << "Grouping switched to " << grouping_algorithms_str[GROUPING_ALGORITHM] << endl;
break;
case 111: //o
RECOGNITION = (RECOGNITION+1)%2;
cout << "OCR switched to " << recognitions_str[RECOGNITION] << endl;
break;
//case 111: //o
// RECOGNITION = (RECOGNITION+1)%3;
// cout << "OCR switched to " << RECOGNITION << endl;
// break;
case 114: //r
REGION_TYPE = (REGION_TYPE+1)%2;
cout << "Regions switched to " << REGION_TYPE << endl;
cout << "Regions switched to " << region_types_str[REGION_TYPE] << endl;
break;
case 115: //s
downsize = !downsize;
......
/*M///////////////////////////////////////////////////////////////////////////////////////
//
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
//
// By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software.
//
//
// License Agreement
// For Open Source Computer Vision Library
//
// Copyright (C) 2000-2008, Intel Corporation, all rights reserved.
// Copyright (C) 2009, Willow Garage Inc., all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
// * Redistribution's of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
// * Redistribution's in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
// * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.
//
//M*/
#include "precomp.hpp"
#include "opencv2/imgproc.hpp"
#include "opencv2/ml.hpp"
#include <iostream>
#include <fstream>
#include <queue>
namespace cv
{
namespace text
{
using namespace std;
using namespace cv::ml;
/* OCR HMM Decoder */
void OCRHMMDecoder::run(Mat& image, string& output_text, vector<Rect>* component_rects,
vector<string>* component_texts, vector<float>* component_confidences,
int component_level)
{
CV_Assert( (image.type() == CV_8UC1) || (image.type() == CV_8UC3) );
CV_Assert( (component_level == OCR_LEVEL_TEXTLINE) || (component_level == OCR_LEVEL_WORD) );
output_text.clear();
if (component_rects != NULL)
component_rects->clear();
if (component_texts != NULL)
component_texts->clear();
if (component_confidences != NULL)
component_confidences->clear();
}
void OCRHMMDecoder::ClassifierCallback::eval( InputArray image, vector<int>& out_class, vector<double>& out_confidence)
{
CV_Assert(( image.getMat().type() == CV_8UC3 ) || ( image.getMat().type() == CV_8UC1 ));
out_class.clear();
out_confidence.clear();
}
bool sort_rect_horiz (Rect a,Rect b);
bool sort_rect_horiz (Rect a,Rect b) { return (a.x<b.x); }
class OCRHMMDecoderImpl : public OCRHMMDecoder
{
public:
//Default constructor
OCRHMMDecoderImpl( Ptr<OCRHMMDecoder::ClassifierCallback> _classifier,
const string& _vocabulary,
InputArray transition_probabilities_table,
InputArray emission_probabilities_table,
decoder_mode _mode)
{
classifier = _classifier;
transition_p = transition_probabilities_table.getMat();
emission_p = emission_probabilities_table.getMat();
vocabulary = _vocabulary;
mode = _mode;
}
~OCRHMMDecoderImpl()
{
}
void run( Mat& image,
string& out_sequence,
vector<Rect>* component_rects,
vector<string>* component_texts,
vector<float>* component_confidences,
int component_level)
{
CV_Assert( (image.type() == CV_8UC1) || (image.type() == CV_8UC3) );
CV_Assert( (image.cols > 0) && (image.rows > 0) );
CV_Assert( component_level == OCR_LEVEL_WORD );
out_sequence.clear();
if (component_rects != NULL)
component_rects->clear();
if (component_texts != NULL)
component_texts->clear();
if (component_confidences != NULL)
component_confidences->clear();
// First we split a line into words
vector<Mat> words_mask;
vector<Rect> words_rect;
/// Find contours
vector<vector<Point> > contours;
vector<Vec4i> hierarchy;
Mat tmp;
image.copyTo(tmp);
findContours( tmp, contours, hierarchy, RETR_EXTERNAL, CHAIN_APPROX_SIMPLE, Point(0, 0) );
if (contours.size() < 6)
{
//do not split lines with less than 6 characters
words_mask.push_back(image);
words_rect.push_back(Rect(0,0,image.cols,image.rows));
}
else
{
Mat_<float> vector_w((int)image.cols,1);
reduce(image, vector_w, 0, REDUCE_SUM, -1);
vector<int> spaces;
vector<int> spaces_start;
vector<int> spaces_end;
int space_count=0;
int last_one_idx;
int s_init = 0, s_end=vector_w.cols;
for (int s=0; s<vector_w.cols; s++)
{
if (vector_w.at<float>(0,s) == 0)
s_init = s+1;
else
break;
}
for (int s=vector_w.cols-1; s>=0; s--)
{
if (vector_w.at<float>(0,s) == 0)
s_end = s;
else
break;
}
for (int s=s_init; s<s_end; s++)
{
if (vector_w.at<float>(0,s) == 0)
{
space_count++;
} else {
if (space_count!=0)
{
spaces.push_back(space_count);
spaces_start.push_back(last_one_idx);
spaces_end.push_back(s-1);
}
space_count = 0;
last_one_idx = s;
}
}
Scalar mean_space,std_space;
meanStdDev(Mat(spaces),mean_space,std_space);
int num_word_spaces = 0;
int last_word_space_end = 0;
for (int s=0; s<(int)spaces.size(); s++)
{
if (spaces_end.at(s)-spaces_start.at(s) > mean_space[0]+(mean_space[0]*1.1)) //this 1.1 is a param?
{
if (num_word_spaces == 0)
{
//cout << " we have a word from 0 to " << spaces_start.at(s) << endl;
Mat word_mask;
Rect word_rect = Rect(0,0,spaces_start.at(s),image.rows);
image(word_rect).copyTo(word_mask);
words_mask.push_back(word_mask);
words_rect.push_back(word_rect);
}
else
{
//cout << " we have a word from " << last_word_space_end << " to " << spaces_start.at(s) << endl;
Mat word_mask;
Rect word_rect = Rect(last_word_space_end,0,spaces_start.at(s)-last_word_space_end,image.rows);
image(word_rect).copyTo(word_mask);
words_mask.push_back(word_mask);
words_rect.push_back(word_rect);
}
num_word_spaces++;
last_word_space_end = spaces_end.at(s);
}
}
//cout << " we have a word from " << last_word_space_end << " to " << vector_w.cols << endl << endl << endl;
Mat word_mask;
Rect word_rect = Rect(last_word_space_end,0,vector_w.cols-last_word_space_end,image.rows);
image(word_rect).copyTo(word_mask);
words_mask.push_back(word_mask);
words_rect.push_back(word_rect);
}
for (int w=0; w<(int)words_mask.size(); w++)
{
vector< vector<int> > observations;
vector< vector<double> > confidences;
vector<int> obs;
// First find contours and sort by x coordinate of bbox
words_mask[w].copyTo(tmp);
if (tmp.empty())
continue;
contours.clear();
hierarchy.clear();
/// Find contours
findContours( tmp, contours, hierarchy, RETR_EXTERNAL, CHAIN_APPROX_SIMPLE, Point(0, 0) );
vector<Rect> contours_rect;
for (int i=0; i<(int)contours.size(); i++)
{
contours_rect.push_back(boundingRect(contours[i]));
}
sort(contours_rect.begin(), contours_rect.end(), sort_rect_horiz);
// Do character recognition foreach contour
for (int i=0; i<(int)contours.size(); i++)
{
Mat tmp_mask;
words_mask[w](contours_rect.at(i)).copyTo(tmp_mask);
vector<int> out_class;
vector<double> out_conf;
classifier->eval(tmp_mask,out_class,out_conf);
if (!out_class.empty())
obs.push_back(out_class[0]);
observations.push_back(out_class);
confidences.push_back(out_conf);
}
//This must be extracted from dictionary, or just assumed to be equal for all characters
vector<double> start_p(vocabulary.size());
for (int i=0; i<(int)vocabulary.size(); i++)
start_p[i] = 1.0/vocabulary.size();
Mat V = Mat::zeros((int)observations.size(),(int)vocabulary.size(),CV_64FC1);
vector<string> path(vocabulary.size());
// Initialize base cases (t == 0)
for (int i=0; i<(int)vocabulary.size(); i++)
{
for (int j=0; j<(int)observations[0].size(); j++)
{
emission_p.at<double>(observations[0][j],obs[0]) = confidences[0][j];
}
V.at<double>(0,i) = start_p[i] * emission_p.at<double>(i,obs[0]);
path[i] = vocabulary.at(i);
}
// Run Viterbi for t > 0
for (int t=1; t<(int)obs.size(); t++)
{
//Dude this has to be done each time!!
emission_p = Mat::eye(62,62,CV_64FC1);
for (int e=0; e<(int)observations[t].size(); e++)
{
emission_p.at<double>(observations[t][e],obs[t]) = confidences[t][e];
}
vector<string> newpath(vocabulary.size());
for (int i=0; i<(int)vocabulary.size(); i++)
{
double max_prob = 0;
int best_idx = 0;
for (int j=0; j<(int)vocabulary.size(); j++)
{
double prob = V.at<double>(t-1,j) * transition_p.at<double>(j,i) * emission_p.at<double>(i,obs[t]);
if ( prob > max_prob)
{
max_prob = prob;
best_idx = j;
}
}
V.at<double>(t,i) = max_prob;
newpath[i] = path[best_idx] + vocabulary.at(i);
}
// Don't need to remember the old paths
path.swap(newpath);
}
double max_prob = 0;
int best_idx = 0;
for (int i=0; i<(int)vocabulary.size(); i++)
{
double prob = V.at<double>((int)obs.size()-1,i);
if ( prob > max_prob)
{
max_prob = prob;
best_idx = i;
}
}
//cout << path[best_idx] << endl;
out_sequence = out_sequence+" "+path[best_idx];
if (component_rects != NULL)
component_rects->push_back(words_rect[w]);
if (component_texts != NULL)
component_texts->push_back(path[best_idx]);
if (component_confidences != NULL)
component_confidences->push_back((float)max_prob);
}
return;
}
};
Ptr<OCRHMMDecoder> OCRHMMDecoder::create( Ptr<OCRHMMDecoder::ClassifierCallback> _classifier,
const string& _vocabulary,
InputArray transition_p,
InputArray emission_p,
decoder_mode _mode)
{
return makePtr<OCRHMMDecoderImpl>(_classifier, _vocabulary, transition_p, emission_p, _mode);
}
class CV_EXPORTS OCRHMMClassifierKNN : public OCRHMMDecoder::ClassifierCallback
{
public:
//constructor
OCRHMMClassifierKNN(const std::string& filename);
// Destructor
~OCRHMMClassifierKNN() {}
void eval( InputArray mask, vector<int>& out_class, vector<double>& out_confidence );
private:
Ptr<KNearest> knn;
};
OCRHMMClassifierKNN::OCRHMMClassifierKNN (const string& filename)
{
knn = KNearest::create();
if (ifstream(filename.c_str()))
{
Mat hus, labels;
cv::FileStorage storage(filename.c_str(), cv::FileStorage::READ);
storage["hus"] >> hus;
storage["labels"] >> labels;
storage.release();
knn->train(hus, ROW_SAMPLE, labels);
}
else
CV_Error(Error::StsBadArg, "Default classifier data file not found!");
}
void OCRHMMClassifierKNN::eval( InputArray _mask, vector<int>& out_class, vector<double>& out_confidence )
{
CV_Assert( _mask.getMat().type() == CV_8UC1 );
out_class.clear();
out_confidence.clear();
int image_height = 35;
int image_width = 35;
int num_features = 200;
Mat img = _mask.getMat();
Mat tmp;
img.copyTo(tmp);
vector<vector<Point> > contours;
vector<Vec4i> hierarchy;
/// Find contours
findContours( tmp, contours, hierarchy, RETR_EXTERNAL, CHAIN_APPROX_SIMPLE, Point(0, 0) );
if (contours.empty())
return;
int idx = 0;
if (contours.size() > 1)
{
// this is to make sure we have the mask with a single contour
// e.g "i" and "j" have two contours, but it may be also a part of a neighbour character
// we take the larger one and clean the outside in order to have a single contour
int max_area = 0;
for (int cc=0; cc<(int)contours.size(); cc++)
{
int area_c = boundingRect(contours[cc]).area();
if ( area_c > max_area)
{
idx = cc;
max_area = area_c;
}
}
// clean-up the outside of the contour
Mat tmp_c = Mat::zeros(tmp.rows, tmp.cols, CV_8UC1);
drawContours(tmp_c, contours, idx, Scalar(255), FILLED);
img = img & tmp_c;
}
Rect bbox = boundingRect(contours[idx]);
//Crop to fit the exact rect of the contour and resize to a fixed-sized matrix of 35 x 35 pixel, while retaining the centroid of the region and aspect ratio.
Mat mask = Mat::zeros(image_height,image_width,CV_8UC1);
img(bbox).copyTo(tmp);
if (tmp.cols>tmp.rows)
{
int height = image_width*tmp.rows/tmp.cols;
resize(tmp,tmp,Size(image_width,height));
tmp.copyTo(mask(Rect(0,(image_height-height)/2,image_width,height)));
}
else
{
int width = image_height*tmp.cols/tmp.rows;
resize(tmp,tmp,Size(width,image_height));
tmp.copyTo(mask(Rect((image_width-width)/2,0,width,image_height)));
}
//find contours again (now resized)
mask.copyTo(tmp);
findContours( tmp, contours, hierarchy, RETR_LIST, CHAIN_APPROX_SIMPLE, Point(0, 0) );
vector<Mat> maps;
for (int i=0; i<8; i++)
{
Mat map = Mat::zeros(image_height,image_width,CV_8UC1);
maps.push_back(map);
}
for (int c=0; c<(int)contours.size(); c++)
{
for (int i=0; i<(int)contours[c].size(); i++)
{
//cout << contours[c][i] << " -- " << contours[c][(i+1)%contours[c].size()] << endl;
double dy = contours[c][i].y - contours[c][(i+1)%contours[c].size()].y;
double dx = contours[c][i].x - contours[c][(i+1)%contours[c].size()].x;
double angle = atan2 (dy,dx) * 180 / 3.14159265;
//cout << " angle = " << angle << endl;
int idx_a = 0;
if ((angle>=157.5)||(angle<=-157.5))
idx_a = 0;
else if ((angle>=-157.5)&&(angle<=-112.5))
idx_a = 1;
else if ((angle>=-112.5)&&(angle<=-67.5))
idx_a = 2;
else if ((angle>=-67.5)&&(angle<=-22.5))
idx_a = 3;
else if ((angle>=-22.5)&&(angle<=22.5))
idx_a = 4;
else if ((angle>=22.5)&&(angle<=67.5))
idx_a = 5;
else if ((angle>=67.5)&&(angle<=112.5))
idx_a = 6;
else if ((angle>=112.5)&&(angle<=157.5))
idx_a = 7;
line(maps[idx_a],contours[c][i],contours[c][(i+1)%contours[c].size()],Scalar(255));
}
}
//On each bitmap a regular 7x7 Gaussian masks are evenly placed
for (int i=0; i<(int)maps.size(); i++)
{
copyMakeBorder(maps[i],maps[i],7,7,7,7,BORDER_CONSTANT,Scalar(0));
GaussianBlur(maps[i], maps[i], Size(7,7), 2, 2);
normalize(maps[i],maps[i],0,255,NORM_MINMAX);
resize(maps[i],maps[i],Size(image_width,image_height));
}
//Generate features for each bitmap
Mat sample = Mat(1,num_features,CV_32FC1);
Mat patch;
for (int i=0; i<(int)maps.size(); i++)
{
for(int y=0; y<image_height; y=y+7)
{
for(int x=0; x<image_width; x=x+7)
{
maps[i](Rect(x,y,7,7)).copyTo(patch);
Scalar mean,std;
meanStdDev(patch,mean,std);
sample.at<float>(0,i*25+((int)x/7)+((int)y/7)*5) = (float)(mean[0]/255);
//cout << " avg " << mean[0] << " in patch " << x << "," << y << " channel " << i << " idx = " << i*25+((int)x/7)+((int)y/7)*5<< endl;
}
}
}
Mat responses,dists,predictions;
knn->findNearest( sample, 11, predictions, responses, dists);
Scalar dist_sum = sum(dists);
Mat class_predictions = Mat::zeros(1,62,CV_64FC1);
vector<vector<int> > equivalency_mat(62);
equivalency_mat[2].push_back(28); // c -> C
equivalency_mat[28].push_back(2); // C -> c
equivalency_mat[8].push_back(34); // i -> I
equivalency_mat[8].push_back(11); // i -> l
equivalency_mat[11].push_back(8); // l -> i
equivalency_mat[11].push_back(34); // l -> I
equivalency_mat[34].push_back(8); // I -> i
equivalency_mat[34].push_back(11); // I -> l
equivalency_mat[9].push_back(35); // j -> J
equivalency_mat[35].push_back(9); // J -> j
equivalency_mat[14].push_back(40); // o -> O
equivalency_mat[14].push_back(52); // o -> 0
equivalency_mat[40].push_back(14); // O -> o
equivalency_mat[40].push_back(52); // O -> 0
equivalency_mat[52].push_back(14); // 0 -> o
equivalency_mat[52].push_back(40); // 0 -> O
equivalency_mat[15].push_back(41); // p -> P
equivalency_mat[41].push_back(15); // P -> p
equivalency_mat[18].push_back(44); // s -> S
equivalency_mat[44].push_back(18); // S -> s
equivalency_mat[20].push_back(46); // u -> U
equivalency_mat[46].push_back(20); // U -> u
equivalency_mat[21].push_back(47); // v -> V
equivalency_mat[47].push_back(21); // V -> v
equivalency_mat[22].push_back(48); // w -> W
equivalency_mat[48].push_back(22); // W -> w
equivalency_mat[23].push_back(49); // x -> X
equivalency_mat[49].push_back(23); // X -> x
equivalency_mat[25].push_back(51); // z -> Z
equivalency_mat[51].push_back(25); // Z -> z
for (int j=0; j<responses.cols; j++)
{
if (responses.at<float>(0,j)<0)
continue;
class_predictions.at<double>(0,(int)responses.at<float>(0,j)) += dists.at<float>(0,j);
for (int e=0; e<(int)equivalency_mat[(int)responses.at<float>(0,j)].size(); e++)
{
class_predictions.at<double>(0,equivalency_mat[(int)responses.at<float>(0,j)][e]) += dists.at<float>(0,j);
dist_sum[0] += dists.at<float>(0,j);
}
}
class_predictions = class_predictions/dist_sum[0];
out_class.push_back((int)predictions.at<float>(0,0));
out_confidence.push_back(class_predictions.at<double>(0,(int)predictions.at<float>(0,0)));
for (int i=0; i<class_predictions.cols; i++)
{
if ((class_predictions.at<double>(0,i) > 0) && (i != out_class[0]))
{
out_class.push_back(i);
out_confidence.push_back(class_predictions.at<double>(0,i));
}
}
}
Ptr<OCRHMMDecoder::ClassifierCallback> loadOCRHMMClassifierNM(const std::string& filename)
{
return makePtr<OCRHMMClassifierKNN>(filename);
}
}
}
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment