Commit 9b7dfd67 authored by Andrey Kamaev's avatar Andrey Kamaev

Merge branch '2.4'

parents 9b8c8718 980fc93b
......@@ -5,8 +5,6 @@
Introduction to Java Development
********************************
Last updated: 28 February, 2013.
As of OpenCV 2.4.4, OpenCV supports desktop Java development using nearly the same interface as for
Android development. This guide will help you to create your first Java (or Scala) application using OpenCV.
We will use either `Eclipse <http://eclipse.org/>`_, `Apache Ant <http://ant.apache.org/>`_ or the
......@@ -15,7 +13,7 @@ We will use either `Eclipse <http://eclipse.org/>`_, `Apache Ant <http://ant.apa
For further reading after this guide, look at the :ref:`Android_Dev_Intro` tutorials.
What we'll do in this guide
***************************
===========================
In this guide, we will:
......@@ -29,12 +27,12 @@ The same process was used to create the samples in the :file:`samples/java` fold
so consult those files if you get lost.
Get proper OpenCV
*****************
=================
Starting from version 2.4.4 OpenCV includes desktop Java bindings.
Download
########
--------
The most simple way to get it is downloading the appropriate package of **version 2.4.4 or higher** from the
`OpenCV SourceForge repository <http://sourceforge.net/projects/opencvlibrary/files/>`_.
......@@ -50,11 +48,11 @@ In order to build OpenCV with Java bindings you need :abbr:`JDK (Java Developmen
`Apache Ant <http://ant.apache.org/>`_ and `Python` v2.6 or higher to be installed.
Build
#####
-----
Let's build OpenCV:
.. code-block:: bash
.. code-block:: bash
git clone git://github.com/Itseez/opencv.git
cd opencv
......@@ -65,13 +63,13 @@ Let's build OpenCV:
Generate a Makefile or a MS Visual Studio* solution, or whatever you use for
building executables in your system:
.. code-block:: bash
.. code-block:: bash
cmake -DBUILD_SHARED_LIBS=OFF ..
or
.. code-block:: bat
.. code-block:: bat
cmake -DBUILD_SHARED_LIBS=OFF -G "Visual Studio 10" ..
......@@ -83,7 +81,7 @@ Examine the output of CMake and ensure ``java`` is one of the modules "To be bui
If not, it's likely you're missing a dependency. You should troubleshoot by looking
through the CMake output for any Java-related tools that aren't found and installing them.
.. image:: images/cmake_output.png
.. image:: images/cmake_output.png
:alt: CMake output
:align: center
......@@ -99,23 +97,23 @@ through the CMake output for any Java-related tools that aren't found and instal
Now start the build:
.. code-block:: bash
.. code-block:: bash
make -j8
or
.. code-block:: bat
.. code-block:: bat
msbuild /m OpenCV.sln /t:Build /p:Configuration=Release /v:m
Besides all this will create a ``jar`` containing the Java interface (:file:`bin/opencv-244.jar`)
and a native dynamic library containing Java bindings and all the OpenCV stuff
(:file:`bin/Release/opencv_java244.dll` or :file:`lib/libopencv_java244.so` respectively).
(:file:`lib/libopencv_java244.so` or :file:`bin/Release/opencv_java244.dll` respectively).
We'll use these files later.
Java sample with Ant
********************
====================
.. note::
The described sample is provided with OpenCV library in the :file:`opencv/samples/java/ant` folder.
......@@ -236,7 +234,7 @@ Java sample with Ant
:align: center
Java project in Eclipse
***********************
=======================
Now let's look at the possiblity of using OpenCV in Java when developing in Eclipse IDE.
......@@ -256,49 +254,48 @@ Now let's look at the possiblity of using OpenCV in Java when developing in Ecli
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib2.png
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib3.png
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib4.png
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib5.png
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib6.png
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib7.png
:alt: Eclipse: external JAR
:align: center
` `
|
.. image:: images/eclipse_user_lib8.png
:alt: Eclipse: external JAR
:align: center
` `
* Add a new Java class (say ``Main``) containing the application entry:
......@@ -307,6 +304,7 @@ Now let's look at the possiblity of using OpenCV in Java when developing in Ecli
:align: center
* Put some simple OpenCV calls there, e.g.:
.. code-block:: java
import org.opencv.core.Core;
......@@ -328,7 +326,7 @@ Now let's look at the possiblity of using OpenCV in Java when developing in Ecli
:align: center
SBT project for Java and Scala
******************************
==============================
Now we'll create a simple Java application using SBT. This serves as a brief introduction to
those unfamiliar with this build tool. We're using SBT because it is particularly easy and powerful.
......@@ -338,14 +336,14 @@ First, download and install `SBT <http://www.scala-sbt.org/>`_ using the instruc
Next, navigate to a new directory where you'd like the application source to live (outside :file:`opencv` dir).
Let's call it "JavaSample" and create a directory for it:
.. code-block:: bash
.. code-block:: bash
cd <somewhere outside opencv>
mkdir JavaSample
Now we will create the necessary folders and an SBT project:
.. code-block:: bash
.. code-block:: bash
cd JavaSample
mkdir -p src/main/java # This is where SBT expects to find Java sources
......@@ -354,7 +352,7 @@ Now we will create the necessary folders and an SBT project:
Now open :file:`project/build.scala` in your favorite editor and paste the following.
It defines your project:
.. code-block:: scala
.. code-block:: scala
import sbt._
import Keys._
......@@ -382,20 +380,20 @@ It defines your project:
Now edit :file:`project/plugins.sbt` and paste the following.
This will enable auto-generation of an Eclipse project:
.. code-block:: scala
.. code-block:: scala
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.1.0")
Now run ``sbt`` from the :file:`JavaSample` root and from within SBT run ``eclipse`` to generate an eclipse project:
.. code-block:: bash
.. code-block:: bash
sbt # Starts the sbt console
> eclipse # Running "eclipse" from within the sbt console
You should see something like this:
.. image:: images/sbt_eclipse.png
.. image:: images/sbt_eclipse.png
:alt: SBT output
:align: center
......@@ -406,7 +404,7 @@ we'll be using SBT to build the project, so if you choose to use Eclipse it will
To test that everything is working, create a simple "Hello OpenCV" application.
Do this by creating a file :file:`src/main/java/HelloOpenCV.java` with the following contents:
.. code-block:: java
.. code-block:: java
public class HelloOpenCV {
public static void main(String[] args) {
......@@ -416,18 +414,18 @@ Do this by creating a file :file:`src/main/java/HelloOpenCV.java` with the follo
Now execute ``run`` from the sbt console, or more concisely, run ``sbt run`` from the command line:
.. code-block:: bash
.. code-block:: bash
sbt run
You should see something like this:
.. image:: images/sbt_run.png
.. image:: images/sbt_run.png
:alt: SBT run
:align: center
Running SBT samples
###################
-------------------
Now we'll create a simple face detection application using OpenCV.
......@@ -435,7 +433,7 @@ First, create a :file:`lib/` folder and copy the OpenCV jar into it.
By default, SBT adds jars in the lib folder to the Java library search path.
You can optionally rerun ``sbt eclipse`` to update your Eclipse project.
.. code-block:: bash
.. code-block:: bash
mkdir lib
cp <opencv_dir>/build/bin/opencv_<version>.jar lib/
......@@ -443,7 +441,7 @@ You can optionally rerun ``sbt eclipse`` to update your Eclipse project.
Next, create the directory :file:`src/main/resources` and download this Lena image into it:
.. image:: images/lena.png
.. image:: images/lena.png
:alt: Lena
:align: center
......@@ -453,7 +451,7 @@ Items in the resources directory are available to the Java application at runtim
Next, copy :file:`lbpcascade_frontalface.xml` from :file:`opencv/data/lbpcascades/` into the :file:`resources`
directory:
.. code-block:: bash
.. code-block:: bash
cp <opencv_dir>/data/lbpcascades/lbpcascade_frontalface.xml src/main/resources/
......@@ -519,19 +517,19 @@ You will also get errors if you try to load OpenCV when it has already been load
Now run the face detection app using ``sbt run``:
.. code-block:: bash
.. code-block:: bash
sbt run
You should see something like this:
.. image:: images/sbt_run_face.png
.. image:: images/sbt_run_face.png
:alt: SBT run
:align: center
It should also write the following image to :file:`faceDetection.png`:
.. image:: images/faceDetection.png
.. image:: images/faceDetection.png
:alt: Detected face
:align: center
......
......@@ -1044,18 +1044,32 @@ enum
COLOR_RGBA2mRGBA = 125,
COLOR_mRGBA2RGBA = 126,
COLOR_RGB2YUV_I420 = 127,
COLOR_BGR2YUV_I420 = 128,
COLOR_RGB2YUV_IYUV = COLOR_RGB2YUV_I420,
COLOR_BGR2YUV_IYUV = COLOR_BGR2YUV_I420,
COLOR_RGBA2YUV_I420 = 129,
COLOR_BGRA2YUV_I420 = 130,
COLOR_RGBA2YUV_IYUV = COLOR_RGBA2YUV_I420,
COLOR_BGRA2YUV_IYUV = COLOR_BGRA2YUV_I420,
COLOR_RGB2YUV_YV12 = 131,
COLOR_BGR2YUV_YV12 = 132,
COLOR_RGBA2YUV_YV12 = 133,
COLOR_BGRA2YUV_YV12 = 134,
// Edge-Aware Demosaicing
COLOR_BayerBG2BGR_EA = 127,
COLOR_BayerGB2BGR_EA = 128,
COLOR_BayerRG2BGR_EA = 129,
COLOR_BayerGR2BGR_EA = 130,
COLOR_BayerBG2BGR_EA = 135,
COLOR_BayerGB2BGR_EA = 136,
COLOR_BayerRG2BGR_EA = 137,
COLOR_BayerGR2BGR_EA = 138,
COLOR_BayerBG2RGB_EA = COLOR_BayerRG2BGR_EA,
COLOR_BayerGB2RGB_EA = COLOR_BayerGR2BGR_EA,
COLOR_BayerRG2RGB_EA = COLOR_BayerBG2BGR_EA,
COLOR_BayerGR2RGB_EA = COLOR_BayerGB2BGR_EA,
COLOR_COLORCVT_MAX = 131
COLOR_COLORCVT_MAX = 139
};
......
......@@ -310,18 +310,32 @@ enum
CV_RGBA2mRGBA = 125,
CV_mRGBA2RGBA = 126,
CV_RGB2YUV_I420 = 127,
CV_BGR2YUV_I420 = 128,
CV_RGB2YUV_IYUV = CV_RGB2YUV_I420,
CV_BGR2YUV_IYUV = CV_BGR2YUV_I420,
CV_RGBA2YUV_I420 = 129,
CV_BGRA2YUV_I420 = 130,
CV_RGBA2YUV_IYUV = CV_RGBA2YUV_I420,
CV_BGRA2YUV_IYUV = CV_BGRA2YUV_I420,
CV_RGB2YUV_YV12 = 131,
CV_BGR2YUV_YV12 = 132,
CV_RGBA2YUV_YV12 = 133,
CV_BGRA2YUV_YV12 = 134,
// Edge-Aware Demosaicing
CV_BayerBG2BGR_EA = 127,
CV_BayerGB2BGR_EA = 128,
CV_BayerRG2BGR_EA = 129,
CV_BayerGR2BGR_EA = 130,
CV_BayerBG2BGR_EA = 135,
CV_BayerGB2BGR_EA = 136,
CV_BayerRG2BGR_EA = 137,
CV_BayerGR2BGR_EA = 138,
CV_BayerBG2RGB_EA = CV_BayerRG2BGR_EA,
CV_BayerGB2RGB_EA = CV_BayerGR2BGR_EA,
CV_BayerRG2RGB_EA = CV_BayerBG2BGR_EA,
CV_BayerGR2RGB_EA = CV_BayerGB2BGR_EA,
CV_COLORCVT_MAX = 131
CV_COLORCVT_MAX = 139
};
......
......@@ -115,6 +115,9 @@ CV_ENUM(CvtMode2, CV_YUV2BGR_NV12, CV_YUV2BGRA_NV12, CV_YUV2RGB_NV12, CV_YUV2RGB
COLOR_YUV2GRAY_420, CV_YUV2RGB_UYVY, CV_YUV2BGR_UYVY, CV_YUV2RGBA_UYVY, CV_YUV2BGRA_UYVY, CV_YUV2RGB_YUY2, CV_YUV2BGR_YUY2, CV_YUV2RGB_YVYU,
CV_YUV2BGR_YVYU, CV_YUV2RGBA_YUY2, CV_YUV2BGRA_YUY2, CV_YUV2RGBA_YVYU, CV_YUV2BGRA_YVYU)
CV_ENUM(CvtMode3, CV_RGB2YUV_IYUV, CV_BGR2YUV_IYUV, CV_RGBA2YUV_IYUV, CV_BGRA2YUV_IYUV,
CV_RGB2YUV_YV12, CV_BGR2YUV_YV12, CV_RGBA2YUV_YV12, CV_BGRA2YUV_YV12)
struct ChPair
{
ChPair(int _scn, int _dcn): scn(_scn), dcn(_dcn) {}
......@@ -162,6 +165,8 @@ ChPair getConversionInfo(int cvtMode)
case CV_BGR5652BGRA: case CV_BGR5652RGBA:
return ChPair(2,4);
case CV_BGR2GRAY: case CV_RGB2GRAY:
case CV_RGB2YUV_IYUV: case CV_RGB2YUV_YV12:
case CV_BGR2YUV_IYUV: case CV_BGR2YUV_YV12:
return ChPair(3,1);
case CV_BGR2BGR555: case CV_BGR2BGR565:
case CV_RGB2BGR555: case CV_RGB2BGR565:
......@@ -204,6 +209,8 @@ ChPair getConversionInfo(int cvtMode)
case CX_YUV2BGRA: case CX_YUV2RGBA:
return ChPair(3,4);
case CV_BGRA2GRAY: case CV_RGBA2GRAY:
case CV_RGBA2YUV_IYUV: case CV_RGBA2YUV_YV12:
case CV_BGRA2YUV_IYUV: case CV_BGRA2YUV_YV12:
return ChPair(4,1);
case CV_BGRA2BGR555: case CV_BGRA2BGR565:
case CV_RGBA2BGR555: case CV_RGBA2BGR565:
......@@ -307,6 +314,31 @@ PERF_TEST_P(Size_CvtMode2, cvtColorYUV420,
SANITY_CHECK(dst, 1);
}
typedef std::tr1::tuple<Size, CvtMode3> Size_CvtMode3_t;
typedef perf::TestBaseWithParam<Size_CvtMode3_t> Size_CvtMode3;
PERF_TEST_P(Size_CvtMode3, cvtColorRGB2YUV420p,
testing::Combine(
testing::Values(szVGA, sz720p, sz1080p, Size(130, 60)),
testing::ValuesIn(CvtMode3::all())
)
)
{
Size sz = get<0>(GetParam());
int mode = get<1>(GetParam());
ChPair ch = getConversionInfo(mode);
Mat src(sz, CV_8UC(ch.scn));
Mat dst(sz.height + sz.height / 2, sz.width, CV_8UC(ch.dcn));
declare.time(100);
declare.in(src, WARMUP_RNG).out(dst);
TEST_CYCLE() cvtColor(src, dst, mode, ch.dcn);
SANITY_CHECK(dst, 1);
}
CV_ENUM(EdgeAwareBayerMode, COLOR_BayerBG2BGR_EA, COLOR_BayerGB2BGR_EA, COLOR_BayerRG2BGR_EA, COLOR_BayerGR2BGR_EA)
typedef std::tr1::tuple<Size, EdgeAwareBayerMode> EdgeAwareParams;
......
......@@ -1802,6 +1802,16 @@ const int ITUR_BT_601_CVG = -852492;
const int ITUR_BT_601_CVR = 1673527;
const int ITUR_BT_601_SHIFT = 20;
// Coefficients for RGB to YUV420p conversion
const int ITUR_BT_601_CRY = 269484;
const int ITUR_BT_601_CGY = 528482;
const int ITUR_BT_601_CBY = 102760;
const int ITUR_BT_601_CRU = -155188;
const int ITUR_BT_601_CGU = -305135;
const int ITUR_BT_601_CBU = 460324;
const int ITUR_BT_601_CGV = -385875;
const int ITUR_BT_601_CBV = -74448;
template<int bIdx, int uIdx>
struct YUV420sp2RGB888Invoker
{
......@@ -2134,6 +2144,84 @@ inline void cvtYUV420p2RGBA(Mat& _dst, int _stride, const uchar* _y1, const ucha
converter(BlockedRange(0, _dst.rows/2));
}
///////////////////////////////////// RGB -> YUV420p /////////////////////////////////////
template<int bIdx>
struct RGB888toYUV420pInvoker: public ParallelLoopBody
{
RGB888toYUV420pInvoker( const Mat& src, Mat* dst, const int uIdx )
: src_(src),
dst_(dst),
uIdx_(uIdx) { }
void operator()(const Range& rowRange) const
{
const int w = src_.cols;
const int h = src_.rows;
const int cn = src_.channels();
for( int i = rowRange.start; i < rowRange.end; i++ )
{
const uchar* row0 = src_.ptr<uchar>(2 * i);
const uchar* row1 = src_.ptr<uchar>(2 * i + 1);
uchar* y = dst_->ptr<uchar>(2*i);
uchar* u = dst_->ptr<uchar>(h + i/2) + (i % 2) * (w/2);
uchar* v = dst_->ptr<uchar>(h + (i + h/2)/2) + ((i + h/2) % 2) * (w/2);
if( uIdx_ == 2 ) std::swap(u, v);
for( int j = 0, k = 0; j < w * cn; j += 2 * cn, k++ )
{
int r00 = row0[2-bIdx + j]; int g00 = row0[1 + j]; int b00 = row0[bIdx + j];
int r01 = row0[2-bIdx + cn + j]; int g01 = row0[1 + cn + j]; int b01 = row0[bIdx + cn + j];
int r10 = row1[2-bIdx + j]; int g10 = row1[1 + j]; int b10 = row1[bIdx + j];
int r11 = row1[2-bIdx + cn + j]; int g11 = row1[1 + cn + j]; int b11 = row1[bIdx + cn + j];
const int shifted16 = (16 << ITUR_BT_601_SHIFT);
const int halfShift = (1 << (ITUR_BT_601_SHIFT - 1));
int y00 = ITUR_BT_601_CRY * r00 + ITUR_BT_601_CGY * g00 + ITUR_BT_601_CBY * b00 + halfShift + shifted16;
int y01 = ITUR_BT_601_CRY * r01 + ITUR_BT_601_CGY * g01 + ITUR_BT_601_CBY * b01 + halfShift + shifted16;
int y10 = ITUR_BT_601_CRY * r10 + ITUR_BT_601_CGY * g10 + ITUR_BT_601_CBY * b10 + halfShift + shifted16;
int y11 = ITUR_BT_601_CRY * r11 + ITUR_BT_601_CGY * g11 + ITUR_BT_601_CBY * b11 + halfShift + shifted16;
y[2*k + 0] = saturate_cast<uchar>(y00 >> ITUR_BT_601_SHIFT);
y[2*k + 1] = saturate_cast<uchar>(y01 >> ITUR_BT_601_SHIFT);
y[2*k + dst_->step + 0] = saturate_cast<uchar>(y10 >> ITUR_BT_601_SHIFT);
y[2*k + dst_->step + 1] = saturate_cast<uchar>(y11 >> ITUR_BT_601_SHIFT);
const int shifted128 = (128 << ITUR_BT_601_SHIFT);
int u00 = ITUR_BT_601_CRU * r00 + ITUR_BT_601_CGU * g00 + ITUR_BT_601_CBU * b00 + halfShift + shifted128;
int v00 = ITUR_BT_601_CBU * r00 + ITUR_BT_601_CGV * g00 + ITUR_BT_601_CBV * b00 + halfShift + shifted128;
u[k] = saturate_cast<uchar>(u00 >> ITUR_BT_601_SHIFT);
v[k] = saturate_cast<uchar>(v00 >> ITUR_BT_601_SHIFT);
}
}
}
static bool isFit( const Mat& src )
{
return (src.total() >= 320*240);
}
private:
RGB888toYUV420pInvoker& operator=(const RGB888toYUV420pInvoker&);
const Mat& src_;
Mat* const dst_;
const int uIdx_;
};
template<int bIdx, int uIdx>
static void cvtRGBtoYUV420p(const Mat& src, Mat& dst)
{
RGB888toYUV420pInvoker<bIdx> colorConverter(src, &dst, uIdx);
if( RGB888toYUV420pInvoker<bIdx>::isFit(src) )
parallel_for_(Range(0, src.rows/2), colorConverter);
else
colorConverter(Range(0, src.rows/2));
}
///////////////////////////////////// YUV422 -> RGB /////////////////////////////////////
template<int bIdx, int uIdx, int yIdx>
......@@ -2736,6 +2824,31 @@ void cv::cvtColor( InputArray _src, OutputArray _dst, int code, int dcn )
src(Range(0, dstSz.height), Range::all()).copyTo(dst);
}
break;
case CV_RGB2YUV_YV12: case CV_BGR2YUV_YV12: case CV_RGBA2YUV_YV12: case CV_BGRA2YUV_YV12:
case CV_RGB2YUV_IYUV: case CV_BGR2YUV_IYUV: case CV_RGBA2YUV_IYUV: case CV_BGRA2YUV_IYUV:
{
if (dcn <= 0) dcn = 1;
const int bIdx = (code == CV_BGR2YUV_IYUV || code == CV_BGRA2YUV_IYUV || code == CV_BGR2YUV_YV12 || code == CV_BGRA2YUV_YV12) ? 0 : 2;
const int uIdx = (code == CV_BGR2YUV_IYUV || code == CV_BGRA2YUV_IYUV || code == CV_RGB2YUV_IYUV || code == CV_RGBA2YUV_IYUV) ? 1 : 2;
CV_Assert( (scn == 3 || scn == 4) && depth == CV_8U );
CV_Assert( dcn == 1 );
CV_Assert( sz.width % 2 == 0 && sz.height % 2 == 0 );
Size dstSz(sz.width, sz.height / 2 * 3);
_dst.create(dstSz, CV_MAKETYPE(depth, dcn));
dst = _dst.getMat();
switch(bIdx + uIdx*10)
{
case 10: cvtRGBtoYUV420p<0, 1>(src, dst); break;
case 12: cvtRGBtoYUV420p<2, 1>(src, dst); break;
case 20: cvtRGBtoYUV420p<0, 2>(src, dst); break;
case 22: cvtRGBtoYUV420p<2, 2>(src, dst); break;
default: CV_Error( CV_StsBadFlag, "Unknown/unsupported color conversion code" ); break;
};
}
break;
case CV_YUV2RGB_UYVY: case CV_YUV2BGR_UYVY: case CV_YUV2RGBA_UYVY: case CV_YUV2BGRA_UYVY:
case CV_YUV2RGB_YUY2: case CV_YUV2BGR_YUY2: case CV_YUV2RGB_YVYU: case CV_YUV2BGR_YVYU:
case CV_YUV2RGBA_YUY2: case CV_YUV2BGRA_YUY2: case CV_YUV2RGBA_YVYU: case CV_YUV2BGRA_YVYU:
......
This diff is collapsed.
......@@ -301,16 +301,14 @@ endif()
# Additional target properties
set_target_properties(${the_module} PROPERTIES
OUTPUT_NAME "${the_module}${LIB_NAME_SUFIX}"
#DEBUG_POSTFIX "${OPENCV_DEBUG_POSTFIX}"
ARCHIVE_OUTPUT_DIRECTORY ${LIBRARY_OUTPUT_PATH}
LIBRARY_OUTPUT_DIRECTORY ${LIBRARY_OUTPUT_PATH}
RUNTIME_OUTPUT_DIRECTORY ${EXECUTABLE_OUTPUT_PATH}
INSTALL_NAME_DIR ${OPENCV_LIB_INSTALL_PATH}
LINK_INTERFACE_LIBRARIES ""
)
if(ANDROID)
set_target_properties(${the_module} PROPERTIES LIBRARY_OUTPUT_DIRECTORY ${LIBRARY_OUTPUT_PATH})
else()
if(WIN32)
set_target_properties(${the_module} PROPERTIES LIBRARY_OUTPUT_DIRECTORY ${EXECUTABLE_OUTPUT_PATH})
endif()
......
......@@ -514,7 +514,7 @@ static bool pyopencv_to(PyObject* obj, double& value, const char* name = "<unkno
(void)name;
if(!obj || obj == Py_None)
return true;
if(PyInt_CheckExact(obj))
if(!!PyInt_CheckExact(obj))
value = (double)PyInt_AS_LONG(obj);
else
value = PyFloat_AsDouble(obj);
......@@ -531,7 +531,7 @@ static bool pyopencv_to(PyObject* obj, float& value, const char* name = "<unknow
(void)name;
if(!obj || obj == Py_None)
return true;
if(PyInt_CheckExact(obj))
if(!!PyInt_CheckExact(obj))
value = (float)PyInt_AS_LONG(obj);
else
value = (float)PyFloat_AsDouble(obj);
......@@ -627,7 +627,7 @@ static inline bool pyopencv_to(PyObject* obj, Point& p, const char* name = "<unk
(void)name;
if(!obj || obj == Py_None)
return true;
if(PyComplex_CheckExact(obj))
if(!!PyComplex_CheckExact(obj))
{
Py_complex c = PyComplex_AsCComplex(obj);
p.x = saturate_cast<int>(c.real);
......@@ -642,7 +642,7 @@ static inline bool pyopencv_to(PyObject* obj, Point2f& p, const char* name = "<u
(void)name;
if(!obj || obj == Py_None)
return true;
if(PyComplex_CheckExact(obj))
if(!!PyComplex_CheckExact(obj))
{
Py_complex c = PyComplex_AsCComplex(obj);
p.x = saturate_cast<float>(c.real);
......@@ -993,7 +993,7 @@ static bool pyopencv_to(PyObject *o, cv::flann::IndexParams& p, const char *name
const char* value = PyString_AsString(item);
p.setString(k, value);
}
else if( PyBool_Check(item) )
else if( !!PyBool_Check(item) )
p.setBool(k, item == Py_True);
else if( PyInt_Check(item) )
{
......
......@@ -1158,7 +1158,7 @@ static PyObject* cvseq_map_getitem(PyObject *o, PyObject *item)
if (i < 0)
i += (int)cvseq_seq_length(o);
return cvseq_seq_getitem(o, i);
} else if (PySlice_Check(item)) {
} else if (!!PySlice_Check(item)) {
Py_ssize_t start, stop, step, slicelength, cur, i;
PyObject* result;
......@@ -1975,7 +1975,7 @@ struct dims
static int convert_to_dim(PyObject *item, int i, dims *dst, CvArr *cva, const char *name = "no_name")
{
if (PySlice_Check(item)) {
if (!!PySlice_Check(item)) {
Py_ssize_t start, stop, step, slicelength;
PySlice_GetIndicesEx((PySliceObject*)item, cvGetDimSize(cva, i), &start, &stop, &step, &slicelength);
dst->i[i] = (int)start;
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment