• Vadim Pisarevsky's avatar
    enabled convolution & activation fusion (#1245) · e551d15c
    Vadim Pisarevsky authored
    * enabled convolution & activation fusion
    
    * a few more optimizations:
    + optimized the common case when the indices of max pooling layer are not used. in this case we use the more efficient branch that computes just maximums over the aperture.
    + optimized the convolution + activation fusion when the activation is relu, which is another common case
    + convolution can now be fused with batch norm. It's the zero-cost fusion. If the batch norm is followed by relu, all three (conv + batchnorm + relu) are fused together. this modification seriously improved ENet performance
    
    * hopefully fixed warnings on Windows
    e551d15c
Name
Last commit
Last update
..
cityscapes_semsegm_test_enet.py Loading commit data...
cnpy.cpp Loading commit data...
cnpy.h Loading commit data...
imagenet_cls_test_alexnet.py Loading commit data...
imagenet_cls_test_googlenet.py Loading commit data...
imagenet_cls_test_inception.py Loading commit data...
npy_blob.hpp Loading commit data...
pascal_semsegm_test_fcn.py Loading commit data...
test_caffe_importer.cpp Loading commit data...
test_common.hpp Loading commit data...
test_googlenet.cpp Loading commit data...
test_halide_layers.cpp Loading commit data...
test_halide_nets.cpp Loading commit data...
test_layers.cpp Loading commit data...
test_main.cpp Loading commit data...
test_precomp.hpp Loading commit data...
test_tf_importer.cpp Loading commit data...
test_torch_importer.cpp Loading commit data...