• Vadim Pisarevsky's avatar
    enabled convolution & activation fusion (#1245) · e551d15c
    Vadim Pisarevsky authored
    * enabled convolution & activation fusion
    
    * a few more optimizations:
    + optimized the common case when the indices of max pooling layer are not used. in this case we use the more efficient branch that computes just maximums over the aperture.
    + optimized the convolution + activation fusion when the activation is relu, which is another common case
    + convolution can now be fused with batch norm. It's the zero-cost fusion. If the batch norm is followed by relu, all three (conv + batchnorm + relu) are fused together. this modification seriously improved ENet performance
    
    * hopefully fixed warnings on Windows
    e551d15c
layers_common.hpp 3.7 KB