-
Yashas Samaga B L authored
cuda4dnn: fuse activations with convolutions * fuse ReLU, ReLU6, TanH, Sigmoid with conv * fix OpenCL errors * improve ReLU, add power, swish and mish * fix missing fusion entries * fix handling of unsetAttached * remove whole file indentation * optimize power = 1.0, use IDENTITY instead of NONE * handle edge case: change backend and then clear
17c485eb
Name |
Last commit
|
Last update |
---|---|---|
.. | ||
csl | ||
cxx_utils | ||
kernels | ||
primitives |