• Pruthvi's avatar
    Pruthvi/fix rnn precision (#1874) · 73da681a
    Pruthvi authored
    * - Added reorder support for rnn weights_layer/iter
    
    * i) fixed compilation issues ii) working but still observing precision error
    
    * i) fixed failing rnn unit test for DEX ii) refactored workspace in RNN mkldnn emitter
    
    * i) added support for src reorder to TNC from NTC
    
    * reorder support for rnn output fron NTC to TNC
    
    * - added support for rnn weight reorder ldgoi -> ldigo
    - code refactor for lstm/rnn kernel in mkldnn emitter
    
    * - refactor rnn mkldnnn kernel, change variable names
    
    * fix RNN codegen kernel
    
    * disbale layer rnn fusion pass, to test CI
    
    * method to validate recurrent rnn inputs
    
    * add correlated macthes for Recurrent RNN PM
    
    * - simplify reorder logic for rnn_weights
    - fix graph pattern for fusing rnn cell across time steps
    
    * do weights reorders in rnn timesteps fusion
    
    * refactored LSTM graph pass
    
    * - Bug fix for finding the lstm inputs determenstically
    - Refactored LSTM graph pass to single pass
    - made changes to LSTM RNN time step fusion graph pass
    
    * - use replace_node instead of replace_output in Lstm_step_wise fusion graph pass
    
    * fix compilation error
    
    * Fix GNMT rnn fusion
    
    * check if the node is in use before replacing in RNN graph passes
    
    *  i) fix style ii) fix topo sort issue in RNN graph pass
    
    * style fix
    
    * fix bug in simplify_concat pass
    
    * replaces Lstm1 -> {GOE1, GOE2} -> {Slice1, Slice2} -> Concat -> Lstm2 with Lstm1 -> Lstm2
    
    * cse for convert layout
    
    * addressed PR comments
    
    * - optimization pass to remove  Lstm1 -> {GOE1, GOE2} -> {Slice1, Slice2} -> Lstm2
    - conditional fusing of LSTM cells only for the decoder
    
    * made changes to multi layer RNN fusion callback
    
    * fix asserts in RNN op
    
    * - added support to fuse layers when slc=dlc for RNN cells
    - bug fix on the sanity checks for RNN Op
    
    * - support RNN layer fusion till slc = dlc
    - bug fixes in multi layer rnn fusion call back
    
    * capture reshape in the RNN weights
    
    * Addressed PR comments
    
    * - added comments in multi layer PM call back
    - fuse only if slc == DLC across layers
    
    * restore deleted 3_lstm_cell_forward.json file
    
    * fix typo
    
    * fix failing unit tets
    
    * When processing in place slice, do not change the offset of the slice node if the argument pointer comes from function input.
    
    * Address PR feedback: process in place slice after propagating in place input.
    
    * Set INTERMEDIATE role before propagating in place input.
    
    * Do not add temporaries to the variable name map before propagating in place input in codegen.
    
    * Fix a bug in codegen.
    
    * Fix a bug in codegen slice.
    
    * reenable disabled rnn unit test
    
    * fix compiler error
    
    * - bug fix in the slicing logic for the layer fused rnn cell
    - fix failing rnn unit test
    
    * - Addressed PR comments
    - removed redundant checks from the rnn graph pass
    - simplified rnn call back replace node logic
    
    * - added new multilayer rnn *.json file
    - fix test case
    
    * [PRIVATE BRANCH] Style fixes (#2080)
    
    * Style fixes
    
    * change order of lstm gates
    
    * [PRIVATE BRANCH] Jbobba/rnn fusion review (#2113)
    
    * Style fixes for single-layer RNN fusion
    
    * Style fixes to multi-layer RNN
    
    * style fix
    
    * disable GPU test
    73da681a