Tanh saturated activation function, when the input reaches a certain value, the output does not change significantly up. If is the number of unsaturated activation map, e.g. RELU, it will be difficult to achieve the effect of gating.
Tanh is a saturated activation function, and when the input reaches a certain value, the output does not change significantly. If you use unsaturated activation graphs, such as ReLU, it will be difficult to achieve the goal-gate effect.
Tanh is a saturated activation function. When the input reaches a certain value, the output will not change significantly. If we use the number of unsaturated activation graphs, such as relu, it will be difficult to achieve the effect of gating.<br>