cudnnAddTensor maximum 5 
https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/

Variants running neuron network:
https://python-school.ru/wiki/convolutional-neural-network/
1)cudnnPoolingForward or cudnnPoolingBackward. SpeedUp training because downscaling 4x image.

2)cudnnConvolutionForward ,cudnnConvolutionBackwardFilter(dla Gradients) ,
  cudnnConvolutionBackwardBias ,cudnnConvolutionBackwardData(dla Gradients)

3)cudnnSoftmaxForward or cudnnSoftmaxBackward

4)cudnnActivationForward or cudnnActivationBackward


cuDNN supports a variety of activation functions, including:

    Sigmoid
    Hyperbolic Tangent
    Rectified Linear Unit (ReLU)
    Clipped Rectified Linear Unit (CLReLU)
    Exponential Linear Unit (ELU)
    GELU (Gaussian Error Linear Unit)

https://readmedium.com/what-is-cudnn-3f67d535a580

Recurrent Neural Networks (RNNs) are a type of neural network architecture 
particularly well-suited for processing sequential data, such as text, speech, and time-series data.
cuDNN provides optimized implementations for various RNN operations, including:

    LSTM (Long Short-Term Memory): A widely used variant of RNNs that addresses the vanishing gradient
         problem by incorporating gating mechanisms to control information flow, enabling the learning
         of long-range dependencies in sequential data.

    GRU (Gated Recurrent Unit): Similar to LSTMs but with a simpler architecture, GRUs also employ gating
        mechanisms to control information flow and have shown comparable performance to LSTMs
        in many applications.

    Simple RNN: The basic variant of RNNs, which processes sequential data one step at a time and updates
                its internal state accordingly.

    Bidirectional RNNs: These architectures process sequential data in both forward and backward directions,
                allowing the model to capture context from both past and future inputs, improving
                performance in tasks like speech recognition and language modeling.

By providing optimized implementations of these RNN operations, cuDNN enables efficient processing of
sequential data, accelerating the training and inference of models used in natural language processing,
speech recognition, and time-series analysis.
