Activation Function and output

Lets create a matrix of shape 5×5 to understand the activation function working:

import numpy as np
Y2=np.arange(25).reshape(5,5)
print(Y2)

array([[ 0,  1,  2,  3,  4],
       [ 5,  6,  7,  8,  9],
       [10, 11, 12, 13, 14],
       [15, 16, 17, 18, 19],
       [20, 21, 22, 23, 24]])

Now create a weight matrix of similar shape

W3=np.arange(25).reshape(5,5)
print(W3)

array([[ 0,  1,  2,  3,  4],
       [ 5,  6,  7,  8,  9],
       [10, 11, 12, 13, 14],
       [15, 16, 17, 18, 19],
       [20, 21, 22, 23, 24]])

Now, create a bias

B3=np.ones_like(W3)
B3

array([[1, 1, 1, 1, 1],
       [1, 1, 1, 1, 1],
       [1, 1, 1, 1, 1],
       [1, 1, 1, 1, 1],
       [1, 1, 1, 1, 1]])

Now Apply Softmax:

import tensorflow as tf

Ys=Y2/100
Ws=W3/100
Y5 = tf.nn.sigmoid(tf.matmul(Ys, Ws) + B3)
model=tf.global_variables_initializer()
sess.run(model)
sess.run(Y5)

array([[ 0.73399752,  0.73419272,  0.73438782,  0.73458284,  0.73477777],
       [ 0.73885001,  0.73952477,  0.7401984 ,  0.7408709 ,  0.74154227],
       [ 0.74364489,  0.74478704,  0.74592584,  0.74706129,  0.74819337],
       [ 0.74838172,  0.74997895,  0.7515694 ,  0.75315306,  0.75472992],
       [ 0.75306009,  0.75509996,  0.75712841,  0.75914542,  0.76115096]])

Now Apply Relu:

import tensorflow as tf

sess=tf.Session()
Y3 = tf.nn.relu(tf.matmul(Y2/100, W3/100) + B3)
model=tf.global_variables_initializer()
sess.run(model)
sess.run(Y3)

array([[ 1.015 ,  1.016 ,  1.017 ,  1.018 ,  1.019 ],
       [ 1.04  ,  1.0435,  1.047 ,  1.0505,  1.054 ],
       [ 1.065 ,  1.071 ,  1.077 ,  1.083 ,  1.089 ],
       [ 1.09  ,  1.0985,  1.107 ,  1.1155,  1.124 ],
       [ 1.115 ,  1.126 ,  1.137 ,  1.148 ,  1.159 ]])

Now apply tanh:

import tensorflow as tf

Yt=Y2/100
Wt=W3/100
Y5 = tf.nn.tanh(tf.matmul(Yt, Wt) + B3)
model=tf.global_variables_initializer()
sess.run(model)
sess.run(Y5)

array([[ 0.76782216,  0.76823229,  0.76864179,  0.76905067,  0.76945892],
       [ 0.77788807,  0.77926642,  0.78063728,  0.78200067,  0.78335662],
       [ 0.78757002,  0.78983767,  0.79208394,  0.79430896,  0.79651287],
       [ 0.79687814,  0.79995957,  0.80299938,  0.80599797,  0.80895576],
       [ 0.80582272,  0.80964582,  0.81340143,  0.81709044,  0.82071372]])

Now apply elu:

import tensorflow as tf

Ye=Y2/100
We=W3/100
Y5 = tf.nn.elu(tf.matmul(Ye, We) + B3)
model=tf.global_variables_initializer()
sess.run(model)
sess.run(Y5)

array([[ 1.015 ,  1.016 ,  1.017 ,  1.018 ,  1.019 ],
       [ 1.04  ,  1.0435,  1.047 ,  1.0505,  1.054 ],
       [ 1.065 ,  1.071 ,  1.077 ,  1.083 ,  1.089 ],
       [ 1.09  ,  1.0985,  1.107 ,  1.1155,  1.124 ],
       [ 1.115 ,  1.126 ,  1.137 ,  1.148 ,  1.159 ]])

Leave a Reply