深度学习

Environment Setup

!pip install numpy scipy matplotlib ipython scikit-learn pandas pillow

Introduction to Artificial Neural Network

Activation Function

Step function

import numpy as np
import matplotlib.pylab as plt

def step_function(x):
    return np.array(x>0, dtype=np.int)
    
x = np.arange(-5.0, 5.0, 0.1)
y = step_function(x)

plt.plot(x, y)
plt.ylim(-0.1, 1.1)
plt.show()

image-20200322174612016

Sigmoid Function

import numpy as np
import matplotlib.pylab as plt

def sigmoid(x):
    return 1 / (1 + np.exp(-x))

#  x = np.array([-1.0, 1.0, 2.0])
#  print(y)

x = np.arange(-5.0, 5.0, 0.1)
y = sigmoid(x)

plt.plot(x, y)
plt.ylim(-0.1, 1.1)
plt.show()

image-20200323231257374

Relu Function

$$ f(x)=max(0,x) $$

import numpy as np
import matplotlib.pylab as plt

def relu(x):
    return np.maximum(0, x)

#  x = np.array([-5.0, 5.0, 0.1])
#  print(y)

x = np.arange(-6.0, 6.0, 0.1)
y = relu(x)
plt.plot(x, y)
plt.ylim(-1, 6)
plt.show()

image-20200323231159016

损失函数

平方和误差

Sum of squared error $$ E = \frac{1}{2} \sum_{k} (y_k -t_k)^2 $$

Cross Entropy error

$$ E = - \sum_k t_k log\ y_k $$

References

comments powered by Disqus