19. Multi Layer Neural Network

Why multi layer neural network is necessary

tensorflow

Single Layer Neural Network

  • Until now, only single layer neural network is considered, not multi layer.
  • Single layer neural network has only one hidden layer between input layer and output layer. - WikiBooks
Image 1. Single layer neural network
  • Even though single layer neural network is not complicated, it shows good capability for linear regression, binary regression and softmax regression.
  • Single layer neural network finds optimal weights and bias for a line which divides True group and False group. Therefore, Single layer neural network can do logical operation well, such as AND, OR, NAND and NOR problem.

Logical Operations with Single Layer Neural Network

  • Weights and bias are coefficients to divide area of red and blue dots in this examples.

AND

  • Truth table
x1 x2 Y
000
010
100
111
Table 1. AND
import numpy as np
import matplotlib.pyplot as plt

def AND(x1, x2):
    i = np.array([x1, x2])
    # Wegith and threshold is defined manually for correct AND
    w = np.array([0.5, 0.5])
    b = -0.7
    tmp = np.sum(i*w) + b
    if tmp <= 0:
        return 0
    else:
        return 1
    
X = [[0,0], [0,1], [1,0], [1,1]]
for x in X:
    y = AND(x[0], x[1])
    color = None
    if y == 0:
        color = "b"
    else:
        color = "r"
    plt.plot(x[0], x[1], "o"+color)

# Dividing line
eq = lambda _x : -_x + 1.2
x1 = [i*0.1 for i in range(-4, 14)]
y1 = list(map(eq, x1))
plt.plot(x1, y1, "g")

plt.grid()
plt.xlim(-0.2, 1.2)
plt.ylim(-0.2, 1.2)
plt.title("AND")
plt.xlabel("x1")
plt.ylabel("x2")
plt.show()
Image 2. AND

OR

  • Truth table
x1 x2 Y
000
011
101
111
Table 2. OR
import numpy as np
import matplotlib.pyplot as plt

def OR(x1, x2):
    i = np.array([x1, x2])
    # Wegith and threshold is defined manually for correct AND
    w = np.array([0.5, 0.5])
    b = -0.2
    tmp = np.sum(i*w) + b
    if tmp <= 0:
        return 0
    else:
        return 1
    
X = [[0,0], [0,1], [1,0], [1,1]]
for x in X:
    y = OR(x[0], x[1])
    color = None
    if y == 0:
        color = "b"
    else:
        color = "r"
    plt.plot(x[0], x[1], "o"+color)

# Dividing line
eq = lambda _x : -_x + 0.4
x1 = [i*0.1 for i in range(-4, 14)]
y1 = list(map(eq, x1))
plt.plot(x1, y1, "g")

plt.grid()
plt.xlim(-0.2, 1.2)
plt.ylim(-0.2, 1.2)
plt.title("OR")
plt.xlabel("x1")
plt.ylabel("x2")
plt.show()
Image 3. OR

NAND

  • Truth table
x1 x2 Y
001
011
101
110
Table 3. NAND
import numpy as np
import matplotlib.pyplot as plt

def NAND(x1, x2):
    i = np.array([x1, x2])
    # Wegith and threshold is defined manually for correct AND
    w = np.array([-0.5, -0.5])
    b = 0.7
    tmp = np.sum(i*w) + b
    if tmp <= 0:
        return 0
    else:
        return 1
    
X = [[0,0], [0,1], [1,0], [1,1]]
for x in X:
    y = NAND(x[0], x[1])
    color = None
    if y == 0:
        color = "b"
    else:
        color = "r"
    plt.plot(x[0], x[1], "o"+color)

# Dividing line
eq = lambda _x : -_x + 1.4
x1 = [i*0.1 for i in range(-4, 14)]
y1 = list(map(eq, x1))
plt.plot(x1, y1, "g")

plt.grid()
plt.xlim(-0.2, 1.2)
plt.ylim(-0.2, 1.2)
plt.title("NAND")
plt.xlabel("x1")
plt.ylabel("x2")
plt.show()
Image 4. NAND

XOR Problem

  • However, XOR cannot be solved with single layer neural network.
  • Truth table
x1x2Y
000
011
101
110
Table 4. XOR
import numpy as np
import matplotlib.pyplot as plt
    
X = [[0,0], [0,1], [1,0], [1,1]]
Y = [0, 1, 1, 0]
for i in range(len(X)):
    x = X[i]
    y = Y[i]
    color = None
    if y == 0:
        color = "b"
    else:
        color = "r"
    plt.plot(x[0], x[1], "o"+color)

plt.grid()
plt.xlim(-0.2, 1.2)
plt.ylim(-0.2, 1.2)
plt.title("XOR")
plt.xlabel("x1")
plt.ylabel("x2")
plt.show()
Image 5. XOR
  • It is impossible to divide red and blue dots by one line. - University of Torino
  • This problem can be solved with multi layer neural network.

Multi Layer Neural Network

  • Multi layer neural network mean there are several overlapped hidden layers. - HackerNoon
Image 6. Multi layer neural network
  • If there are many hidden layers, the neural network is Deep Neural Network.

XOR Problem with Multi Layer Neural Network

  • XOR can be represented by combination of AND, OR and NAND. - Wiki
Image 7. XOR gate
  • In the graph, NAND and OR are placed in the same layer, so XOR can be solved with two layer neural network.
import numpy as np
import matplotlib.pyplot as plt

def NAND(x1, x2):
    i = np.array([x1, x2])
    # Wegith and threshold is defined manually for correct AND
    w = np.array([-0.5, -0.5])
    b = 0.7
    tmp = np.sum(i*w) + b
    if tmp <= 0:
        return 0
    else:
        return 1
    
def OR(x1, x2):
    i = np.array([x1, x2])
    # Wegith and threshold is defined manually for correct AND
    w = np.array([0.5, 0.5])
    b = -0.2
    tmp = np.sum(i*w) + b
    if tmp <= 0:
        return 0
    else:
        return 1

def AND(x1, x2):
    i = np.array([x1, x2])
    # Wegith and threshold is defined manually for correct AND
    w = np.array([0.5, 0.5])
    b = -0.7
    tmp = np.sum(i*w) + b
    if tmp <= 0:
        return 0
    else:
        return 1

def layer1(x1, x2):
    s1 = NAND(x1, x2)
    s2 = OR(x1, x2)
    return s1, s2
    
def layer2(x1, x2):
    return AND(x1, x2)

X = [[0,0], [0,1], [1,0], [1,1]]
for x in X:
    s1, s2 = layer1(x[0], x[1])
    y = layer2(s1, s2)
    color = None
    if y == 0:
        color = "b"
    else:
        color = "r"
    plt.plot(x[0], x[1], "o"+color)

plt.grid()
plt.xlim(-0.2, 1.2)
plt.ylim(-0.2, 1.2)
plt.title("XOR")
plt.xlabel("x1")
plt.ylabel("x2")
plt.show()
Image 8. XOR

Training Multi Layer Neural Network

  • Single layer neural network is trained with gradient descent algorithm.
  • Multi layer neural network is also trained with gradient descent algorithm, but it is more complicated.
  • For multi layer neural network, cost calculated in output layer should be passed to the first hidden layer. Furthermore, it is necessary to know how much each neurons affect the cost. These processes demand lots of computing power.
  • Fortunately, there is a way to reduce computations dramatically, Back propagation. This will be introduced in the next post.

COMMENTS

Name

0 weights,1,abstract class,1,active function,3,adam,2,Adapter,1,affine,2,argmax,1,back propagation,3,binary classification,3,blog,2,Bucket list,1,C++,11,Casting,1,cee,1,checkButton,1,cnn,3,col2im,1,columnspan,1,comboBox,1,concrete class,1,convolution,2,cost function,6,data preprocessing,2,data set,1,deep learning,31,Design Pattern,12,DIP,1,django,1,dnn,2,Don't Repeat Your code,1,drop out,2,ensemble,2,epoch,2,favicon,1,fcn,1,frame,1,gradient descent,5,gru,1,he,1,identify function,1,im2col,1,initialization,1,Lab,9,learning rate,2,LifeLog,1,linear regression,6,logistic function,1,logistic regression,3,logit,3,LSP,1,lstm,1,machine learning,31,matplotlib,1,menu,1,message box,1,mnist,3,mse,1,multinomial classification,3,mutli layer neural network,1,Non Virtual Interface,1,normalization,2,Note,21,numpy,4,one-hot encoding,3,OOP Principles,2,Open Close Principle,1,optimization,1,overfitting,1,padding,2,partial derivative,2,pooling,2,Prototype,1,pure virtual function,1,queue runner,1,radioButton,1,RBM,1,regularization,1,relu,2,reshape,1,restricted boltzmann machine,1,rnn,2,scrolledText,1,sigmoid,2,sigmoid function,1,single layer neural network,1,softmax,6,softmax classification,3,softmax cross entropy with logits,1,softmax function,2,softmax regression,3,softmax-with-loss,2,spinBox,1,SRP,1,standardization,1,sticky,1,stride,1,tab,1,Template Method,1,TensorFlow,31,testing data,1,this,2,tkinter,5,tooltip,1,Toplevel,1,training data,1,vanishing gradient,1,Virtual Copy Constructor,1,Virtual Destructor,1,Virtual Function,1,weight decay,1,xavier,2,xor,3,
ltr
item
Universe In Computer: 19. Multi Layer Neural Network
19. Multi Layer Neural Network
Why multi layer neural network is necessary
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiE9QfIQg9MqxmXv8wo1jRHrMgva3N0n9uaoJIHiM44Vt8k6nlufCwcOrXM4piATO-QqQmLgh_JEZUv2KXJVRIATvdu0xwckn-JPaRyfJpu9tFP929dbQgKHcd0zfVFfe9EjSkH18A4MxU4/s0/
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiE9QfIQg9MqxmXv8wo1jRHrMgva3N0n9uaoJIHiM44Vt8k6nlufCwcOrXM4piATO-QqQmLgh_JEZUv2KXJVRIATvdu0xwckn-JPaRyfJpu9tFP929dbQgKHcd0zfVFfe9EjSkH18A4MxU4/s72-c/
Universe In Computer
https://kunicom.blogspot.com/2017/07/19-multi-layer-neural-network.html
https://kunicom.blogspot.com/
https://kunicom.blogspot.com/
https://kunicom.blogspot.com/2017/07/19-multi-layer-neural-network.html
true
2543631451419919204
UTF-8
Loaded All Posts Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS CONTENT IS PREMIUM Please share to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy