Русские видео

Сейчас в тренде

Иностранные видео




Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Multilayer Perceptron

A neural network model (3-layer perceptron) is learning z = f(x,y) with a teacher. The value of z is expressed by the color displayed in the left square region. The white curve in the right region is the training error. The black one is the prediction error. The goal is that the left area of the sinusoidal boundary is red and the right area is blue. User can set learning examples by using mouse indicated by the white circle in the left region. The learning algorithm is what is called the back propagation. The hidden layer is composed of 30 neurons. One can find the typical phenomena called the over fitting that means the training error becomes sufficiently small in spite of that the prediction error is relatively large. I coded this 10 years ago, using Processing, the Java based IDE. I thought that the topic of neural network models is classic or obsolete. However, I found the "return of the neuro", recently. Is the deep learning really great? I have to catch up with the new topic. Corresponding website: http://www.openprocessing.org/sketch/... (hidden layer: 20 neurons)

Comments