alternative
  • Home (current)
  • About
  • Tutorial
    Technologies
    C#
    Deep Learning
    Statistics for AIML
    Natural Language Processing
    Machine Learning
    SQL -Structured Query Language
    Python
    Ethical Hacking
    Placement Preparation
    Quantitative Aptitude
    View All Tutorial
  • Quiz
    C#
    SQL -Structured Query Language
    Quantitative Aptitude
    Java
    View All Quiz Course
  • Q & A
    C#
    Quantitative Aptitude
    Java
    View All Q & A course
  • Programs
  • Articles
    Identity And Access Management
    Artificial Intelligence & Machine Learning Project
    How to publish your local website on github pages with a custom domain name?
    How to download and install Xampp on Window Operating System ?
    How To Download And Install MySql Workbench
    How to install Pycharm ?
    How to install Python ?
    How to download and install Visual Studio IDE taking an example of C# (C Sharp)
    View All Post
  • Tools
    Program Compiler
    Sql Compiler
    Replace Multiple Text
    Meta Data From Multiple Url
  • Contact
  • User
    Login
    Register

Deep Learning - ANN - Artificial Neural Network - Forward Propagation Tutorial

Forward propagation is the scenario where inputs are passed to the hidden layer with weights. In every single hidden layer, the output of the activation function is calculated until the next layer can be processed. It is called forward propagation as the process begins from the input layer and moves toward the final output layer.

LAYER 1-

Trainable Parameter in L1 = 4 x 3 + 3 = 12 weight + 3 bias = 15

On matrix multiplication of (12 weight and 4 input) and addition of 3 bias based on the formula- \(\displaystyle\sum_{i=1}^{m}(w_ix_i) + bias\) , the output is O11, O12, and O1.

LAYER 1 Output O11, O12, and O13 will be used as input for LAYER 2.

LAYER 2-

Trainable Parameter in L2 = 3 x 2 + 2 = 6 weight + 2 bias =  8

On matrix multiplication of (6 weight and 3 input) and addition of 2 bias based on the formula- \(\displaystyle\sum_{i=1}^{m}(w_ix_i) + bias\) , the output is O21 and O22.

Output from LAYER 2 - O21 and O22 will be used as input for LAYER 3.

LAYER 3-

Trainable Parameter in L2 = 2 x 1 + 1 = 2 weight + 1 bias =  3

On matrix multiplication of (6 weight and 3 input) and addition of 2 bias based on the formula- \(\displaystyle\sum_{i=1}^{m}(w_ix_i) + bias\) , the output is Y'i.

Final Output (Y'i) comes from LAYER 3.

 

Total Trainable Parameter = 15 + 8 + 3 = 26 

 

  

 

 

Practical 1 - Customer Churn Prediction using ANN | Keras and TensorFlow

 

Practical 2 - Handwritten Digit Classification using ANN | MNIST Dataset

 

Practical 3 - Graduate Admission Prediction using ANN

 

 

Loss Function-

The loss function is a method of evaluating how well your algorithm is modeling your dataset.

Loss Function in Deep Learning- (Please check in Machine Learning Notes)

1] Regression

  • Mean Squared Error
  • Mean Absolute Error
  • Huber Loss

2] Classification

  • Binary Cross Entropy
  • Categorical Cross Entropy
  • Hinge Loss

3] Autoencoders

  • KL Divergence

4] GAN

  • Discriminator Loss
  • Min Max GAN Loss

5] Object Detection

  • Focal Loss

6] Embedding

  • Triplet Loss

Loss Function(Error Function) VS Cost Function

 

MSE- Mean Square error(if no outlier)

MAE – Mean Absolute Error (if outlier)

Huber Loss – if 25% point is outlier

BCE - Binary Cross Entropy (if 2 class)

CCE – Categorical Cross Entropy( if more than 2 class)

SCE – Sparse Cross Entropy( if more than 2 class)

 

Deep Learning

Deep Learning

  • Introduction
  • LSTM - Long Short Term Memory
    • Introduction
  • ANN - Artificial Neural Network
    • Perceptron
    • Multilayer Perceptron (Notation & Memoization)
    • Forward Propagation
    • Backward Propagation
    • Perceptron Loss Function
    • Loss Function
    • Gradient Descent | Batch, Stochastics, Mini Batch
    • Vanishing & Exploding Gradient Problem
    • Early Stopping, Dropout. Weight Decay
    • Data Scaling & Feature Scaling
    • Regularization
    • Activation Function
    • Weight Initialization Techniques
    • Optimizer
    • Keras Tuner | Hyperparameter Tuning
  • CNN - Convolutional Neural Network
    • Introduction
    • Padding & Strides
    • Pooling Layer
    • CNN Architecture
    • Backpropagation in CNN
    • Data Augmentation
    • Pretrained Model & Transfer Learning
    • Keras Functional Model
  • RNN - Recurrent Neural Network
    • RNN Architecture & Forward Propagation
    • Types Of RNN
    • Backpropagation in RNN
    • Problems with RNN

About Fresherbell

Best learning portal that provides you great learning experience of various technologies with modern compilation tools and technique

Important Links

Don't hesitate to give us a call or send us a contact form message

Terms & Conditions
Privacy Policy
Contact Us

Social Media

© Untitled. All rights reserved. Demo Images: Unsplash. Design: HTML5 UP.