Backpropagation Algorithm – An important mathematical tool for making better and high accuracy predictions in machine learning. This algorithm uses supervised learning methods for training Artificial Neural Networks. The whole idea of training multi-layer perceptrons is to compute the derivatives of the error function or gradient descent with respect to weights using the backpropagation algorithm. This algorithm is actually based on the linear algebraic operation with a goal of optimising error function by harnessing its intelligence and provisioning updates.

In this post, we will focus on backpropagation and basic details around it on a high level in simple English.

Read More =>

 

What is the Backpropagation Algorithm

As mentioned above “Backpropagation” is an algorithm which uses supervised learning methods to compute the gradient descent (delta rule) with respect to weights.

Backpropagation-AILabPage-1

As per wiki – “Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network.”

This algorithm is used for finding minimum value error function in the neural network during the training model stage. The core idea of backpropagation is to find, what impact it would bring to the overall cost of the neural network if we play around with weights.

 

Read More =>

 

About the Author V Sharma

Specialised in Financial Technology(FinTech), Artificial Intelligence for Fintech. Mobile Financial Services (Cross Border Remittances, Mobile Money, Mobile Banking, Mobile Payments), Data Science, IT Service Management, Machine Learning, Neural Networks and Deep Learning techniques in FinTech. Mobile Data and Billing & Prepaid Charging Services (IN, OCS & CVBS) with over 15 years experience. Led start ups & new business units successfully at local and international levels with Hands-on Engineering & Business Strategy.
%d bloggers like this: