While starting my tryst my machine learning and neural networks, and going through dozens of websites, books and tutorials, I realized something was missing - a buildup hinting at how its development was not anything miraculous or a fluke, instead how it was more or less a natural progression of conventional regression using the rapidly advancing processing power, and some of the neat small mathematical tricks we had or discovered in the last century; while also shaping up the constructing ideas behind it from the ground-up - for the semi-initiated.
Most of the courses and intros I read explained how we needed something that can mimic biological/human cognition and decision making. But there are endless ways of doing that effectively. One way is of course starting with copying the biological neuron design, but that reason alone doesn't sound very convincing. I mean I would not believe that without mimicking biology, it's not possible to build something that is immensely more efficient and effective.
My enlightenment was realizing the regression basis of neural networks, and intuitional understanding of most of the mathematics at play, while appreciating their incidental similarity to the biological neurons.
This book is an humble effort in explaining the basic working of a neural network from the viewpoint of a student along the lines mentioned above, mostly heuristically. Your feedback and comments are heartily welcome!
This is an ongoing effort which means more chapters are being added and the current content is being enhanced wherever needed.
For attribution or use in other works, please cite this book as: Manisar, "Neural Networks - What's Happening? An Intuitive Introduction to Machine Learning", SKKN Press, 2020.
Learn about LSTMs, and see why they work the way they do by interacting with one!
It's astonishing to see that by using a very simple mechanism, we can somewhat generate the pattern long and short term memory are supposed to follow.
In this chapter, we'll be looking at the description of recurrent neural network (RNN).
After constructing the theoretical framework in the last chapter, we will now be dealing with some of the practical difficulties.
From traditional regression to neural networks - it's not that big a leap as you might think. In this book, let's get a peek into this transition while appreciating how animal kingdom is already using this strategy. We will be taking help from our friend - intuition - time and again.
In this chapter, we will be sharpening our theoretical tools and sneak our way into the mathematics of neural networks.
From traditional regression to neural networks - it's not that big a leap as you might think. In this book, let's get a peek into this transition while appreciating how animal kingdom is already using this strategy. We will be taking help from our friend - intuition - time and again.
In this chapter, we will be looking at the basics - the idea of prediction, using traditional regression and moving towards learning based methods.
From traditional regression to neural networks - it's not that big a leap as you might think. In this book, let's get a peek into this transition while appreciating how animal kingdom is already using this strategy. We will be taking help from our friend - intuition - time and again.