While starting my tryst my machine learning and neural networks, and going through dozens of websites, books and tutorials, I realized something was missing - a buildup hinting at how its development was not anything miraculous or a fluke, instead how it was more or less a natural progression of conventional regression using the rapidly advancing processing power, and some of the neat small mathematical tricks we had or discovered in the latter half of the last century; while also shaping up the constructing ideas behind it from the ground-up - for the semi-initiated.
Most of the courses and intros I read explained how we needed something that can mimic biological/human cognition and decision making. But there are endless ways of doing that effectively. One way is of course starting with copying the biological neuron design, but that reason alone doesn't sound very convincing. I mean I would not believe that without mimicking biology, it's not possible to build something that is immensely more efficient and effective.
My enlightenment was realizing the regression basis of neural networks, and intuitional understanding of most of the mathematics at play, while appreciating their incidental similarity to the biological neurons.
This book is an humble effort in explaining the basic working of a neural network from the viewpoint of a student along the lines mentioned above, mostly heuristically. Your feedback and comments are heartily welcome!
This is an ongoing effort which means more chapters are being added and the current content is being enhanced wherever needed.
For attribution or use in other works, please cite this book as: Manisar, "Neural Networks - What's Happening? An Intuitive Introduction to Machine Learning", SKKN Press, 2020.