Have you ever wondered how neural networks... work? You know, they are the little boxes that show up around faces in a digital camera; They are the small programs that identify the dollar amounts on checks; They are also the tiny minds behind YouTube, Netflix, and other's recommendation system.
These systems are certainly complicated in their own right, but they all follow similar principals that all neural networks have. And you may be surprised to hear that the overarching concepts are rather simple. Basically, it's just: Inputs, Outputs, Errors and Weights.
I recently read "Make Your Own Neural Network" by Tariq Rashid. This was a fantastic read and he breaks down each concept in a really easy-to-understand way, step by step. He also makes it interesting (and funny) with visuals! At the time of this writing, you can get it on Amazon for about $4.00 for Kindle: Make Your Own Neural Network.
If you don't want to buy it, he has a blog dedicated to the book as well: https://makeyourownneuralnetwork.blogspot.com. He even has all the Python code available on GitHub here.
In his book, he walks you through the steps of creating a neural network that can 'read' handwritten numbers, 0-9. I'm happy to say that I was able to follow along and get the program to effectively 'learn' from the training dataset, and accurately read the number. I even tested it out on my own handwriting!
My handwritten numbers
I had to manually cut and resize the image to a 28 x 28 image. I did this with Gimp.
Then I was able to feed it into the Python script:
If you enjoy his writing, I would also recommend his other book, "Make Your Own Mandelbrot". Check out my other post here: https://www.antaresanalytics.net/single-post/2018/09/10/Creating-a-Zooming-Mandelbrot-Fractal-with-Python-in-Power-BI
Make Your Own Neural Network" by Tariq Rashid: