July 15th, 2024

Show HN: I created a Neural Network from scratch, in scratch

The article discusses implementing a 1-layer Feed Forward Network in Scratch for image classification. Challenges with handling multi-dimensional data were faced, but promising results were achieved with limited MNIST dataset training.

Read original articleLink Icon
Show HN: I created a Neural Network from scratch, in scratch

This article describes the author's experience implementing a 1-layer Feed Forward Network from scratch to classify images from the MNIST dataset. The model is trained using the Scratch programming language, which presents limitations in handling multi-dimensional data and lacks functions and variable scope. The implementation covers initializing weights using the Xavier initialization method, matrix multiplication, ReLU activation function, Softmax for probability distribution, and the Cross Entropy Loss calculation. The backward pass for gradient descent involves calculating gradients for trainable parameters using the chain rule. The author encountered challenges with memory order assumptions, leading to a need to rewrite parts of the code. Despite these setbacks, the model showed promising results when tested on dummy data points. Due to Scratch's limitations, only a subset of the MNIST dataset was used for training. The article highlights the complexities of implementing deep learning concepts in a constrained programming environment like Scratch.

Link Icon 2 comments
By @RateMyPE - 7 months
This is really cool. You could try running it on Turbowarp for the training, which is a faster "version" of Scratch.