Neural Network from Scratch

Designed and implemented a basic neural network using only NumPy for foundational deep learning concepts.

This project involved designing and implementing a neural network from scratch using only NumPy, with the goal of classifying the MNIST dataset of handwritten digits. The network architecture consists of an input layer, a single hidden layer, and an output layer. Through careful implementation of key components like forward propagation, backpropagation, gradient descent, and dropout, the model achieved an impressive 98% accuracy on the classification task.

Every part of the network, including the activation functions and the dropout mechanism, was manually coded to build a deeper understanding of how these elements contribute to the learning process. Implementing backpropagation myself helped me comprehend how gradients are calculated and used to update weights during training. This hands-on experience gave me a solid grasp of neural network mechanics, equipping me with the knowledge to build and analyze more complex models in the future.

GitHub Repository