Implement two hidden layers neural network classifier from scratch in JAX
₹600-1000 INR
Kapalı
İlan edilme: yaklaşık 2 yıl önce
₹600-1000 INR
Teslimde ödenir
Two hidden layers here means (input - hidden1 - hidden2 - output).
You must not use flax, optax, or any other library for this task.
Use MNIST dataset with 80:20 train:test split.
Manually optimize the number of neurons in hidden layers.
Use gradient descent from scratch to optimize your network. You should use the Pytree concept of JAX to do this elegantly.
Plot loss v/s iterations curve with matplotlib.
Evaluate the model on test data with various classification metrics and briefly discuss their implications.
Hello this si Collins and am a MatLab expert who has handled dozens of Machine Learnign projects including implementation of the hidden layers.
I would like we discuss more regarding the project. Please get in touch