Knowledge Distillation and Transfer Learning

Tamamlanmış İlan edilme: 2 yıl önce Teslim sırasında ödenir
Tamamlanmış Teslim sırasında ödenir

What should have:

2 models: 1 bigger and 1 smaller.

2 dataset: 1 complex(cifar-10) and 1 simpler(mnist)

Task-1: Knowledge distillation and then transfer learning

Train the bigger model on cifar-10 - (Teacher training. — Model-A training)

Train the bigger model and smaller model with dataset cifar-10 simultaneously (basically knowledge distillation - smaller model will become model-B)

remove few last linear laters from smaller model(which model-B) and train for few epochs(3-5) on mnist dataset — This has become model-C trained on mnist.

evaluate model-C on mnist test dataset.

Task-2: Transfer learning and then Knowledge distillation

Train the bigger model on cifar-10.(model-A)

remove few last linear layers from it and add new layers on it and train for few epochs(3-5) on new model(which is model-B) on mnist dataset.

Train model-B and smaller model with dataset mnist simultaneously(basically knowledge distillation - smaller model will become model-C)

evaluate model-C on mnist test dataset

Compare the results for step-4 as result of your project.

I want accuracies for model A, model B, and on model C before transfer learning and after transfer learning for task 2 and task 1

Also have to do the train-test- split for this.

Can suggest anything wrong regarding the description by seeing the uploaded file. That has a brief description.

Python Tensorflow Neural Networks Keras

Proje NO: #29957196

Proje hakkında

2 teklif Uzak proje Aktif 2 yıl önce

Seçilen:

vndeee

Hi, I am a Data Scientist (2-yoe in NLP and CV) and a former competitive programmer. It is my pleasure to help you. Please have a look at my profile: [login to view URL] [login to view URL] https:// Daha Fazla

%selectedBids___i_period_sub_7% gün içinde 120%project_currencyDetails_sign_sub_9% %project_currencyDetails_code_sub_10%
(7 Değerlendirme)
3.7