Kapalı

Tensorflow data pipeline improvements

I currently have a google colab file with a data generator and a model. I train the model using the generator. The current training is very slow and I would like to hopefully speed up the training process by using the [login to view URL] api or anything else. Let me know if you are interested in helping. Relevant Files/Documents are attached below.

COLAB: https://colab.research.google.com/drive/1oOnjeCP0VuGJOljlJt9fW7EUCCjcmrNp#scrollTo=Ie8-s2T9I7H1

LINK TO JSON DATA:

[login to view URL]

The JSON data we read is too large to be read into memory all at once, so our preprocessor breaks it up into these smaller json files. I need the dataset to progressively load them rather than all at once.

Beceriler: Tensorflow, Python, Keras, Machine Learning (ML), Neural Networks

Daha fazlasını gör: example insert data database using xml file vbnet, extract data sql database xml file, collect data user using flash file website, extract data sql server xml file, data non applicable excel file, copy data paste anther excel file macro, need pull data csv delimited text file, data comport store text file, insert data table using xml file sql server, data form web part file subfolder, move data oracle database excel file using, extract data mssql backup bak file, russia offline data entry job jpeg file, 2008 rename file suffix current date, data map csv import file, export data exchange 2003 csv file, capture data website produce txt file, checking existing data database uploading csv file asp net, collect data drupal database csv file

İşveren Hakkında:
( 0 değerlendirme ) Gates Mills Blvd, United States

Proje NO: #29911837

Bu iş için 2 freelancer ortalamada $50/saat teklif veriyor

(91 Değerlendirme)
5.8
(2 Değerlendirme)
4.0