I have a python code that scrapes data from about 375 pages from about 20-25 different websites. It uses senelium for some websites, but not all. It writes data to one single table in my mysql on my remote server.
Several different pages are broken/don't scrape anymore: 9 pages from 1 website, 5 from another website, several pages (API instances) from another API and about 8 or 9 other single pages (very easy to scrape websites).
I need somebody to fix these bugs and get my code working again for all scraping sources.