Data scraping işler
Python Web Scraping 10–15 civarı bilet platformundan etkinlik listesini çekmek ve platformlar arası “hangi etkinlik nerede var” karşılaştırması yapmak. Girdi: Her platform için 1–2 liste URL’si (örn: /events veya kategori sayfası). Çekilecek alanlar: Etkinlik adı, tarih-saat, şehir/venue (varsa), etkinlik URL’si, platform adı. Eşleştirme: Aynı etkinliği ad + tarih (+ şehir/venue) kriterleriyle eşleştirecek. İsim farklılıkları için fuzzy matching (örn. benzer takım/etkinlik isimleri) kullanılmalı. Çıktı (ZORUNLU): CSV ve Excel export. Excel dosyası 3 ayrı sheet içermelidir: Sheet 1: Tüm etkinlikler + platform var/yok matrisi Sheet 2: Yalnızca tek platformda yayında olan etkinlikler Sh...
...Bunları tek tek MNG Kargo ekranından kontrol etmekle vakit kaybetmek istemiyorum. İhtiyacım, masaüstünde çalışacak hafif bir uygulama ile bu kodları topluca MNG Kargo sistemine gönderip alıcı ad-soyad, telefon ve tam adres bilgilerini çekerek dışa aktarabilmek. Uygulama benden bir CSV, Excel ya da basit metin dosyası alabilir; arka planda MNG Kargo’nun resmi API’si varsa onu, yoksa güvenilir scraping yöntemi kullanarak verileri çekebilir. Sonuç kaynağa bağlı kalmadan aynı formatta geri dönmeli; tercihen Excel veya JSON çıktısı benim için yeterli. Başarılı teslimat için: • Masaüstünde kurulum gerektirmeyen portable .exe ya da .jar tercih ederim. • Aynı anda y&uu...
Gerçek zamanlı olarak farklı platformlardan spor oran verilerini toplayıp karşılaştıracak, arbitraj fırsatlarını tespit ederek web push bildirimleri gönderecek bir bot geliştirmek istiyoruz. Teknolojiler: API, web scraping, data matching, real-time alert systems (Web Push) Stack: Python / Node.js, Playwright/Puppeteer, Docker, Redis, PostgreSQL Süre: 1 hafta Bu alanda tecrübeli geliştiricilerden referanslarını ve yaklaşımlarını bekliyorum.
“WooCommerce internet sitem için 500 ürünün 40 farklı web sitesinden saatlik stok takibini yapacak bir otomasyon yazılımı yaptırmak istiyorum. internet siteme yüklediğim ürünlerin, söyleceğim başka sitelerden saatilik stok takibi yapacağı bir yazılıma ihtiyacım var. İnternet sitemde ...500 ürünün 40 farklı web sitesinden saatlik stok takibini yapacak bir otomasyon yazılımı yaptırmak istiyorum. internet siteme yüklediğim ürünlerin, söyleceğim başka sitelerden saatilik stok takibi yapacağı bir yazılıma ihtiyacım var. İnternet sitemde 500 den fazla ürün var, hepsi farklı sitelersen stok takibi yapacak ve takip ettiği sitelerde ürün bitince benim sitemde de satışı scraping ve WooComm...
...Regulation Watchdog: Web Scraping for Local Regulations: The system will use web scraping tools to gather data from local government websites regarding new regulations, zoning laws, or changes in building codes. Automatic Alerts: Whenever there is a change in relevant regulations or laws, the system will automatically notify the team and adjust project planning accordingly. 10. Daily Motivation and Reminder System: Daily Messages: The AI will send motivational and inspirational messages to staff at 10:00 AM and 9:00 PM, encouraging productivity and keeping the team engaged. WhatsApp Notifications: All reminders and daily messages will be delivered via WhatsApp to ensure high engagement rates and timely delivery. 11. Reporting and Analytics: ...
...işlenerek sistemimize entegre edilmesi Web sitesinin 10 dilde hizmet verecek şekilde tasarlanması Otel fiyatlarının otomatik olarak %10 düşürülerek yayınlanması Kullanıcı dostu ve mobil uyumlu bir arayüz SEO uyumlu, hızlı ve güvenilir bir altyapı Otel bilgilerini düzenli olarak güncelleyen otomatik bir sistem Teknoloji Tercihleri: Backend: Python (Scrapy, Selenium, BeautifulSoup gibi web scraping kütüphaneleri), Node.js veya PHP Frontend: React.js, Vue.js veya Angular Veritabanı: PostgreSQL, MySQL veya MongoDB Dil Desteği: Çok dilli altyapı için gettext, i18n veya benzeri bir uluslararasılaştırma yöntemi Hosting & Deployment: AWS, Google Cloud veya DigitalOcean İş Teslim Süresi: [Buraya tahmini...
Telegram botumuzu bir üst seviyeye taşıyacak, yeni özellikler ekleyip mevcut yapıyı optimize edecek bir Python geliştirici arıyoruz. Botumuz, otomatik gönderiler, çekilişler, affiliate takibi sağlıyor. Versiyon 2 için yeni entegrasyonlar ve gelişmiş otomasyon özellikleri eklemeyi planlıyoruz. ? Aranan Yetkinlikler: Python (3+ yıl deneyim) API entegrasyonları, web scraping ve otomasyon araçları konularında deneyim Tercihen bahis casino sektörüne aşinalık tg: segujohnson Uzaktan çalışma ve esnek saat imkanı.
...percentage difference filter. The comparison should cover the entire sports schedule (fixture) and refresh every 5 seconds to ensure up-to-date data. This system should function similarly to platforms like Oddsmath and Betburger. Developers can review these platforms to understand the expected functionality. Scope and Key Features 1. Websites to Compare: • (Football odds) • (Football odds) • Betfair Exchange (Football odds) 2. Odds Types to Compare: • Full-time Result (Home - Draw - Away) • Over/Under 2.5 Goals • Over/Under 3.5 Goals • First Half Result (Home - Draw - Away) 3. System Functionality: • Data Refresh: The system should scan and update the odds every 5 seconds for all available football fixtures. • Od...
## Python ile Web Scraping Projesi ### Proje: Hava Durumu Verilerini Toplama ve Analiz Etme #### Amaç: Belirli bir şehir için hava durumu verilerini toplayıp, bu verileri analiz ederek ortalama sıcaklık, nem oranı gibi bilgileri çıkarmak. #### Gereksinimler: - Python - BeautifulSoup - Requests - Pandas - Matplotlib #### Adımlar: 1. **Veri Toplama:** - Web scraping ile bir hava durumu sitesinden veri çekmek. - Örneğin: `` 2. **Veri İşleme:** - Çekilen verileri anlamlandırmak ve Pandas DataFrame'ine dönüştürmek. - Gereksiz bilgileri temizlemek ve analiz için gerekli olanları seçmek. 3. **Veri Analizi:** - Günlük ortalama sıcaklık, nem oranı gibi bilgileri hesaplamak. - Za...
Hazırda bir scrape projem var react ve node.js ile yaptığım fakat tam istediğim gibi çalıştıramadım belirli bir sitede tıklama ve navigasyon işlemleri ardından pdf indirme işlemini proxy ile yapmaya çalıştığım bir proje tam anlamıyla çalışması için yardıma ihtiyacım var
...Ancak uygulamaya güncelleme geldi. Yeni açılan hesaplarda ilk girişten sonra patlı kullanılmaz oluyor. Cookie'i 1 senelik yapmamız lazım. Bunların hepsi elimdekii programda var ancak program captchaya takılıyor. Bu tarzda bazı sıkıntılar ın headersleri her gün sabaha karşı güncelleniyor falan. Sıfır teknik bilgi ile bu kadar detay verebiliyorum. Python, Api, web (android)scraping, charles, vs bu alanlarda tecrübeli kişi arıyorum. Gerçekten bilgisine güvenen kişiler yazsın, bir kaç kişi deneyip beceremeyip bıraktı. Şuan programı yapıp kullanan kişiler var. Yani yapılabiliyor, sağlam bilgili yetenekli bir arkadaş işin üstesinden gelecektir. ÖZET OLARAK: Bir platform için hesap açma işlemi yapacak. Devamında...
Merhaba, bazı web sitelerinden data toplayacağımız bir programa ihtiyacımız var. Toplayacağımız dataları excele aktarıp uygun bir formatta kaydetmek istiyoruz. Web siteleri booking ile alakalı olup bir çok seçeneği barındırmaktadır.
Merhaba, web scraping de uzman yardımına ihtiyacım var
I need to scrap price and product tittle information from mobile applications. Crawling changing price, Mobile uygulamadan ürünlerin isim ve fiyatlarının bilgisini çekmek istiyorum. Örnek uygulamalar, Getir, Cepteşok, A101 Kapıda
web scraping Sample web site. Mysql table in price .
Paribu, screen scraping icin istenilenler: Ilgili CCYler: BTC, ETH, XRP, LTC, BCH, XLM, USDT, EOS 1) Siteye () giris (2 factor authentication) 2) “Market” sekmesinden, ilgili CCYler icin “Bekleyen Satis Emirleri”ni, bizim belirtecegimiz siklikta okuma ve gonderme (Sell Order Book) 3) “Market” sekmesinden, ilgili CCYler icin “Bekleyen Alis Emirleri”ni, bizim belirtecegimiz siklikta okuma ve gonderme (Buy Order Book) 4) “Market” sekmesinden, bizim gonderecegimiz komuta istinaden, ilgili CCY icin miktar ve fiyat bilgisiyle birlikte Alis Emri girme 5) “Market” sekmesinden, bizim gonderecegimiz komuta istinaden, ilgili CCY icin miktar ve fiyat bilgisiyle birl...
Merhaba, 2 asamali bir web scraping projem var. ve dan 40000 ve 15000 civarinda urunun fiyat bilgisini her hafta (bazen haftada 2 kere alabilecegim) bir scraper gelistirmeni istiyorum. Benim elimde manufacturer part number var. 1. ilk once bu manufacturer part number lari kullanarak ve daki ASIN leri bulacaksin. 2. Bazi manufacturer part numberlarin karsisinda birden fazla urun ve ASIN gelecek. Bu noktada ilk basta manuel olarak dogru olanlarin secilecegi bir arayuz olacak. Ileride bunu da belli bir ogrenme prosesine sokarak programin yapmasini saglayacaksin. 3. ASIN ler alindiktan sonra her hafta bu ASIN ler taratilacak ve urunlerin fiyatlari alinacak. 4. Yalniz 2 fiyat alinacak: Fiyat1: Satici Amazon olunca gelen fiyat (bir urunu Amazon satmiyorsa
Web Url Scraping, (URL link Toplayıcı) Merhaba, Proje basit googlebot benzeri domain adı yazdıktan sonra giriş sayfasına giderek (default page) linkleri toplayıp, sitenin tüm linklerini (sitemap) çıkarıp. Mysql veri tabanına yazacak. Kodlar opensource php ve mysql ile kodlanacak. 50000 link maximimum toplayabilecek.
Merhaba Arkadaşlar, Düğü'daki düğün mekanları kategorisinde yer alan mekanların bilgi...çalışacağım. 2-3 gün içerisinde işi bitirmem gerekiyor. Lütfen sayfa başına teklif veriniz. Teklif verirken açıklamanın sonuna lütfen bunu ekleyiniz:"tamamını okudum" Hi Guys, I need getting data to excel from dugun.com. There are 1.400 wedding venue pages in dugun.com. Just i need their text content(tittle, adress, about etc.). I attached a sample excel doc. Please give me offer for per page. I want to separete this task to finish quickly. If you have a team and ıf you bet finish this task in 3 days, you can give offer for 1.400 pages. You can do this task with manually or with data-scraping...
Web ve masaüstü programında verileri Excel dosyasına aktarma. Data scraping from flash website or program.
I need a custom-built agent that searches real estate websites for homes that meet specific criteria and alerts me. Key requirements: - Search real estate websites AND -MLS - Filter homes based on price reductions and price per square foot hits 110 and under - Additional criteria to be discussed Ideal Skills and Experience: - Experience with web scraping or API integration - Familiarity with real estate data - Ability to build customizable alert systems Please include a detailed project proposal in your application.
I need a reliable researcher who can compile a database of 500 Indian-based YouTube channels, centred on the “Data entry” niche, and capture a full-page screenshot of each channel’s “About” tab that clearly shows the Promotion/Business email address. Scope of work • Data collection and research: locate active Indian YouTubers whose content relates to data entry (tutorials, tips, freelancing, software demos, etc.). • Contact details: extract the publicly listed Promotion/Business email visible in the About section. • Screenshot capture: take a clear PNG or JPG screenshot of the About page for each channel so the email is readable. • Organise results: create a spreadsheet (Google Sheets or Excel) with Channel Nam...
No worries, that's way more workable. Here's the full post trimmed to fit under 10,000 characters: Nationwide Property Auction Web Scraping & Intelligent Alert System (Ongoing) About Us We're a commercial real estate investment firm that acquires distressed properties nationwide. We have the capital to close on any deal in the U.S. — our bottleneck is finding opportunities before competitors. We're building an automated system that monitors every property auction source in the country, filters against our criteria, and alerts us only on qualified deals. This is not a data dump project. We don't want spreadsheets with thousands of rows. We want a smart radar system that scans everything, filters ruthlessly, and only pings us when something m...
...phrase) which asks me what I want and where to get it from. Example: I want the agent to monitor the website of the Royal Mint, the Perth Mint, and APMEX and alert me for new one-ounce silver coins. - there should be another routine to fine-tune this - eg give exclusions (do not give me Niue coloured coins, or anything that costs more than $150) - be able to set up a cron job with different scraping periods per site - persistent database (SQLite) with a) what needs to be scraped, b) what we are looking for, and c) results of previous runs - daily routine to email a report to a given email address, and also the ability to ask for that report and get it generated live in OpenClaw webchat There should be a "main" agent to manage the orchestration, and a sub-agent for ...
...for businesses in hospitality, food production, bakeries, and retail. Our ideal customer is a business doing consistent, high volume on core packaging lines (e.g. burger boxes, bags, coffee cups, greaseproof paper, snack boxes) who is currently overpaying through a local distributor. What We Need Built A fully automated agent pipeline with the following stages: Stage 1 — Lead Scraping Integrate with a lead scraping tool (e.g. , Outscraper, or ) via API Pull business name, website URL, phone number and email address Target industries: cafes, restaurants, bakeries, fast food, food production, hospitality Target location: Australia (initially Perth, WA) Output: structured list of leads with contact details Stage 2 — Website & Menu Analysis Agent visits e...
Florida Judiciary Web Scraper — Config-Driven, Resilient Architecture I need a Python-based web scraping application to collect judge data from all 20 Florida judicial circuits and output it to a standardized CSV. The tool must be built for long-term maintainability — when a circuit website changes layout, only minimal configuration updates should be needed, not code rewrites. Background: Florida has 20 circuits covering 67 counties. Each circuit publishes judge data differently: some offer Excel/CSV downloads, others publish HTML pages and subpages with varying structures. The master data source is: Required Output Fields: (CSV)ID, Type, Name, Lastname, Assistant, Phone, Location, Street, City, State, Zip, County, Circuit,
Project Title: Automated Sales Territory & Route Optimization System (Airtable + Make + OpenRouteservice + Softr/Glide) Project Description: We are looking for an experienced automation specialist to build a lightweight, no‑code system that processes open government construction permit data (CSV format, 100–200 records per month) and generates optimized daily visit routes for a small sales team (2–3 reps). The system will import CSV files that are externally downloaded and provided (e.g., placed in a cloud folder or manually uploaded). It will then filter recent permits (rolling 3–4 month window), apply target profile rules (project type, value, contractor type, exclusions), deduplicate entries, geocode addresses, group them into geographic regions using a fix...
I need a quick-turnaround data scrape focused exclusively on retail businesses. Your task is to locate and compile accurate, up-to-date contact information—specifically email addresses, phone numbers, and physical addresses—for a sizable sample of companies that fit the retail profile I will provide once we start. To keep the work clean and usable, please record each company on its own row in a Google Sheet or Excel file and label the columns clearly. I will spot-check a portion of the entries, so only verified details count toward completion. Deliverables • Spreadsheet (.xlsx or Google Sheet) containing: company name, website, email address, phone number, full street address, city, state/province, and country • Source URL for every data point so...
I have roughly 5,000 DEF 14A proxy statements in HTML format and I need the key compensation details for each named executive pulled out and placed into a clean, structured file. The fields I must end up with are: base salary, stock options and awards, bonuses / incentive pay, plus any other compensation figures that appear in the summary or grants tables. Because the data are scattered in both narrative text blocks and embedded HTML tables, a purely scripted scrape misses too much, while a purely manual effort would be too slow. I’m therefore looking for a balanced workflow that blends solid Python-based parsing (BeautifulSoup, pandas, regex, maybe an LLM call for tricky passages) with targeted human review to catch formatting quirks and footnotes. Deliverables • A s...
...rows. A short “read-me” tab or text file that explains any data cleaning or assumptions you had to make will also be appreciated. Because this is a one-off job, efficient turnaround is important to me. Please outline: 1. the approach and tools you will use (e.g., Python, Scrapy, Selenium, BeautifulSoup, Playwright, etc.) while respecting the site’s pagination and anti-bot measures; 2. the estimated time you need from award to delivery; 3. a realistic fixed price for the full scrape, including any post-processing needed to ensure clean, accurate data. If you can optionally supply the scraping script as part of the hand-off, note that in your proposal—it’s a plus but not mandatory. I will review submissions mainly on data...
Web scraping using python and API extraction. Will discuss details on same.
...Scope of work – Research and compile a list of relevant businesses that match our ideal client profile – Capture accurate contact details (email, Instagram, LinkedIn and/or direct phone) for key decision-makers – Enter each prospect into a shared Google Sheet or our HubSpot CRM, tagged and de-duplicated – Validate that every lead is genuinely qualified for social-media support (no mass scraping or irrelevant entries) Acceptance criteria • 100% of leads fit the target market and include at least one direct contact channel • Spreadsheet/CRM is clean, consistently formatted and ready for outreach • First batch delivered ASAP, with clear notes on research sources When you reply, please attach past work that shows you have success...
...Developer with API Integration & Data Scraping Expertise We are seeking an experienced SaaS developer to create production-grade, error-free software solutions. The ideal candidate will have a proven track record in building robust applications with seamless API integrations, efficient data scraping and Paid Ads APIs (Meta, Google, Snapchat, TikTok) capabilities. Key Requirements: - Experience with Apify scrapers and API integration - Proficiency with Lovable app for SaaS development - Strong experience with Meta, Google, TikTok, and Snapchat Ads APIs - Ability to integrate with Vibe coded SaaS platforms - Full-stack web development skills - Experience with Python, JavaScript, PHP - API design and integration (REST, GraphQL, OAuth, webhooks) - Data...
I’m a Hindustan Unilever distributor who currently exports data from two separate LeverEdge (Retail and Wholesale) instances and then uploads it manually into Zoho Books. I’d like that entire flow replaced by a Python-driven, Zoho RPA-assisted bot that runs end-to-end without human touch. Here is the scope you’ll be taking over: • Dual login sequence – the bot must authenticate into each LeverEdge account one after the other, pull the day’s XML/Excel dumps, and keep the two data sets isolated for reporting. • Nine functional modules must stay in perfect sync: Sales Register, Collection, Product & Party Masters, Purchases, Purchase Returns, Credit Notes, Debit Notes, Tax Analysis, and HUL-specific Scheme calculations. • ...
We are looking for a highly skilled AI and automation developer with experience in sports betting platforms and automated systems to help build and maintain an advanced betting automation solution. The ideal candidate should have experience working with sportsbook platforms, data scraping, AI prediction models, and automation tools to create a reliable system capable of processing large volumes of betting data and executing automated workflows. This is a freelance remote project with the potential for long-term collaboration.
...design and content creation (blog and email) (primary focus) - Experience Figma and Adobe illustrator is a bonus (but not essential, just a bonus) - Strong proficiency in using generative AI for content, video, graphics and vibe coding - Strong written English for various tasks Secondary: - Automation experience in n8n, sumopod, APIs etc (also not essential but a bonus) - Ability to perform web scraping tasks efficiently, email marketing and excellent English (not just AI) Ideal Skills and Experience: - Previous experience as a virtual assistant with a heavy marketing focus or in a mid-weight marketing or more senior role - Strong organizational and multitasking abilities - Creative flair for design and presentation - Excellent English communication skills Ideally looking for ...
...layout changes Acceptance criteria 1. When I start the script and open the provided workbook I see new rows appear automatically as bids are placed. 2. Each row contains the exact bid price and bidder name shown on the site, with no missing or duplicate entries. 3. The solution runs for at least three hours straight without manual intervention. If you have prior experience with live-data scraping or Excel streaming, that will help, but clear, maintainable code is my top priority....
I need a bot written in Python. Let's discuss the details.
Lütfen detayları görmek için Kaydolun ya da Giriş Yapın.
...based on data scraped from a supplier website. The system should scrape products, stock availability, prices, images and product details from the supplier website, then publish them on my own website under my own branding, with my own higher prices. Main goal: scrape supplier products automatically sync stock and prices regularly apply my own markup rules display products on my own website accept customer orders create reorder workflow for supplier purchases Important: I do NOT need a copy of another company’s branding, logo, or copyrighted design. I need a custom-built website with similar functionality, powered by scraping-based product and stock synchronization. Required features: automatic scraping of products from supplier website scraping...
...and automation**. We are specifically looking for professionals who are experienced in **modern AI-driven marketing workflows** rather than traditional manual outreach methods. **Important Requirement – AI-Based Approach Only** This project must rely heavily on **AI tools, automation platforms, and scalable lead-generation systems**. Freelancers who rely primarily on **manual research, manual data entry, or manual outreach without automation** will not be considered. We expect the freelancer to propose and implement **automated, scalable marketing systems** using AI tools wherever possible. --- **Scope of Work** The freelancer will implement at least **three or four of the following marketing strategies using AI-powered tools**: **1. AI Directory & Startup Platf...
I’m putting together a master spreadsheet of Australian businesses and need an organised, accurate data-gatherer to make it happen. The first priority is every golf club in Australia; once those are complete, we’ll move straight on to other business categories so the file ultimately covers both golf clubs and a broad cross-section of Australian enterprises. What I need from you • Source publicly available information online—whether via targeted web scraping, reputable directories, or manual research—and capture each business’s name and primary phone number. • Include suburb, state and website where they’re easy to obtain so the list is genuinely useful, but the phone number is mandatory. • Deliver the results in a clea...
We are looking for a highly skilled AI and automation developer with experience in sports betting platforms and automated systems to help build and maintain an advanced betting automation solution. The ideal candidate should have experience working with sportsbook platforms, data scraping, AI prediction models, and automation tools to create a reliable system capable of processing large volumes of betting data and executing automated workflows. This is a freelance remote project with the potential for long-term collaboration.
No worries, that's way more workable. Here's the full post trimmed to fit under 10,000 characters: Nationwide Property Auction Web Scraping & Intelligent Alert System (Ongoing) About Us We're a commercial real estate investment firm that acquires distressed properties nationwide. We have the capital to close on any deal in the U.S. — our bottleneck is finding opportunities before competitors. We're building an automated system that monitors every property auction source in the country, filters against our criteria, and alerts us only on qualified deals. This is not a data dump project. We don't want spreadsheets with thousands of rows. We want a smart radar system that scans everything, filters ruthlessly, and only pings us when something m...
No scraping as they is not able to get the data. I need a current, easy-to-filter spreadsheet that captures every business operating in the four south-eastern Sydney suburbs of Malabar, Matraville, Little Bay, La Perouse and Maroubra (search in this order) For each entry, please research reliable public sources (official websites, Google Maps, directories, local chamber listings, etc.) and supply: • Business name • Full street address * Suburb name * Category * Email address * Link to contact form on website * Website • Facebook page link * Instagram page link • A short description of the main service or product offered * Source on how you find that info Make sure you don't miss any shops, cafe, restaurant, retail, etc. Accuracy matters...
...Monthly Bulk Member Adding Script (Telethon / Pyrogram) Description I am looking for an experienced Python developer who has previous experience working with Telegram automation using Telethon or Pyrogram. This is a recurring monthly task. I already have a list of Telegram usernames stored in a .txt file. The users are already filtered based on my requirements, so no additional filtering or scraping is needed. Your only responsibility will be adding these users to my Telegram group using a safe and reliable script. Task Details • I will send a .txt file containing around 5,000 Telegram usernames. • The usernames are already prepared and filtered. • Your task is only to add these users to my Telegram group using a script. • No group management, no moderat...
I need a robust, real-time scraping tool that monitors an accounting / legal / financial-services website and records every visitor’s key details the moment they arrive. The script must capture: • IP address • Name, phone and email • The exact Google or other search engine search keyword that led the user to the site • The specific page URL they land on and continue to browse • Any sign-up or form-fill actions performed • Geolocation (city/region/PIN Code) and device type Once each record is assembled it has to flow through my own verification endpoint first; if my custom API flags the data as unverified, the workflow should automatically fall back on a designated paid API to complete validation. All calls need to log their stat...
I need a Python-based solution that automatically gathers companies and shareholders data, pulls supplementary details via external APIs, and outputs a clean, unified dataset I can query at any time. Scope of the scrape • Sources: company websites, financial databases and relevant public records. • Website focus: company profiles, turnover figures and any available Demat / share-holding particulars. What the tool should do 1. Crawl or call the above sources, respecting and rate limits. 2. Parse the required fields, normalise names and IDs, then enrich each record through one or more APIs (for example OpenCorporates, Clearbit or any better suggestion you have). 3. Store results in a structured format (CSV plus an SQLite or Postgres option). 4. Offer a simpl...
...clean, ready-to-use list of prospects pulled directly from Google Maps. Please capture each company’s business name, physical address, phone number, and any email you can locate inside the listing so I can feed the data straight into my sales pipeline. Speed matters—ideally the first batch should land in my inbox as soon as possible, with the full file delivered shortly after. To keep the workflow smooth, deliver the results in a single Excel or CSV sheet, one row per record, free of duplicates and obvious errors. If you already have a proven method or custom tool for bulk scraping Google Maps without hitting quota limits, let me know. Accuracy and freshness of contact details will be the key success metric. Once you confirm you can hit these requirements, I&...