
Tamamlandı
İlan edilme:
Teslimde ödenir
I’m midway through building a professional horse-racing prediction and betting intelligence platform and need an experienced backend/data engineer for an initial 2–3 week sprint. The immediate goal is to get the prediction engine running on real race data as quickly as possible. What you’ll tackle • Design a clean PostgreSQL schema to store race entries, official results and derived running performance data. • Build a flexible ingestion pipeline that accepts JSON and CSV from public race results, manual uploads and future licensed data providers. • Ensure ingestion is source-agnostic, de-duplicates records and timestamps all loads. • Persist raw race data and derived basic performance metrics (finishes, times, odds, running lines) so they are query-ready for modelling. • Prepare structured feature tables that a separate prediction engine will consume. • Expose a lightweight internal REST endpoint (Python preferred, Node acceptable) that returns prediction-ready race data. Public API delivery will later run via Cloudflare Workers. Acceptance criteria New data files dropped into a storage bucket are automatically ingested, validated and queryable in PostgreSQL within 5 minutes. Horse-level performance views correctly reflect finishes, odds, speed proxies and jockey/trainer stats when spot-checked. Prediction data endpoint responds in <200ms using precomputed data. Clear README covering schema, ingestion flow and commands to rebuild system from scratch. Notes The prediction model logic and weighting will be provided separately. Primary objective is rapid deployment of a clean, scalable data foundation for the prediction engine. Successful delivery will lead to significant ongoing work as we scale into a full betting intelligence platform.
Proje No: 40243859
132 teklifler
Uzaktan proje
Son aktiviteden bu yana geçen zaman 2 ay önce
Bütçenizi ve zaman çerçevenizi belirleyin
Çalışmanız için ödeme alın
Teklifinizin ana hatlarını belirleyin
Kaydolmak ve işlere teklif vermek ücretsizdir

Hello Greetings, After reviewing your project description, I feel confident and excited to work on this project for you. But I have some crucial things and queries to clear out. Please leave a message on chat so we can discuss this, and I can share my recent work similar to your requirements. Thanks for your time! I look forward to hearing from you soon. Best Regards.
$700 CAD 7 gün içinde
7,9
7,9
132 freelancer bu proje için ortalama $507 CAD teklif veriyor

⭐⭐⭐⭐⭐ We at CnELIndia, led by Raman Ladhani, can help rapidly deploy a robust backend for your horse-racing prediction platform. We will design a clean PostgreSQL schema capturing race entries, results, and derived performance metrics, ensuring query-ready feature tables for your prediction engine. Our team will implement a source-agnostic ingestion pipeline handling JSON and CSV from multiple sources, with automatic de-duplication, timestamping, and validation within 5 minutes. Raw and derived data will persist efficiently, enabling fast REST endpoints (<200ms) for prediction-ready data. We will provide comprehensive documentation covering schema, ingestion flow, and rebuild commands. Leveraging our expertise in Python, PostgreSQL, data modeling, and REST API design, we can ensure rapid, scalable delivery, setting a strong foundation for your prediction engine and future platform expansion.
$500 CAD 7 gün içinde
8,3
8,3

⭐⭐⭐⭐⭐ Build a Robust Backend for Your Horse Racing Prediction Platform ❇️ Hi My Friend, I hope you're doing well. I've reviewed your project details and see you are looking for a backend/data engineer. You don't need to look any further; Zohaib is here to help you! My team has completed over 50 similar projects for backend development. I will create a clean PostgreSQL schema, build a flexible ingestion pipeline, and ensure your prediction engine runs smoothly with real race data. ➡️ Why Me? I have 5 years of experience in backend development and data engineering, focusing on database design, data ingestion, and API development. My expertise includes PostgreSQL, Python, and building scalable systems. I also have a strong grip on data validation and performance optimization, ensuring a solid foundation for your prediction engine. ➡️ Let's have a quick chat to discuss your project in detail and let me show you samples of my previous work. I look forward to discussing this with you in our chat. ➡️ Skills & Experience: ✅ PostgreSQL Design ✅ Data Ingestion ✅ API Development ✅ Python Programming ✅ JSON and CSV Handling ✅ Data Validation ✅ Performance Metrics ✅ RESTful Services ✅ Cloudflare Workers ✅ Data Modeling ✅ Data De-duplication ✅ System Documentation Waiting for your response! Best Regards, Zohaib
$350 CAD 2 gün içinde
8,0
8,0

Hi there, I am excited to present my expertise for your horse racing data backend project. As a top California freelancer with extensive experience in data engineering, I have successfully completed numerous tasks, earning 5-star reviews on this platform. I understand the urgency of getting your prediction engine operational with real race data quickly, and I'm confident in my ability to design an optimal PostgreSQL schema that meets your needs. I will create a flexible ingestion pipeline to ensure seamless data flow from various sources, be it JSON, CSV, or manual uploads. My approach will ensure that data is de-duplicated, timestamped, and ready for querying within your specified time frame. With a focus on scalability and speed, I will also develop an efficient internal REST endpoint to serve prediction-ready data to your prediction engine. I would love to discuss your project in detail and get started right away. What specific data sources do you envision for the ingestion pipeline, and are there any particular constraints on the data formats? Thanks,
$610 CAD 12 gün içinde
6,4
6,4

Hi, I'm a backend/data engineer specializing in high-performance ingestion pipelines and sports analytics infrastructure. I'll deliver a production-ready PostgreSQL foundation and REST API to power your prediction engine in 2–3 weeks. What You'll Get: ✅ PostgreSQL schema: Race entries, results, horse/jockey/trainer profiles, performance metrics, feature tables optimized for ML queries ✅ Automated ingestion pipeline: JSON/CSV from S3/GCS buckets → validation → deduplication → timestamped loads ✅ Source-agnostic ETL: Handles public results, manual uploads, future licensed feeds with unified transform layer ✅ Performance views: Finishes, odds, speed proxies,jockey/trainer stats—precomputed & indexed ✅ REST API (Python/FastAPI): Sub-200ms prediction-ready race data endpoint ✅ Event-driven triggers: New file → auto-ingest → queryable in <5 min (Lambda/Cloud Functions) ✅ Complete documentation: Schema diagrams, ingestion flow, rebuild scripts, sample queries Stack: PostgreSQL, Python (Pandas/SQLAlchemy), FastAPI, Docker, AWS/GCP storage triggers Timeline: 2–3 weeks Experience: 7+ years data engineering, sports betting platforms, real-time ETL. Let's build your winning edge. Thanks!
$750 CAD 7 gün içinde
6,6
6,6

Hi, As per my understanding: You need a robust PostgreSQL foundation and an agnostic ingestion pipeline to feed your horse-racing prediction engine within a tight sprint. My approach: * Architecting normalized schemas for race entries and performance metrics. * Building a Python-based automated validator for JSON/CSV storage triggers. * Optimizing materialized views to ensure <200ms API response times. Are we using AWS S3 for the storage buckets? Also, should the de-duplication logic prioritize manual uploads over public data? I will share my portfolio on your first message and am sure about the delivery after this first meeting. Best regards,
$650 CAD 18 gün içinde
6,7
6,7

Let’s goal goal goal — I can build a clean, scalable data foundation so your prediction engine runs on real race data quickly and reliably. I have strong experience designing PostgreSQL schemas, ingestion pipelines, and feature-ready datasets for prediction systems. I’ll create a normalized schema that stores raw race entries, results, and derived performance metrics, while keeping ingestion source-agnostic, de-duplicated, and fully timestamped. My approach: PostgreSQL schema optimized for fast querying and feature generation Automated ingestion pipeline (Python preferred) supporting JSON, CSV, and storage bucket triggers Raw data preservation plus structured feature tables for modelling Derived performance views (finishes, odds, speed proxies, trainer/jockey stats) Lightweight REST endpoint delivering prediction-ready data in <200ms using precomputed tables Tech stack: Python (FastAPI), PostgreSQL, background workers, and cloud storage triggers. This ensures new data is ingested, validated, and queryable within minutes. You’ll receive clean code, rebuild instructions, and a clear README explaining schema, ingestion, and deployment so the system is easy to maintain and scale. Quick question: which cloud environment are you using for storage and deployment (AWS S3, GCP, or other)?
$500 CAD 1 gün içinde
6,3
6,3

Hello, {{{ I HAVE CREATED SIMILAR APPS BEFORE AND I CAN SHOW YOU }}} I am a backend/data engineer with 10+ years of experience working on data-intensive platforms, PostgreSQL schema design, ingestion pipelines, and performance-critical APIs, including analytics and prediction-ready systems. I can support your 2–3 week sprint by delivering: • Clean, normalized PostgreSQL schema for races, results, raw data, and derived performance metrics • Source-agnostic ingestion pipeline (CSV/JSON) with de-duplication, validation, and timestamped loads • Automated ingestion from storage buckets with sub-5-minute availability in the database • Persisted raw + derived tables optimized for modelling and fast queries • Precomputed feature tables for prediction engine consumption • Lightweight internal REST API (Python preferred) returning prediction-ready data with <200ms response • Clear README and rebuild instructions for full system reproducibility I WILL PROVIDE 2 YEARS OF FREE ONGOING SUPPORT AND COMPLETE SOURCE CODE. WE WILL WORK WITH AGILE METHODOLOGY, AND I WILL GIVE YOU ASSISTANCE FROM ZERO TO PRODUCTION-READY DELIVERY. I have hands-on experience stabilizing and scaling data foundations for analytics and betting-style platforms, with strong focus on correctness, performance, and clean handover. I eagerly await your positive response. Thanks
$500 CAD 7 gün içinde
6,4
6,4

Hi I can build the backend foundation for your racing prediction platform by addressing the main technical challenge: creating a clean, scalable ingestion + storage pipeline that transforms messy race data into fast, prediction-ready tables. I’ll design a normalized PostgreSQL schema for entries, results, and derived metrics, and implement a source-agnostic loader that ingests JSON/CSV from buckets, validates records, de-duplicates entries, and timestamps every load. Using Python, I’ll persist raw race data alongside computed performance metrics so feature tables remain instantly queryable for modelling. Precomputed aggregates and indexed views will power an internal REST endpoint that returns race data in under 200ms. You’ll receive a clear README covering schema, ingestion flow, rebuild steps, and tooling for ongoing expansion. This sets the stage for your prediction engine to run immediately on clean, structured racing data. Thanks, Hercules
$500 CAD 7 gün içinde
6,4
6,4

With the pivotal role data plays in predicting and analyzing horse racing outcomes, a solid, efficient and well-organized backend system is imperative. As the founder and CEO of Web Crest with a decade-long experience in building intelligent solutions, Data Management is our bread and butter. Through our vast expertise in PostgreSQL and Python as well as extensive experience designing clean schemas for efficient data storage, we can offer an unparalleled solution to your project. One of our core strengths at Web Crest is creating robust internal infrastructures encapsulating processes such as ingestion, de-duplication, validation, timestamping, and query readiness. Our track record proves we are proficient at building systems that are source-agnostic to streamline data flows from multiple channels efficiently. We guarantee that any new data arriving at a storage bucket will be automatically ingested, validated, and will be queryable within just five minutes!
$700 CAD 5 gün içinde
6,5
6,5

Hey, I do backend data engineering work and this project is exactly my kind of thing - clean PostgreSQL schema design, ingestion pipelines, and building the data foundation for a prediction system. For your use case: - I'd design a PostgreSQL schema for race entries, results, running performance, jockey/trainer stats - normalized but query-optimized - Build a source-agnostic ingestion pipeline that handles JSON and CSV, deduplicates records, and timestamps everything - Create pre-computed feature tables ready for the prediction engine to consume - Set up the REST endpoint in Python (FastAPI or Flask) returning prediction-ready data in under 200ms - easy with precomputed data I've built similar data pipelines before for sports and financial data. The 5-minute ingestion SLA and clean README you're asking for are standard deliverables I include by default. This sounds like it could grow into a long-term platform build which I'm interested in too - I'd like to understand the prediction model side better as we go. Happy to start this week. What data format are you currently working with for the race results? - Usama
$700 CAD 14 gün içinde
6,1
6,1

Hello, Your project immediately stood out to me because it aligns strongly with the kind of work I specialize in and consistently deliver successfully. I have worked on similar projects where clarity, performance, and reliability were essential. My focus is always on building solutions that are clean, scalable, and easy to maintain. Beyond just completing tasks, I aim to improve workflows and ensure the final product creates real value for both the client and end users. You’re welcome to review my profile to see examples of my previous work and the type of projects I’ve successfully delivered. I’d be happy to discuss your specific requirements in more detail and explore how I can support your goals effectively. I am available to start immediately and can dedicate focused time and attention to ensure timely delivery and smooth collaboration throughout the project. Thank you for considering my application. I look forward to the opportunity to speak with you. Kind regards, Abhishek Saini
$750 CAD 7 gün içinde
6,2
6,2

Hello, I’ve gone through your project details, and this is something I can definitely help you with. I have 10+ years of experience in mobile and web app development, working with a strong focus on backend systems, databases, and APIs. I specialize in designing scalable solutions and efficient data ingestion pipelines, ensuring a clean architecture that meets your immediate goals. I will first review your requirements and propose the best technical approach to create a PostgreSQL schema that effectively stores race data and supports your prediction engine. I’ll ensure rapid ingestion of data with a robust validation process and expose a lightweight REST endpoint that meets your performance criteria. Here is my portfolio: https://www.freelancer.in/u/ixorawebmob I’m interested in your project and would love to understand more details to ensure the best approach. Could you clarify: 1. Are there specific data providers you want to prioritize in the ingestion pipeline? Are there specific data providers you want to prioritize in the ingestion pipeline? Let’s discuss over chat! Regards, Arpit Jaiswal
$250 CAD 20 gün içinde
5,7
5,7

Hi there, I'm offering a 25% discount on this project. With expertise in sports data engineering and backend development, I will build a robust horse racing data backend system—collecting, processing, and serving comprehensive racing data for your applications, analytics, or betting systems. I'll start by understanding your specific data needs including which racetracks, types of data (race results, horse history, jockey stats, odds, etc.), and how you plan to use the information. I will then develop a complete backend solution including automated data collection from reliable racing data sources, data cleaning and normalization, database design optimized for racing data queries, API development for serving data to your applications, historical data storage and management, real‑time data updates for live racing, calculation of statistics and performance metrics, scalability for growing data volumes, and comprehensive documentation. You'll receive a fully functional horse racing data backend with API access, along with documentation for integrating with your applications and maintaining the system. Let's build the data engine that powers your racing applications. Best regards, Sohail
$250 CAD 1 gün içinde
6,0
6,0

Hello, I’m excited about the opportunity to contribute to your project. With strong backend/data engineering experience in PostgreSQL and Python pipelines, I can design a clean schema for race entries/results/derived metrics, build a source-agnostic ingestion flow for CSV/JSON that de-duplicates, timestamps, and keeps raw + processed tables query-ready for modelling. I’ll tailor the pipeline so new files dropped into your storage bucket are validated and ingested within minutes, then generate structured feature tables and fast horse-level performance views (finishes, odds, speed proxies, jockey/trainer stats) that can be spot-checked reliably. You can expect a lightweight internal REST endpoint returning prediction-ready race data in sub-200ms from precomputed tables, plus a clear README covering the schema, ingestion process, and full rebuild commands for long-term scalability. Best regards, Juan
$500 CAD 3 gün içinde
5,8
5,8

⭐Hello, I’m ready to assist you right away!⭐ I believe I’d be a great fit for your project since I specialize in designing clean PostgreSQL schemas and building flexible ingestion pipelines for data platforms. My expertise includes persisting raw data, preparing structured feature tables, and optimizing REST endpoints for efficient data access. With a focus on streamlining race data processing and prediction-ready outputs, I am well-equipped to contribute to the success of your horse racing prediction platform. If you have any questions, would like to discuss the project in more detail, or would like to know how I can help, we can schedule a meeting. Thank you. Maxim
$250 CAD 5 gün içinde
5,4
5,4

As a seasoned Full Stack Developer with over 6 years of experience and command over MySQL, PostgreSQL, and Python, I am confident I have what it takes to solidify your horse racing prediction and betting intelligence platform. My expertise in utilizing MySQL and PostgreSQL to design efficient and robust schemas that can house a significant amount of data would greatly contribute to building a clean, scalable foundation for your prediction engine. I can mastermind an agile ingestion pipeline that accepts diverse file formats from public race results, manual uploads, as well as licensed data providers. My proposed solution will not only be source-agnostic but also incorporate effective de-duplication techniques while ensuring all loads are timestamped - guaranteeing accurate identification and tracking of each record. My dedication towards incorporating best practices implies I will persist raw race data alongside the derived essential metrics in such a manner that they become query-ready for modeling. With regards to the endpoint delivery, I am adept at developing lightweight internal REST endpoints, highlighting my proficiency in both Python (my preferred choice) and Node (also acceptable) which would be instrumental in responding with prediction-ready race data in less than 200ms using precomputed data.
$300 CAD 3 gün içinde
5,6
5,6

Hi, I'm eager to power your horse-racing prediction engine in this 2-3 week sprint. I'll design a robust PostgreSQL schema for race data, create a source-agnostic ingestion pipeline (JSON/CSV uploads to bucket-auto-ingest in <5 mins) and build feature tables with performance metrics. Plus, a fast Python REST endpoint (<200ms responses) and comprehensive README. I'm ready to deliver a query-ready foundation for your models. Available immediately. Let's align on kickoff! Thanks, Singh
$550 CAD 21 gün içinde
6,2
6,2

Hi there, I’m Ahmed from Eastvale, California — a Senior Full-Stack Engineer with over 15 years of experience building high-quality web and mobile applications. After reviewing your job posting, I’m confident that my background and skill set make me an excellent fit for your project — Horse Racing Data Backend Build . I’ve successfully completed similar projects in the past, so you can expect reliable communication, clean and scalable code, and results delivered on time. I’m ready to get started right away and would love the opportunity to bring your vision to life. Looking forward to working with you. Best regards, Ahmed Hassan
$500 CAD 5 gün içinde
5,1
5,1

Having spent over 5+ years as a Python developer, I specialize in building production-grade and scalable backend systems. The combination of this experience with my background in automation is a perfect match for your Horse Racing Data Backend Build project. I have a solid grasp of working with PostgreSQL, designing schemata, and implementing efficient data ingestion pipelines - all skills that align impeccably with your project needs. Not only do I prioritize clean architecture and error handling in my code, but I also emphasize long-term maintainability - an essential aspect given that your project will scale into a full betting intelligence platform. Moreover, my robust understanding of Web Development using Django / FastAPI / Flask makes me highly adept at quickly building REST endpoints for prediction-ready race data. Lastly, my proficiency in web scraping and data extraction will be beneficial in preventing potential duplication issues while acquiring race data, and ensuring the pipeline is adaptable to different file formats. My dedicated approach combined with regular updates and transparent communication mean you'll be kept in the loop about every step of the project and have peace of mind about its timely completion. Let's connect and discuss how we can make your platform a reality!
$500 CAD 7 gün içinde
5,4
5,4

With my extensive experience in data modeling, I am confident that I can design a clean PostgreSQL schema to meet your needs for storing race entries, results and running performance data. I have a strong grasp of efficient database designs and understand the significance of timestamps and data deduplication - ensuring the integrity and longevity of your data. My primary objective is to provide you with a rapid yet scalable deployment of the prediction engine through a well-thought-out ingestion system. Having handled over 1000 projects in Excel, VBA, and Google Sheets, I am well-versed in processing various file formats, including JSON and CSV as required for your project. I can build you a dependable ingestion pipeline that is source-agnostic and accepts race data from public sources, manual uploads or any future licensed providers you may bring on board. Furthermore, my skills extend into API development including Python and Node.js which will be essential for your internal REST endpoint requirement. As desired by the project description, I promise speedy API response times with high quality prediction-ready race data. Finally, my successful track record means that you will be working with someone who knows how to deliver exactly what you need without any unnecessary handoffs. Let's get started on building this horse racing powerhouse together!
$500 CAD 7 gün içinde
6,2
6,2

Fort Erie, Canada
Ödeme yöntemi onaylandı
Şub 19, 2026 tarihinden bu yana üye
₹2000000-4000000 INR
₹37500-75000 INR
$250-750 USD
$15-25 USD / saat
$30-250 NZD
$250-750 USD
$250-750 USD
$30-250 USD
$1500-3000 USD
₹12500-37500 INR
₹1500-12500 INR
$200-350 USD
€2-6 EUR / saat
$30-250 USD
₹400-750 INR / saat
₹12500-37500 INR
$3000-5000 USD
$20000-50000 USD
₹1500-12500 INR
$1500-3000 USD