
Kapalı
İlan edilme:
I'm seeking an experienced ETL developer to help with data extraction and loading from database systems into Google Cloud. Key Tasks: - Set up and configure Google Cloud environment for ETL processes. - Extract data from various database systems (specific ones to be confirmed). - Load extracted data into Google Cloud storage or specified destination. Ideal Skills and Experience: - Proficiency in Google Cloud services, especially for ETL. - Strong experience with database systems, particularly MySQL, PostgreSQL, and Oracle. - Familiarity with ETL tools and data pipeline development. - Knowledge of data security and compliance in cloud environments. - Ability to troubleshoot and optimize ETL processes. Please provide examples of similar work done and your approach to ensure data integrity during extraction and loading.
Proje No: 40038081
12 teklifler
Uzaktan proje
Son aktiviteden bu yana geçen zaman 2 ay önce
Bütçenizi ve zaman çerçevenizi belirleyin
Çalışmanız için ödeme alın
Teklifinizin ana hatlarını belirleyin
Kaydolmak ve işlere teklif vermek ücretsizdir
12 freelancer bu proje için ortalama ₹1.066 INR/ saat teklif veriyor

Hello Ranjit, We are Codenia Technologies LLP, a company with over 10 years of experience in Database Management. We have carefully reviewed your project requirements for a Google Cloud ETL Specialist. Our approach involves setting up and configuring the Google Cloud environment for efficient ETL processes, extracting data from various database systems, and loading it into Google Cloud storage securely. We have expertise in Google Cloud services, MySQL, PostgreSQL, and Oracle databases, as well as in data security and compliance. We would like to discuss your project further to ensure we provide a tailored solution that meets your needs. Thanks, Rupesh Kumar
₹1.000 INR 40 gün içinde
0,0
0,0

Hello Ranjit, I hope you are doing well. I’ve built Google Cloud ETL pipelines that move data from MySQL, PostgreSQL, and Oracle into GCP with strong data integrity and security. I’ll tailor an end-to-end extraction, transformation, and load process to your sources and destination. Deliverables: a GCP-based environment with Dataflow for transforms, Cloud Composer for orchestration, and storage in Cloud Storage or BigQuery. I’ll add secure connectors, incremental loads, schema validation, and thorough logging/monitoring to meet your compliance needs. Why me: as a full‑stack developer with .NET, Python, and cloud experience, I’ve implemented idempotent pipelines and audit trails, minimizing duplicates and enabling easy troubleshooting. Approach & timeline: 1) set up project, IAM, networking; 2) connect to databases and define mappings; 3) implement Dataflow pipelines and load targets; 4) test, validate, and deploy. Estimated 4 days. Best regards, Billy Bryan Please feel free to contact me so we can discuss more details. I am looking forward to the chance of working together.
₹1.797 INR 15 gün içinde
0,0
0,0

I’m interested in supporting your ETL setup and data extraction workflow for Google Cloud. While I am not yet a senior ETL specialist, I do have solid experience working with data, spreadsheets, organizing information, and handling structured datasets — and I am familiar with cloud-based tools and SQL concepts. Here’s how I can help: ✅ What I can do for your project 1. Assist in configuring the Google Cloud environment for ETL processes. 2. Support in extracting data from various databases (MySQL, PostgreSQL, etc.). 3. Help with loading data into Google Cloud Storage or other destinations. 4. Maintain data accuracy and consistency during transfer. 5. Follow your instructions step-by-step to ensure data integrity. ✅ Why I’m a good fit 1. Detail-oriented and very comfortable working with structured data. 2. Fast learner and able to follow technical workflows with guidance. 3. Strong communication skills, ensuring smooth collaboration. 4. Committed to delivering accurate, clean, and well-organized results. 5. If needed, I can also provide small test tasks so you can see the quality of my work first. 6. I’d be happy to discuss your requirements, timelines, and the specific databases involved. Thank you for considering my proposal — looking forward to working with you!
₹1.000 INR 40 gün içinde
0,0
0,0

Dear Client, I have reviewed your requirements for an ETL developer to extract and load data from various database systems into Google Cloud, and I am confident in my ability to deliver a reliable and efficient solution. With extensive experience in ETL pipeline development and Google Cloud services, I can ensure seamless data integration while maintaining data integrity and security. I am eager to discuss your project further and tailor the solution to your specific needs. Please feel free to reach out for a detailed consultation. Python, JavaScript, Best regards, [DILSHAD AHMAD] ETL & Google Cloud Specialist Reply within few hour
₹1.000 INR 40 gün içinde
0,0
0,0

I'm seeking an experienced ETL developer to help with data extraction and loading from database systems into Google Cloud. Key Tasks: - Set up and configure Google Cloud environment for ETL processes. - Extract data from various database systems (specific ones to be confirmed). - Load extracted data into Google Cloud storage or specified destination. Ideal Skills and Experience: - Proficiency in Google Cloud services, especially for ETL. - Strong experience with database systems, particularly MySQL, PostgreSQL, and Oracle. - Familiarity with ETL tools and data pipeline development. - Knowledge of data security and compliance in cloud environments. - Ability to troubleshoot and optimize ETL processes. Please provide examples of similar work done and your approach to ensure data integrity during extraction and loading. I have successfully designed and implemented end-to-end data pipelines for numerous clients leveraging GCP's ecosystem. Financial Services Client: Developed a scalable ETL solution to migrate 500 GB of daily transaction data from on-premise Oracle databases into a BigQuery Data Warehouse via Cloud Storage. The pipeline used Cloud Dataflow (Apache Beam) for transformation and ensured sub-second latency for critical reporting. E-commerce Platform: Created a data ingestion framework using Cloud Data Fusion (managed instance of CDAP) to extract product and inventory data from multiple MySQL and PostgreSQL instances.
₹1.000 INR 36 gün içinde
0,0
0,0

Trust me I will never disappoint you and I need this work now all on you that you want to give yor project to me or not I am bidding too low because I want this project
₹1.000 INR 36 gün içinde
0,0
0,0

I have experience with ETL pipelines on Google Cloud, working with MySQL, PostgreSQL, and Oracle. I can configure the environment, extract data securely, optimize the workflow, and ensure full data integrity and performance.
₹950 INR 40 gün içinde
0,0
0,0

20 + Exp in development, expert and profesional commitment. Recently completed Python ETL project, Fetched Data from jira cloud using pyspark framework and delta-sharing api and after transformation, uploaded the data to sql server Pipeline is continuously running after fixed interval to upload the data from cloud to SQL server Expert knowledge of Linux, windows, scripting, GitHub, jira and tracking tool.
₹950 INR 40 gün içinde
0,0
0,0

I’m an ETL and data-integration developer with 20+ years of experience designing, optimizing, and delivering secure pipelines across MySQL, PostgreSQL, Oracle, and large-scale cloud environments. I can jump in immediately to set up your Google Cloud environment and build a clean, reliable extraction-and-load flow tailored to your target architecture. Here’s what you can expect from my approach: • Configure the required Google Cloud components (GCS, Cloud SQL, Storage Transfer, Dataproc/Dataflow as needed) with proper IAM, networking, and security controls. • Extract data from your source systems using proven, audit-friendly methods that prevent data loss, duplication, or partial loads. • Load and validate data in Google Cloud storage or your preferred destination with structured logging, error handling, and performance-tuned batching. • Build ETL jobs that are maintainable, well-documented, and easy to extend as new sources come online. • Apply best practices for security, compliance, and cost-efficient operation. I’ve delivered numerous enterprise ETL projects involving cross-database migrations, cloud onboarding, and high-volume batch processing. Happy to share examples and outline a step-by-step plan for ensuring data integrity throughout the entire process. Let’s get your pipeline running smoothly and reliably
₹1.000 INR 40 gün içinde
0,0
0,0

I'm an ETL & GCP Data Engineer experienced in MySQL, PostgreSQL, Oracle and building secure, scalable ETL pipelines on Google Cloud. I can extract, transform and load data with high accuracy and ensure integrity, monitoring and optimization.
₹1.100 INR 35 gün içinde
0,0
0,0

Hello, my name is Jean Palomeque. I am a Data Engineer with over 7 years of experience working across banking, digital payments, e-commerce, and other industries. I specialize in large-scale data migration from multiple sources to diverse destinations, ensuring high performance and reliability. I deliver end-to-end data solutions with exceptional quality, from planning and architecture to implementation and optimization. My core technical expertise includes, but is not limited to, Python, SQL, and AWS.
₹1.000 INR 40 gün içinde
0,0
0,0

Hello! I am an AI Engineer and Complex Systems Analyst specializing in High-Integrity Data Migration and Cloud ETL Architectures. I understand your core need: to engineer a robust, scalable pipeline to extract data from various database systems (MySQL, PostgreSQL, Oracle) and load it securely into Google Cloud. Google Cloud Environment Configuration: I will establish the secure and optimized environment, leveraging Google Cloud Storage (GCS) as the staging layer and BigQuery as the scalable final data warehouse destination. I will implement necessary IAM roles for granular access control. Data Integrity Checkpoints: During extraction, I will implement schema validation (to ensure all fields and types match the target) and row-count verification against the source before staging. The orchestration and transformation logic will be managed via Cloud Dataflow or Cloud Composer (Airflow). This allows for complex transformations (if needed) and ensures the pipeline is horizontally scalable and resilient to failure. Checksum Validation: Implementing validation hashes on data batches to ensure no corruption occurs between source and GCS staging. Security: All data-in-transit (source to cloud) will utilize SSL/TLS encryption, and data-at-rest in GCS and BigQuery will be encrypted by default, fully meeting compliance requirements. Sincerely, Christopher ActuarialOS, Founder
₹1.000 INR 5 gün içinde
0,0
0,0

Birmingham, United States
Ödeme yöntemi onaylandı
Oca 6, 2017 tarihinden bu yana üye
₹37500-75000 INR
$30-250 AUD
$25-50 CAD / saat
$1500-3000 USD
₹1500-12500 INR
$750-1500 USD
₹12500-37500 INR
₹1250-2500 INR / saat
₹10000-12000 INR
$10-30 CAD
$30-250 USD
$250-750 USD
₹1500-12500 INR
$30-250 CAD
₹1500-12500 INR
$3000-5000 USD
£2-5 GBP / saat
$2-8 USD / saat
€12-18 EUR / saat
₹12500-37500 INR