
GCP Data engineer
Houston, Texas, United States
Apply by 14 Mar 2026
Competitive
Job Ref.: BH-56770
Job Description
Location: Houston, TX
Work Model: 100% Onsite
Duration: 12 months - will extend
Overview
We are seeking a highly experienced GCP Data Engineer to join a data-driven team supporting large-scale analytics and cloud-native data platforms. This role requires deep hands-on experience with Google Cloud Platform services, strong Python and SQL skills, and the ability to build, deploy, and maintain robust data pipelines in a production environment.
Key Responsibilities
- Design, build, and maintain scalable data pipelines and workflows on Google Cloud Platform
- Develop and optimize data solutions using BigQuery and Google Cloud Storage (GCS)
- Implement event-driven and batch processing using Pub/Sub
- Deploy and manage services using Cloud Run, Cloud Functions, and Cloud Composer
- Collaborate with DevOps teams to support CI/CD pipelines and cloud infrastructure best practices
- Manage service accounts, IAM permissions, and Secrets Manager configurations
- Build and maintain data transformations using dbt
- Support analytics and reporting use cases, including integrations with Power Apps
- Ensure data quality, performance optimization, and security compliance
- Work closely with cross-functional teams including data scientists, analysts, and application developers
Required Qualifications
- 10 years of overall software or data engineering experience
- 3–4 years of hands-on experience with the GCP data stack
- BigQuery, GCS
- Cloud Run, Cloud Functions, Cloud Composer
- Pub/Sub
- Proficient in Python and SQL
- Experience with GitHub and modern DevOps practices
- Solid understanding of cloud security concepts, including service accounts and secrets management
- Hands-on experience with dbt
- Familiarity with Power Apps
- Google certifications
Preferred Skills
- Experience working in highly regulated or enterprise environments
- Strong troubleshooting and performance tuning skills