Job Description
You'll be responsible for designing and maintaining scalable data pipelines using GCP tools to ensure reliable, high-performance data workflows.
You'll also implement ETL/ELT processes, uphold data governance standards, and continuously improve data engineering practices through innovation and troubleshooting.
Job Responsibilities
Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, DataFlow (Apache Beam), Dataform/dbt, and Cloud Storage.
Implement ETL/ELT processes for data ingestion, transformation, and loading.
Optimize data workflows for performance, scalability, and reliability.
Ensure data governance, security, and compliance best practices.
Troubleshoot and resolve data infrastructure issues, minimizing downtime and performance bottlenecks.
Stay up to date with industry trends and emerging technologies, continuously improving data engineering practices.
Job Requirement
2–4 years of experience in a Data Engineering or similar role
Strong verbal and written communication skills in Indonesian and English
Strong experience with BigQuery, Cloud Storage, Cloud Functions, etc.
Hands-on experience with Dataform (or dbt)
Proficiency in SQL and scripting languages (e.g., Python or JavaScript)
Experience with orchestration tools (e.g., Airflow, Cloud Composer) is a plus
Solid understanding of data warehousing principles and ETL/ELT best practices
Ajukan Sekarang