TalentAQ

TalentAQ

Data Engineer (GCP)

Engineering3+ yearsBengaluru, Karnataka

Required Skills
22 skills

GCP
BigQuery
Dataflow
Pub/Sub
Cloud Storage
Python
SQL
Java
ETL
Apache Beam
Apache Airflow
Spark
Terraform
Kubernetes
Docker
MLOps
Firestore
Bigtable
MongoDB
Looker
Tableau
Power BI

Job Description

Join our data-driven team! We are seeking a highly skilled and passionate Data Engineer with expertise in Google Cloud Platform to join our growing team in Bengaluru. Key Responsibilities: Design, develop, and maintain data pipelines and data warehouses. Leverage GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Extract, transform, and load (ETL) data from various sources. Build and optimize data pipelines for high-volume data processing. Collaborate with data scientists, analysts, and engineers. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 3+ years of experience in data engineering. Hands-on experience with Google Cloud Platform (GCP) services. Strong programming skills in Python, SQL, or Java. Required Skills: Expertise in ETL development, data modeling, and data warehousing. Experience with Apache Beam, Apache Airflow, or Spark. Proficiency in writing complex SQL queries and optimizing database performance. Preferred Qualifications: Experience with Terraform, Kubernetes, or Docker. Understanding of data security, compliance, and governance. Experience with streaming data architectures. Google Cloud Professional Data Engineer certification. Experience with machine learning pipelines and MLOps. Knowledge of NoSQL databases (Firestore, Bigtable, or MongoDB). Experience with BI tools like Looker, Tableau, or Power BI.
Join our data-driven team! We are seeking a highly skilled and passionate Data Engineer with expertise in Google Cloud Platform to join our growing team in Bengaluru. Key Responsibilities: Design, develop, and maintain data pipelines and data warehouses. Leverage GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Extract, transform, and load (ETL) data from various sources. Build and optimize data pipelines for high-volume data processing. Collaborate with data scientists, analysts, and engineers. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 3+ years of experience in data engineering. Hands-on experience with Google Cloud Platform (GCP) services. Strong programming skills in Python, SQL, or Java. Required Skills: Expertise in ETL development, data modeling, and data warehousing. Experience with Apache Beam, Apache Airflow, or Spark. Proficiency in writing complex SQL queries and optimizing database performance. Preferred Qualifications: Experience with Terraform, Kubernetes, or Docker. Understanding of data security, compliance, and governance. Experience with streaming data architectures. Google Cloud Professional Data Engineer certification. Experience with machine learning pipelines and MLOps. Knowledge of NoSQL databases (Firestore, Bigtable, or MongoDB). Experience with BI tools like Looker, Tableau, or Power BI.

Similar Jobs

10000 jobs available

Data EngineeringFull Time3-10 years
GCP
BigQuery
Cloud SQL
+17 more
EngineeringFull-time5+ yearsRemote
GCP
BigQuery
Airflow
+9 more
Data Engineering
GCP
Python
SQL
+10 more
Engineering5-8 years
Java
Spring Boot
Microservices architecture
+12 more
Engineering9+ yearsRemote
GCP
BigQuery
SQL
+12 more
Data Analytics6+ yearsRemote
Looker
LookML
BigQuery
+11 more