TalentAQ

TalentAQ

Data Engineer

EngineeringContractRemote

Required Skills
4 skills

Spark
PySpark
Snowflake
Azure/AWS/GCP

Job Description

<p>We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.</p><ul><h3>Key Responsibilities:</h3><li>Design and develop ETL processes using Spark and PySpark.</li><li>Manage data warehouse and data lake.</li><li>Implement data quality checks and monitoring.</li><li>Optimize data pipelines for performance and scalability.</li><li>Work with cloud platforms such as Azure, AWS, and GCP.</li></ul><ul><h3>Required Skills:</h3><li>Spark</li><li>PySpark</li><li>Snowflake</li><li>Azure/AWS/GCP</li></ul>

We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.

    Key Responsibilities:

  • Design and develop ETL processes using Spark and PySpark.
  • Manage data warehouse and data lake.
  • Implement data quality checks and monitoring.
  • Optimize data pipelines for performance and scalability.
  • Work with cloud platforms such as Azure, AWS, and GCP.

    Required Skills:

  • Spark
  • PySpark
  • Snowflake
  • Azure/AWS/GCP

Similar Jobs

10000 jobs available

EngineeringContract10 years
Data LakeHouse
Kubernetes
AWS
+38 more
EngineeringContract2 years
DM architecture
data modeling
governance
+14 more
Continental HR Services
IT
BigQuery
SQL
ETL
+5 more
EngineeringFull Time7+ years
ADF
Azure Data Factory
Databricks
Terraform
Terragrunt
AWS
+14 more
Data Science3-6 years
Data Science
Computer Science
Statistics
+25 more