TalentAQ

TalentAQ

Data Engineer

EngineeringContractRemote

Required Skills
4 skills

Spark
PySpark
Snowflake
Azure/AWS/GCP

Job Description

<p>We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.</p><ul><h3>Key Responsibilities:</h3><li>Design and develop ETL processes using Spark and PySpark.</li><li>Manage data warehouse and data lake.</li><li>Implement data quality checks and monitoring.</li><li>Optimize data pipelines for performance and scalability.</li><li>Work with cloud platforms such as Azure, AWS, and GCP.</li></ul><ul><h3>Required Skills:</h3><li>Spark</li><li>PySpark</li><li>Snowflake</li><li>Azure/AWS/GCP</li></ul>

We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.

    Key Responsibilities:

  • Design and develop ETL processes using Spark and PySpark.
  • Manage data warehouse and data lake.
  • Implement data quality checks and monitoring.
  • Optimize data pipelines for performance and scalability.
  • Work with cloud platforms such as Azure, AWS, and GCP.

    Required Skills:

  • Spark
  • PySpark
  • Snowflake
  • Azure/AWS/GCP

Similar Jobs

10000 jobs available

EngineeringRemote
EngineeringRemote
EngineeringRemote
EngineeringRemote
EngineeringRemote