TalentAQ

TalentAQ

Data Engineer

EngineeringContractRemote

Required Skills
4 skills

Spark
PySpark
Snowflake
Azure/AWS/GCP

Job Description

<p>We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.</p><ul><h3>Key Responsibilities:</h3><li>Design and develop ETL processes using Spark and PySpark.</li><li>Manage data warehouse and data lake.</li><li>Implement data quality checks and monitoring.</li><li>Optimize data pipelines for performance and scalability.</li><li>Work with cloud platforms such as Azure, AWS, and GCP.</li></ul><ul><h3>Required Skills:</h3><li>Spark</li><li>PySpark</li><li>Snowflake</li><li>Azure/AWS/GCP</li></ul>

We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.

    Key Responsibilities:

  • Design and develop ETL processes using Spark and PySpark.
  • Manage data warehouse and data lake.
  • Implement data quality checks and monitoring.
  • Optimize data pipelines for performance and scalability.
  • Work with cloud platforms such as Azure, AWS, and GCP.

    Required Skills:

  • Spark
  • PySpark
  • Snowflake
  • Azure/AWS/GCP

Similar Jobs

9592 jobs available

Engineering15+ years
Python
PySpark
AWS
+4 more
Engineering2-15 years
Azure
AWS
GCP
+8 more
EngineeringFull Time1+ years
SQL
Python
Spark
+2 more
Engineering5+ yearsRemote
Python
SQL
AWS
+8 more
Data & AnalyticsFull Time2-7 years
SQL
Python
Apache Airflow
+14 more
Data Engineering12+ years
Data Bricks
AWS
SnowFlakes
+18 more