TalentAQ

TalentAQ

Data Engineer

EngineeringContractRemote

Required Skills
4 skills

Spark
PySpark
Snowflake
Azure/AWS/GCP

Job Description

<p>We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.</p><ul><h3>Key Responsibilities:</h3><li>Design and develop ETL processes using Spark and PySpark.</li><li>Manage data warehouse and data lake.</li><li>Implement data quality checks and monitoring.</li><li>Optimize data pipelines for performance and scalability.</li><li>Work with cloud platforms such as Azure, AWS, and GCP.</li></ul><ul><h3>Required Skills:</h3><li>Spark</li><li>PySpark</li><li>Snowflake</li><li>Azure/AWS/GCP</li></ul>

We are seeking a Data Engineer to build and maintain our data pipelines and infrastructure. You will be responsible for designing, developing, and testing ETL processes, as well as managing our data warehouse and data lake. Experience with Spark, PySpark, Snowflake, and cloud platforms (Azure/AWS/GCP) is essential.

    Key Responsibilities:

  • Design and develop ETL processes using Spark and PySpark.
  • Manage data warehouse and data lake.
  • Implement data quality checks and monitoring.
  • Optimize data pipelines for performance and scalability.
  • Work with cloud platforms such as Azure, AWS, and GCP.

    Required Skills:

  • Spark
  • PySpark
  • Snowflake
  • Azure/AWS/GCP

Similar Jobs

9592 jobs available

Nityo Infotech Corp.
EngineeringFull Time3+ years
Python
Big Data
Cloud Integration
+5 more
EngineeringRemote
Hadoop
Spark
Flink
+11 more
Data Engineering10+ years
Java
PHP
Python
+28 more
GCP
BigQuery
Dataflow
+19 more
Apache Kafka
Apache NiFi
Apache Flink
+12 more
Data Engineering
GCP
Python
SQL
+10 more