We are looking for a Data Engineer to build and maintain scalable data pipelines. You will be responsible for designing, implementing, and optimizing data storage and processing systems. The ideal candidate should have a strong background in data warehousing, ETL processes, and big data technologies.
Key Responsibilities:
* Develop and maintain data pipelines using Apache Airflow
* Design and implement data models and schemas
* Optimize data storage and retrieval processes
* Collaborate with data scientists and analysts to support their data needs
Required Skills:
* Proficiency in SQL and Python
* Experience with Apache Airflow
* Knowledge of data warehousing concepts
10000 jobs available