TalentAQ

TalentAQ

Data/ML Engineer

EngineeringFull Time8+ years

Required Skills
39 skills

Machine Learning
Deep Learning
PyTorch
TensorFlow
Scikit-learn
NLP
Computer Vision
Time Series Forecasting
Model Deployment
MLOps
MLflow
Airflow
Docker
Kubernetes
Experiment tracking
CI/CD
ETL
ELT
Apache Spark
dbt
Data warehousing
Redshift
BigQuery
Snowflake
Streaming data
Kafka
Spark Streaming
SQL
Python
Scala
Data architecture
AWS
S3
EC2
SageMaker
Glue
GCP
Vertex AI
Azure

Job Description

<h3>Job Overview</h3><p>We are looking for a skilled Data/ML Engineer to join our team. In this role, you will be responsible for building and maintaining the data infrastructure that supports our machine learning models. You will work closely with data scientists and other engineers to ensure the reliability and scalability of our AI systems.</p><h3>Key Responsibilities</h3><ul><li>Develop and maintain ETL/ELT pipelines using Apache Spark, Airflow, and dbt.</li><li>Build and optimize data warehousing solutions using Redshift, BigQuery, or Snowflake.</li><li>Implement streaming data solutions using Kafka and Spark Streaming.</li><li>Deploy and monitor machine learning models using MLOps tools.</li><li>Collaborate with data scientists to productionize machine learning models.</li></ul><h3>Required Skills</h3><ul><li>Proficiency in Python, Scala, and SQL.</li><li>Experience with cloud platforms such as AWS, GCP, or Azure.</li><li>Strong understanding of data architecture and workflow orchestration.</li><li>Experience with machine learning model deployment and monitoring.</li></ul>

Job Overview

We are looking for a skilled Data/ML Engineer to join our team. In this role, you will be responsible for building and maintaining the data infrastructure that supports our machine learning models. You will work closely with data scientists and other engineers to ensure the reliability and scalability of our AI systems.

Key Responsibilities

  • Develop and maintain ETL/ELT pipelines using Apache Spark, Airflow, and dbt.
  • Build and optimize data warehousing solutions using Redshift, BigQuery, or Snowflake.
  • Implement streaming data solutions using Kafka and Spark Streaming.
  • Deploy and monitor machine learning models using MLOps tools.
  • Collaborate with data scientists to productionize machine learning models.

Required Skills

  • Proficiency in Python, Scala, and SQL.
  • Experience with cloud platforms such as AWS, GCP, or Azure.
  • Strong understanding of data architecture and workflow orchestration.
  • Experience with machine learning model deployment and monitoring.

Similar Jobs

10000 jobs available

Data Engineering10+ years
Java
PHP
Python
+28 more
Veritas Healthcare
Engineering3-6 years
Machine Learning
TensorFlow
PyTorch
+18 more
EngineeringFull-time5+ yearsRemote
GCP
BigQuery
Airflow
+9 more
Data Engineering12+ years
Data Bricks
AWS
SnowFlakes
+18 more
EngineeringFull-time3-4 yearsRemote
Python
R
Java
+11 more
UniQtal Solutions Inc.
EngineeringFull-Time7+ yearsRemote
Machine Learning
AI
PyTorch
+10 more