TalentAQ

TalentAQ

Data/ML Engineer

EngineeringFull Time8+ years

Required Skills
39 skills

Machine Learning
Deep Learning
PyTorch
TensorFlow
Scikit-learn
NLP
Computer Vision
Time Series Forecasting
Model Deployment
MLOps
MLflow
Airflow
Docker
Kubernetes
Experiment tracking
CI/CD
ETL
ELT
Apache Spark
dbt
Data warehousing
Redshift
BigQuery
Snowflake
Streaming data
Kafka
Spark Streaming
SQL
Python
Scala
Data architecture
AWS
S3
EC2
SageMaker
Glue
GCP
Vertex AI
Azure

Job Description

<h3>Job Overview</h3><p>We are looking for a skilled Data/ML Engineer to join our team. In this role, you will be responsible for building and maintaining the data infrastructure that supports our machine learning models. You will work closely with data scientists and other engineers to ensure the reliability and scalability of our AI systems.</p><h3>Key Responsibilities</h3><ul><li>Develop and maintain ETL/ELT pipelines using Apache Spark, Airflow, and dbt.</li><li>Build and optimize data warehousing solutions using Redshift, BigQuery, or Snowflake.</li><li>Implement streaming data solutions using Kafka and Spark Streaming.</li><li>Deploy and monitor machine learning models using MLOps tools.</li><li>Collaborate with data scientists to productionize machine learning models.</li></ul><h3>Required Skills</h3><ul><li>Proficiency in Python, Scala, and SQL.</li><li>Experience with cloud platforms such as AWS, GCP, or Azure.</li><li>Strong understanding of data architecture and workflow orchestration.</li><li>Experience with machine learning model deployment and monitoring.</li></ul>

Job Overview

We are looking for a skilled Data/ML Engineer to join our team. In this role, you will be responsible for building and maintaining the data infrastructure that supports our machine learning models. You will work closely with data scientists and other engineers to ensure the reliability and scalability of our AI systems.

Key Responsibilities

  • Develop and maintain ETL/ELT pipelines using Apache Spark, Airflow, and dbt.
  • Build and optimize data warehousing solutions using Redshift, BigQuery, or Snowflake.
  • Implement streaming data solutions using Kafka and Spark Streaming.
  • Deploy and monitor machine learning models using MLOps tools.
  • Collaborate with data scientists to productionize machine learning models.

Required Skills

  • Proficiency in Python, Scala, and SQL.
  • Experience with cloud platforms such as AWS, GCP, or Azure.
  • Strong understanding of data architecture and workflow orchestration.
  • Experience with machine learning model deployment and monitoring.

Similar Jobs

10000 jobs available

Data Engineering10+ yearsRemote
Azure Data Factory
Azure Synapse
Azure Databricks
+25 more
Data Engineering10+ yearsRemote
Azure Data Factory
Azure Synapse
Azure Databricks
+25 more
Data Science3-6 years
Data Science
Computer Science
Statistics
+25 more
Data Science7-12 years
Python
SQL
XGBoost
+8 more
AI/ML
NLP
NLG
+12 more
GCP
BigQuery
Dataflow
+19 more