backGo to search

Senior Systems Engineer - Data DevOps/MLOps

Hybrid in Coimbatore
bullets
Data DevOps
& others
bullets
hot

We are looking for a detail-oriented and motivated Senior Systems Engineer with a strong focus on Data DevOps/MLOps to join our team.

The ideal candidate should possess a deep understanding of data engineering, automation of data pipelines, and integration of machine learning models into operational environments. This role is for a collaborative professional adept at building, deploying, and managing scalable data and ML pipelines aligned with strategic objectives.

Responsibilities
  • Design CI/CD pipelines for data integration and machine learning model deployment
  • Deploy and maintain infrastructure for data processing and model training using cloud services
  • Automate processes like data validation, transformation, and workflow orchestration
  • Coordinate with data scientists, software engineers, and product teams to integrate ML models into production environments
  • Enhance performance and reliability by optimizing model serving and monitoring processes
  • Ensure data versioning, lineage tracking, and reproducibility across ML experiments
  • Identify improvements for deployment processes, scalability, and infrastructure resilience
  • Implement security measures to safeguard data integrity and maintain compliance
  • Resolve issues in the data and ML pipeline lifecycle
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field
  • 4 or more years of experience in Data DevOps, MLOps, or related professions
  • Proficiency in cloud platforms such as Azure, AWS, or GCP
  • Background in Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or Ansible
  • Expertise in containerization and orchestration tools such as Docker and Kubernetes
  • Skills in using data processing frameworks like Apache Spark or Databricks
  • Proficiency in Python, with familiarity with data manipulation and ML libraries such as Pandas, TensorFlow, or PyTorch
  • Familiarity with CI/CD tools like Jenkins, GitLab CI/CD, or GitHub Actions
  • Knowledge of version control systems, such as Git, and MLOps platforms like MLflow or Kubeflow
  • Understanding of monitoring, logging, and alerting systems like Prometheus or Grafana
  • Strong problem-solving abilities with the capability to work both independently and collaboratively
  • Effective communication and documentation skills
Nice to have
  • Familiarity with DataOps practices and tools like Airflow or dbt
  • Understanding of data governance frameworks and tools like Collibra
  • Knowledge of Big Data technologies such as Hadoop or Hive
  • Credentials in cloud platforms or data engineering activities