backGo to search

Senior Data DevOps - MLOps Expertise

Office in Chennai, Coimbatore
bullets
Data DevOps
& others
bullets
hot

We are looking for a dedicated and proficient Senior Data DevOps Engineer with extensive MLOps knowledge to enhance our team.

The ideal candidate should possess a comprehensive knowledge of data engineering, data pipeline automation, and machine learning model operationalization. The role demands a cooperative professional skilled in designing, deploying, and managing extensive data and ML pipelines in alignment with organizational objectives.

Responsibilities
  • Develop, deploy, and manage Continuous Integration/Continuous Deployment (CI/CD) pipelines for data integration and machine learning model deployment
  • Set up and sustain infrastructure for data processing and model training through cloud-based resources and services
  • Automate processes for data validation, transformation, and workflow orchestration
  • Work closely with data scientists, software engineers, and product teams for a smooth integration of ML models into production
  • Enhance model serving and monitoring to boost performance and dependability
  • Manage data versioning, lineage tracking, and the reproducibility of ML experiments
  • Actively search for enhancements in deployment processes, scalability, and infrastructure resilience
  • Implement stringent security protocols to safeguard data integrity and compliance with regulations
  • Troubleshoot and solve issues throughout the data and ML pipeline lifecycle
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field
  • 4+ years of experience in Data DevOps, MLOps, or similar roles
  • Proficiency in cloud platforms such as Azure, AWS, or GCP
  • Background in Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or Ansible
  • Expertise in containerization and orchestration technologies including Docker and Kubernetes
  • Hands-on experience with data processing frameworks such as Apache Spark and Databricks
  • Proficiency in programming languages including Python with an understanding of data manipulation and ML libraries like Pandas, TensorFlow, and PyTorch
  • Familiarity with CI/CD tools including Jenkins, GitLab CI/CD, and GitHub Actions
  • Experience with version control tools and MLOps platforms such as Git, MLflow, and Kubeflow
  • Strong understanding of monitoring, logging, and alerting systems including Prometheus and Grafana
  • Excellent problem-solving abilities with capability to work independently and in teams
  • Strong skills in communication and documentation
Nice to have
  • Background in DataOps concepts and tools such as Airflow and dbt
  • Knowledge of data governance platforms like Collibra
  • Familiarity with Big Data technologies including Hadoop and Hive
  • Certifications in cloud platforms or data engineering