Go to search
Snowflake Data Engineer
Data Integration, Snowflake, Python, Apache Airflow, dbt, DevOps, Amazon Web Services, Microsoft Azure
Hyderabad, Bangalore, Pune, Chennai, Gurgaon
We are seeking a Snowflake Data Engineer to join our team and enhance our data solutions.
The ideal candidate will be responsible for designing and maintaining efficient data structures, optimizing data storage and retrieval within Snowflake, and ensuring data integrity across various data sources. This role involves collaboration with cross-functional teams to deliver high-quality data solutions that support analytical and operational requirements.
Responsibilities
- Snowflake Data Modeling: Design and implement scalable Snowflake data models, optimized for data ingestion and analytics requirements
- ETL Pipeline Development: Build and maintain robust ETL pipelines to integrate data from multiple sources into Snowflake, ensuring data integrity and consistency
- Performance Optimization: Optimize Snowflake usage and storage, tuning query performance and managing data partitions to ensure quick, reliable access to data
- Data Security & Governance: Implement best practices in data security, role-based access control, and data masking within Snowflake to maintain compliance and data governance standards
- Automation & Workflow Management: Utilize tools such as dbt and Apache Airflow to schedule data processing and automate pipeline monitoring
- Collaboration & Troubleshooting: Partner with data scientists, business analysts, and other stakeholders to address complex data challenges and troubleshoot Snowflake-related issues effectively
- Documentation & Reporting: Develop comprehensive documentation for data structures, ETL workflows, and system processes to ensure transparency and knowledge sharing within the team
Requirements
- 3 to 5 years of experience in data engineering or a related field
- Proficiency in Python
- Experience with AWS as the primary cloud provider
- Expertise in Snowflake modeling, data modeling using Data Vault, and the DBT framework for data transformation pipelines
- Knowledge of workflow management tools like Argo, Oozie, and Apache Airflow
- Understanding of the requirements for a scalable, secure, and high-performance data warehouse that supports integration with monitoring and observability tools