This role is for one of the Weekday's clients
Salary range: Rs 1500000 - Rs 2500000 (ie INR 15-25 LPA)
Min Experience: 5 years
Location: Pune
JobType: full-time
Responsibilities:
- Design, develop, and maintain scalable ELT/ETL data pipelines, ensuring data accuracy, completeness, and reliability.
- Collaborate with stakeholders to gather data requirements and translate them into efficient data models and pipelines.
- Build and optimize data pipelines using Snowflake, Airflow, ElasticSearch, AWS S3, and NFS.
- Develop and maintain data warehouse schemas to support business intelligence and analytics needs.
- Implement data quality checks and monitoring mechanisms to ensure data integrity.
- Work closely with data scientists and analysts to improve data accessibility and usability for various analytical use cases.
- Stay updated with best practices in CI/CD, DevSecFinOps, Scrum, and emerging trends in data engineering.
- Contribute to the architecture and enhancement of the data warehouse.
Requirements (Mandatory):
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in Data Engineering, focusing on ELT/ETL processes.
- 3+ years of hands-on experience with Snowflake and data warehousing technologies.
- 3+ years of experience creating and maintaining Airflow ETL pipelines.
- 3+ years of professional experience using Python for data manipulation and automation.
- Working experience with ElasticSearch in data pipelines.
- Strong proficiency in SQL and data modeling techniques.
- Expertise in cloud-based data storage solutions such as AWS S3.
- Experience working with NFS and other file storage systems.
- Strong problem-solving, analytical, and collaboration skills.
Strive not to be a success, but rather to be of value.
“Albert Einstein”