EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.We are seeking aSnowflake Data Engineerto join our team and enhance our data solutions.The ideal candidate will be responsible for designing and maintaining efficient data structures, optimizing data storage and retrieval within Snowflake, and ensuring data integrity across various data sources. This role involves collaboration with cross-functional teams to deliver high-quality data solutions that support analytical and operational requirements.

Want more jobs like this?GetjobsinHyderabad, Indiadelivered to your inbox every week.

Want more jobs like this?

GetjobsinHyderabad, Indiadelivered to your inbox every week.

Get Jobs

#LI-DNI#EasyApplyResponsibilitiesSnowflake Data Modeling: Design and implement scalable Snowflake data models, optimized for data ingestion and analytics requirementsETL Pipeline Development: Build and maintain robust ETL pipelines to integrate data from multiple sources into Snowflake, ensuring data integrity and consistencyPerformance Optimization: Optimize Snowflake usage and storage, tuning query performance and managing data partitions to ensure quick, reliable access to dataData Security & Governance: Implement best practices in data security, role-based access control, and data masking within Snowflake to maintain compliance and data governance standardsAutomation & Workflow Management: Utilize tools such as dbt and Apache Airflow to schedule data processing and automate pipeline monitoringCollaboration & Troubleshooting: Partner with data scientists, business analysts, and other stakeholders to address complex data challenges and troubleshoot Snowflake-related issues effectivelyDocumentation & Reporting: Develop comprehensive documentation for data structures, ETL workflows, and system processes to ensure transparency and knowledge sharing within the teamRequirements3 to 5 years of experience in data engineering or a related fieldProficiency in PythonExperience with AWS as the primary cloud providerExpertise in Snowflake modeling, data modeling using Data Vault, and the DBT framework for data transformation pipelinesKnowledge of workflow management tools like Argo, Oozie, and Apache AirflowUnderstanding of the requirements for a scalable, secure, and high-performance data warehouse that supports integration with monitoring and observability toolsWe offerOpportunity to work on technical challenges that may impact across geographiesVast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certificationsOpportunity to share your ideas on international platformsSponsored Tech Talks & HackathonsUnlimited access to LinkedIn learning solutionsPossibility to relocate to any EPAM office for short and long-term projectsFocused individual developmentBenefit package:Health benefitsRetirement benefitsPaid time offFlexible benefitsForums to explore beyond work passion (CSR, photography, painting, sports, etc.)