Python [h1.location_city]
[job_alerts.create_a_job]
Python • chandler az
Data Engineer - AWS, Snowflake, Python
IntraEdgeChandler, AZSoftware Engineer III-Python
Bank of AmericaChandlerRemote Senior Python Engineer - AI Trainer
SuperAnnotateTempe, Arizona, USForward Deployed Software Engineer (FDE)- Python
PaychexTempe, ArizonaData Engineer - AWS, Snowflake, Python
IntraEdgeChandler, AZ- [job_card.full_time]
Job Description
The Data Engineer will be responsible for designing, developing, and maintaining robust and scalable data pipelines and data solutions. This role requires strong expertise in cloud-based data platforms, particularly AWS services like Glue and Lambda , combined with proficiency in Python, PySpark , and Snowflake for data warehousing . The ideal candidate will ensure efficient data ingestion, transformation, and availability for analytics and reporting, contributing to data-driven decision-making.Key Responsibilities :
- Design, build, and maintain scalable and efficient ETL / ELT pipelines using AWS Glue, AWS Lambda, and PySpark for data ingestion, transformation, and loading into Snowflake.
- Develop and optimize data models within Snowflake, ensuring high performance and adherence to best practices for data warehousing.
- Write, test, and deploy production-grade Python and PySpark code for data processing and manipulation.
- Implement and manage data orchestration and scheduling using AWS services or other relevant tools.
- Monitor data pipeline performance, troubleshoot issues, and implement optimizations for improved efficiency and reliability.
- Ensure data quality, integrity, and security across all data solutions, adhering to compliance standards.
Required Skills & Qualifications :
Good to Have :
Job Requirements The Data Engineer will be responsible for designing, developing, and maintaining robust and scalable data pipelines and data solutions. This role requires strong expertise in cloud-based data platforms, particularly AWS services like Glue and Lambda , combined with proficiency in Python, PySpark , and Snowflake for data warehousing . The ideal candidate will ensure efficient data ingestion, transformation, and availability for analytics and reporting, contributing to data-driven decision-making.Key Responsibilities :
Required Skills & Qualifications :
Good to Have :