Job Description
Job Description
Job description
Experience : 5–10 Years
Employment Type : W2
Work Authorization : GC / US Citizens Only
Location : Englewood, CO (Remote / Other locations may be considered)
Job Summary :
We are seeking an experienced ETL Developer with strong Python and SQL expertise to design, develop, and maintain scalable data pipelines. The ideal candidate will be responsible for building reliable ETL / ELT workflows, ensuring data quality, and supporting analytics and business intelligence initiatives across the organization.
Roles & Responsibilities
- Design, develop, test, and maintain robust ETL / ELT data pipelines from multiple data sources (databases, APIs, flat files) to data warehouses or data lakes
- Write and optimize complex SQL queries, scripts, stored procedures, and functions for data extraction, transformation, and loading
- Use Python (and shell scripting where required) to automate ETL workflows, implement custom business logic, manage file transfers, and handle error processing
- Implement data quality checks and validation rules to ensure data accuracy, completeness, and consistency
- Monitor ETL jobs, troubleshoot failures, and optimize performance and scalability of data pipelines
- Collaborate with data analysts, data architects, and business stakeholders to gather requirements and define data models
- Create and maintain technical documentation , including data flow diagrams, source-to-target mappings, and process documentation
- Support and enhance data warehouse solutions , applying data modeling best practices such as star and snowflake schemas
Required Skills & Qualifications :
Technical Skills
Strong expertise in SQL , including complex queries, joins, aggregations, performance tuning, and database optimizationProficiency in Python for data processing, automation, and scripting (experience with libraries such as Pandas, NumPy, SQLAlchemy is a plus)Solid understanding of relational databases, data warehousing concepts, and data modelingExperience with ETL tools or frameworks such as Informatica, Talend, SSIS, Apache Airflow, AWS Glue, or Azure Data Factory (preferred)Experience with version control systems (Git)Education & Experience
Bachelor’s degree in Computer Science, Information Technology, or a related field5–10 years of hands-on experience in ETL development, data engineering, or related rolesRequirements
Required Skills & Qualifications :
Technical Skills
Strong expertise in SQL , including complex queries, joins, aggregations, performance tuning, and database optimizationProficiency in Python for data processing, automation, and scripting (experience with libraries such as Pandas, NumPy, SQLAlchemy is a plus)Solid understanding of relational databases, data warehousing concepts, and data modelingExperience with ETL tools or frameworks such as Informatica, Talend, SSIS, Apache Airflow, AWS Glue, or Azure Data Factory (preferred)Experience with version control systems (Git)