[job_card.job_description]Job Title: Senior ETL Developer Location: North Carolina MOH : Onsite / W2Experience Level: 10–15 years job Summary We are seeking a highly experienced ETL Developer to design, develop, and optimize enterprise-level data integration solutions. The ideal candidate will have a deep understanding of ETL processes, data warehousing, and performance tuning, along with experience working in complex data ecosystems. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate with cross-functional teams to ensure the accuracy, integrity, and performance of data pipelines. Key Responsibilities Design, develop, and maintain robust ETL processes to support data movement, transformation, and loading across multiple systems.Analyze business and data requirements to translate them into technical specifications and ETL workflows.Optimize ETL performance, including data loading, transformation logic, and job scheduling.Work closely with data architects, analysts, and application teams to ensure consistent and accurate data delivery.Implement best practices for data quality, validation, and governance.Troubleshoot and resolve issues in existing ETL jobs and data pipelines.Develop and maintain technical documentation for ETL design, mappings, and workflows.Participate in data migration and modernization initiatives, including cloud data integration.Ensure compliance with data security, privacy, and audit standards. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field (Master’s preferred10–15 years of experience in ETL development and data warehousing.Strong proficiency in one or more ETL tools such asInformatica PowerCenter, Talend, SSIS,DataStage, or Azure Data Factory.Advanced SQL and PL/SQL skills with strong database experience (Oracle, SQL Server, Snowflake, or similarHands-on experience withdata modeling, data integration, and data quality frameworks.Familiarity withcloud platforms (AWS, Azure, or GCP) and modern data architectures (e.g., data lakes, data meshStrong understanding ofdata warehousing concepts (Star/Snowflake schemas, slowly changing dimensions, etc.Experience withscripting languages (Python, Shell, etc for automation.Excellent analytical, problem-solving, and communication skills. Preferred Skills Experience with real-time data integration and streaming platforms (Kafka, AWS Glue, etc.Exposure toDevOps, CI/CD pipelines, and version control tools (Git, JenkinsWorking knowledge ofdata governance and metadata management tools.Background infinance, healthcare, or retail domains is a plus.