Job Title: Senior Data Engineer
Location: Columbia, SC (Hybrid)
Duration: 12+ Months
Position Type: Contract Role Summary: As a S
enior Data Engineer, you will be a core member of the Team, responsible for designing, developing, and implementing enterprise-scale data solutions. This position requires a strong foundation in data architecture, data modeling, and modern data engineering practices. You will collaborate with cross-functional teams to help deliver scalable, secure, and efficient data systems that support the agency's strategic objectives.
Key Responsibilities: - Represent the Data Operations Hub in internal and external technical forums.
- Lead the architecture and development of data solutions (e.g., data lakes, data warehouses).
- Collaborate with stakeholders to ensure data products meet business requirements.
- Evaluate, recommend, and adopt emerging data technologies and best practices.
- Design and maintain architectural models and supporting documentation.
- Act as a subject matter expert for data architecture and data engineering.
- Implement modern data pipelines and solutions using Python, Java, Airflow, etc.
- Develop reusable frameworks to improve data integration and transformation processes.
- Contribute to internal data engineering standards and practices.
- Optimize system performance and resolve data-related issues proactively.
Required Skills & Experience: - 10+ years of experience as a Data Engineer or similar technical leadership role.
- 8+ years of hands-on experience in Python, Java development, and system design.
- 10+ years of experience with architecture principles in data systems.
- 6+ years of experience in data modeling, data integration, and cloud data services.
- 8+ years of experience with AWS or Azure cloud platforms.
- 10+ years of experience in SQL and cloud-native data warehouses (e.g., Snowflake, Athena, Postgres).
Preferred Qualifications: - AWS or Azure Cloud Practitioner or Developer Certification.
- Prior experience with Medicaid or other healthcare programs.
- Familiarity with data mesh, data lake architecture, and domain-driven data design.
Tools & Technologies: - Languages: Python, Java, SQL
- Tools: Sparkx (or equivalent data modeling tools), Airflow
- Platforms: AWS or Azure, Snowflake, Athena, Postgres