Data Developer with Snowflake, P&C Insurance, Data Engineering, Python, and TALEND
Role Overview
We are seeking an experienced Data Developer / Data Engineer with strong insurance domain expertise to design, develop, and maintain scalable data solutions supporting analytics, reporting, and operational use cases. This role requires hands-on experience with modern cloud data platforms-including Snowflake-and strong programming skills in Python, along with deep understanding of insurance data.
The ideal candidate will bridge technical data engineering capabilities with insurance business knowledge to deliver high-quality, reliable data assets.
Key Responsibilities
- Design, build, and maintain end-to-end data engineering pipelines (ETL / ELT) for insurance data
- Develop and optimize data solutions using Snowflake as a cloud data warehouse
- Use Python to support data ingestion, transformation, automation, and orchestration
- Integrate data from core insurance systems (Policy, Claims, Billing, Underwriting, Reinsurance)
- Model insurance data for analytical, actuarial, financial, and regulatory reporting use cases
- Write complex, high-performance SQL queries and transformations
- Ensure data quality, validation, lineage, and governance standards
- Collaborate with business stakeholders, analysts, and architects to translate insurance requirements into technical solutions
- Troubleshoot and resolve data pipeline, performance, and data integrity issues
- Document data models, pipelines, and best practices
Required Qualifications
5 years of experience in Data Development or Data Engineering rolesStrong insurance domain experience (P&C, Life, Health, or Specialty Insurance)Hands-on experience with Snowflake (data modeling, performance tuning, security, cost optimization)Advanced SQL skillsProficiency in Python for data processing and automationExperience building and maintaining scalable data pipelinesStrong understanding of insurance data concepts (policies, premiums, claims, losses, exposures)Experience working with large, complex datasetsStrong analytical, troubleshooting, and communication skillsPreferred / Nice-to-Have Skills
Cloud platforms : Azure, AWS, or GCPAzure Data Factory, Azure Data Lake, Databricks, SynapseExperience with orchestration tools (Airflow, Azure Data Factory, dbt, or similar)Familiarity with BI and reporting tools (Power BI, Tableau, Looker)Experience with insurance platforms such as Guidewire, Duck Creek, MajescoKnowledge of data governance, metadata management, and regulatory reportingExperience working in Agile / Scrum environmentsEducation
Bachelor's degree in Computer Science, Data Engineering,Information Systems, or a related field (or equivalent experience)