Location : NJ is preferred but remote is fine.
Job Title - Databricks Architect
Job Summary
Databricks Architect with expertise in building scalable data solutions on Azure
- P&C Insurance (commercial line) is knowledge required.
- Experience in modernizing (solutions and hands-on execution) enterprise data platforms from legacy to Azure cloud
- Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL / ELT pipelines on both on-premise legacy and on AZURE
- Lead migration and modernization of legacy ETL processes from SSIS, SSRS, and SQL Server to cloud-native solutions.
- Design and optimize data workflows for ingestion, transformation, and analytics using AZURE-native services
- Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AZURE
- Experience with building Medallion architecture-based data estates
- Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows
- Proficient in SQL, Python, PySpark, ADF,Azure stack
- Working knowledge of Git, CI / CD, VS Code
- Proficient in AZURE data ingestion stack
- Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and / or Batch ingestion, Pull v / s Push paradigms, Source to Target mapping, and so on
- Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions.
- Strong semantic layer modelling and implementation experience
- Establish best practices for data governance, lineage, and quality across hybrid environments.
- Provide technical leadership and mentoring to data engineers and developers.
- Monitor and troubleshoot performance issues across Databricks and AZURE services.
- Understanding of key reporting stack such as Power BI, Tableau, Excel BI Add-Ins a plus
Certifications
Databricks / Azure Certified Data Engineer Associate is a plus