A company is looking for a Data Engineer to support a post-merger integration between two tier-1 financial institutions.
Key Responsibilities
Design and implement low-latency streaming pipelines using Apache Flink and Kafka for banking transactions
Manage data extraction from legacy systems using Postgres WAL Change Data Capture (CDC) for data consistency
Build and optimize Lakehouse architectures using Databricks and Snowflake, ensuring compliance with financial regulations
Required Qualifications
Minimum 5 years of professional experience in Data Engineering, preferably in Financial Services or Fintech
Expert-level proficiency in Apache Flink, Kafka, and Databricks
Strong hands-on experience with Postgres (CDC) and Snowflake data warehousing
Deep understanding of Delta Lake and Lakehouse design principles
Proficiency in Python, Scala, or Java for streaming applications
Data Engineer • Raleigh, North Carolina, United States