About FAC Services
Want to build your career helping those who build the world?
At FAC Services, we handle the business side so architecture, engineering, and construction firms can focus on shaping the future. Our trusted, high-quality solutions empower our partners, and our people, to achieve excellence with integrity, precision, and a personal touch.
Job Purpose
FAC Services is expanding our Quality Assurance team with a strategic hire : a Data Quality Engineer. Build and operate the quality assurance controls that keep our Data Mart trustworthy. You will design automated data quality tests, implement schema drift detection, and own release gating in CI / CD so only conformant datasets reach Gold. You’ll instrument observability (freshness, integrity, SLOs) and validate API contracts for datasets exposed via REST partnering closely with the Data Engineer and the Database Administrator.
This is a hybrid position, and candidates must reside within 60 miles of Madison, WI.
Primary Responsibilities
Automated data-quality testing (ELT / ETL)
- Design unit, integration, and regression tests for pipelines and marts. Gate promotions based on test outcomes.
Schema-Drift Detection & Controlled Evolution
Detect added / removed / renamed columns and datatype changes; enforce fail-fast behavior with documented exceptions and coordinated fixes.Partner with the Data Engineer to coordinate safe schema evolution and promotion.API Contract Validation
For datasets exposed via REST, validate Swagger / OpenAPI conformance, authentication / authorization, pagination, rate limits, and error handling (e.g., Postman / Newman).Observability & Telemetry
Stand up dashboards / alerts (Azure Monitor, Log Analytics / KQL ) for freshness and data SLOs; triage incidents and drive root cause analysis.Runbooks & Backfill / Replay
Author runbooks for incident response, CDC window recovery, and replay / backfill procedures to meet SLOs without duplications.Release gating in CI / CD
Implement validation gates in Git-based workflows (Azure DevOps / GitHub). Block nonconformant deployments and coordinate remediation with engineering.Collaboration & Handoff
With the DBA : Assert source trust (CDC / Change Tracking windows, anomalies). Block promotion when upstream data cannot be trusted; document exceptions and reopen gates only after resolution.With the Data Engineer : Integrate test hooks into pipelines / notebooks; agree on fix criteria and rerun policies; provide defect patterns to improve frameworks and performance.Qualifications
To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and / or ability required.
Experience (Required)
3+ years in Quality Assurance or data testing for pipelines and infrastructure.Data Quality; Data ValidationELT (Extract, Load, Transform)Azure Data Factory (or Microsoft Fabric Data Factory)Great Expectations or pytest (or equivalent) for automated data quality assertionsAdvanced SQL (TSQL) and PythonCI / CD (Azure DevOps / GitHub Actions)Azure Monitor; Kusto Query Language (KQL – or equivalent log-query experience)Swagger / OpenAPI (API contract checks); Postman / NewmanExperience (Preferred)
Apache Spark / PySpark (test hooks in notebooks)Microsoft Purview (lineage / labels integration with quality tests) .Infrastructure as Code (Bicep / ARM / Terraform) (provision QA / monitoring resources : Key Vault, Managed Identity, ephemeral compute)Power BI (semantic model spot checks, RLS alignment)