Senior Data & Integration Engineer – Onsite | Denver, CO $140,000–$180,000 + Bonus | Full-Time | Exempt A fast?growing nationwide real estate organization is seeking a
Senior Data & Integration Engineer to build and own the data infrastructure that powers its expanding technology ecosystem. This is a full-time, onsite role (Monday–Friday) working directly with the CTO in a high?velocity environment where every engineer leverages AI tools as part of their daily workflow.
We care about what you can do and what you've shipped, not where you learned to do it.
About the Role This role is responsible for creating the centralized data foundation that connects internal systems, external vendors, financial models, and AI-driven document pipelines into a single source of truth. You’ll integrate operational systems, market data providers, and internal analytics to build a robust, scalable data layer that supports all applications across the organization.
You’ll build data pipelines across three levels of complexity:
1. Pre?Built Connectors (Low Complexity) Use platform connectors to integrate systems such as property management platforms, market data sources, and credit/risk data feeds. Work involves configuration, mapping, validation—not writing custom connectors.
2. Structured Uploads (Medium Complexity) Handle ingestion of financial models, deal trackers, and construction budgets via Excel/CSV. Define templates, manage SFTP delivery, and configure validation workflows.
3. AI?Powered Document Extraction (High Complexity) Build pipelines for PDFs—including OMs, leases, rent rolls, and insurance certificates—using AI-based extraction, validation, and ingestion into the unified data layer.
This is an infrastructure-first role, not a dashboarding position. Your work enables:
- Real-time data delivery to internal applications
- AI tools to generate investment analyses
- Deal scoring and market comparison tools
- Automated variance detection between underwriting assumptions and real performance
You’ll collaborate across teams and have direct access to leadership, making communication skills and a team-first mentality essential.
What You’ll Do - Learn the unified real estate data platform, its data model, GraphQL API, and integration modules
- Configure pipelines from operational systems, mapping source schemas into standardized property, lease, tenant, unit, and financial models
- Integrate underwriting and DCF data; build variance comparisons between assumptions and actual performance
- Deploy and manage AI-driven document extraction pipelines for complex PDF documents
- Onboard market data sources (comps, credit data, climate/hazard data) using platform connectors
- Build custom ingestion pipelines for semi-structured data via SFTP
- Redirect internal applications to read from the centralized data layer
- Set up monitoring, alerting, and automated data-quality checks
- Extend the data model to support new business needs
- Lead the organization’s maturity from manual ingestion to fully automated workflows
Ongoing Responsibilities - Monitor, debug, and maintain pipelines across all systems
- Handle schema changes from both source systems and the centralized data platform
- Onboard new data sources as the organization grows
- Manage and extend the API layer serving data to internal applications
- Implement caching and fallback logic for resilience
- Maintain documentation for pipelines, integration patterns, and data models
- Stand up and maintain vector databases for semantic search across documents and investment materials
- Improve sync frequency, monitoring coverage, and data quality across all sources
You May Be a Good Fit If You… - Have 3+ years of experience building production data pipelines integrating enterprise systems
- Are proficient in Python and SQL (PostgreSQL preferred)
- Have experience with REST and GraphQL APIs
- Have built ETL/ELT pipelines using Airflow, Dagster, Prefect, dbt, etc.
- Treat data quality, validation, and monitoring as core engineering responsibilities
- Use AI coding tools (Claude, Cursor, Copilot, etc.) as part of your daily workflow
- Can take ambiguous integration requirements and deliver reliable solutions
- Write clear documentation and communicate effectively with both technical and non-technical teams
- Bring a positive, collaborative attitude
Bonus Points - Experience with real estate systems or market data providers
- Experience with unified data aggregation platforms
- Familiarity with vector databases (pgvector, Pinecone, Weaviate)
- Background in real estate, financial services, or proptech data engineering
- Experience with distributed compute (Spark, Flink) or cloud data warehouses (BigQuery, Snowflake, Redshift)
- Experience building document-extraction or OCR pipelines
- Experience designing or consuming GraphQL APIs
What We Don’t Require - FAANG experience
- Prior real estate knowledge (you’ll learn quickly; the models are intuitive once seen)
Applicants must be legally authorized to work in the United States. Sponsorship is not available for this position.