This role is an office position based in Louisville, Kentucky.
What the Role Is
The Data Engineer will build and manage data pipelines from both primary external and internal sources into central data repositories, and from there to usable, business-facing, enterprise-ready data sources. This position will work within a scrum team and will use Asana to plan and document work.
The Data Engineer is expected to understand large language models and have experience using them to drive data engineering solutions, including IDE-integrated and external LLM-based coding tools. This position may also design and support language-model-drive solutions for end user reporting and analytical needs in partnership with our AI program.
How You Will Spend Your Time?
- Build, maintain, and optimize commercial analytics data products and models that support FP&A, Sales, and other commercial teams.
- Collaborate with the Commercial scrum, IT, and business stakeholders to define requirements, prioritize work, and deliver high-quality data products.
- Enterprise Ready Data Sources
- Curate enterprise-ready data sources for consumption by Power BI for self-service analytics in the business, as well as ingestion in data science workflows, web applications, and R Shiny or similar tools.
- Focus on commercial data products that integrate internal financial and sales data with external syndicated sources to support forecasting, performance measurement, and decision-making.
- Data Pipelines (ETL/ELT)
- Write data pipelines from external and internal commercial data sources into central data locations (data lake / Fabric lakehouse) using SQL, R, and/or Python.
- Transition legacy OLAP- and SSIS-based workflows into well-documented, transparent, and fully reproducible ETL/ELT environments likely driven by notebook (code + text) solutions within Fabric and related tooling.
- Write data pipelines from central repositories to enterprise-ready data sources for consumption by web apps, data scientists, analyst professionals, and Power BI semantic models.
- Work with transparency, best practices, and documentation as guiding principles.
- Leverage integrated large language models (LLMs) and AI-assisted tooling to accelerate coding, documentation, testing, and ideation of data and analytics solutions..
- Other duties as required to support changing business needs.
Who You Are…