At Trinity Industries , we dont just build railcars and deliver logistics we shape the future of industrial transportation and infrastructure. As a Senior Forward Deployed Engineer , youll be on the front lines deploying Palantir Foundry solutions directly into our operations, partnering with business leaders and frontline teams to transform complex requirements into intuitive, scalable solutions. Your work will streamline manufacturing, optimize supply chains, and enhance safety across our enterprise. This is more than a coding role its an opportunity to embed yourself in the heart of Trinitys mission, solving realworld challenges that keep goods and people moving across North America.
Join our team today and be a part of Delivering Goods for the Good of All!
What you'll do :
End-to-End Solution Delivery :
- Autonomously lead the design, development, and deployment of scalable data pipelines, full applications, and workflows in Palantir Foundry, integrating with cloud platforms (e.g., AWS, Azure, GCP) and external sources (e.g., Snowflake, Oracle, REST APIs). Ensure solutions are reliable, secure, and compliant with industry standards (e.g., GDPR, SOX), while handling ambiguity and delivering on-time results in high-stakes environments. Demonstrate deep expertise in Foundry's ecosystem to independently navigate and optimize complex builds
Full Application and Workflow Development :
Build comprehensive, end-to-end applications and automated workflows using Foundry modules such as Workshop, Slate, Quiver, Contour, and Pipeline Builder. Focus on creating intuitive, interactive user experiences that integrate front-end interfaces with robust back-end logic, enabling seamless operational tools like real-time supply chain monitoring systems or AI-driven decision workflows, going beyond data models to deliver fully functional, scalable solutionsData Modeling and Transformation for Advanced Analytics :
Architect robust data models and ontologies in Foundry to standardize and integrate complex datasets from manufacturing and logistics sources. Develop reusable transformation logic using PySpark, SQL, and Foundry tools (e.g., Pipeline Builder, Code Repositories) to cleanse, enrich, and prepare data for advanced analytics, enabling predictive modeling, AI-driven insights, and operational optimizations like cost reductions or efficiency gains. Focus on creating semantic integrity across domains to support proactive problem-solving and game-changing outcomesDashboard Development and Visualization :
Build interactive dashboards and applications using Foundry modules (e.g., Workshop, Slate, Quiver, Contour) to provide real-time KPIs, trends, and visualizations for business stakeholders. Leverage these tools to transform raw data into actionable insights, such as supply chain monitoring or performance analytics, enhancing decision-making and user adoptionAI Integration and Impact :
Elevate business transformation by designing and implementing AIP pipelines and integrations that harness AI / ML for high-impact applications, such as predictive analytics in leasing & logistics, anomaly detection in manufacturing, or automated decision-making in supply chains. Drive transformative innovations through AIP's capabilities, integrating Large Language Models (LLMs), TensorFlow, PyTorch, or external APIs to deliver bottom-line resultsLeadership and Collaboration :
Serve as a lead FDE on the team, collaborating with team members through hands-on guidance, code reviews, workshops, and troubleshooting. Lead by example in fostering a culture of efficient Foundry building and knowledge-sharing to scale team capabilitiesBusiness Domain Strategy and Innovation :
Deeply understand Trinity's industrial domains (e.g., leasing financials, manufacturing processes, supply chain logistics) to identify stakeholder needs better than they do themselves. Propose and implement disruptive solutions that drive long-term productivity, retention, and business transformation, incorporating interoperable cloud IDEs such as Databricks for complementary data processing and analytics workflowsCollaboration and Stakeholder Engagement :
Work cross-functionally with senior leadership and teams to gather requirements, validate solutions, and ensure trustworthiness in high-stakes projectsWhat you'll bring :
Bachelor's degree in Computer Science, Engineering, Data Science, Financial Engineering, Econometrics, or a related field required (Master's preferred)8 plus years of hands-on experience in data engineering, with at least 4 years specializing in Palantir Foundry (e.g., Ontology, Pipelines, AIP, Workshop, Slate), demonstrating deep, autonomous proficiency in building full applications and workflowsProven expertise in Python, PySpark, SQL, and building scalable ETL workflows, with experience integrating with interoperable cloud IDEs such as DatabricksDemonstrated ability to deliver end-to-end solutions independently, with strong evidence of quantifiable impacts (e.g., Built pipeline reducing cloud services expenditures by 30%)Strong business acumen in industrial domains like manufacturing, commercial leasing, supply chain, or logistics, with examples of proactive innovationsExperience collaborating with team members and leadership in technical environmentsExcellent problem-solving skills, with a track record of handling ambiguity and driving results in fast-paced settingsPreferred Qualifications
Certifications in Palantir Foundry (e.g., Foundry Data Engineer, Application Developer)Experience with AI / ML integrations (e.g., TensorFlow, PyTorch, LLMs) within Foundry AIP for predictive analyticsFamiliarity with CI / CD tools and cloud services (e.g., AW, Azure, Google Cloud).Strongly Desired : Hands-on experience with enterprise visualization platforms such as Qlik, Tableau, or PowerBI to enhance dashboard development and analytics delivery (not required but a significant plus for integrating with Foundry tools).