Overview
UW Credit Union is adding a talented Senior Data Engineer (internally known as a Data Engineer II) to our dynamic and developing Data Analytics team.
Are you a skilled and motivated Data Engineer with expert SQL skills and looking to make a significant impact?
As a Senior Data Engineer, you'll play a pivotal role in building and operationalizing data pipelines using DBT and Snowflake technologies within our cloud-native data platform. If you're passionate about working with cutting-edge technologies, collaborating with cross-functional teams, and driving data-driven initiatives from the ground up, this role could be your perfect opportunity.
Join UW Credit Union and be at the forefront of our developing Data Analytics team, driving the creation and operationalization of data pipelines necessary for our data and analytics initiatives. You'll work closely with our business units, data analysts and data scientists to design, develop, and optimize data pipelines, ensuring high-quality, reliable data for various analytics use cases. This role requires a mix of technical expertise, creative problem-solving, and collaboration.
Why Work for UW Credit Union?
Join one of Wisconsin’s premier financial institutions, a 2023 Top Workplaces USA and multi-year recipient of Madison Magazine’s Best Places to Work, Wisconsin State Journal’s Top Workplaces, and Milwaukee Journal Sentinel’s Top Workplaces to receive :
- 21.5+ days of annual paid time off
- 2 weeks paid caregiver leave
- 2.5 weeks paid new child parental leave
- 2 days paid volunteer time
- 11 paid holidays (includes your birthday!)
- 401k company match of up to 5%, plus approximately 4% discretionary match
- Variable bonus reward
- Competitive Medical, Dental and Vision plans, including domestic partner eligibility
- Free bus pass and bublr / BCycle membership
- Employee Assistance Program
- Hybrid work environment
- Salary : $105,000 - $125,000; mid-point $118,000
- And more!
Responsibilities
What You’ll Do
Engage in scrum ceremonies including Daily Standups, Sprint Planning, Retrospectives, and Demos.Collaborate daily with your development team, sharing progress and offering support to one another.Design, develop, and maintain highly scalable and extensible data pipelines from internal and external sources.Test and validate models developed by fellow team members to ensure solutions have high data quality and follow development standards.Collaborate with cross-functional teams to design, develop, and deploy data-driven applications and products.Optimize data queries, models, and storage formats to support common usage patterns.Work with the data science team to enhance machine learning algorithms and platforms.Define and automate data quality checks and develop data solutions, including data visualization tools.Contribute to the definition and management of data governance standards and policies.Participate in prototyping emerging technologies for data ingestion, transformation, and distributed file systems within a cloud-native big data platform.Stay up-to-date with industry trends and best practices in data engineering and cloud computing.Support nightly and weekend on-call support activities.Qualifications
What You’ll Need to Succeed
These skills and experiences are essential to your success :
Bachelor’s degree in Information Systems, Computer Science, Data Science, Business, or related field; relevant work experience may be considered in lieu of educational requirements4+ years of relevant work experience in data engineering space in a cloud data platformDemonstrated experience providing customer-drive solutions, support and serviceIn-depth knowledge of SQL and demonstrated ability to write SQL optimized for MPPSAbility to employ design patterns and generalize code to address common use casesCapable of authoring robust, high quality, reusable code and contributing to the division’s inventory of librariesExpertise and demonstrated experience developing distributed data processing solutions including batch and streamingKnowledge of object-oriented programming languages and scripting languagesAbility to work 3 days a week in office (Monday, Wednesday & Thursday)These attributes and knowledge are preferred but not required :
Experience working with the following tools : Snowflake DBTAzure (Data Factory, Storage Accounts)Python Azure DevOps Team Boards and ReposExperience working in the financial services industry