This is a hybrid role based in Independence, OH, with one day per week onsite required.
Candidates willing to relocate are encouraged to apply, and relocation assistance is available.
Please note, we are unable to transfer or sponsor visas at this time.KORE1, a nationwide provider of staffing and recruiting solutions, has an immediate opening for a
Lead Data Engineer: Big data (Hadoop), AWS & automated deploymentsThe Data Tech Lead is a member of the Insurance Data Engineering Team. They will play a critical role on the team, developing data solutions to solve business and technical problems.
What You'll Do- Leading the development & delivery of technical solutions
- Architect and implement solutions using big data platforms (Hadoop, Hive, Hbase, Impala, Sqoop) and cloud technologies (AWS: EC2, S3, Lambda, Glue, Redshift, API Gateway)
- Understanding complex business problems to ensure projects are leveraging the appropriate technology and analytical tools in the delivery of a comprehensive solution
- Understand the business goals/needs while utilizing big data and cloud technology to ingest, process, and analyze large amounts of data
- Understand how to transform findings into actionable business opportunities
- Understand the changing business needs of the organization/projects and recommend viable strategies for the future.
- Author and/or Review architecture/design and other technical documents, ensuring high-quality deliverables and systems development across tech stacks and applications teams
What You'll Need- Bachelor's degree in Computer Science, Mathematics, Statistics, or a related field (required)
- Be prepared to discuss your project experience & your architecture / design methodologies in interviews
- 5+ years of data engineering/architecture experience, with 3-5+ years in a technical lead or architect role
- Hands-on experience with big data platforms: Hadoop, Hive, Impala, Hbase, Sqoop, NoSQL databases
- Experience developing data pipelines, ETL / ELT processes, and data transformations
- Proficiency with AWS cloud services: EC2, S3, Lambda, Glue, Redshift, API Gateway; experience with cloud-native architectures
- Experience with data modeling, business logic implementation, and optimizing large-scale datasets
- Strong understanding of data architecture principles and ability to design scalable solutions
- Highly proficient in SQL
- Data Mining/Data Warehousing/Business Intelligence experience is a plus (Tableau, R)
- Experience with automated deployments, CI/CD, and source code/configuration management tools (GitHub, Jenkins, Terraform, CloudFormation)
- Excellent communication skills, able to convey technical concepts to business stakeholders and cross-functional teams
- Ability to think strategically, prioritize tasks, and see the 'big picture' while managing day-to-day deliverables
- Strong organizational skills, attention to detail, and ability to work independently or as a lead on multi-disciplinary projects
Compensation depends on experience but is typically between $142K and $189K, plus a 15% bonus. However, the client typically doesn't hire above the 80th percentile to ensure room for merit increases with tenure. 80% would be $151,360K