Principal Data Architect
A BIT ABOUT WAVICLE
Wavicle Data Solutions is a founder-led, high-growth consulting firm helping organizations unlock the full potential of cloud, data, and AI. Were known for delivering real business results through intelligent transformationmodernizing data platforms, enabling AI-driven decision-making, and accelerating time-to-value across industries.
At the heart of our approach is WIT the Wavicle Intelligence Framework. WIT brings together our proprietary accelerators, delivery models, and partner expertise into one powerful engine for transformation. Its how we help clients move faster, reduce costs, and create lasting impactand its where your ideas, skills, and contributions can make a real difference.
Our work is deeply rooted in strong partnerships with AWS, Databricks, Google Cloud, and Azure , enabling us to deliver cutting-edge solutions built on the best technologies the industry has to offer.
With over 500 team members across 42 cities in the U.S., Canada, and India, Wavicle offers a flexible, digitally connected work environment built on collaboration and growth.
We invest in our people through : -Competitive compensation and bonuses -Unlimited paid time off -Health, retirement, and life insurance plans -Long-term incentive programs -Meaningful work that blends innovation and purpose
If youre passionate about solving complex problems, exploring whats next in AI, and being part of a team that values delivery excellence and career developmentyoull feel right at home here.
THE OPPORTUNITY
Wavicle Data Solutions, LLC seeks a Principal Data Architect in Oak Brook, IL.
WHAT YOU WILL GET TO DO
- Provide Top Quality solution design and implementation for clients.
- Design and execute data abstractions and integration patterns (APIs) to support complex distributed computing problems.
- Ensure that data security, governance, and compliance best practices are embedded into all solutions.
- Provide support in defining the scope and the estimating of proposed solutions.
- Engage with our clients to understand their strategic objectives.
- Translate business requirements into scalable and cost effective technology solutions that optimize data availability, performance, and usability.
- Work with client and engagement leaders to understand their strategic business objectives and align data architecture solutions accordingly.
- Identify gaps in data infrastructure, governance, or integration, and work with clients to resolve them in a timely manner.
- Utilize Big Data technologies to architect and build a scalable data solution.
- Architect and implement top-quality data solutions using cloud (AWS, GCP, Azure), big data frameworks (Apache Spark, Kafka, Databricks), and modern data platforms (Snowflake, BigQuery, Redshift).
- Stay up to date with emerging technology trends in cloud data platforms, AI-driven analytics, and data architecture best practices to make recommendations that align with client needs.
- Responsible for the design and execution of abstractions and integration patterns (APIs) to solve complex distributed computing problems.
- Participate in pre-sales activities, including proposal development, RFI / RFP response, shaping a solution from the clients business problem.
- Support pre-sales activities, including proposal development, RFI / RFP responses, and solution presentations.
- Act as thought leader in the industry by creating written collateral (white papers or POVs), participate in speaking events, create / participate in internal and external events.
- Mentor and guide data engineers, analysts, and solution architects on data engineering best practices, architecture frameworks, and cloud infrastructure.
WHAT YOU BRING TO THE TEAM
Bachelors Degree in Computer Science, Information Technology, Data Science, Engineering, or a related field plus 5 years of experience in related occupations.5+ years of experience in the following : Professional development experience in architecting and implementing Big Data Solutions; Scripting languages : Java, Scala, Python, or Shell Scripting.4+ years of experience in the following : Cloud : AWS, Azure, or GCP; Using one or more of the following ETL tools : Informatica, Talend, IBM DataStage, Azure Data Factory, AWS Glue;At least 2 of the following Big data tools and technologies : Linux, Hadoop, Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce;Performing complex data migration to and from disparate data systems / platforms as well as to / from the cloud : AWS, Azure, or GCP.3+ years of experience in the following : Data Visualization tools : PowerBI, Tableau, Looker or similar.Telecommuting is permitted. 40 hours / week, must also have authority to work permanently in the U.S. Applicants who are interested in this position may apply at www.jobpostingtoday.com Ref #98372 for consideration.BENEFITS
Health Care Plan (Medical, Dental & Vision)Retirement Plan (401k, IRA)Life Insurance (Basic, Voluntary & AD&D)Unlimited Paid Time Off (Vacation, Sick & Public Holidays)Short Term & Long Term DisabilityEmployee Assistance ProgramTraining & DevelopmentWork From HomeBonus ProgramEQUAL OPPORTUNITY EMPLOYER
Wavicle is an Equal Opportunity Employer and committed to creating an inclusive environment for all employees. We welcome and encourage diversity in the workplace regardless of race, color, religion, national origin, gender, pregnancy, sexual orientation, gender identity, age, physical or mental disability, genetic information or veteran status.
$160,555 - $181,000 a year
Compensation details : 160555-181000 Yearly Salary
PIce473d8084dc-30511-39248820