Job Title
Big data exposure with GCP
Relevant Experience (in yrs)
8 yrs
Work Location (State, City and Zip)
Phoenix
Technical/Functional Skills
Big data with GCP
Roles & Responsibilities
Big Data Engineer with mapreduce , spark , hive , sql skillset
"Expert in SQL and Data warehousing concepts.
Hands-on experience with public cloud data warehouse (GCP, Azure, AWS).
GCP certification will be very good to have.
MapR experience is must
Strong Hands on experience with one or more programming languages ( Python or Java).
Hands-on expertise with application design and software development in Big Data (Spark(Pyspark), HIVE).
Experience with CICD pipelines, Automated test frameworks, DevOps and source code management tools (XLR, Jenkins, Git, Maven).
Strong communication and analytical skills including effective presentation skills.
Familiarity with Agile & scrum ceremonies."