[job_card.job_description]6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytic.4+ years of experience with Python with working knowledge on Notebooks.4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.4+ years of experience with one of the leading public clouds and GCP : 2+ years2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).2+ years of experience with Kafka, Pub / Sub, Docker, Kubernetes.2+ years of experience in Architecture design and documentationTroubleshoot, optimize data platform capabilitiesAbility to work independently, solve problems, update the stake holders.Analyze, design, develop and deploy solutions as per business requirements.Strong understanding of relational and dimensional data modeling.Experience in DevOps and CI / CD related technologies.Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.