Now Hiring : Senior Data Engineer (GCP / Big Data / ETL)
Location : Richardson, TX ( On-site mandatory )
Duration : 6 Months (Possible Extension)
Job Summary
Were seeking an experienced Senior Data Engineer with deep expertise in Data Warehousing, ETL, Big Data, and modern GCP-based data pipelines . This role is ideal for someone who thrives in cross-functional environments and can architect, optimize, and scale enterprise-level data solutions on the cloud.
Must-Have Skills (Non-Negotiable)
- 9+ years in Data Engineering & Data Warehousing
- 9+ years hands-on ETL experience (Informatica, DataStage, etc.)
- 9+ years working with Teradata
- 3+ years hands-on GCP and BigQuery
- Experience with Dataflow, Pub / Sub, Cloud Storage , and modern GCP data pipelines
- Strong background in query optimization, data structures, metadata & workload management
- Experience delivering microservices-based data solutions
- Proficiency in Big Data & cloud architecture
- 3+ years with SQL & NoSQL
- 3+ years with Python or similar scripting languages
- 3+ years with Docker, Kubernetes , CI / CD for data pipelines
- Expertise in deploying & scaling apps in containerized environments (K8s)
- Strong communication, analytical thinking, and ability to collaborate across technical & non-technical teams
- Familiarity with AGILE / SDLC methodologies
Key Responsibilities
Build, enhance, and optimize modern data pipelines on GCPImplement scalable ETL frameworks , data structures, and workflow dependency managementArchitect and tune BigQuery datasets, queries, and storage layersCollaborate with cross-functional teams to define data requirements and support business objectivesLead efforts in containerized deployments , CI / CD integrations, and performance optimizationDrive clarity in project goals, timelines, and deliverables during Agile planning sessions📩 Interested? Apply now or DM us to explore this opportunity! You can share resumes at moin@wiseskulls.com OR Call us on +1 (669) 207-1376