Design develop and maintain scalable and robust ETL / ELT processes and data pipelines using various tools and technologies.
Build and optimize data warehouses data lakes and other data storage solutions to support analytical and operational needs.
Implement data quality checks and monitoring to ensure the accuracy completeness and consistency of data.
Work with large datasets performing data modeling schema design and performance tuning.
Create data models that are easy for BI tools to consume and build dashboard.
Qualifications needed :
Proficiency in Python SQL and data engineering concepts.
Technologies used :
Google internal tools
Platform / Tools used :
GCP BigQuery.
Key Skills
SQL,Pentaho,PL / SQL,Microsoft SQL Server,SSIS,Informatica,Shell Scripting,T Sql,Teradata,Data Modeling,Data Warehouse,Oracle
Employment Type : Full Time
Experience : years
Vacancy : 1
Etl Developer • San Jose, California, USA