Job Description
Job Description
Job Title : Data Engineer
Location : Phoenix, AZ
Work Type : Hybrid (3 days onsite per week)
Employment Type : W2 only
Candidate Criteria :
- Only Female Candidates
- Experience : 6+ years
- Visa Status : USC / GC only
About the Role :
We are seeking an experienced Data Engineer to join an Agile, SDLC-based team supporting large-scale data initiatives at our firm. The ideal candidate will be a hands-on individual contributor with strong data engineering, analytics, and ETL development experience.
Key Responsibilities :
Work as an individual contributor in an Agile SDLC environmentDesign, develop, and maintain scalable ETL pipelines using Python and PySparkAnalyze, transform, and manage large datasets across data warehouses and data lakesDevelop complex SQL queries across multiple RDBMS platformsIntegrate data across systems using REST, SOAP, ETL, and SSISBuild and support reports and dashboards using BI toolsCollaborate with cross-functional teams to deliver high-quality data solutionsWrite efficient, reusable, and maintainable code for data processing pipelinesLearn and adapt to new cloud-based tools and platforms as requiredRequired Skills :
Strong Data Analytics backgroundProficiency in MySQLStrong hands-on experience with Python and PySparkExpertise in ETL, Big Data, and Data Warehousing conceptsExpert-level knowledge in at least one object-oriented programming language in C, C++ Or JavaExperience with SDLC life cycleAdvanced knowledge of SQL (SQL Server, DB2, Oracle)Experience with :Stored procedures, triggers, DML packages, materialized viewsData modeling (schemas, entity relationships)BI tools : Tableau, Power BI, or QlikSystem integrations using REST, SOAP, ETL, SSISNode.js and JSONFamiliarity with Apache Airflow and DAG developmentPreferred / Plus Skills :
Experience with Apache SparkExposure to cloud-based data platformsWillingness to upskill on emerging tools and technologies