Job Description
Job Title : Data Engineer
Location : Phoenix, AZ
Work Type : Hybrid (3 days onsite per week)
Employment Type : W2 only
Candidate Criteria :
Only Female Candidates
Experience : 6+ years
Visa Status : USC / GC only
About the Role :
We are seeking an experienced
Data Engineer to join an Agile, SDLC-based team supporting large-scale data initiatives at our firm. The ideal candidate will be a hands-on individual contributor with strong data engineering, analytics, and ETL development experience.
Key Responsibilities :
- Work as an individual contributor in an Agile SDLC environment
- Design, develop, and maintain scalable ETL pipelines using Python and PySpark
- Analyze, transform, and manage large datasets across data warehouses and data lakes
- Develop complex SQL queries across multiple RDBMS platforms
- Integrate data across systems using REST, SOAP, ETL, and SSIS
- Build and support reports and dashboards using BI tools
- Collaborate with cross-functional teams to deliver high-quality data solutions
- Write efficient, reusable, and maintainable code for data processing pipelines
- Learn and adapt to new cloud-based tools and platforms as required
Required Skills : - Strong
Data Analytics background
Proficiency inMySQL
Strong hands-on experience withPython and PySpark
Expertise inETL, Big Data, and Data Warehousing concepts
Expert-level knowledge in at least oneobject-oriented programming
language in C, C++ Or Java
Experience with SDLC life cycleAdvanced knowledge ofSQL (SQL Server, DB2, Oracle)
Experience with :Stored procedures, triggers, DML packages, materialized viewsData modeling (schemas, entity relationships)BI tools :Tableau, Power BI, or Qlik
System integrations usingREST, SOAP, ETL, SSIS
Node.js and JSON
Familiarity withApache Airflow and DAG development
Preferred / Plus Skills :
Experience withApache Spark
Exposure to cloud-based data platformsWillingness to upskill on emerging tools and technologies