Job Description
Job Description
Requirement :
- 4+ years of experience in Python, Databricks, and SQL for automating data workflows, ETL processes, and KPI generation using libraries like pandas, numpy, and pyspark.
- 2+ years of experience developing interactive BI dashboards in Spotfire, Tableau, and Power BI with advanced customization using HTML, CSS, and JavaScript.
- 3+ years of experience working in Agile / Scrum projects using Jira or Azure DevOps for collaborative delivery and process efficiency.
Key Skills :
Python, SQL, PySpark, Pandas, NumPy, SQLAlchemy, PyODBCDatabricks, ETL workflow design, and data transformationBI tools : Tableau, Power BI, ThoughtSpot, SpotfireWeb technologies : HTML, CSS, JavaScript for BI customizationData modeling, performance optimization, and KPI automationPostgreSQL and familiarity with data mesh platforms like StarburstAgile / Scrum methodology with tools like Jira and Azure DevOpsExpected Outcome :
Deliver automated, scalable, and optimized data workflows using Python and Databricks.Develop interactive and visually consistent BI dashboards with enhanced user experience.Improve reporting efficiency and accuracy through automated KPI generation and ETL optimization.Enable faster, data-driven decision-making by integrating advanced analytics and real-time insights.Requirements
Soft Skills :
Communication Skills :
Communicate effectively with internal and customer stakeholders
Communication approach : verbal, emails and instant messages
Interpersonal Skills :
Strong interpersonal skills to build and maintain productive relationships with team members
Provide constructive feedback during code reviews and be open to receiving feedback on your own code.
Problem-Solving and Analytical Thinking :
Capability to troubleshoot and resolve issues efficiently.
Analytical mindset
Task / Work Updates
Prior experience in working on Agile / Scrum projects with exposure to tools like Jira / Azure DevOps
Provides regular updates, proactive and due diligent to carry out responsibilities