Job Description
Title : Technical Data Analyst
Location : San Francisco, CA( Hybrid)
Contract
We are seeking a 10+ years’ experience Technical Data Analyst with strong expertise in SQL and Python, hands-on experience with modern data platforms, and the ability to translate complex data into trusted, actionable insights. This role will work closely with engineering, product, and business stakeholders to support analytics, reporting, and data-driven decision-making at scale.
Key Responsibilities
- Write complex, high-performance SQL queries to analyze large, structured and semi-structured datasets
- 8-10 years of software development and deployment experience with at least 5 years of hands-on experience with SQL, Databricks, ADF, Datastage (or other ETL tool), SSAS cubes, Cognos, Tableau, Thoughtspot and other BI tools
- Write SQL for processing raw data, kafka ingestions, adf pipelines, data validation and QA
- Knowledge working with APIs to collect or ingest data
- Use Python for data analysis, automation, validation, and lightweight data engineering tasks
- Build, enhance, and maintain dashboards and reports using Tableau and ThoughtSpot
- Partner with data engineers to design, validate, and optimize ETL / ELT pipelines
- Work extensively with Databricks (Spark, notebooks, Delta tables) for data exploration and analytics
- Perform data quality checks, reconciliations, and root-cause analysis to ensure data accuracy and consistency
- Translate business requirements into technical data solutions and semantic layers
- Support self-service analytics by documenting datasets, metrics, and business definitions
- Collaborate across teams to troubleshoot data issues and improve reporting performance
Required Qualifications
Strong proficiency in SQL, including complex joins, window functions, CTEs, and performance optimizationStrong Python skills for data analysis and scripting (e.g., pandas, numpy)Hands-on experience with Databricks and distributed data processing conceptsHands on experience working with ETL tools and data pipelines (batch and / or streaming)Proficiency in reporting and visualization tools such as Tableau, ThoughtSpot, Cognos, SSAS CubesSolid understanding of data warehousing concepts, data modeling, and analytics best practicesAbility to analyze large datasets and communicate insights clearly to both technical and non-technical audiencesPreferred Qualifications
Experience with cloud data platforms (AWS, Azure, or GCP)Familiarity with version control tools (Git) and CI / CD concepts for analytics workflowsExposure to data governance, metric standardization, and semantic layersPrior experience in enterprise-scale data platforms or COE environmentsWhat Success Looks Like
Trusted, accurate dashboards and datasets used across multiple business teamsEfficient, well-documented SQL and Python code that scales with data growthStrong partnership with engineering and business stakeholders to deliver timely insightsProactive identification and resolution of data quality and performance issues