Job Description
Job Description
- 5+ years of experience in administering, testing, and implementing enterprise data process automation and orchestration.
- Experience with Data processing platforms and technologies such as Microsoft SSIS, Informatica, ActiveBatch, Power Apps, Apache Airflow, Apache Nifi, Job Schedulers, File transfer tools, etc.
- Knowledge of or experience with data virtualization technology
- Responsible for providing full lifecycle administration of data platform tools (patches / updates, AD security, account management, capacity management, documenting processes)
- Primary platform support would revolve around PowerBI and supporting user security, but also serve as a backup for other department software platforms (Denodo, Informatica, ActiveBatch, etc.)
- Data management & modelling : Connect and manage data pipeline tools to various data sources, including on-premise and cloud-based data sources.
- Implement and maintain semantic models to ensure data integrity and performance optimization. Perform data cleansing and transformation tasks (ETL) to prepare data for analysis.
- Technical support : Provide technical guidance and support to consumers of data services, ensuring effective adoption and utilization of enterprise data and the fabric / virtual layer.
- Performance monitoring : Monitor and optimize Data Pipeline (ETL) performance, including capacity planning and server performance.
- User management : Manage user access and permissions to enterprise data platforms and resources, ensuring compliance with security policies.
- Troubleshooting : Conduct thorough testing, debugging, and troubleshooting of Data Pipleline (ETL) tools and solutions.
- Governance : Maintain governance policies, best practices, and security standards for the enterprise data platforms.
- Training and knowledge sharing : Provide training and share knowledge with colleagues to enable the delivery of data for enterprise needs.
- Roadmap building and prioritization : Support the data architecture team with data pipeline (ETL) roadmap, prioritizing initiatives based on business needs and strategic goals
- Skilled in analyzing and automating manual processes to reduce manual interaction
- Experience with data virtualization / fabric platforms such as Denodo, CData, Talend, Data Virtuality
- Experience with and utilizing development skills such as SQL, PL / SQL, T-SQL, Shell Scripting (Powershell, Unix Shell, etc.)
- Able to analyze, troubleshoot and tune SQL queries and recommend enhancements.
- Analyzing and monitoring server resources and implement proactive alerts and notifications based on SLAs.
- Performance tuning and analysis of SQL code and logic in data transformations and queries.
- Relevant certifications related to data platforms and relevant technologies.
- Experience in the healthcare claims processing industry and understanding of associated data security and privacy concerns.
Work schedule is normal working days ranging from 7am-5pm EST. The position will be required to assist existing staff for a rotating on-call of production support at least 1x / month
The candidate must be 50 miles or 1 commute from a Pulsepoint location
o Indianapolis, IN
o Denison, TX
o Baltimore, MD
o Harrisburg, PA
o Syracuse, NY
o Portland, ME
o Hingham, MA