Note: For the safety of our employees and those considering employment with Cotiviti, we are currently conducting all interviews virtually. In addition, the majority of the Cotiviti team is currently working remotely, and we are onboarding new hires remotely as well. As we monitor the pandemic, these arrangements may change and we will update accordingly.
This role will be responsible for the analysis, design, development and implementation of data acquisition solutions and strategies to support enterprise-level reporting. The role will support and collaborate with the software engineering teams, business analysts, BI developers and other project stakeholders to deliver data pipeline architecture that meets or exceeds defined expectations.
- Build and operate stable, scalable data pipelines that cleanse, structure and integrate disparate big data sets into accessible format to support enterprise-level reporting.
- Identify data requirements to integrate new data sources. Perform technical analytics to confirm data integration meets or exceeds defined expectations.
- Provide proactive oversight for Tableau Server-based data source repository used as primary reporting layer for front-end developers.
- Solve complex technical problems and mentor other technical staff on data modeling and ETL related issues.
- Participate in requirement gathering efforts and write low-level and high-level specifications.
- Analyze existing procedures to identify system/process changes needed to meet such requirements including but not limited to streamlining processes and quality assurance.
- Ensure that consistent documentation is developed and actively maintained throughout all phases of work.
- Supports development of other analysts on the team by sharing and training on best practices.
- Assist in the successful completion of deliverables and ensure all requirements are accurately met.
- 4+ years of experience with data warehouse technical architectures, ETL/ELT, reporting/analytic tools and scripting.
- 2+ years experience supporting Data Warehouse operations at a mid- to large-sized organization
- Advanced knowledge and expertise with data modelling, data aggregation, standardization, linking, quality check mechanisms and reporting.
- Experience in the analysis, design and development of solutions and strategies for creating extraction, transformation and loading (ETL) and real-time applications.
- Experience with RDBMS (SQL Server, Oracle, ) and using T-SQL or other data integration/ETL tools such as SSIS. Understanding of new technologies like HDFS and NOSQL a plus.
- Experience implementing and supporting SQL Server auditing features (Change Tracking, Change Data Capture and SQL Server Auditing) to drive ETL operations in complex data warehouse environments.
- Strong working knowledge of semantic models associated with various BI platforms (Tableau, MicroStrategy, etc.).
- Experience with ticketing and documentation tools such as Jira.
- Strong knowledge of coding best practices while developing and deploying code.
- Must be highly analytical, well-organized and possess strong attention to detail.
- Bachelor’s degree in relevant field such as Computer Science, Engineering, a related field, or equivalent experience.