Quality Health First®

The goal of Quality Health First (QHF) was to reduce health care costs while at the same time improve the quality of health care for all patients. It involved the collection of massive amounts of clinical and claims data for patients throughout central Indiana into a central repository. Various clinical quality measurements were applied against that data and the results were aggregated. Reports allowed the participating health care providers to have current, actionable reports on their patients for preventative screenings and management of chronic illnesses. The results were also used in conjunction with various pay-for-performance programs.


Although I began in March of 2006, the clinical repository (the Indiana Network for Patient Care) was already in place. I reviewed measure specifications and identified the data elements needed from the repository. The data was extracted from the repository and placed into a separate system that I designed for measure calculations. Working with the clinical staff, I developed the procedures for calculating measures. The process was efficient enough that 20 measurements against 2.5 million patients utilizing 35 million data points could be performed within a 45-minute period.

CoreANALYTICS™

CoreANALYTICS™ was anticipated to be a means of quickly implementing Meaningful Use functionality in a variety of environments. Intentionally isolated from source data, CoreANALYTICS could be dropped into any environment, perform the measure calculations for Meaningful Use (or whatever other program was needed) and provide the results for a reporting layer. Based on my experience with QHF, I designed a very flexible yet efficient measure calculation engine.


By using a special editor, non-programmers could create and edit measure specifications in a familiar manner. These specifications were then compiled into database stored procedures which would perform the actual measure calculations. Measure definitions were built from smaller components which allowed for great complexity and expressiveness in measures through various combinations and permutations.


I designed the engine concept, the database schema, the XML measure schema, the measure compiler, and the measure editor. As this type of measure engine was a first for the industry I applied for a patent on the entire concept. I also provided guidance to the development team and verified the results to ensure that the results were as expected. I also provided guidance to the clinical staff that created measure specifications and created many of them myself. Although initially developed in SQL Server, it was later ported to IBM Netezza and could also be readily ported to other environments (Teradata, Oracle, MySQL, etc.).


My expectations for this project go beyond Meaningful Use and offer a means of performing many analytical functions. Besides providing data for value based programs (Meaningful Use, PQRS, etc.), measures can be used to identify quality care issues, performance issues, and other business and health quality issues to improve not only patient health but business health as well. 

Healthcare Analytics

I was invited to be a Sr. BI Solutions Architect to design high-level concepts for solutions that were intended to replace current aging systems. In addition, various departments with different focuses were performing similar operations and were all looking for replacements for their outmoded systems yet going in different directions. After identifying the commonalities, I proposed a common solution equitable for all concerned. As part of the process, I also worked with vendors to ensure that their offerings matched those needed. Most projects represented multi-million-dollar investments requiring an eye towards cost management.


My major project involved identification, prevention and recovery from incorrect or fraudulent billing practices including the consolidation of existing diverse operations and procedures across multiple business divisions. This included solution design working with business analysts, clients, and developers. As needed I also assisted the development team by developing some of the more complex logic.

Clinical Quality Measures for Registeries

Due to my successful efforts with CoreANALYTICS, I was asked to return Encore (after their owner Quintiles merged with IMS Health) to work on a similar project with a significant twist. The use of clinical data within registries (i.e. surgical, cancer, trauma, etc.) is often not consistent with that used within typical EHR systems and often will have a unique perspective of data usage and collection. The challenge of performing clinical quality measures against registry data required a remodel of the foundation used by CoreANALYTICS. Additionally, I was tasked with furthering the abilities of CoreANALYTICS by taking lessons learned along with best practices and incorporating them into the project. 

HC Standard

The work that I performed in non-standard data environments led me to GER. The HC Standard product is primarily intended for incident, patient management and data exchanges amongst first responders, paramedics/EMS, fire and police departments, hospital, health care and other such organizations who respond to mass gatherings, natural disasters and casualty incidents. The system was designed to be flexible to meet the unique needs of each organization and situation by allowing changes to data tracking systems on the fly. Such flexibility comes at a cost in terms of performance and interoperability. In response, my efforts were focused upon tuning the existing model as much as possible. In most cases, I achieved a response time reduction by 98%!  I also designed an improved model which promised to provide more stability, better performance, and lower maintenance efforts while providing greater flexibility. 

Operational Data Store (ODS)

With the previous success of my work at Anthem, I was called upon to help design an Operational Data Store to consolidate data from multiple disparate systems into a single AWS repository. This allowed for the simplification of the various reporting needs as well as placed all data into a single place for effective data mining and analytics.