CDC-ONC Industry Days

The CDC-ONC Industry Days event brought together key partners to inform non-government organizations about the Centers for Disease Control and Prevention’s (CDC) and the Office of the National Coordinator for Health Information Technology’s (ONC) plans for modernizing public health data and information systems. Building on the Industry Day Webinar that was hosted on January 12, 2023, this two-day event accomplished several goals, including:

  • Hearing from CDC and ONC leaders about the vision for public health data modernization and the value of multi-sector partnerships,
  • Informing industry about CDC’s and ONC's strategic direction in use of data and information systems, and about the highest priority areas in which they are seeking support,
  • Providing forums for CDC and ONC staff and industry to discuss services and priorities,
  • Increasing opportunities for CDC and ONC to work with and learn from industry,
  • Providing opportunities to hear from industry partners on their compatible capabilities, tools and services.

Below you can find the video recordings and associated slides for each session.

 


Event 1: February 27 & 28, 2023

 

Full Playlist: Day 1 and Day 2 Presentations

 


Get Updates on Data Modernization Initiatives and Events
Sign Up for Email Updates


Individual Session Recordings and Slides

Day 1:

CDC-ONC Industry Days Welcome and Opening Plenary - Day 1

Speakers: Judy Monroe, MD, President & CEO, CDC Foundation
Rochelle Walensky, MD, MPH. Director, Centers for Disease Control and Prevention

Watch the recording

Session Description: Introduction to the event by Judy Monroe, MD, President & CEO of the CDC Foundation, with a video welcoming everyone from Rochelle Walensky, MD, MPH, Director of the Centers for Disease Control and Prevention.

 

Office of Public Health Data, Surveillance, and Technology (OPHDST) Strategic Priorities and Needs

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Jennifer Layden, MD, PhD, Acting Director of Public Health Data, Surveillance, and Technology, CDC

Watch the recording

View/download slides [combined deck for OPHDST, OCIO and CFA speakers]

Session Description: Jennifer Layden, MD, PhD, discusses the strategic priorities from the Office of Public Health Data, Surveillance, and Technology (OPHDST) and how members of industry can support their missions.

 

Office of the Chief Information Officer (OCIO) Strategic Priorities and Needs from Industry

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Suzi Connor, Chief information Officer (CIO) and Director, OCIO, CDC

Watch the recording

View/download slides [combined deck for OPHDST, OCIO and CFA speakers]

Session Description: Suzi Connor briefs on the Office of the Chief Information Officer's (OCIO) strategic priorities and needs that Industry can help with.

Center for Forecasting and Outbreak Analytics (CFA) Strategic Priorities and Needs from Industry

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Dylan George, PhD, Director of CDC's Center for Forecasting & Outbreak Analytics

Watch the recording

View/download slides [combined deck for OPHDST, OCIO and CFA speakers]

Session Description: Dylan George, PhD, touches upon the needs from industry vendors, in regard to the strategic priorities of the Center for Forecasting and Outbreak Analytics.

Improving Public Health Data Accessibility

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Honorable Tim Kaine, U.S. Senator from Virginia

Watch the recording

Office of the National Coordinator for Health Information Technology Strategic Priorities and Needs

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Micky Tripathi, PhD, MPP, National Coordinator for Health Information Technology, U.S. Department of Health and Human Services

Watch the recording

Announcements - Day 1 Morning

Speaker: Judy Monroe, MD, President & CEO, CDC Foundation

Watch the recording

Panel Discussion, CDC & ONC Speakers

Moderator: Judy Monroe, MD, President & CEO, CDC Foundation

Panelists: Ryan Argentieri, MBA, Deputy Director, ONC, CDC
Jennifer Layden, Layden, MD, PhD, Acting Director, Office of Public Health Data, Surveillance, and Technology, CDC
Suzi Connor, Chief Information Officer (CIO) and Director, OCIO, CDC
Dylan George, PhD, Director, Center for Forecasting and Outbreak Analytics (CFA), CDC

Watch the recording

Industry Leader Talk - Nirav Shah

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Nirav Shah, MD, MPH, Senior Scholar, Stanford University Clinical Excellence Research Center

Watch the recording

Industry Leader Talk - Angela Dunn

Introduction: Judy Monroe, MD, President & CEO, CDC Foundation
Speaker: Angela Dunn, MD, MPH, Health Officer/Executive Director, Salt Lake City Health Department

Watch the recording

View/download slides

Industry Leader Talk - Dale Sanders

Introduction: Judy Monroe, President & CEO, CDC Foundation
Speaker: Dale Sanders, Chief Strategy Officer, Intelligent Medical Objects

Watch the recording

View/download slides

Panel Discussion, Industry Speakers

Industry Panelists: Nirav Shah, MD, MPH, Senior Scholar, Stanford University Clinical Excellence Research Center
Angela Dunn, MD, MPH, Health Officer/Executive Director, Salt Lake County Health Department
Dale Sanders, Chief Strategy Officer, Intelligent Medical Objects

Watch the recording

CDC-ONC Industry Days Closing - Day 1

Closing to Day 1 of the CDC-ONC Industry Days event by Judy Monroe, MD, President & CEO of the CDC Foundation.

Watch the recording

 

Day 2:

A Modern Approach to Standardizing and Enriching Public Health Data with Modular Building Blocks

Speakers: Daniel Paseltiner, Senior Data Engineer, Skylight
Brady Fausett, Staff Data Engineer, Skylight

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description:

Working with public health data is challenging. There are a variety of data types and formats and fields within associated messages which often lack standardization. This requires public health departments to implement bespoke processes, both automated and manual, throughout the data life-cycle in order to make their data usable. Additionally, the underlying hardware and software used to process these data are commonly antiquated and inflexible. This compounds the challenges already inherent in the data by making it difficult to manage surges in data volume and to quickly meet new analytical and reporting requirements as they arise. Through our work as part of CDC’s Pandemic-Ready Interoperability Modernization Effort (PRIME) we seek to address these issues through the use of modern public health data and infrastructure.

 

How CDC Integrated Complex Data to Drive Vaccination Forecasting with Databricks

Speakers: Sheila Stewart, Senior Solutions Architect, Databricks
John Repko, Technical Program Manager, Peraton
Jim Fetters, Specialist, Microsoft

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: For successful analytics, scientists need to start with the right data that is both timely and high quality. In this session, we will discuss how Peraton helped CDC build an immunization data lake to track COVID-19 vaccination orders, delivery, inventory and administration (shots in arms) with a Databricks Lakehouse analytics platform on Microsoft Azure. The session will highlight the challenges faced by the team while constructing the Lakehouse and provide insight into the importance of reliable data when leveraging analytics for insight and promoting public health.  

Attendees will learn:

  • How the team collected complex Covid data for analysis and forecasting (600 million records in 184 days)
  • Where challenges arose in creating an integrated data environment and how they were overcome
  • How healthcare data became actionable public health data

Democratizing Forecasting and Analytics - Amazon Web Services (AWS)

Speakers: Dan Sinnreich, Senior Product Manager, Amazon Web Services
Henrik Balle, Senior Solutions Architect, Amazon Web Services

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: In this session we will discuss how Amazon Web Services (AWS) can help public health agencies drive more informed and agile decision-making by applying machine learning (ML) to large scale datasets. ML-based forecasting, classification and regression have historically been accessible only to data science teams with specialized skills. However, data scientists are in high demand and often have to prioritize a large number of competing demands. During a health crisis, the demand on these teams becomes even more acute as the need for rapid analysis grows, given quickly changing conditions and data. AWS Low Code No Code machine learning (LCNC ML) tools allow non-data scientists (analysts) to take advantage of ML's power to quickly identify subtle patterns within data, and make more informed decisions without having to understand machine learning or writing code. Data science teams can support and improve on models created by analysts, while no longer becoming a choke point in delivering rapid decision-making insights to the organization.

Advancing Public Health Data Interoperability with Integration Accelerators - Mulesoft, Salesforce

Speakers: Avneet Bakshi, Principal Solution Engineer, Salesforce
Sarah Linden, Account Director, Salesforce

Topic: Policy, Standards, and Technology

Watch the recording

View/download slides

Session Description: Post-pandemic digital modernization may be among the most impactful developments in public health in our lifetimes. But the challenge going forward extends beyond a lack of adequate funding and resources for ongoing digital transformation efforts. As public health struggles with ongoing data challenges and coping with the ongoing impacts of the pandemic (e.g., the backlog of missed routine childhood vaccinations and the exacerbation of pre-existing health inequities), were left exposed to the next health crisis.

Data exchange is potentially the most significant, but elusive, digital goal to achieve in this next chapter of preparedness and response. It is foundational to driving large-scale public health transformation, and critical for achieving required data quality, completeness, and timeliness. Most importantly, it turns the page in areas that have proven meaningful for informing communication, guidance and policies necessary for instigating effective public health counter measures in real-time. In the past, complete information to inform community sentiment and action (e.g., demographic data regarding marginalized and underserved communities) has been nearly impossible to collect, summarize and share.

This session will highlight the benefits of MuleSoft's Accelerator for Healthcare to address public health data interoperability challenges, reinforcing principles emphasized by the Trusted Exchange Framework.

Leveraging FHIR for IIS Bulk Data and Modernization - HLN Consulting, LLC

Speaker: Michael Berry, Project Manager, HLN Consulting

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: Immunization Information Systems (IIS) are increasingly utilized to share data for clinical and public health decision making. Traditional methods of data exchange, such as HL7 version 2 Query/Response and flat file exchange, have proven to be inefficient and cumbersome for modern use cases.

FHIR bulk data offers a way for physicians and other providers to efficiently access up-to-date immunization data from IIS using modern tools and protocols. It is particularly useful for use cases in which providers need to be kept up-to-date on the immunization status of large groups of patients. The Rhode Island Child and Adult Immunization Registry (RICAIR) has embarked on a FHIR deployment that provides not only bulk data access, but also doubles as an application modernization strategy—replacing legacy web applications with modern front-ends that communicate with the FHIR back-end.

This talk will introduce the "FHIR Facade" architecture as it applies to IIS and other population-based registries, and describe how RICAIR built its FHIR platform on the open source HAPI FHIR server. As a participant in the Helios FHIR Accelerator for Public Health, RICAIR recently exchanged immunization data—including clinical decision support forecasts and evaluations—for groups of up to 100,000 patients at the HL7 FHIR Connectathon.

Data Enrichment for Data Pipelines at Scale - Sophron Networks LLC

Speaker: Richard Murphy, Chief Technology Officer, Sophron Networks LLC

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description:

Introduction:

Over the last 18 months Sophron.io has been conducting R&D on cloud data pipelines and data linkage services as part of our ongoing efforts to inform and enhance our professional services and consulting engagements with CDC and other agencies. This research has been focused in three areas: cloud data pipelines, data linkage, GIS services related to data pipelines.  

Research:

Initial R&D efforts ran in parallel with focus on optimizing data pipelines and building user friendly data linkage interfaces.  Early pipeline testing with Azure logic apps and Azure data factory eventually led to a focus on Delta Lake in Azure Databricks as the platform of choice for further optimization testing due to performance improvements and other factors.  At the same time, the data linkage project produced SLINK—an R package that provided a GUI user interface to the R FASTLink data linking library—which was transformed from an internal project to an OSS project and released on CRAN and Github in late 2022.   

The learnings from these projects have been inputs to the current R&D effort which is focused on creating data linkage services that can operate in a data streaming pipeline or on batches from a data lake.  As part of optimizing the data linkage services and efforts to increase data quality, probabilistic algorithms have been chosen to provide for a wider range of accuracy and quality options. The pre-processing of data using address normalization has also been identified as increasing match rate and data quality.  Additional benefits of the address normalization functionality are geocoding services that can also be provided from the same software solution.  Packaging these scalable services into a solution offering is underway.

Conclusions:

Pipelines for health record data can benefit from data enrichment services such as geocoding, address normalization and data linkage. Building services designed specifically for high volume data processing and utilizing GIS software that was more traditionally used in mainframe environments and the financial services industry has shown to scale better for the increased volumes of health data. These approaches seem to fit in well with the recently announced Northstar architecture and could provide multiple benefits to future implementations.

Researchers:

Richard Murphy, MBA Technology Management; CTO, Sophron.io
Beijun Desai, BS Computer Science, BS Applied Mathematics, University of Georgia, Senior
Brandon Yau, BS Computer Science, University of Georgia, 2022 graduate

Unlocking Potential of Nationwide Health Data Secure Federated Learning with NVIDIA FLARE - Deloitte

Speakers: Juergen Klenk, PhD, Principal, Deloitte Consulting, LLC
Jesse Tetreault, MS, AI Solutions Architect, NVIDIA

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: The massive amount of data needed to train models to make predictions about complex systems with clinical-level accuracy in real time cannot be understated. To train models on this massive amount of data, the typical approach for CDC and other organizations has been aimed at first bringing all training data together in one central location such as a data lake, requiring data transfer from the original source. With this approach, CDC has experienced ongoing barriers due HIPAA and privacy laws protecting healthcare data, strict data use agreements regulating non-healthcare data collected by STLT partners. Attempts to mask, redact, or otherwise anonymize data before transmission are often imperfect solutions that can limit the value and utility of the remaining data.

We propose that federated learning will be a critical tool in enabling CDC/CFA to produce high-performance models for predicting outbreaks and other public health threats that are truly representative of the populations they are protecting. Using a Federated Learning (FL) approach, CFA can bring the training to the data instead of orchestrating data transfer to a central location. FL allows for multiple entities to collaborate in training a centralized, global model without requiring the entities to exchange any sensitive training data by exchanging only model parameters between the node and the global model. The global model is distributed from a central training server out to the various entities or “nodes” included in the network. Each entity retains the model using its own data, then sends the resulting updated model parameters back to the central server for aggregation and retraining of the global model.

FL can expand the scope of data available for the modeling efforts of CFA and its partners, while maintaining privacy and security for participating data providers. In our talk with CFA, Deloitte will describe the results of a benchmarking study to analyze how FL models perform in comparison to models that are trained on centralized (or siloed) datasets, as well as why we selected NVIDIA’s proven Federated Learning Application Runtime Environment (FLARE) for use in our medical image classification pilot project after evaluating over 10 open-source federated learning platforms and tools.

Our partner, NVIDIA, will be present to explain how FLARE—as a software development kit (SDK) rather than a platform—provides flexibility for developers and data scientists to build applications and platforms with much of the complexity of messaging and communication abstracted away. NVIDIA will also describe how FLARE allows for distributed, multi-party computation beyond machine learning which enables “federated analytics” and more general statistical applications.

NVIDIA will also highlight an exciting real-world example—along with measurable results—of an AI deployment with FLARE that involved training a model across 20 distinct international healthcare institutions to predict the future oxygen requirements of COVID-19 patients.

Leveraging Existing Post Acute and Acute Data and Standards Based Data Extraction Capabilities

Speakers: Benjamin Zaniello, MD, Infectious Disease Physician, PointClickCare
Keith Boone, MBI, Standards Guru, Audacious Inquiry

Topic: Policy, Standards, and Technology

Watch the recording

View/download slides

Session Description: To fully capitalize on the Data Modernization Initiative’s (DMI) effort to update public health systems in use today, there is an opportunity for CDC to tap into the existing real-time acute and post-acute care networks and datasets. The current active and passive surveillance systems rely heavily on providers, are not always representative of the patient population, and are not designed for performance, scalability and high availability.  Industry partners, such as PointClickCare (PCC), have existing post-acute and acute care networks and datasets that can play a pivotal role in supporting the aims of the DMI. PCC’s comprehensive care collaboration network covers over 27,000 senior care facilities, 2,700 hospitals, 2,000 ambulatory facilities, 75 state and government agencies, totaling more than 195 million subscriber lives and 1.7 million senior care residents across 46 states. Today, PCC supports CDC through a partnership that leverages our post-acute data set to provide deep insights into the impact of COVID-19 on long-term care facilities. PCC’s platform also supports other public health use cases relying on real-time data, including more complete care coordination for maternal and infant health, multi-drug resistant organisms, Candida Auria, and opioid use disorder. With decades of interoperability experience, vendors like PCC should assist in implementing standards for more streamlined public health exchange.  Further opportunities exist to alleviate public health data issues today by tapping into the PCC networks and other networks already established within the healthcare data ecosystem.

Accurately combining data from disparate sources and enabling analytics and visualization of it for programmatic decision-making improves public health interventions at individual and population levels. However, data lives in a variety of formats, including HL7® V2, CDA, FHIR® and others.  Modern data lake technologies can collect data from many sources, but there are challenges in these datasets:

  • Getting data via disparate APIs in a common way.
  • Preserving privacy within the data model for anonymization/pseudonymization.
  • Linking data in multiple formats and sources to the same individual to create a longitudinal view.
  • Extracting and normalizing data for aggregation, analytics, and visualization (analysis).

We will show how existing FHIR-based Health IT standards and open-source software can solve these challenges by:

  • Querying for patient data using nationally recognized standards used by leading EHRs and HIEs.
  • Identifying fields containing PHI to support anonymization.
  • Linking data across sources by integrating with master data management and privacy preserving record linking services using standards.
  • Mapping and extracting data to normalized data models for analysis.
  • We will show how standards in Certified EHRs, national networks and HIEs can be used with existing enterprise-grade, cloud-agnostic, open-source tools, and enhancements supporting efficient, flexible models for extraction for analysis.

We will explain how protocols can be integrated using open-source, HL7 FHIR APIs as a common interface, integrating:

  • a data lake using open-source tools,
  • hospitals and long-term care facilities using FHIR, and
  • national and regional networks using current and future protocols.

We will show a fine-grained query API that can efficiently query data sources using a common syntax, enabling systems with varying capabilities to be accessed uniformly, improving data access efficiency across standards. We will identify key standards and open-source tools for data collection and enable the integration of MDM and record linking services. We will show how to represent a data model in FHIR, enabling mapping and extraction of data from any format into models suitable for analysis.

Finally, we will describe an architecture using existing open-source tools and standards that can support the critical data collection needs of a modern clinical data platform that serves public health.

Federated Data Modernization through Standardization and Harmonization for Public Health Evidence

Speakers: Mui Van Zandt, VP/GM, Real World Data and Technology, IQVIA
Atif Adam, PhD, MD, MPH, Associate Director of Epidemiolgy, IQVIA

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: The COVID-19 pandemic has highlighted the importance of utilizing advancements in interoperability to establish a solid foundation for effective health data exchange to tackle the significant disparities in health outcomes within the country. Currently, the task at hand is to maintain the drive and continue to prioritize the importance of exchanging health information, in order to share data and construct advanced, precise public health collaborations not just within individual counties but also across different counties and states.

IQVIA propose development of a surveillance network of multi-level health data built around the OMOP common data model (CDM) to provide seamless and interoperable data from various sources and to ensure a comprehensive understanding of population health. Observational Health Data Science and Informatics (OHDSI) and the OMOP CDM standardizes the data ingestion chain, harmonizes health ontologies, and allows deployment of reproducible, accurate, and well-calibrated evidenced-based analytics at scale. IQVIA has extensive experience with the implementation of OMOP CDM and evidence generation utilizing this technology at scale.

IQVIA has helped push the adoption of the OMOP CDM globally, especially within government entity.  This approach has been used in several government real-world evidence (RWE) generation initiatives, including those within the US as well as with governments entities globally.  The FDA Biomedical Evidence Generation and Surveillance Team (BEST) program, since 2017, supports surveillance activities looking at rare adverse events caused by CBER regulated products such as vaccines, blood, blood products, and tissues.  The European Medicines Agency's (EMA) Data Analysis and Real-World Interrogation Network, or DARWIN EU, established in 2022, will allow fast access and analysis across the network for conducting non-interventional pharmacovigilance studies.  The European Health Data & Evidence Network (EHDEN), since 2018, is providing a new paradigm for the discovery and analysis of health data in Europe by building a large-scale, federated network.  Through these programs, IQVIA have provided support in developing response-ready surveillance systems that allow for secure and trusted health data exchange between healthcare policy makers, hospital providers, and academic research organizations to rapidly execute real-time, evidence-based studies.  

By utilizing the OMOP CDM, we take an integrated approach to address the significant challenges in public health data modernization such as data silos, inadequate data privacy and security, and limited ability to perform response-ready analytics. Our network-based approach will foster agile decision-making practice at all levels to create targeted programs and innovative practices, resulting in better health and reducing health disparities across populations.

Developing Configuration-Controlled Data Harmonization Service Building Blocks Using Commodity Code

Speaker: Dirk Lieske, Director of Sparklabs, SparkSoft Corporation

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: The presentation will detail how teams can develop reusable, configuration-controlled, data harmonization building blocks with simple commodity languages like SQL. During the presentation, topics covered will include:

Justification: Sparksoft understands that the CDC-ONC has several technical decisions to make as part of their roadmap for modernizing data analytics. As detailed in the CDC’s Industry Day overview video, data harmonization building blocks will be needed to support the NorthStar Architecture. However, many questions remain, including what should the building blocks look like, how should they be built, who should maintain them, etc. Sparksoft has already successfully developed a series of data harmonization building blocks, which we call “Enhancements,” along with a virtual harmonization approach using only scripting languages, which include SQL, KSH, and Python. The languages used are specifically chosen for their broad industry adoption with consideration for how language selection affects system security and transparency. Our goal is to help the CDC-ODC and associated stakeholders understand and benefit from our approach as they look to modernize the CDC-ODC’s analytic architecture.

  • Topic 1 Defining a Building Block: Teams can choose to develop an infinite number of harmonization building blocks, so which data harmonization tasks should be performed as a building block and where should the remaining logic live? User and stakeholder understanding is a key factor when developing building blocks. In the simplest of terms, you should be able to explain what a building block does in 30 seconds or less to a general user. An example of a good building block includes: “This building block ensures users have access to valid data types.” In addition to suggesting building blocks, we will discuss how the CDC’s Shared Analytic Zone could support versioned virtual logic (Schema on Read and Views) to complete the harmonization journey.
  • Topic 2 Designing a Building Block: We will discuss building block standardization, and how building blocks should preserve the original data asset while working to extend, flag, or tag that data in a way that provides users with a better product.
  • Topic 3 Developing a Building Block: Configuration-controlled building blocks are essentially code generators. We will walk through a real-life example using dynamic SQL. (i.e., SQL that generates SQL based on metadata and configurations).
  • Topic 4 Controlling a Building Block: We will discuss the central role metadata plays in building blocks and how using standard methods to preserve and track building block execution configurations, along with associated logging and error messages, can establish a rich repository of consistent central system metadata.
  • Topic 5 Performance Considerations: We will discuss technical considerations regarding building block runtime performance.
  • Topic 6 Versioned Virtual Harmonization: We will discuss how complex business rules can be preserved and versioned as virtual transformations providing users direct access to historic harmonization logic. This approach provides harmonization developers an agile development approach for efficient delivery.
  • Wrap-up: We will demonstrate the building block process in real time, stepping through the loading and harmonizing of one of the CDC’s existing public data files, showing attendees building blocks in actions.

Using Federated Enterprise Artificial Intelligence to Manage a Pandemic - T2S Solutions & Dell

Speaker: David Bauer, PhD, Chief of AI & ML, T2S Solutions, Inc.

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: Please join Dr. Bauer as he describes the recent project to leverage Dell Technologies initiatives leveraging AI to understand and model the pandemic and its impact to the world as a whole. Dell Technologies partnered with Dr. Bauer and his team for the implementation of the Avicenna platform for Covid-19 planning and modeling for the Joint Artificial Intelligence Center, project SALUS.

Avicenna's primary model is the epidemiological model (SEIR), driving the transmission of disease between individuals as they interact at different locations over time. This modeling approach for disease spread has been implemented on the small scale as agent-based models for decades. We have transformed the traditional SEIR differential equations into a discrete event model that runs at scale in ROSS. Avicenna is unique in that the fidelity of the human behavior scenarios and demographics that can be incorporated is nearly limitless given the scalability of the underlying engine.  Fidelity within the human component of transmission is what enables Avicenna to produce results that are unprecedented in the field of infectious disease spread.  The modeling was verified to be ~93% accurate for predicting Covid-19 at a neighborhood level, which allowed for proper planning and proactive versus reactive approaches to mitigating the impact of the pandemic.

Pennsylvania's Federated HIE Model Provides Faster Data Insights and Decision-making - Cognosante

Speakers: Michael Lundie, Vice President, Interoperability Solutions, Cognosante, LLC
Martin Ciccocioppo, Director, Pennsylvania eHealth Partnership Program, Pennsylvania Department of Human Services

Topic: Policy, Standards, and Technology

Watch the recording

View/download slides

Session Description: The Pennsylvania Patient and Provider Network (P3N) is a statewide federated health information exchange (HIE) for participating health information organizations (HIOs). A main service of the Pennsylvania Department of Human Services (DHS) PA eHealth Partnership Program, the P3N facilitates the secure sharing of patient information and health records held by an HIO-connected provider with other providers connected to that same HIO as well as enhanced provider reporting. Following a major system design, development, and implementation phase, a new P3N went live in August 2022, and incudes: a Master Patient Identifier, that leverages referential matching; statewide Query and Retrieve; and statewide and interstate alerting with real-time HL7 Admission, Discharge, and Transfer (ADT) messages. P3N maintains connections with five Certified Participating HIOs and one state agency EHR: PA Department of Corrections. The program has established or is in the process of signing interstate agreements for sharing ADT messages with Delaware, Maryland, West Virginia, Connecticut, and the District of Columbia.

As part of their HIE ecosystem, Pennsylvania DHS implemented a Public Health Gateway with bidirectional interfaces for multiple state public health registries and databases. It allows for the seamless integration and aggregation of data from major health systems, hospitals, ambulatory care sites, laboratories, and public health organizations. Data reported and shared includes: reporting lab results performed by inpatient and ambulatory clinical care providers; reporting immunizations delivered; updating the Pennsylvania cancer registries; and prescription drug monitoring. With more than 6M messages per year, this data is available across the Commonwealth, removing data silos and increasing the speed and accuracy of information at the point of care. By utilizing these transactions, P3N can quickly identify and track patient movement within the Commonwealth's healthcare system, allowing for prompt identification and response to outbreaks. Today P3N surveys ADT transactions for COVID-19 exposure and confirmed cases, providing a daily extract from the ADTs to epidemiologists working on COVID. This abstract provides valuable patient demographic and diagnosis information (including age, race and ethnicity) and allows DOH to target specific at-risk population groups.

Where P3N resides within the Department of Human Services enables close ties with the Department of Health and affords the Commonwealth enhanced collaboration within its health data ecosystem. PA eHealth Partnership Program supports infectious and non-infectious disease programs. Today they are working on additional surveillance tools such as providing surveillance into suspected or potential child abuse. The ability to quickly incorporate social determinants of health and other unique data sources significantly enhances patient-centered care, care coordination, and the Commonwealth's ability to be more proactive and data driven in to tackling emerging health threats.  

In this presentation, we discuss the technical and logistical aspects of further enhancing the public health gateway, data governance, privacy, and security considerations. We will share successes and learnings of implementing this critical public health infrastructure. In addition, we will show how it can be enriched with new data sources, to demonstrate the predictive modeling capabilities to forecast, inform, and provide faster data insights.

CDC-ONC Industry Days Opening Plenary - Day 2

Speaker: Umair A. Shah, MD, MPH, Washington State Secretary of Health, Washington State Department of Health

Watch the recording

View/download slides

Data Modernization Facilitates Advanced Analytics and Unlocking Health Insights - Guidehouse

Speakers: Anthony Cristillo, PhD, MBA, MS, Partner & Digital Health Lead, Guidehouse
Rod Fontecilla, PhD, MS, Chief Innovation Officer, Guidehouse

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: While every organization's digital transformation journey will be different, they all include three tenets to managing the expanding data landscape: i) data governance, ii) data access, and iii) data usage practices to support an ecosystem with continuous innovation and data interoperability (i.e., FHIR). As the Data Modernization team implements the DMI's North Star Architecture and their progressive governance strategy, CDC will be able to realize Director Walensky's vision for faster actionable data-driven decisions at all levels of public health.

Within the health industry, the use of Real-World Data (RWD) and Real-World Experience (RWE) is becoming a critical component in CDC's data modernization goals. Connecting disparate data sources, integrating unstructured/structured data and accelerating health insights is all achieved by harnessing advanced analytics and providing a scalable platform incorporating government systems, health/medical industry data and untapped patient and research repositories.

Instituting common data models and data standards sets the stage for Advanced Analytics (AI/ML/NLP) to extract valuable health insights from RWD and RWE of patient-reported experiences, electronic health records (EHR), and other medical/clinical repositories that could only be accessed through 100s of human hours.  

Guidehouse recently developed a COVID-19 Vaccine Adverse Event Surveillance Portal demonstrating the power of this approach to i) integrate multiple adverse events data sets, ii) leverage Natural Language Processing (NLP) in extracting RWD from adverse event reports and iii) layering AI/ML tools to interrogate the data and unlock health insights/discovery in adverse events across different patient populations and co-morbidities.  More specifically, this solution combines NLP tools with a strong semantic foundation that utilizes terminologies (e.g., MedDRA, SNOMED CT) to harmonize and link medical language. This enables the interpretation and isolation of critical components in the RWD\EHR and avoids data duplication, data gaps, and ambiguous results.

 A cross-sectional data lake integrates complex sets of data ingested from the Vaccine Adverse Event Reporting System (VAERS) and combined with a Tableau dashboard; thus, facilitating trending and visualization. Information can easily be organized based on questions like, What were the most commonly reported adverse events in juveniles with diabetes? or What is the comorbidity of oncology patients and vaccine treatments? Data can also be quickly sorted/filtered to detect anomalies (outliers) or clusters by region, product or demographic information. Further, analytics of such RWD helped to distinguish on-label adverse events, noted in clinical trials, from off-label adverse events noted following immunization of more than a hundred million individuals across the United States. With machine learning, the service could proactively detect and alert healthcare professional to emerging issues, elevated norms and trends using risk factor/biomarker indicators and predictive adverse events. As CDC's data modernization capabilities mature new services, predictive models and internal systems can be institutionalized achieving greater efficiencies and insights transforming patient care.

Building a Privacy Sensitive Decentralized Demand Driven Supply Chain 3DSC for Pandemic Response

Speakers: Gary Jackson, Director, Secure Data Fabric, CGI Federal
Jason Porter, Vice President Emerging Technology Practice, CGI Federal

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description:

Problem Statement:

The media discourse around monkeypox (MPOX) has spread falsehoods faster than the virus itself.  Health advocates fear that misinformation about how the virus spread created discrimination specifically among members of the LGBTQ+ community.  One of the factors claimed that the LGBTQ+ community is at higher risk than other communities. This mirrors what happened in the early 1980s with the stigma of AIDS and HIV. If the media creates a stigma around a particular group who are affected by a virus or condition, it tends to turn them away from the healthcare which prevents it.  Worse yet, the information to help those who are infected if revealed could be weaponized to destroy their lives instead of helping them to heal.

Solution:

CGI Federal in collaboration with the University of Tennessee Knoxville created a decentralized solution to reach those populations most affected by MPOX without giving away personal information that could subject them to bias or prejudice. These populations faced barriers to awareness, education, and preventive measures like vaccination. CGI Federal worked first on offering a decentralized collaborative digital exchange that prevents data movement and protecting personally identifiable information (PII).  Secondly, CGI Federal used assets developed between internal staff and University of Tennessee students and researchers to determine where MPOX cases were increasing to inform manufacturers of the JYNNEOS and ACAM2000 vaccines on where to distribute to clinics and commercial partners such as CVS, Walgreens, and others.  

This DeSci (decentralized science) marketplace, digital assets (datasets, databases) of confirmed cases were paired with algorithms to determine:

Inventory Days of Supply - Help identify how much inventory to cover the order fulfillment for a specific number of days.
Perfect Order Percentage - Help identify the percentage of orders that get shipped in time without any damage, delay, documentation error, etc.
Information was shared without revealing private details using techniques such as Zero Knowledge Proofs (ZKP) and Trusted Execution Environments (TEE).

Impact:

Using these technologies, CGI Federal was able to prevent ascertainment bias which generally refers to situations in which the way data is collected is more likely to include members of a population than others.  In the end, it demonstrated supply chain management in response to demand signals without compromising the data that makes patients vulnerable.

Focusing on the outputs of this collaboration allowed for vaccine equity distribution. Also it was determined that those of the LGBTQ+ community have stronger relationships with their medical practitioners than straight, heterosexual males. This means that more reports of the outbreak were reported by the LGBTQ+ community.  So an outreach was recommended to make it easier for vaccinations for the heterosexual male community to overcome the stigma.

Overall, psychology became the biggest factor to overcome for the pandemic response surrounding MPOX. It might be impossible to change media discourse but trusted person to person communication is the best way to circumvent falsehoods. CGI Federal and the University of Tennessee Knoxville determined that protecting the privacy of one community, protects all communities. 

Analytical Tools for Health Equity Assessment - The Lewin Group, part of Optum Serve

Speakers: Suman Challagulla, Managing Consultant, The Lewin Group (Optum Serve)

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: COVID-19 highlighted the need to identify vulnerable populations and available resources to enable emergency preparedness, response, and advance health equity. To achieve this, Optum Serve used Tableau, SAS, and SQL Server, with automated data processes to develop the analytical tools named Health Equity Assessment Review Solution (HEARS) and Priority Identification Vaccine Operating Tool (PIVOT). Optum Serve is working to deploy them in the cloud for extensibility and scalability.

HEARS, a suite of interactive visualizations using public data, shows metrics of healthcare access and quality of care for vulnerable populations. HEARS includes:

  • Health Risk and Disease Burden: Identifies vulnerable communities by the prevalence of select adverse health outcomes.  
  • Access to Care: Shows geographical views of the percentage of the population who do not have health insurance, medication for high blood pressure, routine checkups, and dental visits.
  • Quality of Care: Shows six quality measures that quantify the quality of care to help prioritize vulnerable communities.
  • Social Determinants of Health: Shows visualizations of the Area Deprivation Index to identify communities with social disadvantages and fewer resources.
  • Community Resources: Maps locations of healthcare resources to analyze areas with a shortage of or inaccessible resources to help target healthcare education and intervention programs.

PIVOT, a business intelligence support tool for vaccine distribution, provides states with the analytics needed to optimize the planning, administration, and monitoring of vaccines. PIVOT includes:

  • Priority Population Mapping: Quantifies the size of priority populations at a ZIP code level.
  • Socio-economically Disadvantaged Communities Mapping: Visualizes the Area Deprivation Index at the census level helping state vaccine programs to target socio-economically disadvantaged areas.
  • Vaccine Administration Network Mapping: Maps the location of hospitals, nursing facilities, healthcare centers, Federally Qualified Health Centers, pharmacies, schools, and major sport venues to plan vaccine administration networks.
  • Vaccine Network Catchment Area Analyzer: Helps bridge the target population with vaccine providers to inform the optimal use of resources.
  • Vaccine Monitoring: PIVOT can intake data from state vaccine registries to report on the administration of vaccines at a ZIP code level and estimate vaccine uptake for the target population.

Leveraging Public Private Academic Partnerships to Design and Implement Novel Exposure Notifications

Speakers: Umair Shah, MD, MPH, Washington State Secretary of Health, Washington State Department of Health
Bryant Thomas Karras, MD, Chief Medical Informatics Officer, Washington State Department of Health

Topic: Policy, Standards, and Technology

Watch the recording

View/download slides

Session Description: Ongoing collaboration between public, private, and academic partners has been critical to the accelerated rollout and success of EN in the U.S. It has increased credibility and trust in the technology, leading to higher rates of adoption and engagement. The partnerships have fostered continued EN technology development, including expanding EN to allow self-reported home test results and integrating EN backend operation with local, state, and national reporting streams. WA DOH is exploring future integration with NIH and CareEvolution's Make My Test Count portal.   

From the beginning of the COVID-19 pandemic, Washington state has been at the forefront of EN development with a commitment to strengthening the public-private-academic partnerships and multi-state community of practice that have made this tool so successful.

WA DOH is committed to modernizing and integrating its public health infrastructure. EN has become an effective and increasingly integrated public health tool in Washington state and beyond, as a result of these partnerships.

A Partnership Among States and Industry to Advance Interoperability – Michigan Department of Health

Speaker: Jeffrey Duncan, PhD, MS, Director, Division for Vital Records and Health Statistics, Michigan Department of Health and Human Services

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: One of the many public health lessons learned through the COVID-19 pandemic was that the process of reporting, coding, and tabulating mortality information from death certificates was too slow.  As a result, in 2021 CDC awarded funding to all states and territories for National Vital Statistics System (NVSS) modernization through the Epidemiology and Laboratory Capacity for Prevention and Control of Emerging Infectious Diseases (ELC) Cooperative Agreement.  The primary purpose of the funding was to support the development and implementation of Fast Healthcare Interoperability Resources (FHIR)-based interoperability between state death registration systems and the National Center for Health Statistics (NCHS).  

State health departments have nearly a decade's worth of experience with HL7 version 2 and version 3 standards used to report immunizations, syndromic surveillance, and reportable diseases, but no experience with the new HL7 FHIR standard. NCHS built a Community of Practice (COP) among NVSS jurisdictions and their vendors to address this and other modernization issues.

The State of Michigan worked with its vendor, Altarum Institute, to design a FHIR solution using Lyniate's Rhapsody Interface Engine, a product that many state health departments already own and use for existing public health reporting use cases.  Altarum currently manages Michigan's Rhapsody instances and designs and operates interfaces with healthcare entities across Michigan.

When it was learned multiple other states were interested in leveraging their existing Rhapsody capabilities to meet this new challenge, Michigan worked with NCHS to create a subset of the larger COP to focus on Rhapsody integration issues, share solution requirements and design criteria, development and testing artifacts, and other useful information among states  and their vendors.  The current Rhapsody User group, which includes representatives from 10 states, vendors, NCHS, NAPHSIS, and Mitre representatives.  

This presentation will focus on the challenges faced by states faced with the need to implement FHIR-based interoperability, the NCHS Community of Practice approach and how it fostered an environment where Michigan could identify other interested jurisdictions, the Rhapsody FHIR user group, its organization, scope, and benefits to participants.   The presentation demonstrates how   jurisdictions, vendors, and federal partners can collaborate and share both experience and even development artifacts to expedite the adoption and implementation of new technology using resources.

Best Practices for Modernizing Data Systems Across a Vast and Diverse Public Health Ecosystem

Speakers: Corrina Dan, RN, MPH, Vice President, Maximus
Mark Headd, MPA, Government Technology Strategist, Ad Hoc, LLC

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: As CDC seeks to modernize public health data systems at the federal level, a well-orchestrated collaboration with the many stakeholders will be needed. Today, the public health data ecosystem consists of a complex set of systems managed by local and state public health departments, tribes and territories, non-profit organizations such APHL with the AIMS platform, and a diverse array of local clinical actors, facilities, and systems providing data on infectious and chronic diseases. The participating organizations have a wide variation in their technical skills and capabilities and will need significant support to modernize along with CDC on the 10-year roadmap. Emerging health information network advances and health data standards represent critical opportunities to increase the speed and quality of data as it begins to flow across the modernized ecosystem.

Best practices in modernizing large, complex health systems exist that can assist CDC in their success. For example,

  • Workforce development through the establishment of learning communities across the ecosystem that provide training materials, up-to-date policies and programs, tools such as open-source software, and peer-based support technologies can facilitate rapid uptake and engagement across the various domains.
  • Developing API services that enable participating organizations to connect and leverage their existing systems more quickly and exchange data with new CDC systems will also be essential.
  • Establishing communication, coordination, and feedback mechanisms that support the all-of-agency approach at the Federal level and engage state and local stakeholders at appropriate levels across the ecosystem.

Maximus is a unique government partner that has an established focus on the entire system from deep partnerships with CDC and other Federal government partners like CMS and VA; broad partnerships with state and local governments partners providing public health, Medicaid, and other human services programs; expansive partnerships with industry leaders like AWS, Genesys Cloud, and Salesforce; to deep expertise and a commitment to community in partnership with National Network of Public Health Institutes (NNPHI), ASTHO, and the non-profit community. We recognize that it will take the sum of stakeholders engaging around shared goals to achieve public health data modernization.

This session will feature Maximus Public Health leadership and our partner Ad Hoc, a U.S. Digital Services company, providing an overview of our unique approach and examples of how we can support the public health ecosystem as we collectively reimagine and rebuild our public health data systems using a multi-sectoral, systems approach to lift all stakeholders and their communities toward improved population health outcomes through data modernization. 

Advanced Disease Modelling to Inform Proactive Global Disease Management in Real Time - Airfinity Ltd

Speakers: Matt Linley, PhD, MEng, Analyst Director, Airfinity Ltd
Rasmus Bech Hansen, MPA, CEO, Airfinity

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: The COVID-19 pandemic has brought to light the critical need for efficient and effective responses to emerging infectious disease threats. As a leading expert in the tracking, assessment, simulation, and prediction of infectious diseases and selected non-communicable diseases worldwide, Airfinity offers advanced simulation capacity, deep domain expertise, and the ability to forecast and predict disease burden and outcomes under various scenarios. Here, we intend to present our approach to rapid disease tracking and modelling and outline how these analyses have been used by key decision makers to inform strategy and save lives.

Airfinity's unique approach utilizes a wide range of data sets from around the world to build country proxies and analogues, allowing for assessments and forecasts even in the absence of data or limited data. Our extensive data repository, built over several years, enables us to account for country-specific factors such as population composition, natural and vaccine-induced immunity, healthcare system resilience, ICU capacity, testing levels, treatment demand and availability, among others, to develop models and simulations with parameter sensitivity analysis to understand which factors have the greatest influence on outcomes.

Most recently, this approach has proven to be highly useful to understand COVID-19 in China, but can be applied to any other country and infectious disease. As alternative data becomes increasingly available globally and the likelihood of a comprehensive global monitoring system decreases, this method is likely to become the gold-standard. The US government has a unique opportunity to be at the forefront of this pioneering work.

Concretely, Airfinity can offer unparalleled value to the Centers for Disease Control and Prevention (CDC) and the wider federal government by providing quick risk assessments with a 24-hour turnaround time and modeling potential risks under different scenarios. This capability has been demonstrated throughout the COVID-19 pandemic and with other disease outbreaks, for example:

By monitoring, tracking, and analyzing the emergence of the Omicron variant, with alarming data released on a Friday evening, we were able to, by Monday morning, brief government clients on spread, burden, and global impact, and inform on country protection measures.

By predicting the hospital burden from the COVID/flu/RSV tripledemic™, we were able to alert governments to key health system risks 5-6 months in advance.

Within 24 hours of global news on increased spread of MPOX we were able to brief governments and help clients ensure sufficient supplies of vaccines.

Our swift analysis across infectious diseases has been crucial in understanding rapidly evolving threats and situations, which has aided our clients in their decision-making and undoubtedly helped save lives. As recognized by the leaders of the UK Vaccine Task Force, Dame Kate Bingham and Mr Clive Dix, stating: "Airfinity data support has been instrumental in the UK government's COVID-19 response and has helped save lives."

Airfinity is well-established in the US with an office and a range of clients, and we believe we can be an important partner in the federal government's data-driven approach to reducing disease burdens.

Modern Data Platform | A Practical Example of Data Mesh in Practice - Skyward IT Solutions

Speaker: Vitaliy Baklikov, Vice President of Engineering, Skyward IT Solutions

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: In this talk, I will share an example of implementing Data Mesh Architecture within the context of a large financial institution and how it applies to public health domain. I am going to cover challenges, tools, processes, and lessons learned encountered on this digital transformation journey. This talk is the reflection on my time as an Enterprise Architect in charge of building and rolling out a Data Platform at one of the largest banks in the world. The audience will benefit by learning about Data Mesh principles, Data Lake Strategies, and how to avoid pitfalls while modernizing data infrastructure at CDC. Note, I am not affiliated with any software or platform vendors and the purpose of this talk is NOT to endorse any of the tools. This talk is mainly to share the experience and provide advice to the industry peers who are embarking on this journey. Target audience: Data Strategists, Business Leaders, Enterprise Architects, Analytics Directors, CDOs, CAOs, IT departments.

Data Quality at Scale: Rapidly Evolving and Expanding our Master Patient Index with Large Datasets

Speakers: Andy Hanks, Senior Data Architect, CRISP Shared Services
Craig Behm, MBA, CEO, CRISP Shared Services

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: In March 2020, CRISP Shared Services (CSS) supported data in three states with a mature master patient index (MPI) system (Initiate) in a cloud environment, providing real-time organization of unique entities from the 2m HL7 and CCD messages received per day. The MPI ensures that patient records and queries are matched to the correct person.

Starting in March 2020, CSS started a project to build a cloud-based data lake to support Covid case reporting; this system went live within six weeks for the first state.  Almost immediately, we recognized that a full integration with the MPI was required for rapid processing of datasets.  Full integration enables daily delivery of downstream bulk data, such as Covid cases matched with hospitalization status and enhanced race and ethnicity.  We coordinated a pipeline that received Covid Cases and Hospitalizations from admission, discharge, and transfer (ADT) feeds into a pipeline that processed the incoming data starting at 3:30am and delivered matched output by 6am into a contact tracing system.  Following the success of this effort, we rapidly received NEDSS HL7 messages in a real-time feed from the State, and so combined the daily official case file with the real-time new cases to send new cases to the contact tracing system hourly.

To efficiently continue this rapid, daily process, we recognized a need for a daily, updated copy of the MPI within the data lake. This decision enabled a large number of use cases to be possible, and, at the same time, we can reduce the number of queries against the production MPI – by first checking with our copy (which is aligned with all the other snapshots of data through midnight the previous day) and secondarily accessing the live MPI data.

As of January 2023, CSS serves six states, with five separate COVID Immunization incoming daily feeds and one All Immunization system incoming daily feed. The MPI copy in the data lake has allowed us to align with all of the changes in State Immunization systems – including adds, deletes and merges of patients. We are able to identify use cases, such as someone who gets married, changes address, or there is an administrative error in entering the patient – all of which are not usually deduplicated by the Immunization systems.

Key to our success is our partnership with the states that we serve. We can provide better clinical matching by using State Immunization Data to improve the MPI data quality, and we can send reports customized to the State team performing the merges, and also flow HL7 messages to the Immunization systems that provide latest phone numbers where a phone number is missing.  

For data quality to continuously improve at scale, HIEs/Health Data Utilities utilizing the services of CSS must work hand-in-hand with our local health departments. By bringing together clinical and public health data, we can more effectively serve these different but equally important data streams.

Using IBM Analytics@Scale to Align Analytic Solutions to Public Health Leadership Priorities - IBM

Speakers: Mark Freeman, PhD, MPA, Chief Data Scientist, IBM
Suaina Menawat, MS, Associate Partner, Public Health, IBM Consulting

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: IBM™s Analytics@Scale approach allows us to collaborate with public health agency leadership to re-imagine what data can enable the public health agency to improve the health of its residents. The Analytics@Scale approach focuses on starting where you are and co-developing the most promising solutions such as: a unified dashboard to serve as a shared technology service across public health and the sister agencies, an analytics tool incorporating NLP and machine learning to optimize the reach of public health communications, machine learning algorithms embedded in data pipelines to identify positive hospitalizations caused by COVID-19 illness, among others. We showcase examples from our public health agency clients where we continue to generate solutions through shared technical services serving production-grade data and analytics pipelines and workflows. This session focuses on an iterative, agile model approach to prioritize and deliver high value analytics solutions for CDC and STLTs.

Experiences in the Data Commons: Bringing Together Data Community to Improve Patient Outcomes - RTI

Speakers: Rebecca Boyles, MSPH, Senior Director, Center for Data Modernization Solutions, RTI International
Rashonda Lewis, JD, MHA, Data Governance and Privacy Specialist, RTI

Topic: Policy, Standards, and Technology

Watch the recording

View/download slides

Session Description: CDC is leading a national effort to create modern public health surveillance infrastructure. RTI has worked with National Institute of Health (NIH) to create efficient, modular, connected, and scalable data ecosystems that meet users' needs on demand, in real-time. We discuss how our “Fit-for-Purpose” approach incorporates mission, resources, and the user community to inform the data ecosystem development process.

RTI’s approach to technical transformation can inform CDC’s efforts to modernize public health infrastructure. We lead data transformation as part of a larger community working toward a modular national ecosystem that allows participants to connect, collaborate, share data, and create insights that inform decisions.

RTI co-coordinates the NIH NHLBI BioData Catalyst? (BDC) [1,2]. In this community-driven ecosystem, researchers find, access, share, store, and compute on large-scale datasets. Secure, cloud-based workspaces, search tools and workflows, applications, and innovative features address community needs, including exploratory data analysis, genomic and imaging tools, tools for reproducibility, and adoption of open APIs to enable data exchange. As the HEAL Data Stewardship Group, RTI provides solutions for managing and coordinating diverse data across the 1000+ studies in the Helping to End Addiction Long-Term Initiative? (NIH HEAL). We collaborated with HEAL to define the data strategy and governance plan, developed a sustainability plan, and provided materials to upskill HEAL contributors and support translational science by researchers, healthcare providers, community leaders, and policymakers[3].

Identifying and piloting OpenAPI and other interoperability standards are critical to moving data across cloud platforms[4]. The new NIH Cloud Platform Interoperability effort RTI is coordinating takes a use-case approach to develop and adopt foundational technical tools or approaches for an extensible, connected data ecosystem that ultimately leads to improved findability, accessibility, interoperability, and reproducibility of previously siloed data. We will also test and inform international standards in development, such as the Data Use Ontology, by organizations like GA4GH and HL7[5–7].

Critical to these efforts is data governance. We integrate stakeholders into the data governance vision, aligning the vision with client culture and project aims. Our data governance approach includes five components that help us to meet clients where they are: (1) understanding the client’s data culture; (2) involving the right people by creating a collaborative and inclusive operating framework of data management, decision, and change management accountabilities; (3) assessing and documenting in scope data assets, interoperability standards and technical requirements; (4) understanding the legal and policy landscape and client operational processes for managing data; and (5) applying technical solutions that support client needs across the data lifecycle.

We present lessons learned working with clients and international stakeholders to make incremental and coordinated progress toward increasing the movement and utility of data, meeting users where they are, moving from use cases to scalable production, and integrating interoperability standards in a modular development model. Data transformation is a people-centric endeavor and demands a client-specific data governance structure. Our tailored approach builds a community around data ecosystem change and innovation, maximizes data access and compute capabilities for clients, and ensures compliance with data exchange requirements.

Real Time Health Care Facility Resource Information with IoT FHIR Devices - NAVOMI

Speakers: Vinod Koosam, Founder/CEO/Solutions Architect, NAVOMI INC
Veeresh Nama, MS, VP of Business Development, NAVOMI

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description:

SUBJECT:  

Emergency preparedness, response, and crisis management data sharing in real-time with automated FHIR capable smart IoT devices - Concept named FHIR Beacon

INTRODUCTION:

Today, situational awareness data related to health care facility resources capacities, such as availability of Beds, PPEs, Oxygen etc., is provided routinely by health care organizations using spreadsheets/PDFs to state/federal agencies.

Instead, with a smart IoT device attached to these resources, this same workflow can be accomplished automatically, and in real time, where the data is directly sent to the federal agency's endpoints (API front door) with FHIR based data integrations.

METHOD:

The proposed and new concept of FHIR Beacon, is a low footprint hardware/software IoT module that is FHIR standards aware, purpose built, reusable, and can be attached and re/configured to any healthcare facility resource.

The FHIR Beacon collects data about its attached resource and its environment, using smart sensors, translates it to a FHIR friendly standard JSON data format, and posts it to a target service such as a centralized database at CDC. This device can be managed/monitored remotely via administration service and comes with functions such as:

Attach/Detach/Configure - To/From new resources (Bed, Oxygen Cylinder, PPE, Ventilator, inventory of life saving drugs, antibiotics etc.)
Set data collection intervals
Power levels, security, health check and error handling
Internet connectivity and GPS
The FHIR Beacons are deployed around a facility.  These then report the utilization and therefore available capacity of resources in near real-time at scheduled time intervals using WiFi.

CONCLUSION:

Whether it is in times of preparedness or emergency response, an automated, streamlined, standardized data gathering using FHIR Beacon aligns with Capability #6, Information Sharing, of Public Health Emergency Preparedness and Response Capabilities: National Standards for Health for STLT". (link - https://www.cdc.gov/cpr/readiness/capabilities/index.htm)

Once standards around this mechanism are established, future medical supplies, equipment manufacturers can embed FHIR Beacon functionalities / capabilities.  This will accelerate interoperability, adoption of FHIR and ultimately help decision making, especially during emergencies.

Acronyms Used:

HL7: Health Level Seven (HL7) is a set of international standards used to provide guidance with transferring and sharing data between various healthcare providers created and maintained by http://www.hl7.org/
FHIR: Fast Healthcare Interoperability Resources is a HL7 standard for exchanging Information about resources such as Patient, Measures, Observations etc.
JSON: JavaScript Object Notation is an industry standard and popular lightweight data interchange format
PDF: Portable Document Format
IoT: Internet of Things
PPE: Personal Protective Equipment
CDC: Centers for Disease Control
GPS: Global Positioning System
WiFi: Wireless Fidelity
STLT: State, Tribal, Local, and Territorial

The Open Flexible Platform: Scale Resiliency and Availability for the Global Enterprise - Red Hat

Speaker: Wes Jackson, Senior Solution Architect, Red Hat

Topic: Public Health Data Modernization

Watch the recording

View/download slides

Session Description: Data is everywhere. To make timely and reliable predictions, we need to be able to deal with the high volume and variety of data where it is, and surface insights quickly. When the COVID-19 pandemic began, data was key to understanding where to focus resources and what mitigation strategies to put in place. The patchwork of data sources, legal boundaries and differing collection practices added to the difficulty of forming a high-quality global picture.

Using today’s technology, compute power at the edge can take whatever data is available, transform it into useful information, and provide forecasting tools to public health researchers in time to take meaningful action.

We will discuss technology trends that allow us to build a globally connected data platform we can use to build a system of predictive models, providing researchers and analysts with timely information, all while respecting data sovereignty and privacy rules and ensuring consistent access even with sparse connectivity.

Those trends involve combining available technologies to create a data science fabric through federated data models.  We can create standards around features, and then serialize and move data between models to form a more comprehensive picture. Data source variety no longer needs to be an impediment to insights.

Prognostic and Predictive Classification Approaches for Disease Prediction Modeling - Leidos

Speaker: Chetan Paul, Vice President, Leidos

Topic: Forecasting and Analytics

Watch the recording

View/download slides

Session Description: To develop an effective prediction model, it is crucial to understand risk factors, gain insight into patient-level data, and identify appropriate prognostic factors that support clinical decisions. To support this objective, we developed an innovative feature selection and modeling approach that is appropriate for healthcare-focused disparate Real-World big data. Our predictive modeling approach consists of two stages. The first stage to generate Machine Learning-ready (ML-ready) dataset and the second stage to use the ML-ready dataset for predictive modeling to identify cases with a high likelihood of developing the condition of interest.

Prognostic models aim to predict patients™ future outcomes and use time-based analyses to make accurate predictions based on historical medical information. For this approach, each patient requires time-to-event (Ti), censoring information, and the binary event indicator. Predictive classification models aim to provide time-invariant predictions of event likelihood. We developed and applied this approach in predicting Long COVID-19 through an exhaustive and comprehensive feature selection analysis of datasets containing patients™ medical histories including conditions occurrence, drug exposure, measurements, devices exposure, procedures occurrence tables with observations for individual patients. The performance analysis of the best predictive classification model showed with AUC = 0.93 and a balanced accuracy of 0.75.

Justification: This approach and methodology aligns with the objectives of "Forecasting and Analytics" Topic and is general enough so that it can be applied to domains of interest such as infectious diseases and cancer. The explainability of model outcomes and inherent incorporation of FAIRS principles provides informed decision-making capabilities for disease modeling and analytic capability to structure response and resources at federal, state and local levels.

Technology Building Blocks of Rapidly Deployed Public Health Projects - Scripps Health

Speaker: Ed Ramos, PhD, Director of Digital Clinical Trials, Scripps Health

Topic: Policy, Standards, and Technology

Watch the recording

View/download slides

Session Description: Public health initiatives need the capacity for rapid response and adapting to change in order to evolve to meet the needs of various communities over time.  Public health organizations can enhance their ability to respond both rapidly and adaptively if they establish a technical strategy founded on interoperability standards and reusable public health components within and across initiatives.  This presentation will use real world examples of how a reusable component strategy addressed the COVID pandemic from test validation all the way through therapy in higher risk communities with Paxlovid including test validation (RADx Initiative), early covid detection (DETECT), COVID testing at home and in schools (SYCT and Test2Stay), as well as therapy with Paxlovid (Plan4Health).  Extending beyond COVID, the presentation will also demonstrate the application of these strategies to perinatal risk especially in underserved populations.  The presentation will demonstrate the benefits of the following reusable strategies:

  • Common Approach to using standards and terminologies that have become significantly more adopted due to the 2016 Cures Act and will increase with the adoption of TEFCA and QHINS
  • Use of Claims, EMR, and Wearable Data to gain insight into the daily lives of people without needing direct contact with individuals
  • Reusable social media approaches to recruit participation from individuals
  • Digital Survey tools and Automated Eligibility verification to streamline participation and gathering participant provided information
  • Dashboards and reward systems to increase engagement and compliance
  • Kitting and fulfillment strategies to scale distribution of at home testing
  • Standard data models to streamline downstream data cleaning and analysis
  • Each of these strategies are the reusable building blocks that allow a public health department or academic medical center to rapidly assemble components and then deploy for a specific purpose. Based on the strategies discussed, public health initiatives can establish shorter deployment times as well as rapidly adapt to challenges as they progress from initial data collection to targeted interventions especially for specific subgroups like underrepresented and underserved communities.

CDC-ONC Industry Days 2023 CDC Acquisitions - Day 2

Speaker: Arielle Douglas, Deputy Director, Office of Small Disadvantaged and Business Utilization, U.S. Department of Health and Human Services

Watch the recording

View/download slides

CDC-ONC Industry Days 2023 Closing - Day 2

Introduction: Judy Monroe, MD, President and CEO, CDC Foundation
Speaker: Bill Cassidy, MD, U.S. Senator for Louisiana

Closing remarks and a video message from the Honorable Bill Cassidy.

Watch the recording


Get Updates on Data Modernization Initiatives and Events
Sign Up for Email Updates