- Browse by Author
Browsing by Author "Grannis, Shaun J"
Now showing 1 - 10 of 11
Results Per Page
Sort Options
Item Clinical Versus Public Health Perceptions of Notifiable Disease Reporting Burden(2014) Revere, Debra; Hills, Rebecca; Grannis, Shaun J; Dixon, Brian E.Item Effect of Electronic Health Record Systems Access on Communicable Disease Report Completeness(2013) Kirbiyik, Uzay; Dixon, Brian E.; Grannis, Shaun JItem Electronic Laboratory Data Quality and the Value of a Health Information Exchange to Support Public Health Reporting Processes(2011-10) Dixon, Brian E.; McGowan, Julie J; Grannis, Shaun JThere is increasing interest in leveraging electronic health data across disparate sources for a variety of uses. A fallacy often held by data consumers is that clinical data quality is homogeneous across sources. We examined one attribute of data quality, completeness, in the context of electronic laboratory reporting of notifiable disease information. We evaluated 7.5 million laboratory reports from clinical information systems for their completeness with respect to data needed for public health reporting processes. We also examined the impact of health information exchange (HIE) enhancement methods that attempt to improve completeness. The laboratory data were heterogeneous in their completeness. Fields identifying the patient and test results were usually complete. Fields containing patient demographics, patient contact information, and provider contact information were suboptimal. Data processed by the HIE were often more complete, suggesting that HIEs can support improvements to existing public health reporting processes.Item Estimating Increased Electronic Laboratory Reporting Volumes for Meaningful Use: Implications for the Public Health Workforce(2014-02) Dixon, Brian E.; Gibson, P Joseph; Grannis, Shaun JObjective: To provide formulas for estimating notifiable disease reporting volume from ‘meaningful use’ electronic laboratory reporting (ELR). Methods: We analyzed two years of comprehensive ELR reporting data from 15 metropolitan hospitals and laboratories. Report volumes were divided by population counts to derive generalizable estimators. Results: Observed volume of notifiable disease reports in a metropolitan area were more than twice national averages. ELR volumes varied by institution type, bed count, and by the level of effort required of health department staff. Conclusions: Health departments may experience a significant increase in notifiable disease reporting following efforts to fulfill meaningful use requirements, resulting in increases in workload that may further strain public health resources. Volume estimators provide a method for predicting ELR transaction volumes, which may support administrative planning in health departments.Item Measuring the impact of a health information exchange intervention on provider-based notifiable disease reporting using mixed methods: a study protocol(2013-10) Dixon, Brian E.; Grannis, Shaun J; Revere, DebraBackground Health information exchange (HIE) is the electronic sharing of data and information between clinical care and public health entities. Previous research has shown that using HIE to electronically report laboratory results to public health can improve surveillance practice, yet there has been little utilization of HIE for improving provider-based disease reporting. This article describes a study protocol that uses mixed methods to evaluate an intervention to electronically pre-populate provider-based notifiable disease case reporting forms with clinical, laboratory and patient data available through an operational HIE. The evaluation seeks to: (1) identify barriers and facilitators to implementation, adoption and utilization of the intervention; (2) measure impacts on workflow, provider awareness, and end-user satisfaction; and (3) describe the contextual factors that impact the effectiveness of the intervention within heterogeneous clinical settings and the HIE. Methods/Design The intervention will be implemented over a staggered schedule in one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care. Evaluation will be conducted utilizing a concurrent design mixed methods framework in which qualitative methods are embedded within the quantitative methods. Quantitative data will include reporting rates, timeliness and burden and report completeness and accuracy, analyzed using interrupted time-series and other pre-post comparisons. Qualitative data regarding pre-post provider perceptions of report completeness, accuracy, and timeliness, reporting burden, data quality, benefits, utility, adoption, utilization and impact on reporting workflow will be collected using semi-structured interviews and open-ended survey items. Data will be triangulated to find convergence or agreement by cross-validating results to produce a contextualized portrayal of the facilitators and barriers to implementation and use of the intervention. Discussion By applying mixed research methods and measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community.Item State and Local Health Agency Engagement in HIE: A Cross-Sectional Survey(2012) Dixon, Brian E.; Gamache, Roland E; Grannis, Shaun JItem Towards Estimation of Electronic Laboratory Reporting Volumes in a Meaningful Use World(2012) Dixon, Brian E.; Gamache, Roland E; Grannis, Shaun JItem Using Information Entropy to Monitor Chief Complaint Characteristics and Quality(2013) Grannis, Shaun J; Dixon, Brian E.; Xia, Yuni; Wu, JianminAs we enter the 'big medical data' era, a new core competency is to continuously monitor quality of data collected from electronic sources, including population surveillance data sources. We describe how entropy, a fundamental information measure, can help monitor the characteristics of chief complaints in an operational surveillance system.Item Variation in Information Needs and Quality: Implications for Public Health Surveillance and Biomedical Informatics(2013-11) Dixon, Brian E.; Lai, Patrick T; Grannis, Shaun JUnderstanding variation among users’ information needs and the quality of information in an electronic system is important for informaticians to ensure data are fit-for-use in answering important questions in clinical and public health. To measure variation in satisfaction with currently reported data, as well as perceived importance and need with respect to completeness and timeliness, we surveyed epidemiologists and other public health professionals across multiple jurisdictions. We observed consensus for some data elements, such as county of residence, which respondents perceived as important and felt should always be reported. However information needs differed for many data elements, especially when comparing notifiable diseases such as chlamydia to seasonal (influenza) and chronic (diabetes) diseases. Given the trend towards greater volume and variety of data as inputs to surveillance systems, variation of information needs impacts system design and practice. Systems must be flexible and highly configurable to accommodate variation, and informaticians must measure and improve systems and business processes to accommodate for variation of both users and information.Item A Vision for the Systematic Monitoring and Improvement of the Quality of Electronic Health Data(2013) Dixon, Brian E.; Rosenman, Marc; Xia, Yuni; Grannis, Shaun JIn parallel with the implementation of information and communications systems, health care organizations are beginning to amass large-scale repositories of clinical and administrative data. Many nations seek to leverage so-called Big Data repositories to support improvements in health outcomes, drug safety, health surveillance, and care delivery processes. An unsupported assumption is that electronic health care data are of sufficient quality to enable the varied use cases envisioned by health ministries. The reality is that many electronic health data sources are of suboptimal quality and unfit for particular uses. To more systematically define, characterize and improve electronic health data quality, we propose a novel framework for health data stewardship. The framework is adapted from prior data quality research outside of health, but it has been reshaped to apply a systems approach to data quality with an emphasis on health outcomes. The proposed framework is a beginning, not an end. We invite the biomedical informatics community to use and adapt the framework to improve health data quality and outcomes for populations in nations around the world.