Who said it? Establishing professional attribution among authors of Veterans' Electronic Health Records

AMIA Annu Symp Proc. 2012:2012:753-62. Epub 2012 Nov 3.

Abstract

Background: A practical data point for assessing information quality and value in the Electronic Health Record (EHR) is the professional category of the EHR author. We evaluated and compared free form electronic signatures against LOINC note titles in categorizing the profession of EHR authors.

Methods: A random 1000 clinical document sample was selected and divided into 500 document sets for training and testing. The gold standard for provider classification was generated by dual clinician manual review, disagreements resolved by a third reviewer. Text matching algorithms composed of document titles and author electronic signatures for provider classification were developed on the training set.

Results: Overall, detection of professional classification by note titles alone resulted in 76.1% sensitivity and 69.4% specificity. The aggregate of note titles with electronic signatures resulted in 95.7% sensitivity and 98.5% specificity.

Conclusions: Note titles alone provided fair professional classification. Inclusion of author electronic signatures significantly boosted classification performance.

Keywords: EHR meta-data; Health-care profession; LOINC title; document quality; electronic signature.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Algorithms*
  • Authorship*
  • Electronic Health Records*
  • Humans
  • Information Systems
  • Logical Observation Identifiers Names and Codes*
  • United States
  • United States Department of Veterans Affairs / organization & administration
  • Veterans