J Am Med Inform Assoc doi:10.1136/amiajnl-2011-000784
  • Review

Evaluating the state of the art in coreference resolution for electronic medical records

  1. Brett R South3,4,5
  1. 1Department of Information Studies, University at Albany, SUNY, Albany, New York, USA
  2. 2Computer Science and Artificial Intelligence Laboratory, MIT, Cambridge, Massachusetts, USA
  3. 3VA Salt Lake City Health Care System, Salt Lake City, Utah, USA
  4. 4Department of Internal Medicine, University of Utah, Salt Lake City, Utah, USA
  5. 5Department of Biomedical Informatics, University of Utah, Salt Lake City, Utah, USA
  6. 6Cincinnati Children's Hospital, University of Cincinnati Computational Medicine Center, Cincinnati, Ohio, USA
  1. Correspondence to Dr Ozlem Uzuner, Department of Information Studies, College of Computing and Information, University at Albany, SUNY, Draper 114A, 135 Western Avenue, Albany, NY 12222, USA; ouzuner{at}
  1. Contributors OU is the primary author and was instrumental in all aspects of the preparation and organization of the coreference resolution challenge from data to workshop. AB helped organize the coreference challenge, led the data analysis, and co-wrote and edited the manuscript. BS was co-lead of the coreference resolution challenge, and along with SS and TF managed the annotation and preparation of the i2b2/VA corpus for the coreference challenge. JP provided organization insights and feedback throughout challenge organization.

  • Received 18 December 2011
  • Accepted 23 January 2012
  • Published Online First 24 February 2012


Background The fifth i2b2/VA Workshop on Natural Language Processing Challenges for Clinical Records conducted a systematic review on resolution of noun phrase coreference in medical records. Informatics for Integrating Biology and the Bedside (i2b2) and the Veterans Affair (VA) Consortium for Healthcare Informatics Research (CHIR) partnered to organize the coreference challenge. They provided the research community with two corpora of medical records for the development and evaluation of the coreference resolution systems. These corpora contained various record types (ie, discharge summaries, pathology reports) from multiple institutions.

Methods The coreference challenge provided the community with two annotated ground truth corpora and evaluated systems on coreference resolution in two ways: first, it evaluated systems for their ability to identify mentions of concepts and to link together those mentions. Second, it evaluated the ability of the systems to link together ground truth mentions that refer to the same entity. Twenty teams representing 29 organizations and nine countries participated in the coreference challenge.

Results The teams' system submissions showed that machine-learning and rule-based approaches worked best when augmented with external knowledge sources and coreference clues extracted from document structure. The systems performed better in coreference resolution when provided with ground truth mentions. Overall, the systems struggled in solving coreference resolution for cases that required domain knowledge.


  • Funding The 2011 i2b2/VA challenge and the workshop are funded in part by grant number 2U54LM008748 on Informatics for Integrating Biology and the Bedside from the National Library of Medicine. This challenge and workshop are also supported by resources and facilities of the VA Salt Lake City Health Care System with funding support from the Consortium for Healthcare Informatics Research (CHIR), VA HSR HIR 08-374 and the National Institutes of Health, National Library of Medicine under grant number R13LM010743-01.

  • Competing interests None.

  • Ethics approval This study was conducted with the approval of i2b2 and the VA.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Related Article

Free Sample

This recent issue is free to all users to allow everyone the opportunity to see the full scope and typical content of JAMIA.
View free sample issue >>

Access policy for JAMIA

All content published in JAMIA is deposited with PubMed Central by the publisher with a 12 month embargo. Authors/funders may pay an Open Access fee of $2,000 to make the article free on the JAMIA website and PMC immediately on publication.

All content older than 12 months is freely available on this website.

AMIA members can log in with their JAMIA user name (email address) and password or via the AMIA website.

Navigate This Article