rss
J Am Med Inform Assoc 18:827-834 doi:10.1136/amiajnl-2010-000076
  • Research and applications

The role of information technology in translating educational interventions into practice: an analysis using the PRECEDE/PROCEED model

  1. Mark A Supiano1,3
  1. 1Geriatric Research, Education, and Clinical Center (GRECC), George E Wahlen Department of Veterans Affairs Medical Center, Salt Lake City, Utah, USA
  2. 2Department of Biomedical Informatics, University of Utah School of Medicine, Salt Lake City, Utah, USA
  3. 3Department of Internal Medicine, Division of Geriatrics, University of Utah School of Medicine, Salt Lake City, Utah, USA
  4. 4Intermountain Healthcare, Salt Lake City, Utah, USA
  1. Correspondence to Dr Charlene Weir, Geriatric Research, Education, and Clinical Center - GRECC (182), George E Wahlen Department of Veterans Affairs Medical Center, 500 Foothill Dr, Salt Lake City, UT 84148, USA; charlene.weir{at}hsc.utah.edu
  • Received 21 July 2010
  • Accepted 19 March 2011
  • Published Online First 12 May 2011

Abstract

Objective The evidence base for information technology (IT) has been criticized, especially with the current emphasis on translational science. The purpose of this paper is to present an analysis of the role of IT in the implementation of a geriatric education and quality improvement (QI) intervention.

Design A mixed-method three-group comparative design was used. The PRECEDE/PROCEED implementation model was used to qualitatively identify key factors in the implementation process. These results were further explored in a quantitative analysis.

Method Thirty-three primary care clinics at three institutions (Intermountain Healthcare, VA Salt Lake City Health Care System, and University of Utah) participated. The program consisted of an onsite, didactic session, QI planning and 6 months of intense implementation support.

Results Completion rate was 82% with an average improvement rate of 21%. Important predisposing factors for success included an established electronic record and a culture of quality. The reinforcing and enabling factors included free continuing medical education credits, feedback, IT access, and flexible support. The relationship between IT and QI emerged as a central factor. Quantitative analysis found significant differences between institutions for pre–post changes even after the number and category of implementation strategies had been controlled for.

Conclusions The analysis illustrates the complex dependence between IT interventions, institutional characteristics, and implementation practices. Access to IT tools and data by individual clinicians may be a key factor for the success of QI projects. Institutions vary widely in the degree of access to IT tools and support. This article suggests that more attention be paid to the QI and IT department relationship.

Introduction

Implementing change in practice patterns for the care of older adults is a translational challenge. Recent reviews on the quality of outpatient care for older adults have noted significant deficits, including low rates of general preventive care,1 improper prescribing,2 and fragmentation of care.3 Improving geriatric care in primary care settings is essential given that the majority of older adults receive their care from primary care physicians and that, by 2030, the number of individuals over 65 will have more than doubled.4

Systematic reviews of education programs have found significant variation in outcomes, with some studies finding positive results5 and others failing to find an effect.6 7 Successful programs appear to be multi-faceted, use information technology (IT) resources, provide feedback, are individually tailored, and offer external rewards.8–12 Programs using educational outreach are particularly successful in modifying clinicians' behavior,13–17 especially when tailored to providers' stage of change and/or the clinician's individual characteristics.18 The inconsistency suggests a knowledge gap regarding what constitute successful program components and implementation processes. In particular, the role of IT in an educational program implementation is often not fully examined, as most educational intervention studies focus on content. Examining the role of IT in the overall implementation process is a key contribution to both the literature in geriatric education and medical informatics.19 In other words, evidence-based knowledge regarding the science of implementation is an essential component of the translation process.20–28 Most importantly, such an analysis contributes to the growing debate on the mechanism of success for informatics interventions.23 29–32

The purpose of this study is to conduct a formative evaluation of the implementation process for a community-based education program entitled ‘Advancing Geriatric Education through Quality Improvement’ (AGE QI). We used both qualitative and quantitative methods starting first with a qualitative exploration of constructs from the PRECEDE/PROCEED implementation model followed by a quantitative exploration of the hypotheses that emerged from the initial qualitative analysis.33 The issues of IT, electronic records, and QI emerged from the qualitative work and were the focus of the quantitative hypotheses. The outcomes of the educational interventions were compliance rates associated with the chosen QI projects. Additional measures were created to stand as proxies for the intensity of intervention and were treated as moderators.

The PRECEDE/PROCEED model

The PRECEDE (predisposing, reinforcing, and enabling constructs in educational diagnosis and evaluation) model was developed to support the design of effective healthcare interventions, as well as to provide a model to evaluate program outcomes.33 It is a motivational model that includes all levels of change recommended by Shortell34 and has been used extensively across a variety of domains, including healthcare and education.35–37 The model as adapted for our study is presented in figure 1.

Figure 1

Advancing Geriatric Education through Quality Improvement (AGE QI) PRECEDE/PROCEED model, adapted from Green and Kreuter.33 CD, compact disks; CME, continuing medical education; EMR, electronic medical record; IT, information technology; QI, quality improvement.

According to the model, change occurs as a result of the interaction between three categories of variables: ‘predisposing factors’, which lay the foundation for success (eg, electronic medical records or strong leadership); ‘reinforcing factors’, which follow and strengthen behavior (eg, incentives and feedback); and ‘enabling factors’, which activate and support the change process (eg, support, training, computerized reminders and templates or exciting content).

Methods

The University of Utah Institutional Review Board and the research oversight committees of all three institutions approved the AGE QI educational intervention. The methods section is divided into qualitative and quantitative components.

Description of the intervention

The AGE QI educational program includes five components: (1) a 35-page previsit syllabus and CD with geriatric didactic information; (2) three outreach visits, including an initial 1–2 h lecture on geriatric assessment essentials, a second onsite visit 1 month later focusing on QI project planning, and final review; (3) customized design and implementation of computerized tools for alerting, documentation, tracking, and decision support; (4) intensive ongoing weekly support, including phone calls and e-mails, creation of clinic-specific educational materials, work-process re-engineering, and monthly performance feedback; and (5) 20 free continuing medical education (CME) credits. Table 1 has the details.

Table 1

Topics and implementation activities by clinic and institution

Data sources and analysis

The study used three distinct data sources. Comprehensive data were extracted from a detailed logbook or ‘diary’ kept by the geriatric nurse educator. The logbook included specific dates and content, mode of interaction, as well as daily thoughts and impressions of the implementation team. We used reports of substantive interaction with clinics, including phone calls, extra visits and emails. The second source of data was the formative interviews conducted by the evaluation team which were tape recorded and analyzed for content. These interviews were tape recorded and analyzed thematically based on PRECEDE/PROCEED constructs. The third source of data came from compliance reports collected either from the information system in each setting or from data collected by staff during the process of care.

Settings

Thirty-three clinics across three institutional settings were approached to participate in the educational intervention: (1) 10 University of Utah Health Care (UHC) Community Clinics; (2) 19 Intermountain Healthcare Medical Group clinics (IHC); and (3) four VA Salt Lake City Health Care System community-based outpatient clinics (VA). See table 1 for details. All sites used an electronic health record (EHR).

EHR usage

The VA has had mandated provider order entry with electronic text entry for over 12 years using CPRS. The software comes with customizable fields and an alerting program that can use any orderable item to design a clinical reminder. All orders are entered electronically. The UHC uses Epic@ and includes all orders and electronic notes. Usage is generally mandated. IHC was transitioning from HELP2 to a GE product. Usage was not mandated until 2010. About 70% of drugs were ordered electronically and about 90–99% of notes were entered electronically during the study period. Laboratory results, procedures, and consultations are not included in the ordering system (Len Bowes, MD, personal communication). Usage of specific tools varied across clinics.

Description of participants

Of the 33 clinics that agreed to participate in session I, six chose not to complete the full geriatric QI project. Reasons include: (1) lack of provider participation; (2) lack of interest in a geriatric QI project; (3) low numbers of clinic geriatric population.

Qualitative methods

The analysis was conducted iteratively using the PRECEDE/PROCEED theory to guide inspection and discussion, rather than a grounded theory approach because the overarching purpose was to build on existing implementation theoretical constructs. The analysis was conducted by the research team and consisted of iterative discussions over a 6-month time period. Data were integrated from the formative interviews, weekly meeting reports, and logbooks. Mapping the data on to the constructs of the PRECEDE/PROCEED framework was performed by discussion and through consensus. Member checking was carried out with some of the clinic participants.

Quantitative methods

The quantitative analysis explored a key finding from the qualitative work that the relationship between the IT and QI departments appeared to differ between institutions and may have contributed to outcomes through the degree of support provided by the research team. Outcomes included pre–post changes, final compliance levels, duration of program implementation, and the CME/provider ratio. Analysis of covariance was used to tease out the effects of supportive activities on differences found in institutions.

Results

The results are divided into two sections. The first is the qualitative analysis of the implementation process using the PRECEDE/PROCEED model. The second reports on a quantitative analysis addressing the emergent hypothesis regarding the relationship between organizational structure, implementation processes, and outcomes.

Qualitative results of the PRECEDE/PROCEED model

The following analysis presents process evaluation results organized around the concepts identified in the PRECEDE/PROCEED model.

Predisposing or antecedent factors

Three categories of predisposing factors were identified: (1) IT department function; (2) institutional experience with QI; (3) characteristics of the implementation team.

IT department structure and function

The existence of an effective EHR in each setting was a key factor. All three institutions have some form of EHR. However, the degree of adoption varied substantially across institutions (table 2).

Table 2

Description of clinical settings

The ease by which IT could access resources also emerged as central to the process. Data input screens had to be customized for most QI projects and data retrieval mechanisms needed to be set up in order to provide monthly feedback to clinics. As a result, identifying the mechanism for ‘activating’, the IT department support took up significant time for the implementation team. The VA required a direct request to meet the clinical application coordinator by email or phone, which could be initiated by any clinician. Data retrieval was directly under the control of clinic administrators through the set up of ‘clinical reminder’ reports, which did not require ongoing intervention by the data analyst.

IHC has also had significant experience integrating IT and QI. However, creating and changing clinical reminders, alerts, and templates required high levels of approval and was not perceived to be achievable within the timeframe of 6–9 months. As a result, few changes were made to assist the QI projects directly. However, the system was sufficiently robust to support many (but not all) of the identified QI activities to some degree. Data extraction was difficult, requiring multiple interactions at the institutional level.

The university system required relatively high-level approval from several different administrative committees. The institution decided to support a single QI activity—fall screening and prevention—for all 10 clinics. The time to obtain approvals and build new fields for falls in Epic@ was about 1.5 years. Screens and clinical reminders were built accordingly. Data retrieval requests at the institutional level required 3–6 months. However, mechanisms were in place for local clinics to retrieve some compliance data at the clinic level.

QI experience

All three institutions had substantial experience with QI activities, although there was substantial variation between institutions. IHC has a history of an exceptionally strong program and has imbedded QI principles into everyday care for the last 12 years.38 They also give incentives to clinicians to conduct QI activities. The VA has implemented systematic performance measurement reviews covering most clinical care areas. They have invested in significant QI training for clinicians, but clinicians are not, in general, given incentives to conduct QI activities. UHC has a well-established QI office whose main focus is meeting institutional requirements, and they do not mandate QI training, nor provide individual incentives for QI activities.

Implementation team

The third predisposing factor was the composition of the study team and their relationship with the three institutions. At least one member of the research team worked at each of the institutions. These personal connections allowed ‘insider’ knowledge of how the system worked, where resources were located, and what individuals would be good change agents.

Reinforcing factors (incentives and disincentives)

Reinforcing factors serve to improve motivation, sustain interest, and focus attention, both individually and as a group. We identified three reinforcing factors: (1) CME credit; (2) anonymous feedback; (3) alignment with institutional goals.

CME program

Free CME credits were given for completion of the full program with a possible total of 20 h per provider. This study was one of the first in the nation to utilize AMA PRA Category 1 credits for performance improvement in the outpatient setting. It was so well organized that it recently won the 2011 National Award for Outstanding CME Outcomes Assessment by the Alliance for Continuing Medical Education (CME). Overall, 1085 credits have been awarded to 102 providers.

Feedback

The provision of performance feedback is a key evidence-based component of education and clinical guideline implementation. The implementation team attempted to provide performance feedback monthly. This feedback took the form of compliance with the nature of the QI project (eg, number screened/number eligible per month) and was presented in the form of a control chart.

Alignment with ongoing institutional goals

Alignment with larger institutional goals appeared to be particularly important overall especially for prioritizing IT resources and time. Projects outside the institution's priorities suffered significant delays and required more effort. For example, the VA clinic that chose driving assessments for older patients required substantial new dialog language in the reminders and was only used in that clinic. In contrast, because UHC chose fall prevention for all of their clinics in order to meet the Joint Commission's National Patient Safety Goal No 9 (reducing the risk of harm resulting from falls), the IT department and the QI department were able to work synergistically (the institutional solution improved access to IT resources). At IHC, the marketing of the program as a method to fulfill QI project mandates was useful for creating interest and was also in alignment with the overarching quality goals of IHC.

Enabling or barrier factors

Enabling factors energize and stabilize the intervention. We identified four key enabling factors: (1) flexibility; (2) supportive activities; (3) implementation of supportive activities, and (4) redundant content.

Flexibility

We identified ‘flexibility’ as a key enabling factor, and included the willingness to adjust the timing, content, and resources of the program as needed ‘at the clinic level’. Session presentations were given anytime between 07:00 and 19:00. This flexibility, however, put extra demands on the IT department for variation in content, timing, and level of support.

EHR supportive activities

Across all three institutions, the research staff assisted at all levels of EHR usage and design, but the type of support varied significantly at different institutions. At the VA, data entry variables and note templates were created by the IT liaisons, taking a few weeks to create and not requiring higher-level approval. UHC required over a year of meetings to acquire the necessary institutional approvals and several months to develop and test new input templates. No attempts were made to change the input screens at IHC, but much more effort was put into training staff at clinics to use the EHR more intensively.

Implementation of supportive activities

The research staff participated in work-process re-engineering, one-on-one training, and general consultation for all clinics. Most of the clinics required extra training of staff and weekly phone calls until the QI project was more fully implemented. The need for idiosyncratic, clinic-specific, and provider-specific tools and support became increasingly more obvious as we completed implementation at all 33 clinics.

Redundant content

Methods of content delivery ranged from lectures incorporating an audience response system, written materials (PowerPoint slides and pocket cards), posters, handouts, computerized alerts, and ‘academic detailing.’ Although there was significant emphasis on using the computerized technology, computer tools were not sufficient to serve as reminders because substantial patient interaction occurred without a computer. In other words, computerized interventions had to be supplemented by non-computerized support components in order to deliver an effective program.

Summary

The qualitative analysis above identified key IT factors important for success that varied across the three organizations. The implementation team responded to these variations by adapting their strategies as needed (see the section on flexibility). Because of the variation in strategies across institutions, it is difficult to tease out the independent influence of organizational structure and implementation activities. The following analysis is an attempt to explore these questions quantitatively.

Quantitative analysis of clinic implementation metrics and QI outcomes

Analysis was performed at the clinic level only and has four components: (1) descriptive results; (2) differences in outcomes across institutions (pre–post changes, final compliance levels, duration of program, and the CME/provider ratio); (3) differences between institutions in the intensity of supportive activities; (4) differences between institutions in outcomes, after supportive activities had been controlled for in order to tease out the independent effects of organizations. Table 3 presents the data aggregated by institution.

Table 3

Mean differences in study variables across institutions

Descriptive

Overall, 82% of the clinics completed the 6-month QI project (100% UHC, 74% IHC, and 75% VA) for a total of 128 providers and 252 staff. Of those programs that completed the project, 75% improved outcomes to some degree. The average percentage improvement (post–pre differences) ranged from −0.09% to 100%, with a mean of 21%. The average number of CMEs per provider (a variable indicating provider involvement) was 12.59, and the average length of time to implement (preparation time plus 6 months QI implementation) was 14 months. A total of 131 visits were made for an average of four/clinic, and a range of one to nine. In addition, over 134 additional contacts were made (by phone or email) with an average of 4.06 per clinic. Specific clinic level implementation activities and outcomes are presented in table 3.

Differences in outcomes across institutions

Pre–post changes

Pre–post changes in compliance across QI outcomes across institutions were found to differ significantly (F2,24=4.33; p=0.025). The VA had the largest increase of 66%, with IHC and UHC at 27% and 15%, respectively. Tests for heterogeneity of variance were non-significant. Table 3 shows the means.

Duration of implementation

Duration of implementation also significantly differed across institutions (F2,24=30.04; p=0.000) with the VA and IHC taking 11 months on average, and 20 months for UHC. Tests for heterogeneity of variance were non-significant. Although we only have estimates of the cause of the delay, it appeared that obtaining approval for EHR customization (templates, reminders, and fields) was the largest single reason for the delay, followed by the need for extra training to use the computer. In both cases, the VA had the shortest time, with no real need for institutional approvals and no required training. IHC did not carry out any clinic level customization (so that their time was shorter), but did require some extra training. UHC required a year and a half of meetings to obtain approvals for changes in Epic and to conduct the necessary training.

CME hours per provider

Finally, we found no significant differences in the number of hours of CME per provider earned across institutions. This outcome variable was the least sensitive to IT issues, as the providers had the most control (they could complete as much of the required work as they wanted).

Differences in supportive activities

We also found significant differences in the amount of supportive activities required between institutions. UHC received nearly twice as many extra visits on average as well as extra patient education materials (VA and IHC did more of their own). However, the VA received significantly more contacts than UHC or IHC, mostly because the research team could more directly design the clinical reminders and note templates.

Differences in outcomes controlling for supportive activities

To examine the effect of institutional differences on outcomes independent of supportive activities, we conducted an analysis of covariance, which tests for the impact of a variable after controlling for a covariate. We also controlled for the number of providers (a surrogate for the size of the clinic) and the overall number of contacts (as a surrogate for the intensity of support) on the two outcome measures (post–pre compliance and implementation time).

Significant differences between institutions remained for post–pre changes after the number of visits and the number of providers had been controlled for (F3,23=4.02; p=0.05). An even stronger effect was noted for implementation time with the difference between systems being highly significant (F3,23=26.44; p=0.00). In both analyses, the numbers of contacts and providers were non-significant.

Duration of implementation

The significant differences between institutions regarding the average duration of completion remained after the number of visits had been controlled for (F3,23=11.56; p=0.000), but, as in the above analyses, number of visits became non-significant in the overall model. Similarly, the comparison between institutions for the duration of implementation after number of contacts had been controlled for was also significant (F3,23=28.28; p=0.000), with the number of contacts becoming non-significant.

Summary

In summary, the exploratory quantitative analysis presented above indicates that institutional characteristics had a strong relationship with outcomes. Because we found that these differences persisted after controlling for supportive activities, we have tentative support for our general hypothesis of the importance of variation in institutional QI–IT relationships while disentangling the research implementation process.

Discussion

This theory-based process analysis identified predisposing, reinforcing and enabling factors that are important to the success of a multi-component educational intervention. The IT role was integral in all three categories. Two general IT themes emerged from the qualitative analysis and were explored in the quantitative section. First, the use of IT in clinical change processes is both idiosyncratic and ubiquitous, across all organizational levels. Each clinic required a unique combination of support, re-engineering, education, and customization activities. Second, the qualitative results highlighted how each organization has a unique pattern of integrating IT with QI and clinical work processes, and this relationship may be a critical implementation factor to success.

The first theme regarding the idiosyncratic requirements of each clinic and clinician suggests that it might not be possible to ever ‘standardize’ or even directly predict the implementation process. Providing individualized IT tools at both the clinic and clinician level may be a critical component of success for any clinical intervention and should be part of assessment for readiness. The importance of systematically ‘diagnosing’ a clinical context has been promulgated by a number of health service researchers in the form of readiness assessments.39–41 This perspective is also congruent with recent work highlighting the impact of micro-systems42 and the complexity of workflow modeling.43 Future evolutions of EHRs and IT should move to make tools available to support the clinician's information management goals and educational learning needs.44

The finding that each organization has a unique pattern of integrating IT with QI and clinical work processes suggests that this relationship may provide a causal mechanism for implementation success that is often not considered.34 45–51 Easy access to IT departmental resources can significantly affect how well quality and educational programs are developed and implemented, a finding noted by others.52 We identified the concept of IT department ‘activation’, which refers to both the ease of access and the degree of customization available at the clinic level. Mechanisms to access IT resources and the IT/QI department relationship has not been explicitly explored and could be an area of future work. One way to identify mechanisms is to hypothesize them directly. In their model for Realistic Evaluations, Pawson and Tilley suggest that it is important to identify context mechanism outcome configurations.53 From this perspective, the IT/QI relationship may support implementation success through several causal processes. First, it may lead to greater congruence or alignment across the institution, from leadership to staff. Second, it may bring the deep history of change processes into the hands of IT support staff. Or, third, it may simply bring resources that are hard to come by in strapped institutions (eg, spread the pain).

These findings do not diminish the role of understanding patterns of adoption at the individual level. Models, such as the Technology Adoption Model,54 55 were developed from older value-expectancy motivational models, such as the Theory of Planned Behavior56 and Theory of Reasoned Action,57 in which intentions are predicted from attitudes, social norms, values, and outcome expectancies. The Technology Adoption Model specifically focuses on usefulness and usability of the technology. Our intervention was not particularly new and, in fact, the implementation team attempted to minimize any significant change in their electronic environment. Hence we would expect that the variables measured in the Technology Adoption Model would be less useful in predicting behavior for this intervention.

Scientific importance

Several authors have called for a greater clarification of the phenomenon of information technology as a causal variable and for a more theoretical approach to design and implementation.58–61 For example, the authors from the RAND Corporation recently published a meta-analysis of the costs and benefits of health information technology,61 noting: ‘In summary, we identified no study or collection of studies, outside of those from a handful of HIT leaders, that would allow a reader to make a determination about the generalizable knowledge of the system's reported benefit … This limitation in generalizable knowledge is not simply a matter of study design and internal validity.’ (p 4). This work is a call to increase the theoretical grounding of informatics. The strength of the study is the integration of quantitative and qualitative findings. By extending the qualitative findings into exploratory quantitative analysis, the groundwork is laid for better, more theoretically based hypothesis testing in this field.

The idiosyncratic and unique requirements of each clinic meant that the supportive activities were applied differently in each setting, making it not possible to tease out the intensity of support from need. This dilemma is at the core of key issues in the tension between program evaluation and the need for scientific generalizability. It is also the point made in a recent systematic review of the impact of context on the success of QI activities.62 Although the results may not be entirely surprising, this study illustrates the great need to develop theoretical hypotheses about the mechanisms of action prior to engaging in a study and following up with those measures as part of a systematic analysis. Most current theories of implementation are too broad to lead to testable questions. Identifying these questions more specifically as they arise from formative analyses and planning can enhance generalizabiilty and the building of a science of informatics. These questions can include a focus on the conditions that might need to be triggered to create the needed mechanisms, the possible interaction of different factors that might be causally related to outcomes and an examination across all levels from individual providers to the organizational level.

The results of this study also provide practical advice to future designers of clinical interventions regarding the combination of institutional and implementation approaches that should be addressed.63 Recent reviews have noted the importance of an integrated educational model, and this study provides a model of the implementation process that integrates multiple levels of change.64 65

Finally, the new emphasis on comparative effectiveness research requires a deeper understanding of the underlying causal mechanisms of IT interventions.66 This understanding requires a stronger emphasis on theoretical perspectives if a generalizable science is to be developed.67

Limitations

This study has several limitations. Only three organizations participated and not all clinics within each institution were approached. Those who participated might be different on some immeasurable dimension than those who did not. The quantitative analysis is essentially exploratory only and should not be overinterpreted, as random assignment was not performed and sample size varied across groups and in one case was quite small. Because we did not quantify the institutional variables identified in the qualitative portion of our study (IT access and the IT/QI relationship), we cannot test for their impact directly. As a result, our conclusions are only suggestive. The investigator team served as the implementation facilitators and their role is inextricably intertwined with the outcomes. This paper was an attempt to measure those relationships directly. Finally, the three institutions are all in the same part of the country, resulting in unknown cultural and population bias.

Conclusions

This study highlights the complex role of IT in the design and implementation of educational and QI projects. Access to IT and the degree of integration between the institutional departments of IT and quality emerged as particularly salient in this study and should be investigated more systematically in future studies. Enhancing tools to bring control of IT interventions to the clinic and the individual clinician may be a necessary component of future success.

Footnotes

  • Funding Funded by a grant from the Donald W Reynolds Foundation to MS. Supported by the University of Utah Center for Clinical and Translational Science NIH grant #1U54RR023426-01A1.

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.

References

Related Article

Free Sample

This recent issue is free to all users to allow everyone the opportunity to see the full scope and typical content of JAMIA.
View free sample issue >>

Access policy for JAMIA

All content published in JAMIA is deposited with PubMed Central by the publisher with a 12 month embargo. Authors/funders may pay an Open Access fee of $2,000 to make the article free on the JAMIA website and PMC immediately on publication.

All content older than 12 months is freely available on this website.

AMIA members can log in with their JAMIA user name (email address) and password or via the AMIA website.