Testing e-mail content to encourage physicians to access an audit and feedback tool: a factorial randomized experiment

Special Article

Testing e-mail content to encourage physicians to access an audit and feedback tool: a factorial randomized experiment

G. Vaisson, BSc MSc*, H.O. Witteman, PhD*, S. Chipenda-Dansokho, PhD*, M. Saragosa, RN MN, Z. Bouck, MPH, C.A. Bravo, MSc, L. Desveaux, PhD, D. Llovet, MA PhD, J. Presseau, BA MRes PhD, M. Taljaard, PhD, S. Umar, BA MSc, J.M. Grimshaw, MBChB PhD, J. Tinmouth, MD PhD, N.M. Ivers, MD PhD

doi: http://dx.doi.org/10.3747/co.26.4829



In Ontario, an online audit and feedback tool that provides primary care physicians with detailed information about patients who are overdue for cancer screening is underused. In the present study, we aimed to examine the effect of messages operationalizing 3 behaviour change techniques on access to the audit and feedback tool and on cancer screening rates.


During May–September 2017, a pragmatic 2×2×2 factorial experiment tested 3 behaviour change techniques: anticipated regret, material incentive, and problem-solving. Outcomes were assessed using routinely collected administrative data. A qualitative process evaluation explored how and why the e-mail messages did or did not support Screening Activity Report access.


Of 5449 primary care physicians randomly allocated to 1 of 8 e-mail messages, fewer than half opened the messages and fewer than 1 in 10 clicked through the messages. Messages with problem-solving content were associated with a 12.9% relative reduction in access to the tool (risk ratio: 0.871; 95% confidence interval: 0.791 to 0.958; p = 0.005), but a 0.3% increase in cervical cancer screening (rate ratio: 1.003; 95% confidence interval: 1.001 to 1.006; p = 0.003). If true, that association would represent 7568 more patients being screened. No other significant effects were observed.


For audit and feedback to work, recipients must engage with the data; for e-mail messages to prompt activity, recipients must open and review the message content. This large factorial experiment demonstrated that small changes in the content of such e-mail messages might influence clinical behaviour. Future research should focus on strategies to make cancer screening more user-centred.

KEYWORDS: Cancer screening, primary health care, audit and feedback, e-mail, persuasive communication, behaviour change techniques, factorial experiments, process evaluations


Despite guidelines, many patients do not undergo recommended screening tests such as mammography (breast cancer), Pap tests (cervical cancer), and colonoscopy (colorectal cancer)1,2. For example, in Ontario during 2012–2014, 35% and 37% respectively of the eligible population did not receive recommended mammography or Pap test screening, and 40% were overdue for colorectal cancer screening3. Cancer screening uptake depends on multiple factors at the patient, provider, and organization levels47.

Cancer Care Ontario is the Ontario provincial agency responsible for organizing population-wide screening programs for breast, cervical, and colorectal cancers8. In many cases, eligible Ontarians must access those screening programs through a primary care provider. Recommendations, communication, and quality of the discussion between health care providers and their patients about cancer screening options are important determinants of the use of screening services6. Healthy People 2020, launched by the U.S. Department of Health and Human Services, set “increasing the proportion of adults who were counseled about cancer screening consistent with current guidelines” as one of its objectives9. A number of strategies, including electronic audit and feedback, have been attempted to encourage health professionals to change their behaviour so as to align their practice with evidence-based health directives, but so far, no optimal strategy has been identified for changing primary physician behaviour with respect to cancer screening1013.

Audit and feedback tools provide health professionals with a summary of their practice performance over a given period of time, comparing their performance with the performance of other professionals or with provincial or national standards10. The effects of audit and feedback vary depending on the way in which the intervention is designed and delivered14,15. In fact, audit and feedback interventions can be considered a platform for delivering behaviour change techniques to support the delivery of guideline-concordant care16,17. The Screening Activity Report created and administered by Cancer Care Ontario is an online audit and feedback tool that is updated monthly. Eligible primary care physicians in Ontario can register to receive a Screening Activity Report that lists their patients who are overdue for screening or who have received an abnormal result and require follow-up. Registered physicians (8462 at 1 May 2017) must work in a patient enrolment model—that is, having a known roster of patients for whom they are paid in part through capitation18. Registered physicians receive monthly e-mail messages from Cancer Care Ontario informing them that their report has been updated. Use of the Screening Activity Report is associated with slightly higher rates of cancer screening19, but only a small proportion of registered family physicians (38% in 2014) access their Screening Activity Report19.

In the present study, we aimed to examine the impact on Screening Activity Report access and cancer screening rates of 3 behaviour change techniques—anticipated regret, material incentive (behaviour), and problem-solving20— within the monthly e-mail messages sent by Cancer Care Ontario to primary care physicians. We hypothesized that the behaviour change techniques would increase Screening Activity Report access and cancer screening rates.


This pragmatic randomized factorial experiment and process evaluation used the Multiphase Optimization Strategy framework to optimize and evaluate multicomponent behavioural interventions21,22. The study relates to the Preparation and Optimization phases of the framework. Details of the methods and intervention development have previously been described23,24. An assessment of this pragmatic experiment scored 5 on a 5-point Likert scale for 7 of the 9 domains on the precis-2 tool25 (supplemental Table 1).

Experimental Design, Setting, and Eligibility Criteria

Participants in this 2×2×2 randomized factorial experiment were primary care physicians (5449 in May 2017) who had accepted to receive routine e-mail messages about the Screening Activity Report. Nominated delegates (that is, people designated to access the Screening Activity Report on behalf of eligible physicians) were excluded, a decision that was made because of uncertainties in attributing the intervention to Screening Activity Report access by a delegate (that is, risk of contamination). The Research Ethics Board at Women’s College Hospital approved the study, with a waiver of informed consent for physician participation, given that the study met all Article 3.7A criteria of the Tri-Council Policy Statement26,27. The study is registered at http://ClinicalTrials.gov/ as NCT03124316.


In alignment with the Preparation phase of the Multiphase Optimization Strategy framework, the research team (including specialists in behaviour change theories and 2 senior research experts in qualitative methods) selected and drafted e-mail content based on literature about physician behaviour change, informed by the behaviour change techniques taxonomy (version 1) set out by Michie and colleagues20,23. The first drafts of the e-mail message incorporated 6 behaviour change techniques (“active components of an intervention designed to change behavior”28): anticipated regret, information about others’ approval, material incentive (behaviour), problem-solving, salience of consequences, and credible source.

To refine the intervention content, we then conducted two 2-hour workshops and three 2-hour focus groups with Screening Activity Report adopters and non-adopters, taking a user-centred approach23. User-centred design is a highly iterative development process involving users or potential users early and often during the process to meet audience needs29. It aims to develop useful products or tools that are adapted to people rather than to make people use a pre-specified product or tool that might not suit them29.

During the first workshop, we asked the Screening Activity Report adopters to discuss their current practice for monitoring screening participation and talked about the benefits of the Screening Activity Report. We then asked them to write a letter to their colleagues that would convince those colleagues to use the Screening Activity Report. Finally, we showed them the pre-crafted messages containing the behaviour change techniques to get their initial reactions. Between workshops 1 and 2, we used their content and considered their reactions in developing an e-mail message that would be tested in workshop 2. During workshop 2, we received reactions to 2 e-mail messages with varied content. Between workshop 2 and the focus groups, we refined content. Finally, during the focus groups, we tested 2 e-mail messages with varied content and refined the content between the focus group sessions.

The final e-mail message operationalized 3 behaviour change techniques targeting different behavioural determinants28. Those techniques were anticipated regret (targeting beliefs about consequences, intention, and emotions: “How would you feel if a patient had a poor outcome because you missed an abnormal test result?”), material incentive (behaviour) (targeting reinforcement: “Logging into the [Screening Activity Report] can help you maximize your screening rates and save time when calculating your preventive care bonus.”), and problem-solving (targeting behavioural regulation and environmental context and resources, including “Email ONE ID [College of Physicians and Surgeons of Ontario, Toronto, ON] to register a delegate with eHealth Ontario so they can check your report”)28.

Testing the 3 behaviour change techniques in our factorial design resulted in 8 different versions of the e-mail message according to whether the message contained each of the 3 operationalized techniques [hereinafter called “factors” (supplemental Appendix 1): with no factors; with anticipated regret, or material incentive (behaviour), or problem-solving; with anticipated regret plus material incentive (behaviour); with anticipated regret plus problem-solving; with material incentive (behaviour) plus problem-solving; or with all 3 factors.

The e-mail messages were sent monthly by Cancer Care Ontario for 4 months between 10 May and 10 September 2017. Physicians received the same e-mail version every month.

Baseline Characteristics

Physician sex and years of practice [Corporate Provider Database (ices, Toronto, ON)], history of Screening Activity Report use [previous use or no previous use between October 2014 and April 2017, Reports Accessed by User Report (Cancer Care Ontario, Toronto, ON)], screening rates for breast, cervical, and colon cancer as of March 2017 [Regional Primary Care Provider Report (Health Quality Ontario, Toronto, ON)], and general practice characteristics [practice size, size of group, and rurality, Client Agency Program Enrolment (ices)] were obtained from routinely collected administrative data.


The primary outcome was Screening Activity Report access, defined as at least 1 log-in event during the 4-month experiment. Secondary outcomes were the number of different days in the 4-month period that participants logged in and the proportion of a physician’s enrolled eligible patients who were up-to-date for breast, cervical, and colorectal cancer screening at the end of the 4-month period. Data were not available for physicians who had fewer than 6 enrolled patients eligible for screening (data censored). We also tracked calls to Cancer Care Ontario and eHealth Ontario about the Screening Activity Report. The “open rate” of the e-mail message and the “click rate” (defined as the percentage of e-mail messages that attracted at least 1 click-through to the Screening Activity Report directly from the link in the message were assessed to evaluate intervention fidelity.

Data Collection

Routinely collected administrative data (ices) were used to obtain instances of Screening Activity Report access; up-to-date cancer screening status for breast, cervical, and colorectal cancer screening; baseline physician characteristics; and balance measures. Open and click rates were automatically collected by Mailchimp (https://mailchimp.com/ Rocket Science Group, Atlanta, GA, U.S.A.), the software automation platform used to send e-mail messages.


The sample size was the number of eligible physicians who were registered and appeared on the active user list of the Screening Activity Report at the time of randomization. After removal of duplicate e-mail addresses, the total number of unique eligible physicians was 5449. That sample size achieves a 94% power to detect an absolute difference of 4% in Screening Activity Report use (between participants with the factor present and those with the factor absent) using a 2-sided test at the 5% level of significance, assuming a control arm proportion of 0.20.

Randomization, Allocation Concealment, and Blinding

The allocation schedule was computer-generated by the study biostatistician (ZB) using simple unrestricted randomization. The allocation sequence was applied to a de-identified list of eligible physicians (May 2017) exported from Cancer Care Ontario. Because Cancer Care Ontario could distribute only one type of e-mail message at a time, the order in which the 8 message types were sent was determined randomly every month. The e-mail message versions were sent at 45-minute intervals starting at 09h00 on the 10th day of each month (or if that day fell on a weekend, on the next business day) between 10 May and 10 September 2017.

Randomization was conducted using the SAS software application (version 9.4: SAS Institute, Cary, NC, U.S.A.).

Statistical Analysis

The primary outcome was analyzed using robust Poisson regression analysis30 rather than the logistic regression analysis originally planned24, yielding estimates as relative risks (rrs) rather than as odds ratios. The secondary outcomes (number of times the Screening Activity Report was accessed and cancer screening rates) were analyzed using a log-Poisson regression yielding estimated rate ratios (rars). The unit of analysis was the individual physician. The 3 intervention components were entered as main effects using effects coding to express the marginal effect of each intervention component; that is, the estimated rr or rar represents the average effect across conditions with the particular intervention component present (for example, the average effect of anticipated regret on primary care physicians allocated to study arms 2 + 5 + 6 + 8) compared with conditions in which the intervention component was absent (the average effect of anticipated regret on primary care physicians allocated to study arms 1 + 3 + 4 + 7). Secondary analyses examined 2-way interactions between the intervention components. Per the pre-specified protocol, the primary analysis was adjusted for history of Screening Activity Report use. Pre-specified subgroup analyses examined effect modification through stratification by physician history of report access (binary) and sex (men vs. women). We had planned to explore additional effect modifiers, but given the absence of significant findings for the main effects, we chose to forego such analyses to avoid unwarranted increases in the risk of type i error. Payment models could not be tested because, counter to our original analytic plan24, data were not available from Cancer Care Ontario.

The analyst (GV) was not blind to experimental arms. The threshold for statistical significance was 0.05, and all analyses were conducted in the SAS software application.

Process Evaluation

We explored how and why the e-mail messages did or did not prompt physicians to access the Screening Activity Report.

Primary care physicians were recruited by convenience sampling, through the personal network of the principal investigator (NMI) and through snowball sampling as a secondary strategy. Because of privacy regulations, specific contact with participating physicians in each study arm was not possible. Physicians were eligible if they were randomized to receive an intervention e-mail message as part of the study. Between August and September 2017, consenting physicians participated in a 30-minute semi-structured telephone interview and received an honorarium of CA$150 in appreciation of their time. Physicians were recruited until data saturation was reached—that is, when no new comments were heard during the interviews31.

A first interview guide was developed by GV, MS, LD, NMI, and HOW24 and was later adapted into 8 versions to match the e-mail versions received by the physicians (supplemental Appendix 2). Interview guide development was led by the Theoretical Domains Framework. Questions therefore sought to prompt physicians about the determinants of behaviour relating to Screening Activity Report access and cancer screening guideline adherence. If physicians could not retrieve the version of the e-mail message they had received, the version containing the anticipated regret, material incentive, and problem-solving factors was sent before the interview. All interviews were audio-recorded and transcribed.

The initial deductive framework was derived from the 3 behaviour change techniques20. However, because most physicians had not opened the intervention e-mail messages and were therefore not exposed to the intervention, we used framework analysis32 to develop summary categories that captured the central themes of the interviews. The lead author (GV), together with an experienced mixed-methods researcher (SCD), generated a set of initial themes after coding 3 interviews. Then, GV and SCD refined and sought themes in an iterative manner until no new themes were identified. Supplemental Table 2 presents the final codes. Using the new coding structure, all data were coded by 2 independent analysts (GV, SCD) using the NVivo software application (version 11: QSR International, Melbourne, Australia).


Participant Flow

The study included 5449 physicians, 2606 of whom received e-mail messages targeting anticipated regret; 2744, material incentive; and 2776, problem-solving. The study flow diagram (Figure 1) details the number of physicians receiving each e-mail version. Each month, between 4.6% and 7.3% of the messages bounced back (supplemental Table 3). The final retention rate was 99.0% (5396/5449, Figure 1).



FIGURE 1 Participant flow. E-mail 1, no factors; e-mail 2, anticipated regret; e-mail 3, material incentive (behaviour); e-mail 4, problem-solving; e-mail 5, anticipated regret and material incentive (behaviour); e-mail 6, anticipated regret and problem-solving; e-mail 7: material incentive (behaviour) and problem-solving; e-mail 8: anticipated regret and material incentive (behaviour) and problem solving. SAR = Screening Activity Report; drop-out = Screening Activity Report account no longer active (physician did not access the Screening Activity Report at least once in the preceding year); unsubscribing = physician no longer receiving the e-mail message from Cancer Care Ontario.

Baseline Characteristics

Baseline characteristics of the participants were similar for all 3 factors (Table I) and the 8 different e-mail versions (supplemental Table 4).

TABLE I Baseline physician characteristics according to each operationalized behaviour change technique


Screening Activity Report Access by Factor

Table II presents descriptive summaries of the primary and secondary outcomes. On average, physicians accessed their Screening Activity Report less than 1 time during the 4 months of follow-up regardless of factor or e-mail version (Table II and supplemental Table 5). The Screening Activity Report was accessed by 567 (21.8%), 558 (20.3%), and 540 (19.5%) of the physicians in the anticipated regret, material incentive, and problem-solving groups respectively. The primary outcome is presented in supplemental Table 5 by e-mail version. The e-mail messages containing only the material incentive factor had the highest access rate (24.2%); e-mail messages containing both the material incentive and problem-solving factors had the lowest rate (17.3%) (supplemental Table 5).

TABLE II Descriptive statistics for outcomes according to each operationalized behaviour change technique


Main Effects on Screening Activity Report Access

Table III presents the results from the multivariable Poisson regression analysis of the primary outcome. No factor was associated with a statistically significant increase in Screening Activity Report use. However, after adjustment for history of Screening Activity Report use, problem-solving e-mail messages were associated with a statistically significant 12.9% relative reduction in Screening Activity Report access [rr: 0.871; 95% confidence interval (ci): 0.791 to 0.958; p = 0.005]. The material incentive factor (rr: 0.954; 95% ci: 0.867 to 1.050; p = 0.34) was also associated with a reduction (but not a statistically significant one), and the anticipated regret factor was associated with a nonsignificant increase (rr: 1.072; 95% ci: 0.975 to 1.180; p = 0.15).

TABLE III Multivariable Poisson regression analysis of primary and secondary outcomes


Main Effects on the Number of Times the Screening Activity Report Was Accessed

Interpretation of results for the number of times that the Screening Activity Report was accessed (Table III) is similar to that for the primary outcome. After adjustment for history of Screening Activity Report use, the problemsolving factor was associated with a statistically significant decrease of 18.7% in the number of times the Screening Activity Report was accessed (rar: 0.813; 95% ci: 0.708 to 0.934; p = 0.003). The material incentive factor was also associated with a decrease, but not a statistically significant one (rar: 0.887; 95% ci: 0.773 to 1.018; p = 0.09). The association with anticipated regret was positive, but not statistically significant (rar: 1.090; 95% ci: 0.950 to 1.251; p = 0.22).

E-Mail Message Open and Click Rates

In May, open rates varied from 50% to 57%, and click rates varied from 7% to 14% (supplemental Table 3). Open rates then steadily declined to 45%–55% in June, to 44%–47% in July, and to 40%–46% in August. Click rates decreased to 6%–11% in June, to 7%–10% in July, and to 4%–8% in August (supplemental Table 3).

Main Effects on Up-to-Date Breast, Cervical, and Colorectal Cancer Screening Status

No e-mail message was associated with a statistically significant effect on rates of breast cancer screening or colorectal cancer screening. But after adjusting for the cervical cancer screening baseline, a statistically significant positive effect on cervical cancer screening was observed for problem-solving e-mail messages (rar: 1.003; 95% ci: 1.001 to 1.006; p = 0.003; Table III). With 2,365,062 patients being eligible for cervical screening, that difference represents 7568 more patients being screened during the 4 months.

Ancillary Analyses

Exploratory Interactions and Modification Effects

Table IV presents results from the exploratory subgroup analyses stratified by sex and history of Screening Access Report use. The pattern of results was observed to be similar for the male and female physicians. In particular, for male and female physicians alike, Screening Activity Reports were accessed less frequently by those receiving the problemsolving e-mail messages than by those not receiving those messages (rr for men: 0.849; 95% ci: 0.740 to 0.975; p = 0.02; rr for women: 0.895; 95% ci: 0.783 to 1.022; p = 0.10).

TABLE IV Factors affecting the primary outcome, stratified by sex and history of Screening Activity Report (SAR) use


Subgroup differences were observed for physicians with and without a history of access to Screening Activity Reports. For those previously without access, the e-mail messages had no association with statistically significant effects; however, for physicians with previous access to the Screening Activity Report (compared with their counterparts not having access), the receipt of problem-solving e-mail messages was associated with a statistically significant 13.8% lower risk of Screening Activity Report access (rr: 0.862; 95% ci: 0.781 to 0.951; p = 0.003; Table IV).

Analyses examining two-way interactions between the factors found no statistically significant interactions (supplemental Tables 6 and 7).

Process Outcomes by Factor

The problem-solving factor was associated with the number of calls to Cancer Care Ontario or eHealth (rar: 2.226; 95% ci: 1.055 to 4.694; p = 0.04), but anticipated regret and material incentive were not (anticipated regret rar: 1.164; 95% ci: 0.586 to 2.312; p = 0.67; material incentive rar: 0.560; 95% ci: 0.274 to 1.142; p = 0.11; supplemental Table 8).

Drop-outs and unsubscribing were not associated with any factor (anticipated regret rr: 1.130; 95% ci: 0.661 to 1.933; p = 0.65; material incentive rr: 1.284; 95% ci: 0.748 to 2.206; p = 0.36; problem-solving rr: 0.997; 95% ci: 0.583 to 1.704; p = 0.99; supplemental Table 8).

Process Evaluation

Supplemental Table 9 describes the characteristics of the physicians interviewed (n = 11). All physicians used electronic medical records, with a median of 7 years of use (interquartile range: 5–8 years), and 55% accessed their Screening Activity Report during the study.

Physicians reported using the monthly e-mail messages from Cancer Care Ontario simply as a reminder to check their Screening Activity Report at a later time, regardless of the content of the message. Only 1 physician interviewed used the link in the message to access the Screening Activity Report. Most physicians reported that they opened and glanced at, but did not attentively read, the messages. The physicians who opened the messages with the anticipated regret content reported that it provoked an emotional reaction, but no follow-up action. The material incentive content led to a discussion of the value of incentives, and the problem-solving suggestions were perceived as interesting, but not actionable, even though the suggestion to delegate responsibility was appealing.

Physicians also offered a number of suggestions to improve the messages. They asked for a brief, personalized message including the specific results and peer comparisons, although such content raises privacy concerns23. One also asked for the message to be sent to a delegate (for example, the clinic nurse), something suggested in the problem-solving messages; however, other respondents did not find that solution actionable either because their practice was new or because no colleague in the clinic was perceived to be worthy of that trust. Another suggestion was to send the e-mail messages with different information each time, possibly supplemented with links to guidelines, or even to send it by postal mail. Finally, physicians mentioned factors determining or facilitating their adherence to cancer screening guidelines. Such factors included knowing or learning more about using their electronic medical record to track screening, receiving incentives (financial or continuing medical education credits), and obtaining the results of tests ordered by other physicians (for example, gynecologists).

Regardless of whether they opened or did not open the e-mail messages, physicians typically had other systems in place (for example, electronic medical records) to track the cancer screening needs of their patients. Physicians who had other systems in place found that the Screening Activity Report was comparatively time-consuming, unhelpful, and redundant. The most commonly identified added value of the Screening Activity Report occurred when physicians took on new patients (and therefore had no patient history or record) or when their patients underwent tests ordered by other physicians. The Screening Activity Report was also useful in instances in which physicians had to validate electronic medical record data. Physicians made a number of suggestions to improve the Screening Activity Report, including integration of the report data into their electronic medical record; facilitating access to the report; and providing more accurate and “actionable” data (that is, personalized information that could be acted on immediately). Table V presents illustrative quotes from participants.

TABLE V Qualitative process evaluation (11 respondents)



In this study, we used the Multiphase Optimization Strategy to examine the effect of component behaviour change techniques within a monthly e-mail message sent to primary care physicians encouraging them to access a cancer screening audit and feedback report. In an effort to inform ongoing initiatives, we were able to embed this pragmatic factorial experiment involving more than 5000 physicians within the routine operations of our health system partner33. For audit and feedback to work, recipients must engage with the data34; for an e-mail message to prompt activity, recipients must open and review the e-mail content.

During the study, only about half the e-mail messages were opened, and in any given month, fewer than 11% of recipients “clicked through” the message to the Screening Activity Report. Those e-mail open and click rates are well above the mean open rate (21%) and click rate (2%) reported by Mailchimp for e-mail messages in the medical, dental, and health care industries. Those overall modest rates might reflect the challenges of using e-mail as a method to reach busy health professionals. Our qualitative findings suggest that physicians might not read the Screening Activity Report e-mail message and that some leave it flagged in their inbox to deal with later. Coupled with the qualitative findings, the low open rates suggest that the program’s theory must be revisited35. The results suggest that the choice to open, read, and act on a reminder e-mail message prompting a physician to check the Screening Activity Report depends on the importance or pertinence that the physician gives to the Screening Activity Report in relation to other priorities and in relation to other strategies for achieving the goal of cancer screening.

Operationalization of anticipated regret and material incentive within monthly e-mail messages to physicians did not influence Screening Activity Report access. Physicians receiving a message containing the problem-solving behaviour change technique were 13% less likely to access the Screening Activity Report during the 4-month study period. We operationalized problem-solving as a bulleted list: “Three tips from other Ontario family doctors on how to fit using the Screening Activity Report into your schedule.” The tips were to delegate access to a member of staff by sending a message to a provided e-mail address, booking time in their calendar to review the report, or “tackle a few patients at a time.” The effect of those problem-solving tips demonstrates that specific content choices in physician communication can influence behaviour.

We speculate that the specific response could reflect one of two effects. First, physicians who received the recommendation to authorize a delegate to access the report on their behalf might have followed that recommendation and thus reduced their own report access. The increase in cervical cancer screening with the problem-solving factor also supports that hypothesis, although inferences about the small observed effect in that secondary outcome are at risk of type i error, especially given the large number of statistical tests conducted. Second, problem-solving content might not be helpful when motivation to complete a task is not already high. In such cases, it is possible that emphasizing the difficulties of staying on top of reports fails to help recipients overcome gaps between intention and behaviour36. Further data about delegates will be necessary before firm recommendations can be made.

In the participatory design process and in the qualitative interviews described here, physicians noted other changes at the system level that could make it easier for them to use the data in their Screening Activity Report. They emphasized a desire for such reports to be integrated into the electronic medical records that they already use or to receive actionable data directly by e-mail. Thus, the operationalization of problem-solving might not have adequately addressed key barriers to the desired action. Those findings suggest that individuals developing audit and feedback interventions should carefully consider how to align their intervention with the workflow of the people who will use it. When the goals of the organization and of the users do not align, the ability to bring about change could be limited, even when using best practices in design.

Our study had 3 methodologic strengths. First, we were able to test, at scale, interventions that had been carefully co-designed with users, in partnership with a provincial agency. Second, rates of bounce-back, loss to follow-up, and unsubscribing were low (<10%), meaning that selection bias was negligible. Third, we conducted qualitative interviews to help explain the quantitative results. Those interviews provided further insight into how to optimize cancer screening in primary care.

Limitations of the study included, first, a temporal window for observing changes that might not have been ideal. Physicians reported consulting their Screening Activity Report typically before receiving preventive care bonuses in fall and winter. Summer might also mean fewer medical visits. It was also not possible to restrict analyses to patients who visited their physician during the 4 months of the study. Second, outcome measures were automatically generated and collected from Mailchimp and Cancer Care Ontario databases. That approach avoided any overestimation of cancer screening rates that might have occurred if physicians self-reported37, which could have led to non-differential information bias and sub-estimation of effects. However, Mailchimp data could not be merged with Cancer Care Ontario data, meaning that measuring effect for physicians who opened their e-mail messages was not possible. Third, randomization was done at the physician level and not the practice level, meaning that contamination is plausible. However, the risk of cross-exposure seems low, given the low open rates. Fourth, for the process evaluation, we used a convenience sample recruited through the principal investigator of the study. That sampling technique could have affected the diversity of the sample. Five, we selected and operationalized behaviour change techniques that seemed most likely to be effective, but it is plausible that, despite our best efforts, we selected the wrong techniques or operationalized them inadequately. Finally, we did not test different e-mail subject lines for the various experimental conditions or over time, which might have contributed to declining open rates. Future studies of e-mail prompts to physicians should consider randomizing and testing a variety of subject lines.

The findings about effective communication techniques to prompt action by primary care physicians are broadly applicable to other physicians in Canada and abroad. Qualitative insights from our process evaluation indicate that Screening Activity Report use depends on multiple practical factors that limit its fit within existing workflows. As in many evaluations of implementations of e-health tools specifically (or practice-change initiatives more broadly), the innovation must align with intra- and inter-organizational structures involved in a specific care process3841. Before implementing any new tool related to cancer screening (or any other process) into practice, it is necessary to determine aspects of the existing local processes where capacity and desire for change are both present. Foundational work to illuminate the realities “on the ground”—for example, surveying, meeting, interviewing, or shadowing the target population during a typical day of practice—should inform implementation strategies to improve quality of care.

Reflections on the Value of Catalyst Project Funding

This pragmatic experiment, conducted in collaboration with Cancer Care Ontario, allowed for a largescale evaluation of multiple interventions. The catalyst project was an ideal circumstance in which to make use of the first steps in the Multiphase Optimization Strategy framework for optimizing multicomponent interventions. The partnered approach also enabled the team to understand the constraints of real-world implementation and to inform future work. The grant provided an opportunity to apply principles of behavioural science to try to improve cancer care. Thus, in addition to the scientific findings, we highlight the benefits gained through working with a health care organization and conducting a trial using real-world data.


In this large factorial experiment, we tested 3 behaviour change techniques embedded in e-mail content sent to more than 5000 primary care physicians to increase the use of an audit and feedback tool for cancer screening. Despite limited engagement by the participants with the intervention, we found that small changes in the content of e-mail communications to physicians can lead to behaviour change, with the potential for change in clinical outcomes. The study demonstrated the potential benefits of rigorous partnered research with health system organizations to examine ways to optimize quality improvement interventions. This type of large-scale implementation research is crucial when working on population-level interventions in which very small effects could be worthwhile. The study also highlights that improving uptake of audit and feedback is a complex problem, unlikely to be satisfactorily addressed with single or simple interventions. Future work in the area should focus on the key goal of increasing cancer screening rather than increasing use of a particular tool, given that the principle of user-centred design is to make systems useful to people rather than to make people use specific systems.


We thank all the participating physicians for their insights. Special thanks go to Petra de Heer from Cancer Care Ontario. This study was conducted with the support of the Ontario Institute for Cancer Research and Cancer Care Ontario through funding provided by the Government of Ontario. Funding for GV’s doctoral work was provided by that grant and by a Canadian Institutes of Health Research (cihr) Foundation Grant (fdn 148426, principal investigator HOW) during the preparation of the manuscript. HOW is supported by a Fonds de recherche du Québec–Santé Research Scholar Junior 2 salary award. NMI was supported by a cihr New Investigator Award and a Clinician Scientist Award from the Department of Family and Community Medicine at the University of Toronto and by the Women’s College Hospital Chair in Implementation Science. JMG holds a Canada Research Chair in Health Knowledge Transfer and Uptake and a cihr Foundation Grant (fdn 143269).


We have read and understood Current Oncology’s policy on disclosing conflicts of interest, and we declare the following interests: During the course of the project, authors CAB, DL, SU, and JT were employed at Cancer Care Ontario. The remaining authors have no conflicts to disclose.


*Quebec: Office of Education and Professional Development, Faculty of Medicine, Laval University (Vaisson, Witteman, Chipenda-Dansokho), Research Centre of the CHU de Québec, Laval University (Vaisson, Witteman), Department of Family and Emergency Medicine, Laval University (Witteman), and Laval University Primary Care Research Centre, Laval University, Quebec City (Witteman),
Ontario: Family Practice Health Centre, Women’s College Hospital, Toronto (Saragosa, Desveaux, Ivers); Institute for Health Systems Solutions and Virtual Care, Women’s College Hospital, Toronto (Saragosa, Bouck, Desveaux, Ivers); Dalla Lana School of Public Health, University of Toronto, Toronto (Bouck); Prevention and Cancer Control, Cancer Care Ontario, Toronto (Bravo, Llovet, Umar, Tinmouth); Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto (Llovet); Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa (Presseau, Taljaard, Grimshaw); School of Epidemiology and Public Health, University of Ottawa, Ottawa (Presseau, Taljaard); School of Psychology, University of Ottawa, Ottawa (Presseau); Department of Medicine, University of Ottawa, Ottawa (Grimshaw); Institute for Clinical Evaluative Sciences, Toronto (Tinmouth); Department of Medicine, University of Toronto, Toronto (Tinmouth); and Department of Family and Community Medicine, University of Toronto, Toronto (Ivers).


1 Kerner J, Liu J, Wang K, et al. Canadian cancer screening disparities: a recent historical perspective. Curr Oncol 2015;22:156–63.
cross-ref  pubmed  pmc  

2 Anas R, Bell R, Brown A, Evans W, Sawka C. A ten-year history: the Cancer Quality Council of Ontario. Healthc Q 2012;15(spec no):24–7.
cross-ref  pubmed  

3 Cancer Care Ontario (cco). Ontario Cancer Screening Performance Report 2016. Toronto, ON: cco; 2016.

4 Wools A, Dapper EA, de Leeuw JRJ. Colorectal cancer screening participation: a systematic review. Eur J Public Health 2016;26:158–68.

5 De Klerk CM, Gupta S, Dekker E, Essink-Bot ML on behalf of the Expert Working Group “Coalition to reduce inequities in colorectal cancer screening” of the World Endoscopy Organization. Socioeconomic and ethnic inequities within organised colorectal cancer screening programmes world-wide. Gut 2018;67:679–87.

6 Peterson EB, Ostroff JS, DuHamel KN, et al. Impact of provider–patient communication on cancer screening adherence: a systematic review. Prev Med 2016;93:96–105.
cross-ref  pubmed  pmc  

7 Anhang Price R, Zapka J, Edwards H, Taplin SH. Organizational factors and the cancer screening process. J Natl Cancer Inst Monogr 2010;2010:38–57.
cross-ref  pubmed  pmc  

8 Cancer Care Ontario (cco). Who We Are [Web page]. Toronto, ON: cco; 2015. [Available at: https://www.ccohealth.ca/en/who-we-are; cited 30 May 2017]

9 United States, Department of Health and Human Services, Office of Disease Prevention and Health Promotion, Healthy People 2020. Cancer [Web page, Objectives tab]. [Available at: https://www.healthypeople.gov/2020/topics-objectives/topic/cancer/objectives; cited 3 April 2019]

10 Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback : effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev 2012;:CD000259.

11 Scott A, Sivey P, Ait Ouakrim D, et al. The effect of financial incentives on the quality of health care provided by primary care physicians. Cochrane Database Syst Rev 2011;:CD008451.

12 Arditi C, Rège-Walther M, Wyatt JC, Durieux P, Burnand B. Computer-generated reminders delivered on paper to healthcare professionals; effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2012;12:CD001175.

13 Tuti T, Nzinga J, Njoroge M, et al. A systematic review of electronic audit and feedback: intervention effectiveness and use of behaviour change theory. Implement Sci 2017;12:61.
cross-ref  pubmed  pmc  

14 Ivers NM, Sales A, Colquhoun H, et al. No more “business as usual” with audit and feedback interventions: towards an agenda for a reinvigorated intervention. Implement Sci 2014;9:14.

15 Brehaut JC, Colquhoun HL, Eva KW, et al. Practice feedback interventions: 15 suggestions for optimizing effectiveness. Ann Intern Med 2016;164:435–41.
cross-ref  pubmed  

16 Colquhoun HL, Squires JE, Kolehmainen N, Fraser C, Grimshaw JM. Methods for designing interventions to change healthcare professionals’ behaviour: a systematic review. Implement Sci 2017 2017;12:30.

17 Grimshaw J, Ivers N, Linklater S, et al. on behalf of the Audit and Feedback MetaLab. Reinvigorating stagnant science: implementation laboratories and a meta-laboratory to efficiently advance the science of audit and feedback. BMJ Qual Saf 2019;:[Epub ahead of print].
cross-ref  pubmed  

18 Glazier RH, Klein-Geltink J, Kopp A, Sibley LM. Capitation and enhanced fee-for-service models for primary care reform: a population-based evaluation. CMAJ 2009;180:E72–81.
cross-ref  pubmed  pmc  

19 Jonah L, Pefoyo AK, Lee A, et al. Evaluation of the effect of an audit and feedback reporting tool on screening participation: the Primary Care Screening Activity Report (pcsar). Prev Med 2017;96:135–43.

20 Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med 2013;46:81–95.
cross-ref  pubmed  

21 Collins LM, Kugler KC, Gwadz MV. Optimization of multicomponent behavioral and biobehavioral interventions for the prevention and treatment of hiv/aids. AIDS Behav 2016;20(suppl 1):S197–214.

22 Wyrick DL, Rulison KL, Fearnow-Kenney M, Milroy JJ, Collins LM. Moving beyond the treatment package approach to developing behavioral interventions: addressing questions that arose during an application of the Multiphase Optimization Strategy (most). Transl Behav Med 2014;4:252–9.
cross-ref  pubmed  pmc  

23 Bravo CA, Llovet D, Witteman HO, et al. Designing emails aimed at increasing family physicians’ use of a Web-based audit and feedback tool to improve cancer screening rates: cocreation process. JMIR Hum Factors 2018;5:e25.

24 Vaisson G, Witteman HO, Bouck Z, et al. Testing behavior change techniques to encourage primary care physicians to access cancer screening audit and feedback reports: protocol for a factorial randomized experiment of email content. JMIR Res Protoc 2018;7:e11.
cross-ref  pubmed  pmc  

25 Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe KE, Zwarenstein M. The precis-2 tool: designing trials that are fit for purpose. BMJ 2015;350:h2147.

26 Canadian Institutes of Health Research (cihr), Natural Sciences and Engineering Research Council of Canada (nserc), and Social Sciences and Humanities Research Council of Canada (sshrc). TCPS2. Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans. Ottawa, ON: cihr, nserc, and sshrc; 2010. [Available online at: http://www.pre.ethics.gc.ca/pdf/eng/tcps2/tcps_2_final_web.pdf; cited 21 April 2019]

27 Canadian Institutes of Health Research (cihr), Natural Sciences and Engineering Research Council of Canada (nserc), and Social Sciences and Humanities Research Council of Canada (sshrc). TCPS2 2014. Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans. Ch. 3. The Consent Process. Ottawa, ON: cihr, nserc, and sshrc; 2014. [Available online at: http://www.pre.ethics.gc.ca/pdf/eng/tcps2-2014/TCPS_2_FINAL_Web.pdf; cited 3 December 2018]

28 Michie S, Atkins L, West R. The Behaviour Change Wheel : A Guide to Designing Interventions. Surrey, UK: Silverback Publishing; 2014.

29 International Standards Organization (iso). ISO 9241-210. Ergonomics of Human–System Interaction. Part 210: Human-Centred Design for Interactive Systems. Geneva, Switzerland: iso; 2010. [Available online at: https://www.sis.se/api/document/preview/912053; cited 27 February 2019]

30 Zou G. A modified Poisson regression approach to prospective studies with binary data. Am J Epidemiol 2004;159:702–6.
cross-ref  pubmed  

31 Saunders B, Sim J, Kingstone T, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant 2018;52:1893–907.
cross-ref  pubmed  pmc  

32 Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol 2013;13:117.
cross-ref  pubmed  pmc  

33 Ivers NM, Grimshaw JM. Reducing research waste with implementation laboratories. Lancet 2016;388:547–8.
cross-ref  pubmed  

34 Brown B, Gute WD, Blakeman T, et al. Clinical Performance Feedback Intervention Theory (cp-fit): a new theory for designing, implementing, and evaluating feedback in health care based on a systematic review and meta-synthesis of qualitative research. Implement Sci 2019;:[In press].

35 Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf 2015;24:228–38.
cross-ref  pubmed  pmc  

36 Sheeran P, Webb TL. The intention–behavior gap. Soc Personal Psychol Compass 2016;10:503–18.

37 Montano DE, Phillips WR. Cancer screening by primary care physicians: a comparison of rates obtained from physician self-report, patient survey, and chart audit. Am J Public Health 1995;85:795–800.
cross-ref  pubmed  pmc  

38 Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q 2004;82:581–629.
cross-ref  pubmed  pmc  

39 Greenhalgh T, Wherton J, Papoutsi C, et al. Beyond adoption: a new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res 2017;19:e367.
cross-ref  pubmed  pmc  

40 Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4:50.
cross-ref  pubmed  pmc  

41 Keith RE, Crosson JC, O’Malley AS, Cromp DA, Taylor EF. Using the Consolidated Framework for Implementation Research (cfir) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement Sci 2017;12:1–12.

Correspondence to: Gratianne Vaisson, Office of Education and Professional Development, Faculty of Medicine, Research Centre of the CHU de Québec, Laval University, 1050 avenue de la Médecine, Québec (Québec) G1V 0A6. E-mail: gratianne.vaisson.1@ulaval.ca

(Return to Top)

Supplemental material available at http://www.current-oncology.com ( Return to Text )

Current Oncology, VOLUME 26, NUMBER 3, JUNE 2019

Copyright © 2019 Multimed Inc.
ISSN: 1198-0052 (Print) ISSN: 1718-7729 (Online)