0
No votes yet
Online Exclusive Article

Multiple Myeloma Education: Results From the ACE Program’s Digital, Serial Learning Approach

Beth Faiman
Sandra Kurtin
Jocelyn Timko
Linda Gracie-King
CJON 2018, 22(5), E120-E126 DOI: 10.1188/18.CJON.E120-E126

Background: Understanding aspects of multiple myeloma (MM) from drug delivery to side effect management and survivorship are critical to patient management. The Advanced Clinical Educator (ACE) program in MM combined live and web-based activities to educate nurses to gain mastery of content and achieve ACE status.

Objectives: The primary objectives were to improve ACE candidates’ practice skills and knowledge of MM and prepare them to educate others.

Methods: 20 ACE candidates were paired with an advisor and educated through a structured learning program. The RealMeasure® methodology measures the effect on intended learner cohorts, analyzing pre- and post-assessment data, as well as follow-up data, with a multidimensional metric that serves as a surrogate marker for performance.

Findings: Learners from the ACE program and a national cohort improved substantially from baseline averages at pretest to high levels of proficiency at post-test. This curriculum model fosters subject matter expertise, leadership, professional networking, and peer-to-peer learning.

The diagnosis and management of multiple myeloma (MM) continue to undergo rapid evolution. Insight into the biologic underpinnings of the disease has led to numerous recent drug discoveries (U.S. Food and Drug Administration, 2018). Nurse navigators, advanced practice providers, and bedside nurses are charged with integrating these changes into the management of MM and play a critical role in educating the patient and his or her caregivers. The first Multiple Myeloma Mentorship Program (MMMP) in 2009 was well received and this, as well as subsequent programs, highlighted the continued need for structured nurse education in MM (Faiman, 2011; Faiman, Kurtin, Timko, & Gracie-King, 2017; Faiman, Miceli, Richards, & Tariman, 2012).

In partnership with AXIS Medical Education and RealCME, two nurses with expertise in MM developed a continuing education program to evaluate data from two curricula. The primary objectives of the program were to improve nursing knowledge, practice skills, and level of performance in the diagnosis, treatment, and management of patients with MM, and to build nurses’ skills as subject matter experts and speakers in this therapeutic area. Therefore, the program was targeted primarily toward nurses involved in the treatment and management of patients with MM. One curriculum, a three-activity serial learning program, was offered to a national audience (the national cohort group). The other included a select group of nurses invited to participate as Advanced Clinical Educator (ACE) program candidates in a mandatory eight-activity intensive with three modules. The ACE candidates’ components included a focused, interactive, live, peer-to-peer educational experience culminating in ACE status. ACE status denotes an individual’s mastery of the content provided in the ACE program, as well as demonstration of effective presentation of the content to peers. Content was presented in three modules (see Figure 1).

Education Groups

ACE Candidates

From March 2016 to July 2017, 20 ACE candidates were paired with one of five experienced MM advisors. The ACE candidates were identified based on geographic location and desire to participate. Advisors (also known as mentors) were defined as MM clinical experts who coach, teach, guide, and empower the ACE candidates to achieve educational milestones that can assist them in providing optimal management of patients with MM. ACE candidates were defined as oncology nurses, oncology nurse educators, or advanced practice nurses being advised through a collaborative education process to initiate, evaluate, and implement evidence-based nursing care for the highest quality outcomes. The program was conducted in collaboration with AXIS, an accredited continuing education provider, and RealCME, a technology and outcomes analytics partner.

Each ACE candidate received instruction concerning best practices in the diagnosis and management of MM within a structured serial learning curriculum. The curriculum included a test-and-teach self-assessment activity prior to each of three self-study sessions. After each self-study session, ACE candidates participated in three accompanying virtual summits, then presented the information learned to their peers in a live format. Twenty ACE students presented two MM-focused slide decks, created by the clinical experts, to 420 total learners nationwide. The target audience included oncology and hematology/oncology nurses and nurse practitioners in addition to other healthcare professionals who treat patients with MM in various practice settings (e.g., inpatient, outpatient, community).

The advisors were required to attend their ACE candidates’ first live presentation, with the option to also attend the second. The advisors completed a feedback form and rated components of their ACE candidate's presentation (e.g., knowledge of material, comfort level, speaking volume, ability to answer audience questions), and discussed the feedback with each candidate.

ACE candidates then completed a final self-assessment to conclude their participation in the ACE program. A score of 90% or higher on the final post-test self-assessment was required to attain ACE status. All 20 candidates met or exceeded this critical milestone and achieved ACE status. A digital feedback survey was conducted at the end of the program to measure participant satisfaction.

National Cohort

The modular activities were also opened to a national cohort of healthcare professionals. Pretest and post-test results of the two learning domains (knowledge and competence, confidence and practice strategy) were compiled across all activities. Learner performance was evaluated by RealIndex®, a single multidimensional situation-based question that incorporates clinical application of the curriculum and represents a surrogate marker for learner performance. Because there were three modular activities, some individuals participated in one, but not all, of the learning modules. Others may have not completed the modules. Therefore, these individual scores were not available for analysis.

Methodology

RealCME is an educational technology and analytics company that specializes in instructional design and outcomes measurement. RealCME partnered with AXIS to provide technology/web publishing and robust outcomes assessment of all ACE programs. The methodology used by RealCME, known as RealMeasure®, uses a sophisticated approach to measure affect on the intended learner cohorts, analyzing pre-, post-, and four-week follow-up learner data in concert with a curriculum-based, multidimensional, index-based metric that serves as a surrogate marker for performance (RealIndex). These analyses include paired sample t tests, correlations, nonparametric testing, and opportunities for advanced analytics.

Moore, Green, and Gallis (2009) described seven levels for the assessment of continuous learning and medical education outcomes measurement. The model by Moore et al. (2009) is recognized as an effective measure of participant learning by the American Academy of Continuing Medical Education. Of the seven levels, five were deemed appropriate and used for analysis in this learning activity, including participation, satisfaction, learning, performance, and patient health.

Statistical Analysis

Pretest and post-test questions for each of the individual activities within the ACE program (self-study activities, virtual summits, and the live presentations) were analyzed through the RealCME platform. Then, individual baseline self-assessments were conducted at the beginning of the ACE program for all participants, as well as a final self-assessment at the end of the ACE program. Participant responses were collected at the beginning and end of each activity. All appropriate statistical analyses for each metric were conducted using IBM SPSS Statistics, version 19.0, through the RealCME platform.

Results

Level 1 (Participation)

A total of 443 learners started the virtual content activities included in this educational initiative. Of learners who began an activity, 87% (n = 372) completed the educational content and 72% (n = 320) claimed credit. These rates are above benchmark data for previous MM programs offered on the RealCME educational platform during the past several years (Faiman, 2011; Faiman et al., 2012, 2017), including average content completion rates of 50% and certification rates of 41%. The results are outlined in Table 1.

Level 2 (Satisfaction)

A digital ascending five-point Likert-type feedback form, ranging from 1 (poor) to 5 (excellent), was completed by each ACE participant at the end of the program. Participants indicated a high level of overall satisfaction (4.74) with the ACE program. Self-rating for the effect of the ACE program on knowledge of MM (4.79), clinical performance in managing MM (4.63), and confidence/comfort in managing MM (4.63) indicated satisfaction among the study participants. Learners also indicated that the learning objectives identified by the curriculum were reliably met through the instruction.

Levels 3–4 (Findings)

Statistically significant gains were measured in all learning domains (p ≤ 0.01) (see Table 2). Substantial improvements in knowledge were measured across the curriculum in all learning domains for both ACE participants (21% increase from baseline, p < 0.5) and the national cohort (47% increase from baseline, p ≤ 0.05). Similarly, improvement in competence, confidence, and practice strategy was observed for the ACE participants and the national cohort, respectively. Across the curriculum, the ACE candidates demonstrated the greatest proficiency (mean post-test score = 93%) when compared to the national cohort (mean post-test score = 86%). In addition, the national cohort improved from relatively low averages at pretest in knowledge (mean = 58%) and competence (mean = 60%) to demonstrate improvement in both domains. When learners’ self-reported confidence and practice strategy ratings were evaluated, perceived proficiency for both learner cohorts was reflective of their demonstrated knowledge and competency.

Improvements on knowledge and competence items that were mapped to curriculum learning objectives resulted in substantial improvements from pretest, ranging from 5%–58% for the ACE candidates and from 18%–115% for the national cohort. At pretest, performance for both cohorts were variable, with averages ranging from 56%–94% for ACE candidates and 38%–84% for the national cohort. However, by the conclusion of the curriculum, all learners attained proficiency, if not mastery (except the national cohort on learning objectives 2 and 7). Learners who participated in this education as ACE candidates remained more proficient across the curriculum, except for learning objective 5, where national cohort learners demonstrated greater proficiency at post-test (see Table 3).

Level 5: Performance (the RealIndex)

Statistically significant (p ≤ 0.02) improvements of 22% and 15% were measured from baseline to the final RealIndex for ACE candidates and the national cohort, respectively; these increases equated to medium effect sizes (d = 0.746 and d = 0.32) and a high level of power (0.9). Taken together, these findings indicate that the education had a substantial effect on both groups, although the impact on ACE candidates was greater (see Table 4). The improvements for both cohorts on this applied performance metric exceeded historic RealCME improvement benchmarks for this domain (5%).

RealIndex Progression

The RealIndex performance metric is a serial measure that acts as a surrogate marker for clinical performance. Learners are presented with a case vignette and a series of clinical decisions that they sort as either consistent or inconsistent with their current practice approach. ACE candidates were required to complete all curriculum components and, as such, their RealIndex performance is represented. National cohort learners’ averages remained relatively low at all levels of curriculum engagement. However, gains were observed in all groups, regardless of engagement level. For those who took part in a single activity, scores increased from 51% to 55%. National cohort participants who took part in two activities saw their scores increase from a baseline of 51% to 63% after the first activity and 67% after the second. Those who completed all activities saw an increase from a baseline of 54% to 58%, 63%, and 66% for each activity, respectively.

Discussion

The activities in the ACE program were designed to construct a bridge between self-directed learning, didactic training, and independent clinical practice for oncology nurses. The program provided tools and resources that enabled oncology nurses to gain expertise and competence by applying new skills in clinical practice with the support and guidance of a more specialized and experienced MM advisor/clinical expert. Advisors in the program were defined as myeloma clinical experts who were intended to coach, teach, guide, and empower the ACE participants to achieve educational milestones that can assist them in providing optimal management of patients with MM. ACE candidates/participants are defined as oncology nurses, oncology nurse educators, or advanced practice nurses being advised through a collaborative education process to initiate, evaluate, and implement evidence-based nursing care for the highest quality outcomes. The ACE program was implemented as a reciprocal and collaborative learning relationship designed to share mutual responsibility and accountability for helping ACE candidates work toward achievement of clear and mutually defined learning goals regarding the optimal treatment of patients with MM.

The objective of the ACE program is not only to improve ACE candidates’ knowledge, competency, and level of performance in the treatment and management of patients with MM, but, in addition, to build their skills as subject matter experts and speakers in this therapeutic area. After successfully achieving all of the milestones throughout the program, the ACE candidates achieve advanced clinical educator status and were added to AXIS database to present future live activities.

In this program, ACE candidates demonstrated varied performance at pretest. Then, at the conclusion of the ACE curriculum, learners demonstrated uniform proficiency, if not mastery, in both domains (knowledge and competence) across the curriculum. A comparative evaluation of performance among the ACE activities revealed that performance was relatively uniform across educational contexts. When confidence and practice strategy were evaluated, the increases in learner ratings were commensurate with their demonstrated improvements in proficiency.

In the national cohort, substantial improvements from baseline to high levels of proficiency at the close of the curriculum were observed by post-test for all learners who participated in the curriculum components available to a national audience, except in competence regarding symptom and side effect management, where learners’ post-test average was the lowest observed. An analysis of confidence and practice strategy ratings revealed that this cohort of learners felt more able and empowered to implement this education in their practice, but did not overestimate their proficiency.

When ACE candidates’ performance was assessed relative to the performance of the national cohort, the learners who participated as ACE candidates were modestly more proficient by post-test in all domains across all activities compared to learners who participated in the national cohort components. Since interim reporting, learners in the national cohort components of the curriculum improved substantially, with post-test performance roughly comparable to that of the specialized ACE learners. These findings suggest that ACE candidates and national cohort participants are equivalently proficient and ready for more challenging education.

Patient Reach

ACE participants and the national cohort of learners were asked to approximate the number of patients with MM that they see per week. Participants from the national cohort reported seeing an average of 1–9 patients with MM per week; ACE candidates reported seeing an average of 1–5 patients with MM per week. Based on the program’s results from 100 learners who completed the content of at least one activity, an anticipated 1,080 patients with MM per month may benefit from this program’s preparation of clinicians. In addition, this finding highlights the efficacy of this curriculum at targeting and engaging an audience of specialized and nonspecialized learners actively involved in the treatment and management of patients with MM.

Challenges and Sustainability

A minor challenge was in keeping ACE candidates and advisors engaged throughout the year-long program. Methods for adherence to the mandatory activities within the requested time frame included calendars, email reminders, and telephone calls provided by AXIS to the candidates. Mentors and advisors also reached out to ACE candidates at various intervals to keep lines of communication open and enhance learning. The AXIS team, mentors, advisors, and ACE candidates all worked well together to complete required activates and were able to complete the program as stated.

This project was funded by a grant and, therefore, resources were available to successfully carry out the educational activities. However, it is the opinion of the current authors that the educational model can be adopted with other cancer types and in other settings. Based on the successes of this program, AXIS has secured grant funding for mentorship programs in other tumor types (breast cancer and non-small cell lung cancer, specifically). This MM program was first developed in 2009 with plans to continue, which is contingent upon grant funding.

Implications for Nursing and Conclusion

The 2016/2017 ACE program was modeled after previous successful programs in MM. In a partnership with AXIS, the program co-chairpersons developed specialized content to provide systematic education to nurses involved in the care of patients with MM. A national cohort was also evaluated to compare these learners with candidates invited to participate in the enhanced advance clinical educator track. Using Moore et al.’s (2009) conceptual model and RealIndex statistical analysis, improvements in pre- and post-test knowledge in all areas of this program. The ACE candidate cohort was more proficient than the national cohort, as demonstrated by comparatively higher average performance scores in all domains. These findings highlight the success of the ACE component on learner performance relative to performance of the national cohort that did not receive individual advisor/mentorship guidance. The gains measured across the curriculum in all learning domains reflect the need for specialized education among oncology nurses and support the creation of future educational activities featuring an individualized ACE component.

This integrated curriculum model fosters subject matter expertise, leadership, professional networking, and peer-to-peer learning. These findings are consistent with previous MM ACE programs and reinforce the idea that the model of structured, virtual learning has consistently led to improvements in participant’s knowledge and practice skills. Therefore, this educational model should be considered for future educational programs.

About the Author(s)

Beth Faiman, PhD, MSN, APRN-BC, AOCN®, is a nurse practitioner (NP) at the Cleveland Clinic in Ohio; Sandra Kurtin, ANP-C, AOCN®, is an NP, clinical assistant professor of medicine, and adjunct clinical professor of nursing at the University of Arizona Cancer Center in Tucson; and Jocelyn Timko, BS, is a senior medical writer and Linda Gracie-King, MHS, is a managing partner, both at AXIS Medical Education in Fort Lauderdale, FL. The authors take full responsibility for this content. This research was supported by educational grants from the Celgene Corporation, Novartis Pharmaceuticals, and Takeda Oncology. Faiman has previously consulted for and received support from Celgene Corporation and Takeda Oncology. Kurtin has previously consulted for AbbVie, Amgen, Celgene Corporation, Genentech, Incyte Corporation, Novartis Pharmaceuticals, Pharmacyclic, and Takeda Oncology, and has received additional support from Agios Pharmaceuticals, Celgene Corporation, Novartis Pharmaceuticals, and Otsuka Pharmaceutical. The article has been reviewed by independent peer reviewers to ensure that it is objective and free from bias. Faiman can be reached at faimanb@ccf.org, with copy to CJONEditor@ons.org. (Submitted January 2018. Accepted March 18, 2018.)

 

References 

Faiman, B. (2011). Overview and experience of a nursing e-mentorship program. Clinical Journal of Oncology Nursing, 15, 418–423. https://doi.org/10.1188/11.CJON.418-423

Faiman, B., Kurtin, S.E., Timko, J., & Gracie-King, L. (2017). Multiple myeloma mentorship: Bridging communication and educational gaps. Clinical Journal of Oncology Nursing, 21, 99–103. https://doi.org/10.1188/17.CJON.99-103

Faiman, B., Miceli, T.S., Richards, T., & Tariman, J.D. (2012). Survey of experiences of an e-mentorship program: Part II. Clinical Journal of Oncology Nursing, 16, 50–54.

Moore, D.E., Jr., Green, J.S., & Gallis, H.A. (2009). Achieving desired results and improved outcomes: Integrating planning and assessment throughout learning activities. Journal of Continuing Education in the Health Professions, 29, 1–15. https://doi.org/10.1002/chp.20001

U.S. Food and Drug Administration. (2018). Drug approvals and databases. Retrieved from https://www.fda.gov/Drugs/InformationOnDrugs/default.htm