LCME Element 6.1: Program and Learning Objectives

Element 6.1: Program and Learning Objectives

June 13, 2023

LCME Element 6.1 – Program and Learning Objectives

The faculty of a medical school defines its medical education program objectives in outcome-based terms that allow the assessment of medical students’ progress in developing the competencies that the profession and the public expect of a physician. The medical school makes these medical education program objectives known to all medical students and faculty. In addition, the medical school ensures that the learning objectives for each required learning experience (e.g., course, clerkship) are made known to all medical students and those faculty, residents, and others with teaching and assessment responsibilities in those required experiences.

Hidden Curriculum

Element 6.1 is one of those foundational elements on which so many other elements are dependent, most especially Element 8.4: Evaluation of Educational Programs.  Element 6.1 demands more than just the adoption of nicely worded competencies and program objectives.  It requires that every program objective is stated in clear, specific, and measurable terms.  Compliance with this element means a well-defined, evidence-supported connection between program objective, assessment, and outcome.  Yes, every program objective requires an assessment to enable leadership to make outcome determinations, both at the individual and cohort level. 

The dissemination of competencies and program objectives is another important piece of this element, which can get overshadowed by the data-driven requirements.  Medical education is complex, fast-moving, and rigorous; and it requires a host of stakeholders to bring it to fruition, including the students themselves.  Medical students, faculty, residents, and any other professional with teaching and assessment responsibilities must have a documented understanding of the program objectives to guide their daily efforts (see also Element 9.1: Preparation of Resident and Non-Faculty). 

Best Practices

Within the Data Collection Instrument (DCI), schools are asked to complete a chart to organize the multitude of information associated with this element.  Consider completing this chart well before a site visit, using it as the canonical source to guide assessment development, data collection, and program evaluation schedules.  In fact, the chart can serve as an effective communication tool to the many parties that play a role in the medical education journey.  

A measure without a benchmark becomes data without context.  Make your data collection efforts count by identifying standards against which to make outcome determinations.  Standardized exams, AAMC questionnaires, and residency match rates all lend themselves to benchmarking because they provide national data.  However, a program must put extra thought into benchmarking for its home-grown assessments and course pass rates.  Standard-setting efforts and longitudinal data collection are key to establishing internal standards.  Invest in the systems and people to orchestrate and synthesize these data.    

The 2024-25 DCI now includes a chart to capture Independent Student Analysis (ISA) results on “satisfaction with the utility of the educational program objectives to support learning.”  While previous DCIs have inquired about the process of the dissemination of program objectives, they did not dig into student perceptions about them.  The revised 2024-25 DCI not only asks for ISA data, but also requires a narrative summary of students’ perception.  This is a new quantitative and qualitative dimension that schools must manage, and they should consider incorporating such questions into internal, annual surveys.

Develop a dissemination plan that is executed through multiple modalities and captures annual attestations, particularly for residents and students.  Learning management systems for courses should clearly post competencies, program objectives, and assessments as a standard feature.  A school’s website should also publicize the information – a helpful resource for programs with distributed campuses.  Annual residency onboarding and retreats should incorporate digital acknowledgement of this information.  Moreover, departments should dedicate an annual meeting to reviewing objectives and aggregated data, allowing faculty members to see the connection between their teaching and educational outcomes.

Continuous Quality Improvement

Element 8.4: Evaluation of Educational Programs is a best friend to Element 6.1.  Element 8.4 is the vehicle for continuous quality improvement (CQI) whereas Element 6.1 is the map.  CQI is designed to ask tough questions by examining data on a regular basis, and these elements are at the heart of it.  Faculty should be able to review data and answer questions like: Are students achieving our program objectives? Are we fulfilling our duty as faculty by adequately teaching and assessing students in preparation for a demanding career? How are we performing in comparison to national data or peer institutions? 

The more critical part of CQI is developing action plans when outcomes are determined to be less than desirable.  These elements are designed to prevent programs from becoming complacent and simply maintaining a status quo.  Investing in the resources, both software and staff, to orchestrate curriculum mapping and data collection aids in this process.  The cautionary tale is that data are data.  The medical school leadership must be dedicated to acting on their internal findings to ensure the medical education program continues to evolve and improve the medical students’ educational experiences.

Related Reading:

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *