Assessment and Evaluation Webinar Series

Advancing Assessment and Evaluation Webinar Series

Assessment and Evaluation Webinar Series

 

This four part archived webinar series is based on the National Center’s Practical Guide Number 5: Incorporating IPCP Teamwork Assessment into Program Evaluation. Guide 5 builds on foundation principles of measurement and assessment as described in Practical Guides 1-4, in previous National Center webinars on measurement, and the measurement primer Evaluating Interprofessional Education and Collaborative Practice: What Should I Consider When Selecting a Measurement Tool? All of these resources are available here at nexusipe.org.

Presented by the four authors of Guide 5, each webinar is one hour long. To receive maximum benefit from these webinars, we strongly encourage you to download and read Guide 5 prior to participating.

 

Archived Recording for Practical Guide 5 Launch – Incorporating IPCP Teamwork Assessment into Program Evaluation
Connie C. Schmitz, PhD, National Center Evaluation Consultant, Chair, National Center Measurement Collection Advisory Board
Barbara F. Brandt, PhD, Director, National Center for Interprofessional Practice and Education

 

Webinar 1: Designing the Evaluation: Frameworks, Principles, and Planning Tools

Connie C. Schmitz, PhD, National Center Evaluation Consultant, Chair, National Center Measurement Collection Advisory Board

Evaluation does not begin with an assessment tool.  It begins with a well thought out plan for what needs to be evaluated, how and why.  In this webinar, we discuss critical things to think about and know before designing an evaluation, and outline the 15 components of a written evaluation plan.  We present several evaluation frameworks and planning tools, and walk participants through the process of creating a logic model to guide their evaluation.  By the end of the webinar, participants should be able to:

  1. Describe elements of their own evaluation context that impact evaluation planning
  2. Explain the if-then thinking (logic) behind their program
  3. Understand how guiding evaluation questions influence evaluation planning

Webinar 1 Evaluation Link

Webinar 2: Evaluation Planning in Action: One University's Story

Lauren Collins, MD, Thomas Jefferson University, Co-Director of the Jefferson Center for Interprofessional Practice and Education (JCIPE)

The principles of evaluation design and planning sound good in the abstract: but how do they work in practice, given the complexity of IPCP?  In this webinar, we explain how the Jefferson Center for IPE used Guide 5 to develop an evaluation plan for their Interprofessional Student Hotspotting Learning Collaborative. The goal of hotspotting is to provide high-touch, interprofessional, team-based care for super-utilizers (i.e., medically and socially complex patients who incur a disproportionate number of ED visits and hospital admissions). Jefferson’s system-wide intervention has multiple potential impacts on student learning, patient care, and organizational practices. By the end of the webinar, participants should be able to:

  1. Understand the Jefferson Team’s context for evaluation
  2. Relate the Team’s guiding evaluation questions to their Hotspotting logic model
  3. Understand the difference between general and refined evaluation questions
  4. Track the relationship between refined questions, data sources and data collection methods

Webinar 2 Evaluation Link

Webinar 3: Using Qualitative Methods to Collect and Analyze Your Data

Barret Michalec, PhD, University of Delaware, Associate Dean of IPE

Collecting qualitative data and making sense of it is easy, right?  Well, no, actually!  Using qualitative methods require as much rigor, attention to detail, and steps to avoid bias as do quantitative methods.  In this webinar, we discuss the types of programs, evaluation purposes, and evaluation questions that benefit most from qualitative approaches.  We provide a working overview of different types of qualitative methods for collecting and analyzing data, as well as pitfalls to avoid.  By the end of the webinar, participants should be able to:

  1. Outline a qualitative approach for a given evaluation purpose and question
  2. Identify ways to ensure high quality data when using interviews, focus groups, case logs, and observation protocols
  3. Differentiate between inductive and deductive methods of analysis

Webinar 3 Evaluation Link

Webinar 4: Using Quantitative Methods to Collect and Analyze Your Data

Doug Archibald, PhD, University of Ottawa, Researcher in the C.T. Lamont Center for Research in Primary Care

Quantitative methods can be helpful for all types of programs and evaluations, but they are critical for answering summative evaluation questions concerning the efficacy and impact of IPECP interventions on Triple Aim outcomes. In this webinar, we review some designs along with their strengths and limitations for IPECP environments. We provide examples of these designs, and walk through some of the principles that influence data collection and analysis.  By the end of the webinar, participants should be able to:

  1. Outline a quantitative approach for a given evaluation purpose and question
  2. Identify ways to ensure high quality data when using surveys, observational ratings sheets, and data extraction protocols
  3. Differentiate between descriptive and inferential methods of analysis

Webinar 4 Evaluation Link