Backward Design Step 9: Evaluate Your Course

Getting started with hybrid instruction

This document is part of a larger collection of documents on hybrid instruction from the Center of Teaching, Learning and Mentoring's Instructional Resources KnowledgeBase. See more hybrid instruction documents.

Home Design Develop Deliver Evaluate What Why

Create an evaluation plan to measure the effectiveness of your course

STEP 9: EVALUATE.

As you design and deliver your course, you will be using new techniques and facilitating learning activities that may differ from those students are used to. These two factors may temporarily cause you to have lower end-of-semester course evaluations. A part of blended course design should include a plan to collect data on the positive outcomes facilitated through your new design that can be used alongside the traditional course evaluation process.

Developing an evaluation plan

The following are some suggestions to consider in identifying focus areas, choosing types of evidence, and developing your evaluation plan.

  • Talk with your department chair about your plans to blend your course. Ask what evidence they would want to see about the performance of your course.
  • If you are pre-tenured, engage in conversations with your chair and tenure committee about what evidence they would want to see in addition to formal end-of-semester evaluation results.
  • Review your course outcomes, unit objectives, and student assessments. Consider the kinds of data you can collect to show process or improvement in achieving these designated levels.
  • Use the course syllabus to document student success in completing course outcomes.
  • Think about the kinds of qualitative data you can collect to measure the softer outcomes of your course, such as student engagement, quality of discourse, preparation for class, and/or class attendance.

Evaluation plan example: MUS STU 352

The following example is from MUS STU 352 — Visitor Experience and Design in Museums taken from the pre-class case studies. Before Marcy teaches her course, she identifies her goals and those of other stakeholders. Based on conversations with her chair and others, Marcy identifies the following areas to focus on as measurements of success for her course design and builds the following plan to address these issues:

  • Student professionalism in dealing with museum staff;
  • Quality of semester projects (using a rubric she develops);
  • Student compliance with learning pathway;
  • Quality and level of student contributions in class; and,
  • Level of preparation for future Museum Studies courses.
Evaluation Plan
Outcomes Unit Unit Objectives

Program Outcomes (PO)(Indicates supported program outcomes)

  • PO1: Develop a high degree of professionalism and responsibility, skills in problem-solving, critical thinking and evaluation, writing, oral presentation, decision-making, and teamwork.
  • PO2: Apply professional museum standards and ethics and proper care and interpretation of collections.
  • PO3: Explore the role of museums in our complex and ever-changing society.
  • PO4: Develop skills around administration and organizational problems in modern museums, finances and budgets, legal-administrative problems, relations with support groups and volunteers, record keeping, and management of museum projects.
  • PO5: Develop skills around the collection, organization, storage, care, and scientific use of museum collections.
  • PO6: Develop and design museum visitor experiences.

Course outcomes (CO)

  • CO1: Analyze the elements of the visitors’ meaning-making: framing, resonating, channeling, and broadening.
  • CO2: Design visitor experiences that build connections between content and viewers in ways that foster specific kinds of encounters.
  • CO3: Design exhibit narratives for different levels of museum visitors.
  • CO4: Develop an engaging user experience to support the exhibit's narrative.
  • CO5: Value the importance and impact of a well-designed museum experience.
  • CO6: Use museum spaces to encourage reflection and dialogue.
  • CO7: Design museum exhibits that develop and demonstrate cultural understanding.
  • CO8: Develop competence, depth, and expertise in the design of museum visitor experiences.
  • CO9: Develop a high degree of professionalism and responsibility, skills in problem-solving, critical thinking and evaluation, writing, oral presentation, decision-making, and teamwork.
  • CO10: Develop research and interviewing skills to gain feedback on exhibits and further the field of Museum Studies.
Envisaging the Discipline of Museum Design
  • UO1: Situate exhibition design within the field of Museum Studies.
  • UO2: Recognize the historical shifts in the discursive formation of exhibits.
  • UO3: Recognize the historical changes in the purposes of exhibition design.
User Experiences in Museums
  • UO1: Analyze the multiple ways in which experience has been conceptualized in museum studies.
  • UO2: Hypothesize on future trends in visitor experience design.
  • UO3: Develop strategies to respond to new trends in practical terms.
Deconstructing Visitor Experiences
  • UO1: Build a conceptual decision-making framework for designing visitor experiences.
  • UO2: Identify the ways an exhibit conceptualizes the subject and how they impact what visitors see through an exhibition.
  • UO3: Explore the ways an exhibit facilitates the investigation of the subject.
  • UO4: Develop a process for gathering feedback from visitors on their experiences with an exhibition.
Framing
  • UO1: Consider how the visitors' conceptualization of ‘museum’ and ‘exhibit’ guide their participation in and evaluation of their museum experience
  • UO2: Situate the exhibition in ways that help visitors classify what they are experiencing.
  • UO3: Organize the exhibition into an appropriate framework to guide comprehension.
Resonating
  • UO1: Analyze the way visitors meet each item within the collection.
  • UO2: Organize items within the collection that connect to build understanding.
  • UO3: Analyze how the qualities of the visitor and the qualities of the museum display mesh in a transactional exchange.
Channeling
  • UO1: Identify the intended wayfinding of the user within the exhibit.
  • UO2: Design for conceptual, attentional, and perceptual elements of wayfinding.
  • UO3: Analyze the collection to identify its major elements.
  • UO4: Design wayfinding pathways (formal and informal) to highlight the collection.
Broadening
  • UO1: Gauge the various content-related meanings the visitor may take away from the exhibit.
  • UO2: Identify the meanings to be supported through the exhibit design.
  • UO3: Design visitor experiences that help visitors find themselves in relationship to the interpretive content of the exhibit.
Design for Exhibition Ecologies
  • UO1: Describe how the four processes (framing, resonating, channeling, and broadening) form a mutually influencing system.
  • UO2: Appreciate how design choices catalyze the visitor-in-exhibition experience.

Evidence to be collected

Marcy identifies the following kinds of qualitative and quantitative evidence to be collected. This data will help her make evidence-based changes to improve the effectiveness of her course design.

Qualitative evidence

Review of Museum Design projects 

In addition to the grade students are given, projects are analyzed for quality based on course outcomes and project rubrics. Improvements in these qualities can be qualitative data to determine whether course design has yielded the desired impact.

Instructor Reflection

After each class session, the quality of student engagement and contributions is recorded. After a full semester, it can be hard to recreate these impressions meaningfully. This approach is a quick and easy way to flag unmet expectations. Marcy will look at some specific items in her reflection.

  • PRE-CLASS SELF-ASSESSMENTS – Level of participation and quality of reflection.
  • IN-CLASS GROUP DISCUSSIONS – Level of preparation to contribute to the discussion, level of contribution, and kinds of gaps in learning displayed through student comments.
  • WORKLOAD – Amount of time to facilitate activity and provide feedback.
  • FEEDBACK – How quickly did it take to give feedback to students?

Stakeholder meeting

Marcy plans on meeting with the instructors who teach classes students take later in the course sequence. She will ask them whether they see changes in students' course preparation.

Quantitative Evidence

Mid-semester surveys 

After the midterm, students will complete a survey to measure specific course questions.

  • How well were unit objectives facilitated?
  • How well did the pre-class activities prepare you for in-class activities?
  • How useful was the feedback you received so far?

Formal end-of-semester survey

The Museum Studies program has redesigned its end-of-semester surveys to measure specific course and departmental questions.

  • How well did the instructor facilitate learning?
  • Were assignments presented and graded clearly and fairly?
  • Was the amount of feedback provided by the instructor sufficient?

Supplement surveys

In addition to the formal end-of-semester survey, Marcy develops an additional survey to collect student questions and feedback.

  • How well did the pre-class activities prepare you for in-class activities?
  • What problems (if any) did you experience with the semester project?
  • What suggestions do you have for future improvements in the course?

What makes a good evaluation plan

DoIT Academic Technology's Evaluation Design & Analysis service has developed some principles of evaluation that can be helpful as you plan to evaluate your course. Good evaluation plans:

  • Are not an afterthought;
  • Have a defined and meaningful purpose;
  • Include asking the right people for feedback;
  • Are iterative and support your measurements of success;
  • Include multiple types and sources of feedback;
  • Are flexible and enable responsiveness;
  • Take advantage of existing evaluation tools; and,
  • Include a process for acting on feedback.

Collecting evidence of success

In addition to the end-of-semester data collected by your department, you can use the following tools to collect qualitative and quantitative data to measure the success of your course.

Qualtrics — http://survey.wisc.edu

The UW-Madison Qualtrics Survey Hosting Service allows users to easily create surveys, collect and store data, and produce reports. Users can access the tools using their UW-Madison NetID and password.

Support Site – https://www.qualtrics.com/support/(Links to an external site.)Links to an external site.

Info Site – https://it.wisc.edu/services/surveys-qualtrics/

Microsoft Forms —http://forms.office.com

Microsoft Forms and other tools like Outlook, Calendar, Word, PowerPoint, and Excel are part of the Microsoft 365 suite of tools on campus. Users can access the tools using their UW-Madison NetID and password.

INFO SITE: Microsoft 365 - Forms

Google Forms — http://forms.google.com

Google Forms is part of Google's online apps suite of tools. Users can access the tools using their UW-Madison NetID and password.

INFO SITE –https://it.wisc.edu/services/google-apps/



Keywordsevaluation, course, effectiveness, data collection, data informed designDoc ID107400
OwnerTimmo D.GroupInstructional Resources
Created2020-11-25 13:26:58Updated2023-12-27 12:04:14
SitesCenter for Teaching, Learning & Mentoring
Feedback  0   0