As you design and deliver your course, you will be using new techniques and facilitating learning activities that may be different from those students are used to. These two factors may cause you to have lower end-of-semester course evaluations temporarily. A part of blended course design should include a plan to collect data on the positive outcomes that are being facilitated through your new design that can be used alongside the traditional course evaluation process.
DEVELOPING AN EVALUATION PLAN
The following are some suggestions to consider in identifying focus areas, choosing types of evidence, and developing your evaluation plan.
- Talk with your department chair about your plans to blend your course. Ask what evidence they would want to see around the performance of your course.
- If you are pre-tenure, engage in conversations with your chair and tenure committee about what evidence they would want to see in addition to formal end-of-semester evaluation results.
- Review your course outcomes, unit objectives, and student assessments. Think about the kinds of data you can collect to show process or improvement in achieving these designated levels.
- Use the Course Syllabus (AEFIS) tool to document student success in completing course outcomes.
- Think about the kinds of qualitative data you can collect to measure the softer outcomes of your course such as student engagement, quality of discourse, preparation for class, and/or class attendance.
EVALUATION PLAN EXAMPLE: MUS STU 352
The following example is from MUS STU 352 — Visitor Experience and Design in Museumstaken from the pre-class case studies. Before Marcy teaches her course, she identifies her goals as well as those of other stakeholders. Based on conversations with her chair and others, Marcy identifies the following areas to focus on as measurements of success for her course design and builds the following plan to address these issues:
- Student professionalism in dealing with museum staff;
- Quality of semester projects (using a rubric she develops);
- Student compliance with learning pathway;
- Quality and level of student contributions in class; and,
- Level of preparation for future Museum Studies courses.
|Program Outcomes (PO)(Indicates supported program outcomes)PO1: Develop a high degree of professionalism and responsibility, skills in problem-solving, critical thinking and evaluation, writing, oral presentation, decision-making, and teamwork. PO2: Apply professional museum standards and ethics and proper care and interpretation of collections. PO3: Explore the role of museums in our complex and ever-changing society. PO4: Develop skills around administration and organizational problems in modern museums, finances and budgets, legal-administrative problems, relations with support groups and volunteers, record keeping, and management of museum projects. PO5: Develop skills around the collection, organization, storage, care, and scientific use of museum collections. PO6: Develop and design museum visitor experiences. Course outcomes (CO) CO1: Analyze the elements of the visitors’ meaning-making: framing, resonating, channeling, and broadening. CO2: Design visitor experiences that build connections between content and viewers in ways that foster specific kinds of encounters. CO3: Design exhibit narratives for different levels of museum visitors. CO4: Develop an engaging user experience to support the narrative of the exhibit. CO5: Value the importance and impact of a well-designed museum experience. CO6: Use museum spaces to encourage reflection and dialogue. CO7: Design museum exhibits that develop and demonstrate cultural understanding. CO8: Develop competence, depth, and expertise in the design of museum visitor experiences. CO9: Develop a high degree of professionalism and responsibility, skills in problem-solving, critical thinking and evaluation, writing, oral presentation, decision-making, and teamwork. CO10: Develop research and interviewing skills to gain feedback on exhibits and further the field of Museum Studies.||Envisaging the Discipline of Museum Design|
|User Experiences in Museums|
|Deconstructing Visitor Experiences|
|Design for Exhibition Ecologies|
EVIDENCE TO BE COLLECTED
Marcy identifies the following kinds of qualitative and quantitative evidence to be collected. This data will help her make evidenceâ€‘based changes to improve the effectiveness of her course design.
REVIEW OF MUSEUM DESIGN PROJECTS
In addition to the grade students are given, projects are analyzed for their quality based on course outcomes and project rubric. Improvements in these qualities can be qualitative data to determine whether course design has yielded the desired impact.
After each class session, the quality of student engagement and contributions is recorded. After a full semester, it can be hard to recreate these impressions in meaningful ways. This approach is a quick and easy way to flag situations in which expectations were or were not met. Marcy will look at some specific items in her reflection.
- PRE-CLASS SELF-ASSESSMENTS – Level of participation and quality of reflection.
- IN-CLASS GROUP DISCUSSIONS – Level of preparation to contribute to discussion, level of contribution, and kinds of gaps in learning displayed through student comments.
- WORKLOAD – Amount of time to facilitate activity and provide feedback.
- FEEDBACK – How quickly did it take to give feedback to students?
Marcy plans on meeting with the instructors who teach classes students take later in the course sequence. She will ask them whether they see any changes in students' preparation for their courses.
After the midterm, students will complete a survey to measure specific course questions.
- How well were unit objectives facilitated?
- How well did pre-class activities prepare you for in-class activities?
- How useful was the feedback you received so far?
FORMAL END-OF-SEMESTER SURVEY
The Museum Studies program has redesigned their endâ€‘ofâ€‘semester surveys using AEFIS to measure specific course and departmental questions.
- How well did the instructor facilitate learning?
- How clearly were assignments presented and graded?
- Was the amount of feedback provided by the instructor sufficient?
In addition to the formal endâ€‘ofâ€‘semester survey, Marcy develops an additional survey to collect student questions and feedback.
- How well did pre-class activities prepare you for in-class activities?
- What problems (if any) did you experience with the semester project?
- What suggestions do you have for future improvements in the course?
WHAT MAKES A GOOD EVALUATION PLAN
DoIT Academic Technology's Evaluation Design & Analysis service has developed some principles of evaluation that can be helpful as you plan to evaluate your course. Good evaluation plans:
- Are not an afterthought;
- Have a defined and meaningful purpose;
- Include asking the right people for feedback;
- Are iterative and support your measurements of success;
- Include multiple types and sources of feedback;
- Are flexible and enable responsiveness;
- Take advantage of existing evaluation tools; and,
- Include a process for acting on feedback.
COLLECTING EVIDENCE OF SUCCESS
In addition to the end-of-semester data that is collected by your department, you can use the following tools to collect qualitative and quantitative data to measure the success of your course.
QUALTRICS — http://survey.wisc.edu
The UW-Madison Qualtrics Survey Hosting Service allows users to easily create surveys, collect and store data, and produce reports. Users can access the tools using their UW-Madison NetID and password.
INFO SITE – https://it.wisc.edu/services/surveys-qualtrics/
MICROSOFT FORMS —http://forms.office.com
Microsoft Forms – along with other tools like Outlook, Calendar, Word, Powerpoint, and Excel – is part of the Microsoft Office 365 suite of tools on campus. Users can access the tools using their UW-Madison NetID and password.
GOOGLE FORMS — http://forms.google.com/span>
Google Forms is part of Google's online apps suite of tools. Users can access the tools using their UW-Madison NetID and password.