Aerial view of Washington Square Park at sunset

Peer Review of Teaching

Taken together with Reflective Teaching and Student Feedback, Peer Review of Teaching forms a key component of NYU’s Teaching Quality commitment to define, support, measure, and honor teaching excellence at NYU, and to develop those mechanisms which will enhance teaching and improve student learning outcomes.

Peer review of teaching (PRT) brings the evidence, rigor, and community standards of scholarship to the work of teaching. It supports a multidimensional assessment of teaching by providing another set of data-points in addition to those such as student evaluations, thus mitigating overreliance on any one source of information for assessment. But the practice may be formative as well as summative in nature (though not usually at the same time). PRT can be used to collect developmental feedback and can be conducted as an exchange or dialogue between peers. Depending on its implementation, it can infuse the all-too-often solitary practice of teaching with a spirit of collaboration and mutual support.

Though sometimes equated with classroom observations, PRT benefits from multiple and longitudinal sources of evidence when possible, including syllabus, course site, and assignment review. One useful model is the Teaching Quality Framework (TQF) initiative at the University of Colorado Boulder, itself part of the multi-institution TEval project. TQF recommends an array of teaching artifacts for assessment as well as multiple lenses through which to assess those artifacts: the lenses of peer, self, and student. 

But no single approach to PRT works for everyone; what a successful approach looks like for your department or school will depend on your discipline, faculty community, and goals. Teaching Support at NYU can walk you through the process of designing and piloting PRT for a small faculty cohort. These pages will explain that process, show how it was implemented at NYU’s School of Nursing, and provide you with resources, models, and research for creating your own. 

Our Services and Tools

The Learning Experience Design Team offers a range of PRT support services to meet your needs, including: resource sharing and consulting; needs and goals assessment; design and development support; and facilitating small scale pilots. 

The peer review process we typically employ involves reciprocal reviews:

  1. Syllabus and course review
  2. Meeting to discuss syllabus and course review and to set goals and expectations for classroom observations
  3. Reciprocal classroom observations
  4. Meeting to exchange feedback and discuss classroom observations
  5. Self-reflection activities to identify takeaways from the process as well as next steps for modifying teaching

First Steps for Designing a PRT Process

To start your PRT pilot planning on the right track, consider the following questions: 

  • What is the purpose of peer review of teaching for your school or department?
    • You might consider whether the purpose is formative (developmental, supportive) or summative (evaluative), as well as more specific goals, such as: supporting new faculty, making personnel decisions, surfacing effective teaching practices, standardizing elements of a curriculum, among other possibilities.
  • What outcomes do you want to achieve by creating a peer review of teaching pilot for your school or department? Examples include:
    • Creating a faculty community around peer review of teaching
    • Developing or selecting instruments and a process for peer review of teaching that is appropriate for the department in a larger scale implementation
    • Training faculty in feedback and observation practices
    • Preparing faculty to train their colleagues in PRT
  • Have you identified faculty who are interested in participating? We recommend identifying 3-4 faculty leads and creating a small team of faculty champions to test-drive the PRT process.
  • How are faculty learning communities and faculty engagement with pedagogy resources currently promoted in your school or department? How can these practices be leveraged for this pilot?
  • Based on your goals for PRT, which faculty members would be reviewed and which faculty members would conduct the reviews? 
  • How will you create buy-in for peer review of teaching and incentivize the practice, both among leadership and faculty?

These questions may not be fully answered until a pilot team has been assembled, and answers to some of them may change over the course of the pilot.

Sample Pilot Timeline

Some departments and schools benefit from a single consultation regarding PRT as they develop their own process. Other times, these conversations may lead to further consultations or to one or more faculty workshops. An even more extensive engagement could include a PRT pilot over the course of a summer or semester. Such a pilot might involve: 

StageTopic/ActivityTime 
Goal-setting Share framework with school/program leadership and define goals1 hour
Planning Flesh out roll-out strategy and plan; identify faculty champions for pilot1.5 hours
Faculty WorkshopOverview of peer review process: syllabus and course review, pre-observation, observation and post-observation process and tools2 hours
ImplementationLED supports faculty teams in implementationSyllabus and Course Review – 60 minutesPre-observation meeting – 45 minutesObservation – 90 minutesPost-observation debrief – 45 minutes4 hours
Debrief Faculty: Debrief on engagement, Planning course redesign1-2 hours

Keep in mind that our team will always tailor the format, scope, and time commitment according to the needs and availability of faculty participants.

Review Templates and Rubrics

Observation Process Guides

PRT Pilot with NYU Rory Meyers College of Nursing

From June through September 2023, the Learning Experience Design team, operating under NYU’s Office of the Provost, facilitated a Peer Review of Teaching (PRT) Pilot with 7 faculty champions from the Rory Meyers College of Nursing. (For more details on this  initiative, read the full report.)

During two initial planning sessions, lead faculty and administrators identified faculty champions and set pilot goals:

  • Identify and compile evidence-based methods for developing an inclusive syllabus and course review processes for different course formats 
  • Design or adapt tools and instruments for peer teaching observations
  • Review, reflect on, and apply various evidence-based teaching strategies
  • Form a learning community that values peer-learning and review of teaching

Over the course of four workshops, the faculty champions were briefed on the different stages of the review process and debriefed on the stages that they had completed. Members of the LED team facilitated the planning meetings and workshops and consulted with faculty throughout the review process.

Benefits

When faculty considered the overall benefits that they experienced in reviewing and being reviewed by their peers, two major themes emerged regarding morale and awareness around teaching. The process was supportive and energizing; faculty fed off the excitement of their colleagues. This atmosphere fostered community and motivated faculty to try new teaching techniques to better support their students. It also inspired curiosity about the teaching methods of peers as well as reflection on their own practices. One faculty participant noted that the process nudged them “to look into my blindspots.”

Participants reflected on the benefits they experienced during the pilot. Their comments included:

I loved it. I had a focus, the observation yielded good information I can act on, including a variety of in-class approaches, and it also supported the useful techniques and approaches I currently use.

The process made me realize my teaching style and how I can make it more interesting for my students.

The feedback that resonated with me most was when observers shared that they saw what I was doing or trying to do, and that it was working and could even be brought to the next level—When observers saw my vision and gave me confidence by exploring ways that my vision could be taken to the next level.

This pilot has helped me a lot in reflecting on my own practice. . . I plan to implement more active learning strategies and formative assessments, and to be more forgiving when teaching does not go well. I feel that this project keeps me from letting my teaching go stale.

It provided me with new perspectives about how to achieve my goals in the classroom and to achieve clarity in the syllabus.

It was exciting to focus on an area of improvement to explore and get feedback on. Other benefits included connecting and sharing enthusiasm for teaching and learning with colleagues; being curious about others’ experience and perspectives; reflecting on the process and recognizing salient aspects of feedback. Improvement in teaching was brought to students. A new strategy that I tried in the classroom because of the process resulted in greater participation from previously hesitant students.


Resources, Models, and Research

Peer Review of Teaching at Other Institutions

Select Journal Articles and Handbooks

Tested Observation Protocols

A wide variety of teaching observation protocols are used within higher education, either as a standalone practice or part of a multidimensional peer review of teaching. Some protocols are based on common practice, while others have been rigorously tested for validity and reliability. Below find a sampling of the latter.

Approaches to Teaching Inventory (ATI)

Classroom Observation Protocol for Undergraduate STEM (COPUS)

  • A tool for collecting information about the range and types of teaching practices utilized at the levels of school and department.
  • “The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices” (2017): https://www.lifescied.org/doi/10.1187/cbe.13-08-0154

Stanford Faculty Development Program-26 Questionnaire (SFDP-26)

  • Longstanding questionnaire to assess teaching in medical education; it provides the basis for numerous more recent updates at other institutions.
  • “Measuring University Teachers’ Teaching Quality: A Rasch Modelling Approach.” (2021): https://link.springer.com/article/10.1007/s10984-020-09319-w 

Teaching Dimensions Observation Protocol (TDOP)

  • A customizable tool for observing the interactions among teachers, students, and technology in the classroom.

Active Learning Assessment Tools

Observation Protocol for Active Learning (OPAL)

  • A tool applicable across disciplines for observing the degree of actual active learning used in a class.

Practical Observation Rubric to Assess Active Learning (PORTAAL)

  • An observational tool for supporting the implementation of active learning in the classroom.
  • “PORTAAL: A Classroom Observation Tool Assessing Evidence-Based Teaching Practices for Active Learning in Large Science, Technology, Engineering, and Mathematics Classes” (2014): https://www.lifescied.org/doi/epdf/10.1187/cbe.14-06-0095

Technology-Based Observation Tools

Decibel Analysis for Research in Teaching (DART)

Real-time Instructor Observing Tool (RIOT)

Departmental and School Assessment

Partnership for Undergraduate Life Sciences Education Vision & Change Rubrics (PULSE)

  • A set of rubrics initially developed for assessing departmental implementation of the recommendations identified in the 2011 National Academy of Sciences report Vision and Change in Undergraduate Biology Education: A Call to Action (American Association for the Advancement of Science [AAAS], 2011).
  • “Pulse Vision and Change Rubrics” (2013): https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3846506/pdf/579.pdf

TEval: Transforming Higher Education – Multidimensional Evaluation of Teaching

  • A multi-institutional project to support and research multidimensional assessment of teaching aligned with evidence-based teaching practices.