Back to top

May 20, 2020 - Subject and Student Surveys for Spring 2020

Dear Colleagues,
I am writing on behalf of the Academic Policy and Regulations Team (APART) to inform you of the plan for obtaining student feedback on remote teaching and learning this semester. This plan was developed by APART in collaboration with members of the Teaching and Learning Laboratory (TLL), the Registrar’s Office, and MIT Institutional Research. This email is also being sent to all students.
As explained in detail below (see “The Decision to Change the Subject Evaluation System”), in April it became clear that the normal subject evaluation form–based system would not be adequate in providing the feedback we need concerning the success and limitations of remote teaching this semester. The two-survey approach outlined below was therefore developed.
Key Features of the Survey Plan
Remote Experience Survey
(1) A Qualtrics-based Institute-wide Remote Experience Survey was developed by MIT Institutional Research in consultation with APART and the TLL. The survey was distributed to students on Wednesday, May 13 (the day after the last day of classes), and the survey closes on May 31.  
(2) The purpose of this voluntary survey is to assess MIT students’ remote learning experience this semester and to help MIT better understand how the stresses and difficulties they encountered can be mitigated should remote learning be necessary in the future.
(3) The survey will not provide information on a subject-by-subject basis but rather aims to evaluate students’ overall experience with remote learning this spring and to understand the challenges they faced. 
(4) The aggregated survey results will be made available to the entire MIT community. Individual data will be treated as confidential.
Subject-Specific Survey
(1) To enable students to provide specific and useful feedback on how subjects might be improved in remote offerings in the future, a Subject-Specific Survey has also been developed. The Subject-Specific Survey opens at 9 AM on Thursday, May 21 (the day after the last day of final exams), and will close on May 31.
(2) The Subject-Specific Survey will be conducted using the same online subject evaluation system employed in prior semesters. Students will receive a kick-off announcement from the Registrar’s Office on the morning of May 21, including a link to the survey and instructions.
(3) The Subject-Specific Survey does not include any questions calling for numerical ratings. The survey includes Likert-type questions (e.g., strongly disagree, disagree, neither disagree nor agree, agree, strongly agree) but mainly consists of open-ended questions such as, "Please comment on the effectiveness of live (synchronous) lectures and/or class periods in engaging your interest and/or helping you learn. Please also describe any challenges or impediments you faced with this component of the subject."
(4) The survey does not include questions asking for numerical ratings of instructional staff, although students may use the open-ended questions to recognize the contributions of members of the instructional staff (see discussion below).
(5) The sharing of the subject-specific survey results will be strictly limited to the respective subject instructors and department leadership, and to a tightly restricted number of key individuals involved in supporting remote learning at the Institute.
(6) The open-ended questions provide an opportunity for students to recognize the performance of particular individuals, such as teaching assistants.
(7) Because of practical time constraints, it was not possible to allow departments and instructors to include customized, additional questions in the survey.
(8) Note that an additional survey is being developed for Instructors to provide feedback on their experience with remote teaching. This will open approximately a week from now.
With best wishes to all,
Rick Danheiser
A. C. Cope Professor and
Chair of the MIT Faculty
Massachusetts Institute of Technology
Department of Chemistry
Room 18-298
Cambridge, MA 02139
Tel 617 253 1842
Cell 617 480 3948
The Decision to Change the Subject Evaluation System
The plan for subject and student surveys was developed by the Academic Policy and Regulations Team (APART) in collaboration with members of the Teaching and Learning Laboratory (Janet Rankin and Dipa Shah), the Registrar’s Office (Mary Callahan, Piero Chacon, and Kathleen MacArthur), and MIT Institutional Research (Jon Daries and his team). APART was created in mid-April as a successor to the Emergency Academic Regulations (EARs) Team that developed the emergency regulations, including the alternate grading scheme in effect for the spring. I serve as chair of APART, whose membership includes students and the current and incoming chairs of key Faculty Governance committees concerned with the Institute’s educational mission.
Several principles and considerations guided APART’s deliberations concerning the subject surveys. A top priority was to design a system that would gather useful and actionable feedback on remote teaching this semester. We felt that it is essential that instructors learn what worked well this spring, and what needs improvement. Another concern was to ensure that students have an opportunity to provide input on their experience with remote classes. Student feedback via subject evaluations has always been extremely valuable in helping instructors to improve their classes. Finally, APART was also sensitive to the fact that many instructors faced unusual challenges in launching remote versions of their subjects on very short notice this spring.
Why the subject-specific survey results will not be made public
While the full results of the Remote Experience Survey will be available to the community, the information gathered from the subject-specific surveys will be restricted (mainly) to instructors and departments. This is consistent with the primary purpose of the subject-specific survey—to provide the foundation for developing improved remote classes in the future. While APART appreciates that students often use subject evaluations for guidance in choosing classes, we recognized that instructors faced significant handicaps this semester in developing remote versions of classes "on the fly.” Consequently, we view the evaluations for subjects this spring as having little relevance with regard to future versions of the same classes.
Why numerical ratings of instructors are not included in the surveys
APART concluded that the usual numerical ratings would not be meaningful in view of the special challenges instructors faced. We felt that some of the same considerations that led to mandating PE/NE grading for students also apply here, where some instructors faced serious disadvantages in launching and teaching the remote versions of their subjects. Instructors were forced to develop remote classes with little prior notice, and many instructors had negligible prior experience with remote teaching. We also were sensitive to the fact that no small number of instructors were required to juggle teaching at home with taking care of young children, providing home schooling, and with all the difficulties and stress engendered by the Covid-19 emergency.
How outstanding instructors and TAs can be recognized
Open-ended questions in subject-specific surveys provide an opportunity for students to highlight exceptional contributions of instructors and teaching assistants. The surveys will be read not only by instructors but also by department leadership. Also note that MIT Open Learning sponsors student-nominated Teaching with Digital Technology Awards and this year, they have received a record number of nominations.
Why subject-specific surveys are opening after finals
Normally the subject evaluations close prior to final exams and the assignment of grades to students. APART felt that delaying the closing date to after exams would allow students to provide feedback on remote final exams and other end-of-semester assessment methods. With PE/NE grading in effect, we did not feel that students would be influenced by their assigned grades.