Appendix A — Appendix: Sample Survey Instrument
This appendix presents a sample version of the Student Perceptions of Learning Experience (SPLE), including the recommended preamble and one item per aspect of class climate (two for Coherence). The instrument uses a five-point ordered categorical scale (Strongly Agree, Agree, Neither Agree nor Disagree, Disagree, Strongly Disagree) plus a Not Applicable option for each item. The items presented here are illustrative. They are intended to demonstrate how the six aspects of class climate can be operationalized as experiential survey items. This is not intended to be the final instrument.
A.1 Preamble
The survey should open with the following informational preamble, consistent with the evidence on anti-bias framing discussed in Section 5.5 (Boring and Philippe, 2021).
Student Perceptions of Learning Experience
This brief survey asks about your experience in this course — the learning environment, your interactions with the instructor, and how you perceive the course was structured. It does not ask you to evaluate the instructor’s teaching ability or the course content.
Research shows that students’ responses to surveys like this can be influenced by characteristics of the instructor — such as gender, race, and accent — that are unrelated to the learning environment. Being aware of this tendency helps you provide more accurate feedback.
Your responses are anonymous and will not be shared with the instructor until after final grades have been submitted. Please respond thoughtfully and honestly.
A.2 Sample Items
All items use the following response scale:
Strongly Agree · Agree · Neither Agree nor Disagree · Disagree · Strongly Disagree · Not Applicable
A.2.1 Regard for Students
“I felt the instructor engaged with students as individuals.”
A.2.2 Consistent Communication and Enforcement of Expectations
“I knew what was expected of me in this course.”
“I felt the instructor applied the same expectations and standards to all students.”
A.2.3 Access to Instructor and Instructor Resources
“I was able to get help from my instructor when I needed it (in office hours, after class, or by email).”
“I was able to access the course materials and resources I needed for this class.”
A.2.4 Perceived Course Coherence
“I could see how what was assessed related to what was covered in the course.”
“I could see how the different parts of this course fit together.”
A.2.5 Participatory Climate
“I felt there were ways for me to participate in the course.”
“I felt the instructor created opportunities for me to explore the ideas in the course.”
A.2.6 Responsive Learning Environment
“I felt the instructor created a learning environment that was responsive to all students.”
A.3 Sample if the Academic Senate elects to retain open-ended questions
If open-ended questions are retained under the guardrails described in Section 3.3, the instrument would include one structured open-ended prompt on Perceived Course Coherence — the aspect where elaboration is most informative and least susceptible to bias. The prompt appears immediately after the Perceived Course Coherence Likert items and directs the student to describe their experience with course structure.
A.3.1 Perceived Course Coherence (with structured open-ended prompt)
“I could see how what was assessed related to what was covered in the course.”
“I could see how the different parts of this course fit together.”
“Please describe your experience with how the different parts of this course fit together — for example, how readings, class activities, assignments, and assessments related to each other. Focus on specific aspects of the course, not on personal characteristics of the instructor.”
All other items (Regard for Students, Consistent Communication and Enforcement of Expectations, Access to Instructor and Instructor Resources, Participatory Climate, Responsive Learning Environment) remain Likert-only.
A.3.2 Why an open-ended question only on Perceived Course Coherence?
The committee considered attaching an open-ended prompt to each of the six aspects and concluded that Perceived Course Coherence is the only aspect where the benefit of elaboration clearly outweighs the risk of bias. The reasoning, aspect by aspect:
Regard for Students. An open-ended prompt here invites commentary on manner, demeanor, and personality — exactly the content that disproportionately targets women and faculty from marginalized groups (Mitchell and Martin, 2018). Highest risk, lowest benefit.
Consistent Communication and Enforcement of Expectations. An open-ended prompt here invites comments about grading, which correlates with grade expectations, not actual consistency of standards. It also invites favoritism allegations that can be racialized (Chisadza, Nicholls, and Yitbarek, 2019). High risk.
Access. An open-ended prompt here invites commentary on communication style, accent, and warmth — all heavily gendered and racialized (Subtirelu, 2015; Miller and Chamberlin, 2000). High risk.
Responsive Learning Environment. An open-ended prompt here could elicit valuable information, but it could also produce comments about the instructor’s identity that are impossible to disentangle from bias. A student who doesn’t feel they belong might attribute it to the instructor’s demographics rather than to specific practices (Heffernan, 2023). Moderate-to-high risk.
Perceived Course Coherence. This is the safest choice. An open-ended prompt here channels comments toward course structure — readings, assignments, assessments, the connections between topics. These are the most impersonal, practice-oriented comments a student can make. It is hard (but not impossible) to write something biased about whether the exam matched the lectures. And it is the aspect where elaboration is most useful to evaluators — a Likert response tells you the student didn’t see the connections; a structured comment tells you which connections were missing.
Participatory Climate. An open-ended prompt here could produce useful structural feedback (e.g., “group work was dominated by two people,” “questions were welcomed but never answered”). But it readily invites evaluative commentary about the instructor’s teaching style — particularly judgments like “the lectures were boring” or “there was too much group work.” Research shows that students conflate instructor enthusiasm and charisma with teaching effectiveness, even though enthusiasm is not associated with learning (Feeley, 2002; Williams and Ceci, 1997). A comment like “boring” tells you about the student’s affective response — which may reflect the instructor’s gender, accent, or presentation style — not about whether the environment supported participation. This kind of feedback is valuable in the formative process, where the instructor can contextualize it; in the personnel file, it becomes indistinguishable from bias. Moderate risk.
A.4 Relationship to Existing Cal Poly Maritime Academy Practices
Several of these aspects are already tracked in other CSU instruments. The Cal Poly Maritime Academy, for example, includes items on Consistent Communication and Enforcement of Expectations (“The instructor attempted to be fair and unbiased in their interaction with students”), Responsive Learning Environment (“The instructor demonstrated awareness and consideration of the diversity of students in the class”), Access to Instructor and Instructor Resources (“The instructor was responsive when I had questions”), and Participatory Climate (“The instructor provided opportunities for class participation”). The SPLE items are compatible with this existing practice. The principal difference is one of framing: the SPLE items are worded as first-person experiential reports (“I felt…”) rather than third-person assessments of instructor behavior (“The instructor attempted…”), consistent with the evidence that experiential items are less susceptible to bias than evaluative ones.