AAT banner

Voices of the Global Community

22

Kiana Y. Shiroma, Pre-Health/Pre-Law Advising Center, University of Hawaiʻi at Mānoa
Lauren Nakamine, Pre-Health/Pre-Law Advising Center, University of Hawaiʻi at Mānoa

Incorporating survey response data has become increasingly important for decision-making in higher education institutions. Colleges and universities often use survey results like the National Survey of Student Engagement (NSSE) to assess student outcomes, advising effectiveness, and program success. In addition, most research published in higher education journals is based on findings from survey data (NSSE, 2023). However, many offices and researchers are experiencing decreased survey response rates, particularly since the COVID-19 pandemic began (Krieger et al., 2023). The authors’ office, the Pre-Health/Pre-Law Advising Center (PAC) at the University of Hawaiʻi at Mānoa, also faced this issue. To increase its number of evaluation form responses, PAC tried various methods. After providing a brief overview of PAC, this article will discuss what did not work, what worked, and future plans.

The Pre-Health/Pre-Law Advising Center is a tiny office that provides advising and resources for students and alumni applying for health professional and law schools. It has an estimated 5,800 advisees. The center consists of one full-time faculty director, one full-time staff advisor hired six months ago, and six part-time peer advisors. Its services include individual advising appointments and group workshops. PAC began to experience decreased evaluation survey response rates for both service types when the pandemic began. To address this issue, PAC started experimenting with various free and straightforward ways of requesting and collecting student survey responses. The remainder of this article describes the methods used for one-on-one appointments and workshops and their effects on survey response rates.

Individual Advising Appointments

The primary service PAC provides is individual advising appointments. Advisees can meet with PAC in person, online, or over the phone. The evaluation form is available as a Google Form. Below is a description of what did not work and what did.

What Did Not Work

Below are two strategies PAC tried to garner an increase in response rates that were unsuccessful:

  • Paste the evaluation link in the chat at the end of online appointments.
    • Before (November 2019 to October 2020)
      • 221 responses for 549 online appointments
      • Response rate = 40%
    • After (November 2020 to October 2021)
      • 209 responses for 1,047 online appointments
      • Response rate = 20%
    • Result: The response rate significantly decreased, which may be due to possible survey fatigue during the pandemic
  • Provide advisees with a laptop open to the online form and ask them to complete the evaluation before leaving the office for in-person appointments.
    • Before (May 2019 to February 2020)
      • 325 responses for 629 appointments
      • Response rate = 52%
    • After (July 2023 to April 2024)
      • 58 responses for 155 appointments
      • Response rate = 37%
    • Due to the pandemic, PAC did not offer in-person appointments from March 2020 to June 2023
    • Result: The response rate significantly decreased, which may be due to possible post-pandemic survey fatigue

What Worked

However, PAC was able to increase the number of responses received in two other ways:

  • Move the evaluation link in post-appointment emails from the bottom of the message to the top.
    • Before (October 2020 to September 2021)
      • 173 responses for 1,072 appointments
      • Response rate = 16%
    • After (October 2021 to September 2022)
      • 188 responses for 831 appointments
      • Response rate = 23%
    • Result: PAC experienced a 7% increase in its response rate
  • Send an email at the end of each month directly to advisees who had an appointment that month to request completion of the evaluation form and share that the results will be used to ensure that PAC provides quality services.
    • Before (February 2022 to January 2023)
      • 150 responses for 936 appointments
      • Response rate = 16%
    • After (February 2023 to January 2024)
      • 285 responses for 980 appointments
      • Response rate = 29%
    • Result: The response rate nearly doubled

Workshops

Another service PAC provides is workshops. Below is a description of the changes and their positive impacts on the survey response rate. The evaluation form was also offered as a Google Form for workshop attendees.

In-Person

PAC tried four variations of requesting survey responses at its in-person events:

  • Provide attendees with a handout with a QR code for the evaluation form.
    • Spring 2023: 17 responses from 24 attendees
    • Response rate: 71%
  • Provide attendees with a handout with a QR code for the evaluation form and time during the workshop to complete the form before dismissal.
    • Fall 2023: 25 responses from 26 attendees
    • Response rate: 96%
  • Provide attendees with a handout with a QR code for the evaluation form, time during the workshop to complete it before dismissal, and an explanation of how PAC will incorporate their feedback.
    • Spring 2024: 47 responses from 49 attendees
    • Response rate: 96%
  • Result: These changes resulted in an approximately 25% increase in response rate.

Online

PAC also attempted a couple of slightly differing ways of requesting event evaluations for its online workshops:

  • Send an email as soon as the event ends.
    • Fall 2023: 7 responses from 14 attendees
    • Response rate: 50%
  • Send an email as soon as the event ends with an explanation for why feedback is needed.
    • Spring 2024: 38 responses from 55 attendees
    • Response rate: 67%
  • Result: These changes increased the response rate by 17%.

Other Evaluation Form Updates

In addition to the methods shared above, PAC made additional minor updates to encourage further survey responses:

Paper Evaluation Forms

Below are two other strategies PAC does:

  • Place evaluation forms and pens on the desks before attendees arrive.
  • During form collection, have an office representative gather completed forms before students exit instead of having a collection basket.

Online Evaluation Forms

PAC also tried other tactics to increase online evaluation form responses.

  • Keep questions short and specific and require a few straightforward answers.
    • The majority of questions use a Likert scale.
    • Very few questions were open-ended.
  • Require responses to important questions to ensure at least the minimum amount of information was collected.
  • Include an optional open-ended question at the end of the survey for attendees to ask additional questions and provide more ideas and feedback. More responses were provided than expected.
  • Use sections as needed.
    • Sections allow for separation between questions, thus making questions easily seen and answered.
    • Breaking the form into smaller sections may encourage students to complete the survey, as they may feel less overwhelmed by answering all of the questions.
  • Ensure that the form allows multiple submissions from the same individual to ensure submission for different events or appointments.
  • Make the form aesthetically appealing.
    • Google Forms allows artistic changes like font, color, banners, images, etc.
    • Changing the form aesthetics could make the survey more eye-catching and engaging, thus encouraging people to respond.

In retrospect, PAC's most impactful change was explaining why students need to respond to surveys, with a 13% increase in response rate for advising evaluations and a 17% increase for workshops. This finding is consistent with the results of two literature reviews completed on nearly 9,000 studies that utilized online surveys in that they found that contacting possible participants before sending a survey garnered more responses (Sammut et al., 2021; Wu et al., 2022). Explaining value in conjunction with providing paper and online survey response options at workshops is likely why response rates increased by 25%, as Wu et al. also found that giving respondents different ways of completing surveys further raised responses.

PAC will continue experimenting to garner more evaluation responses for the upcoming academic year by providing multiple ways and opportunities for advisees to respond during workshops. The office plans to provide the QR code for the evaluation survey and paper copies for in-person events. Similarly, for online workshops, the QR code will be provided in the slide along with the link in the chat. During both event formats, attendees will be asked to complete the evaluation earlier to give more time to complete the form.

Receiving more responses to its evaluation forms has been instrumental for PAC to demonstrate its quality of services and effectiveness in addressing student learning outcomes in its justification for faculty and staff funding, faculty’s applications for promotion and tenure, and end-of-year reports. This information is also helpful for PAC when making decisions about its services and events. Moreover, providing qualitative and quantitative student feedback is a major factor in PAC’s success in receiving $11,000 in funding to provide basic certification courses for its pre-health students to be considered for health-related jobs and volunteer opportunities required for them to be competitive for health professional schools. The university office that provides funding highly values student feedback to determine if future funding is awarded.

These lessons have also been applied to the director’s research surveys with similar results. The authors hope that readers will be able to take away these free and easy ways of increasing student survey responses so that they can gather more information and data to make better-informed decisions and to demonstrate further their advising offices’ significant roles in their students’ success. The authors welcome readers to reach out and share other ways they have been able to garner more student survey responses.

References

Krieger, N., LeBlanc, M., Waterman, P. D., Reisner, S. L., Testa, C., & Chen, J. T. (2023). Decreasing survey response rates in the time of COVID-19: Implications for analyses of population health and health inequities. American Journal of Public Health, 113(6), 667–670. https://doi.org/10.2105/AJPH.2023.307267

National Survey of Student Engagement. (2023, May 17). NSSE overviewhttps://nsse.indiana.edu/nsse/reports-data/nsse-overview.html

Sammut, R., Griscti, O., & Norman, I. J. (2021). Strategies to improve response rates to web surveys: A literature review. International Journal of Nursing Studies, 123. https://doi.org/10.1016/j.ijnurstu.2021.104058

Wu, M. J., Zhao, K., & Fils-Aime, F. (2022). Response rates of online surveys in published research: A meta-analysis. Computers in Human Behavior Reports, 7. http://doi.org/10.1016/j.chbr.2022.100206

Comments

There are currently no comments, be the first to post one!

Post Comment

Only registered users may post comments.
Academic Advising Today, a NACADA member benefit, is published four times annually by NACADA: The Global Community for Academic Advising. NACADA holds exclusive copyright for all Academic Advising Today articles and features. For complete copyright and fair use information, including terms for reproducing material and permissions requests, see Publication Guidelines.