AAT banner

Voices of the Global Community

28

Vantage Point banner.jpg

Meg Wright Sidle and Megan Childress, University of Pikeville

Sidle and Childress.jpgProfessional academic advisors wear many hats, from creating a fall schedule during summer orientation to personally walking a student to academic assistance after a referral by a faculty member. Advisors want to have a positive impact on student behavior that affects student success. They know that it is critical to identify the students who are most at-risk of not returning to college as soon as possible, ideally within the first two weeks of the first term.

When academic advisors collaborate with institutional research professionals on their campuses for such an endeavor, it is important to move beyond the data that is readily available to institutional researchers like composite ACT score, high school grade point average, first-generation status, and poverty status, as these data points often do not provide a sufficiently narrow list of students to academic advisors on which to focus their mentoring efforts.

The limitation with these measures is that they do not reveal how confident students are with their transition to college or how students approach the learning process. Many campus administrators already utilize various tools in their attempts to understand and predict student persistence. For example, the Students' Attitudes Survey (Erdmann, 1990), the Myers-Briggs Type Indicator (Kalsbeek, 1987), the Student Adaptation to College Questionnaire (Krotseng, 1992), and self-developed instruments (Chen, Lacefield, & Lindstrom, 2018).

The University of Pikeville (KY) decided to use the College Student Inventory™ (CSI) by Ruffalo Noel-Levitz, LLC. to identify at-risk students, because it is an instrument that had already been helpful for one-on-one mentoring of students at the university since 2011 and had wide institutional acceptance in providing reliable student information. This article describes the steps that the directors of the Office of Institutional Research and Effectiveness and the Center for Student Success took to identify which CSI-Form B non-cognitive indicators would successfully target at-risk, first-time first-year students, so that the institution’s student success professionals could intentionally intervene with those students early enough in the fall term to increase student retention and success.

Identifying Indicators with Largest Effect Sizes on Student Success

The data on academic success and retention from first-time, first-year students at a small, non-selective, private, four-year university in Central Appalachia for the 2011, 2013, 2014, and 2015 fall semesters were matched with the respective students’ CSI-Form B results (n=1,254) and accounted for 82 percent of all students in those four cohorts (N=1,534). This sample population was found to be generalizable with the cohort population.

The four independent variables were nominal measures: academic standing at the end of the first term [0=probation/suspension, 1=good standing], student enrollment for the second term [0=did not return, 1=did return], academic standing at the end of the first year [0=probation/suspension, 1=good standing], and student enrollment for the following fall term [0=did not return, 1=did return]. The dependent variables were ratio measures: the 16 CSI-Form B motivational subscale percentile scores, the four CSI-Form B computed percentile components, and the thirty 30 CSI-Form B self-reported student information. Statistical analysis included a series of independent t-tests to determine significant differences and subsequent Cohen’s d to measure effect sizes.

Ultimately, there were six indicators from the CSI-Form B that were determined to be good candidates for identifying at-risk students at the University of Pikeville: study habits (percentile), family emotional support (percentile), dropout proneness (percentile), educational stress (percentile), predicted academic difficulty (percentile), and high school grades (percentile). While dropout proneness (percentile) impacted all four of the negative consequence independent variables, it did not have the largest effect sizes among the seven candidate outcome variables. In addition, high school grades (percentile) had the largest effect sizes, but it was dismissed from further consideration as being duplicate information with the actual students’ high school grades from the admissions process.

Using Confidence Intervals to Determine Cut-off Scores

Once the directors decided to use predicted academic difficulty (percentile) as the best predictor of identifying at-risk students at this institution, the next step was to determine a cut-off score to actually find the students entering the university in fall 2016 who were most at-risk. For this indicator, the higher the score, the more likely the student is to not have success. The 95% confidence interval at the upper level for the sample population on this variable was 51. However, the score of 62  was determined to be the best minimum score to use because it was the average score of students being placed on academic probation/suspension at the end of the first year and, hence, would best identify only those students who were most likely to need extra attention from the academic advisors.

Early academic warning alerts were created on the institution’s retention alert system for the 100 new first-years (30% of fall 2016 cohort) who had predicted academic difficulty (percentile) scores of 62 or higher on the CSI-Form B. The advisors intentionally reached out to these students more frequently and through various forms of communication to help maintain open lines of communication and to offer assistance throughout the fall 2016 term. It was essential that advisors be proactive, rather than reactive based on early alerts received from faculty, with their mentoring of these students.

Advisors sent emails to their students in advance of important dates on the academic calendar, such as the last date to drop or add a course, midterm exams, early registration, and final exams, in order to ensure their students were prepared for these dates and knew their advisor was available to answer questions. However, advisors also used that opportunity to schedule an in-person check-in for this cohort of students to ensure that these students were connected to the right campus and community resources at the right time.

The Need to Further Narrow the List of Identified Students

When the institution used the predicted academic difficulty (percentile) alone to identify at-risk students, the data produced a high number of students (n=100) for this small institution, which made it difficult for the academic advisors to adequately maintain sufficient contact with all of the students identified. In other words, using the one indicator did not help narrow down the list of at-risk students enough to feel that it was an accurate predictor by itself.

While it did flag many students who needed additional assistance, it also flagged many who did fine without the supplementary contact and support. So, beginning with the 2017–2018 cohort, the director of the Center for Student Success wanted to focus efforts with more intentionality. When she and the academic advisors looked at the other possible indicators and considered the institution’s student body of first-generation students (43%), the family emotional support indicator was selected as a possible strong indicator of students who may need additional support from someone in an advisory role.

Thus in fall 2017, early academic warning alerts were created for only 39 new first-year students (14% of fall 2017 cohort) on the institution’s retention alert system for students with both conditions present: a predicted academic difficulty in the 62 percentile or higher and family emotional support at the 36 percentile or lower. There was a noticeable increase in the fall 2017 targeted cohort’s first term grade point average (M=2.42, SD=0.96) compared with the fall 2016 targeted cohort (2.08). In addition, 79% of these students from fall 2017 ended the term in good academic standing compared with the fall 2016 cohort (61%).

Another validation that using the combination of these two indicators to identify the most at-risk students works best for this institution is that of the targeted new first-year students from fall 2017 who returned for spring 2018 (25), none received another retention alert during the fall term. Of those who did not return after the fall term (14), 21% (3) received an additional retention alert during the fall term. While 79% of the fall 2017 students were in good academic standing at the end of the term, it is notable that the majority of those students particularly struggled with non-academic issues (i.e. roommate issues, difficulty navigating the financial aid process, personal issues, etc.) and often frequented the institution’s Center for Student Success for support.

Identification Does Not End Here

It is important to keep in perspective that using confidence intervals on assessments that help advisors understand the way students are approaching college is only a way to immediately identify students who need special attention. This does not replace integrated database platforms that will combine midterm grades, ACT scores, classroom attendance, etc., to identify additional at-risk students further along in the academic year.

Meg Wright Sidle, Ph.D.
Director, Institutional Research and Effectiveness
    and NAIA Athletic Compliance Administrator
University of Pikeville
msidle@Upike.edu

Megan Childress, M.A.
Director, Center for Student Success
University of Pikeville
msmith00@upike.edu

References

Chen, J., Lacefield, V., & Lindstrom, A. (2018, March). Using first-year students’ sense of belonging as a predictor of retention. Breakout session presented at the Kentucky Association for Institutional Research 2018 Annual Conference, Frankfort, Kentucky.

Erdmann, D. G. (1990). Maintaining enrollment stability: A new role for admissions officers. College Board Review, 155, 38–41, 48.

Kalsbeek, D. (1987). Campus retention: The MBTI in institutional self-studies. In J. A. Provost & S. Anchors (Eds.), Applications of the Myers-Briggs Type Indicator in higher education (pp. 30–63). Palo Alto, CA: Consulting Psychologists Press.

Krotseng, M. V. (1992). Predicting persistence from the student adaptation to college questionnaire - early warning or siren song? Research on Higher Education, 33(1), 99–111. https://doi.org/10.1007/BF00991974


Cite this article using APA style as: Wright Sidle, M., & Childress, M. (2019, September). Using students’ narratives and the college student inventory as an early alert tool for student success. Academic Advising Today, 42(3). Retrieved from [insert url here]

Comments

There are currently no comments, be the first to post one!

Post Comment

Only registered users may post comments.
Academic Advising Today, a NACADA member benefit, is published four times annually by NACADA: The Global Community for Academic Advising. NACADA holds exclusive copyright for all Academic Advising Today articles and features. For complete copyright and fair use information, including terms for reproducing material and permissions requests, see Publication Guidelines.