Implications For Assessment
Authored By: Sharon Aiken-Wisniewski
To inform and support improvement, advisors and advising leadership undertake assessment, which allows them to gather evidence regarding the claims made about student learning as well as the process and delivery of academic advising (Campbell, 2008). This important process, comprised of vision and mission as well as goals, objectives, student learning outcomes, and measurement tools, allows for evaluation of advising (Aiken-Wisniewski et al., 2010; Campbell, 2008). The assessment components integrate into a cyclical process for evaluating academic advising as well as creating information for future change and enhancements (Maki, 2004).
Mission
The 94.7% response to the 2011 NACADA National Survey item regarding the existence of a mission statement indicates that the advising community is familiar with the term, even though a smaller percentage (60.3%) of respondents noted that their institution has a mission statement. These two findings show that the mission statement serves as a foundational component of and a common starting point for the assessment process. By understanding the documented connection between the mission statement and assessment, researchers can create items about this relationship for future inquiry. For example, are institutions developing a mission statement as a means to initiate assessment?
A review of the institutional characteristics shows that more large institutions and those with a mixed model of advising had mission statements than did small institutions or those with faculty-only advising. These data suggest that certain institutional types were more likely to have a statement. Based on the findings, further inquiry might focus on the resources (i.e., people and time) necessary to develop a statement.
NACADA should continue to increase awareness and facilitate development of mission statements by promoting activities that educate advisors and advising administrators on the concept. Specifically, the members must continue to reach out to the general advising population but also use these survey data to create tools that support institutions and programs less likely to have an advising mission statement. Through networking and surveying, the organization can identify small institutions, those using a faculty-only advising model, and proprietary schools that have created an advising mission statement. Leadership at these institutions and NACADA can share the creation process with colleagues attempting to develop their own mission statement.
Assessment Efforts
In a striking result, 44.3% of survey participants did not respond to the item about assessment efforts, suggesting a limitation of the instrument. Due to the common characteristics of the institutions of respondents who failed to respond, important follow-up survey items include those on resources allocated for advising, training and development of advising personnel on assessment tools, and commitment to a systematic advising process proven effective in addressing the student experience.
That the satisfaction survey was the most common means of collecting data for the assessment process, especially for full-time advisors, was unsurprising. In my experience, full-time advisors seek out satisfaction information. However, the results also suggest that advisors are not focusing on student learning with the same level of interest as their satisfaction. Less than 23% of respondents had established program goals and only 17% had articulated student learning outcomes. Because they provide more than just satisfaction information, clearly communicated program and student learning outcomes are essential for understanding the effect of advising within the teaching and learning process. These tools yield data advisors can use to promote learning through an advising curriculum. Advising is connected to the teaching mission, which requires a curriculum established with student-learning outcomes (SLOs), program goals and objectives, and a delivery system (National Academic Advising Association, 2007).
The delivery of advising is often evaluated through job performance criteria. The survey indicates that this tool is in place at only 1 in 5 responding institutions. Many advising programs are not evaluating delivery of SLOs.
Overall these data indicate that the process of advising is missing key elements significant to effective assessment of learning within the advising process. Advising programs, structured to easily garner student satisfaction information, still must address student learning. The findings suggest that stakeholders need to identify and engage those best positioned to lead assessment initiatives in advising.
Using Data to Assess Advising Effectiveness
The results indicate that most assessment efforts involve a specific activity but do not reflect employment of multiple efforts such as those named in the survey. Respondents selected the student satisfaction survey, measures of retention/persistence, and advisor job performance as more frequently used tools than measures of program goals/outcomes and SLOs. As a result, the respondents indicated that their institutions do not employ key components for gathering evidence in an assessment plan, which are important to evaluate effectiveness, especially when connecting advising to the teaching mission of the institution.
The low response to the survey item on advising effectiveness is a limitation. However, of those who did respond, most indicated that that their institutions do not typically evaluate effectiveness in all advising programs. In general, while institutional leaders may communicate the importance of advising to student learning, retention, and progress to graduation, they lack the assessment tools or fail to report assessment results. Accreditation agencies, professional organizations, and students expect institutions to address effectiveness continuously as a means of identifying best practices and opportunities to enhance programming.
Two other issues emerged from the response to the survey item on assessment. First, the reliance on a satisfaction survey suggests neglect of the role of advising within the teaching and learning process. Satisfaction focuses on a students liking or disliking the services provided through advising, but it does not inform advisors and stakeholders if students learned how to generate and read a degree audit report, identify resources to improve math performance, or value the general education program. Efforts and tools that measure learning will provide evidence of understanding (know), performance (do), and appreciation (value).
Second, retention and graduation data limit the interpretative scope of advising effectiveness to student behavior at one institution. While these information points are relevant and important, they do not communicate the entire story of effectiveness. Through the development of an advising relationship with a student, an advisor will focus on opportunities that result in student success, and some of those efforts may reflect negative outcomes where unqualified numbers are used for assessment. For example, a student receiving effective advising may transfer to another institution to best complete individual goals and thus create attrition numbers that undermine the perceived effectiveness of advising in the context of retention for a specific institution.
Advising involves more than dispensing information at registration, and advisors embrace a developmental process through which they view students holistically. As stakeholders come to appreciate these goals for and efforts of advisors, they also expect institutions to build assessment, through appropriate, integrative evaluations, into the academic advising plan.
Summary
The mission statement and assessment items from the survey offer a glimpse of current practices in academic advising. Because the mission statement often positions the program for initial assessment and nearly 95% of respondents indicated understanding of mission, with 60% reportedly working at an institution with a mission statement, one can conclude that the field is moving toward greater clarification of the advising role, in part perhaps, for assessment purposes. The assessment questions on efforts in place and to determine effectiveness offered limited information due to low response rates, which may indicate relatively low respondent engagement in advising assessment.
The survey results showed that more institutions use the satisfaction survey instead of tools that complement SLOs and program goals/outcomes. A basic analysis from the findings of the three items on mission and assessment suggests that leadership is initiating advising assessment (i.e., through a mission statement), but these efforts are not completely integrated with key components into the advising plan. Alas, these data confirm the slowness of the assessment process, but they do not address the reasons for the arduousness of the process.
Also efforts and tools that highlight the connection of advising to teaching are the least commonly utilized at institutions of the survey respondents. SLOs and goals/outcomes are key components of assessment processes for advising plans that center on student learning.
Finally, the advising community must recognize the connection between best practice and assessment. To be promoted as a best practice, the effort must be proven effective. The evidence for this effectiveness emerges through multiple evaluation methods within an assessment process.
The findings lead to the following implications for advising practice:
- Advisors must continue to build, use, and enhance tools to gain momentum in assessment.
- Advisors need to identify opportunities to engage in training and development on the assessment process.
- Examples of assessment plans and tools for different types of advising models should be readily available and shared.
- Stakeholders must promote a culture of collaboration for easy access to assessment materials.
In addition to practice, further inquiry should yield greater description and understanding of the assessment process for advising based on institutional type, the presence of mandatory advising, and type of advisor. This research would increase awareness of useful tools and identify student needs.
Conclusion
Based on data from the 2011 NACADA National Survey, questions to ask about advising assessment emerge from two different perspectives: tools and the assessment cycle.
The following questions are useful for determining if assessment tools exist for your advising program:
- Does your advising program have a mission statement? Is it publicly displayed for advisors, students, and other members of the institution to view and understand?
- Does the advising program articulate clear goals and objectives to guide the development of SLOs?
- Does a developed list of SLOs represent the process of teaching and learning in advising? Has this list been discussed with the advising community for understanding as well as mapping SLOs within the advising process?
- What measurement tools (e.g., survey, focus groups, data from other campus entities, rubrics, etc.) are used to create multiple data points in the assessment process?
The following questions help determine if the assessments are cycling in an ongoing process characteristic of a productive advising program:
- Who is taking responsibility for advising assessment? Does a person or a team represent the advising community? How are these staff supported to accomplish the process of assessment?
- Upon gathering data from measurement tools, did the analysis clarify best practices in advising for student learning on your campus? Did it identify areas for growth and development? What efforts will be undertaken based on this information?
- Did your advising assessment team develop a communication plan to share findings with stakeholders? Did this plan clarify resources needed to address change for the next cycle of assessment?
- Does the institution value and reward advising assessment through visible policy initiatives and resource allocation regardless of the advising model, size of institution, or type of institution?
- What advising activities and assessment components were changed and which ones were retained before the initiation of the next cycle of assessment?
- What is the trajectory of training and development activities for individuals and teams involved in advising assessment that will promote a productive and continuous process?
Tables from 2011 National Survey
References
Aiken-Wisniewski, S. A., Campbell, S., Nutt, C., Robbins, R., Kirk-Kuwaye, M., & Higa, L. (2010). Guide to assessment in academic advising (2nd ed.) [CD]. Manhattan, KS: National Academic Advising Association.
Campbell, S. M. (2008). Vision, mission, goals, and programmatic objectives for academic advising programs. In V. N. Gordon, W. R. Habley, & T. J. Grites (Eds.), Academic advising: A comprehensive handbook (2nd ed.) (pp. 229–243). San Francisco, CA: Jossey-Bass.
Maki, P. (2004). Assessing for learning. Sterling, VA: Stylus.
National Academic Advising Association (2007). Student learning outcomes: Evidence of the teaching and learning components of academic advising (Pocket Guide PG06). Manhattan, KS: Author.
Cite the above resources using APA style as:
Wisniewski-Aiken, S. (2011). Implications for assessment. Retrieved from the NACADA Clearinghouse of Academic Advising Resources Web Site: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Implications-for-assessment-2011-National-Survey.aspx