Intentional use of Technology for Academic Advising
Authored by: George E. Steele
Digging a hole is easier using a shovel than a rake. Finding the best technology tools to achieve academic advising outcomes is a little more complicated. Those charged with finding the best technology often lack an understanding of how different technologies align with of the multiple goals of academic advising. This article presents a model to help advisors and college administrators better understand that intentional relationship. In short, this Intentional use of Technology Model is designed to help those interested in the use of technologies in academic advising with a tool that leads to greater intentionality.
The creation of the Technology Selection Model came about after a study this author conducted that examined 25 years (1988 – 2012) of technology presentations at NACADA’s annual conferences. Analyzed were 485 conference sessions and 52 pre-conference workshops which represented 7.51% of all annual conference sessions and 7.3% of all pre-conference sessions over that 25-year span. These sessions were selected by examining conference programs for each of the 25 years and identifying sessions 1) listed in the “technology” program track; 2) that used program conference codes for technology; or 3) were identified by this author through the use of terms such “technology,” “computers,” and specific software terms. The latter was necessary for the years 1988, 1990, 1992-1994 because the prior two methods for identifying program sessions did not exist. The results were presented at the NACADA Annual Conference in 2013, titled “Technology and Academic Advising: Where Have We Been These Past 25 Years?“
Once technology focused presentations were identified, the conference program description of the sessions, written by presenters, were used to identify the types of technologies being addressed. Through this method, identification of the primary technology tool discussed was made based on a qualitative methodology suggested Guba and Lincoln (1983). For example: Did the session predominate discuss e-mail, student information systems, customer relationship systems, or social media? Results were categorized based on emerging patterns from the data set.
Creation of the Model for Intentional Use of Technology
During the session categorization process three main patterns emerged: communication tools, Web-based tools, and enterprise level systems. The communication tools focused on interactions such advisor to student, student to student, and institution to students. The Web-based tools category tended to focus on providing information in a variety of media formats. The enterprise system category reflected the use of system-wide computing power to provide services within the institution and to students. These three broad categories overlapped with one-another, as represented in Image 1 below. The blue area in Image 1 represents those few sessions that did not fit into the three main pattern areas.
Image 1: Initial categorization of technologies
Using this general conceptual arrangement, the types of technology tools were coded so analysis could proceed. The arrangement is presented below (Image 2) along with examples of the type of technology tools assigned to each number code.
Image 2: Conceptual model for coding types of technology
The number code assignment to the various technology tools is shown in the Table 1 below along with links to definitions.
Coding of different types of technologies discussed in sessions at NACADA’s annual conferences
Examples and definitions supplied by Wikipedia
Student Information Systems (SIS)
Costumer Relationship Management Systems (CRM)
Video-conferencing, smart phones, etc
E-mail, blogs, chat, and other social media (Facebook, Twitter, etc.)
Static Web sites (predominately text based)
Applied information tools – degree audits, appointment scheduling, etc.
Learning management systems (LMS), retention systems, e-portfolios, and student portals - all based on using individual student-accounts
Technologies that fall outside of these identified areas such as how to use specific types of software like Microsoft Office or how to create a database.
Sessions that suggested use of multiple tools
One observation immediately appeared based on the model’s conceptual arrangement for coding. First, a rough line can be drawn on the right side of the model, called the FERPA/privacy line (Image 3). FERPA refers to the U.S.’ Family Education Rights and Privacy Act of 1974, which protects students’ rights to privacy. For this model, the line suggests an area where technology can be used to connect with students outside a formal advising session. Included here are social media and Web 2.0 tools used outside institutional endorsed and supported technologies to engage students in personal interactions with advisors. These tools fall outside the institution’s boundary because they are often housed on external vendor systems. Because of this, institutions have developed protocols that clarify and limit the nature of the types of conversations members of the institution can have with students.
Image 3: Location of the FERPA line
Three Areas of Intentional Application of Technology to Advising Goals
From the coding model, three areas emerged that suggest a more intentional way of associating advising goals and related activities to specific types of technologies. The three areas are service, engagement, and learning (Image 4).
Image 4: Three areas of intentional application of technology to advising goals
The service area highlights those tools that provide institutional services through personalized student accounts. Tools in this area (e.g., student information systems, degree audits, and appointment scheduling systems) share elements of the service characteristic. Technologies in this area are similar to the ones employed in other large organizations and institutions. These system tools are not significantly different from the customer service tools one would encounter interacting with a bank or airline. As such, some within the academy point to service technologies to support the notion that students are customers.
These service systems can also be viewed through the perspective of a commonly used technology model known as DIKW. The DIKW model stands for: Data, Information, Knowledge, and Wisdom (Ackoff, 1989) and is referred to as the “Knowledge Hierarchy” or “Knowledge Pyramid” in information knowledge literature. In short, the hierarchy suggests a relationship between the stages. While data is less than information and information is less than knowledge, one cannot achieve information without data, nor knowledge without information. Each of the stages in the model reflects a more powerful and sophisticated level of computing power that has been expanded, integrated, and developed over time.
In some regards, the DIKW model shares similarities with the human cognitive processing functions identified by Benjamin Bloom (1956). At one end of the spectrum Data in the DIKW model is similar Bloom’s information stage, in that both represent the existence of an item (Image 5). The theoretical level of Wisdom in the DIKW model is similar to Bloom’s level of evaluation, in that both share a metacognitive characteristic (Steele and Thurmond, 2009). In short, for the purpose of the technology and advising model, the DIKW model provides insights into the increasing sophistication and complexities of the enterprise systems our campuses are adopting. However, the DIKW model also suggests that technology cannot function at the highest levels of human cognition, let alone address the affect needs of students. This situation is where the high touch of advising combined with learning and communication tools is critical. When students cannot understand what these systems provide, then the advisable moment is present for advisors to help students understand what the cognitive processing done by technology means and how to use these systems more effectively in the future.
Image 5: Comparison of the DIKW model to Bloom’s Cognitive Taxonomy
The purpose of building communities is based on the belief (Nutt, 2003) that student involvement leads to higher student satisfaction and an increased commitment to completion of their educational goals. The engagement area uses tools to inform and build communities with students and others at the institution. These tools fall on both sides of the FERPA/privacy line. Tools that are on the outside of the FERPA/privacy line include standard Web pages (static), social media, and some other Web 2.0 applications (e.g., wikis and photo sharing platforms). While some of these tools may exist on the institutional side of the FERPA/privacy line (e.g., chat tools or wikis embedded within a LMS) usually they are not. On the institutional side of the FERPA/privacy line, tools such as Interactive video and smartphones permit advisors and students to engage in face-to-face interactions. CRMs, which use institutional e-mail accounts, are more common means of communicating within the FERPA framework.
Much has been written about advising as teaching (Hermwall & Trachte, 2003; Lowenstein, 2005). A key element of learning is that students are expected to show they have mastered some content, developed a skill, produced a project, created a plan, or demonstrated reflection on a topic or issue. And, that student learning will be assessed. This assessment of learning is what distinguishes students from customers, as customers are not held to this level of accountability. As Martin (2007) stated “Learning objectives answer the question: what should students learn through academic advising?” Publication of student learning objectives for academic advising programs within an advising syllabus is one creative delivery method institutions use (Trabant, 2006). In terms of the technology model, LMS, e-portfolios and retention systems are designed to assist in this learning effort. To various degrees, these learning tools provide means to organize content or monitor student behaviors defined by specific goals. Typically, these technologies integrate both synchronous and asynchronous communication and assessment tools to support and monitor students during the learning process. This important function is reflected by the placement of these learning tools in the middle of the model.
What cannot be over looked is the important role these tools play in program assessment. As Robbins and Zarges (2011) pointed out “…assessment is concerned with the academic advising program and services overall, primarily the achievement of student learning outcomes.” In short, the greater the use of technology tools to assist with capturing the results of students’ learning, the greater the production of student learning outcome data that can be aggregated to assist with program assessment.
Implications for Advising
The configuration of the technology model helps us consider intentionality in several ways. First, the model draws our attention to the fact technologies are tools designed to perform specific functions. The closer those functions align with our goals for advising, the more effective our delivery of advising becomes. Furthermore, better alignment between the function of technologies and our advising goals helps us produce more effective data for the evaluation of student learning and program assessment.
Second, the technology model reinforces that advising is a multi-faceted endeavor (Pasquini, 2011). Arguments as to whether those attending our campuses are students or customers miss the reality of today’s technology rich institutional environments. When technologies are properly aligned with advising goals as presented here, then “customer” and “student” distinctions become clearer. When students use technologies, as they access their personal accounts, designated as “service” in this model, it makes sense for them to have expectations and for institutions to treat them as customers. On the other hand, when using technologies designated as “engagement” and “learning” in this model, these same individuals are “students.” While this debate will continue, it must be noted that even banks and airlines do not let their customers misuse their access to personal accounts. There are always rules.
Finally, the technology model helps us focus on where and how we can improve our practices using technology tools. In the case of the service area, “best practices” from business and non-profit organizations can be very helpful as we develop these tools. Likewise, with tools of engagement, much can be learned from businesses, politics, and non-profit organizations in the terms of how they develop communities and keep members informed. The growth of distance and blended learning means advisors must expand our knowledge and improve our practice by becoming more aware of how technology is successfully used in the learning/teaching process. Perhaps using a shovel to dig a hole is not the best metaphor. Instead, what we need is a crane to help use build better advising programs, improve advising practice, and achieve our valued goals.
George E. Steele
Curator NACADA Clearinghouse
Ackoff, R. L. (1989). From data to wisdom, Journal of Applies Systems Analysis, 16, 3-9.
Bloom B. S. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain. New York: David McKay Co In
Guba, E.G. & Lincoln, Y.S. (1983). Effective evaluation: Improving the usefulness of evaluation results through responsive and naturalistic approaches. San Francisco, Jossey-Bass.
Hemwall, M. K., & Trachte, K. C. (2003). Academic advising and a learning paradigm. In M. K. Hemwall & K. C. Trachte (Eds.), Advising and learning: Academic advising from the perspective of small colleges and universities (NACADA Monograph No. 8, pp. 13–20). Manhattan, KS: National Academic Advising Association.
Lowenstein, M. (2005). If advising is teaching, what do advisors teach? NACADA Journal, 25, 2, 65-73.
Martin, H. (2007). Constructing Learning Objectives for Academic Advising. Retrieved, from NACADA Clearinghouse of Academic Advising Resources Web site: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Constructing-student-learning-outcomes.aspx
Nutt, Charlie L. (2003). Academic advising and student retention and persistence from the NACADA Clearinghouse of Academic Advising Resources Web site http://www.nacada.ksu.edu/tabid/3318/articleType/ArticleView/articleId/636/article.aspx
Pasquini, L. (2011). Implications for use of technology in academic advising. Retreived from the NACADA Clearinghouse for Academic Advising Resource Web Site: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Implications-for-use-of-technology-in-advising-2011-National-Survey.aspx
Robbins, R. & Zarges, K.M. (2011). Assessment of Academic Advising: A Summary of the Process. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web Site: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Assessment-of-academic-advising.aspx
Steele, G.S. and Thurmond, K. (2009). Academic advising in a virtual university, In New Directions for Higher Education, Lessons form virtual universities, 146, Summer 2009, 85-95.
Trabant, T.D. (2006). Advising Syllabus 101. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web site:
Cite this resource using APA style as:
Steele, G. (2014). Intentional use of technology for academic advising. NACADA Clearinghouse Resource Web Site: