During my 20 + years in higher education and, specifically, in my over 15 years of working in the areas of institutional effectiveness and assessment, I’ve witnessed many changes in the processing of information and data at my institutions. Most of these changes have been based on the adoption of technology. From whiteboards to Learning Management Systems (LMS), and from spreadsheets to enterprise systems, how higher education manages data and records has dramatically changed. In my experience, most of these technology innovations have been wonderful additions to my work and have provided the opportunity to collect and share data in more efficient and effective ways with a wider constituency.
However, there can be challenges to adopting new technology to existing campus assessment processes. In my work as a peer reviewer and occasional consultant in colleges and universities, I have observed firsthand the complications of a poorly defined or implemented technology solution. The most common mistake I see occurs when colleges assume that all technology and software solutions are simply an administrative task. When using technology in support of assessment initiatives, it is paramount to include faculty in the equation, since assessment belongs to the faculty. The focus of good assessment is the improvement of student learning in the classroom and in the program. It is then essential that any technology solutions adopted keep that focus on the use of the data by faculty and on improving student learning.
Some of the challenges I’ve seen include:
• Technology that is focused on compliance with an external agency. While we need to be aware of the broader constituents we serve, the primary audience for assessment efforts is internal and their needs should be the primary concern.
• Technology that is cumbersome to use. As I noted, faculty are the owners of assessment since the primary role of the data is to inform curriculum changes. If a system is difficult to use, or you must worry about multiple “modules” or “components” of the system, faculty often will not have the time or patience to enter the results.
• Technology that seeks to “dictate” the processes. It takes time to develop a culture of assessment on campus, and overlaying a technology solution that does not align with assessment processes can have a negative impact on that culture.
• Technology that is over “permission” oriented. This challenge also speaks to the cultivation of an assessment culture of an institution. Since the ownership of assessment belongs to the faculty, a technology solution that requires multiple levels of submissions and approvals can stifle innovation as well as the assessment culture on campus.
I was delighted to find that the new LMS my institution recently selected allows for the collection of assessment data that can be aggregated at the department, division, and institutional levels! Helping faculty understand how to use the data and generate useable results has been an area of focus for my office for the last several months.
"From whiteboards to Learning Management Systems (LMS), and from spreadsheets to enterprise systems, how higher education manages data and records has dramatically changed"
We are also in the throes of updating our Program Review software to better integrate the reporting of assessment results. Admittedly, both endeavours have been very challenging. But throughout these shifts, my office has stayed focused on serving our primary audience – faculty. In doing so, our goal is to encourage wider adoption of the technology for its various assessment uses. This will benefit faculty in gathering assessment results that are beneficial to student learning and will also support the investment the college has made in adopting the solution.
Technology is a wonderful thing, and the advances in how we can capture, manage, and share data has come a long way. Just remember to keep the focus on your primary audience!