As higher education evolves, so too does the importance of assessing learning. New regulations, financial constraints, and accrediting agencies are stressing that colleges and universities should strengthen assessment organizationally. However, when assessment is discussed in large faculty forums, the concept often, strangely, becomes very foreign to them. Here is where understanding and employing the process of research can very helpful in completing such tasks.
Historically, the process of research is associated with tenure, publications, and the doctoral process. Through the passage of time, research is now a balanced undertaking between gaining content knowledge and the process one goes through creating new knowledge. In breaking the traditional mold, it appears that having an understanding of the process of research may also help institutions in another area. This article examines the usefulness and paralleling of the research process and its application to institutional and academic assessments.
Every institution of higher education has a mission that all the institutional-level objectives should reflect. It does not stop there. Each program, as we know, has its own set of program-level objectives that should reflect the institutional-level objectives. Moreover, at the very bottom level are the course-level objectives that need to be met by students upon completing each course. Program chairs and directors are tasked with making sure every full-time, adjunct, and contracted instructor of that program is performing assessments of classes and linking them to the program objectives.
This task is easier said than done for two reasons. First, the number of adjuncts instructing in any one program could be so large as to cause a governance nightmare. Second, only recently has developing faculty on assessment become a more prevalent development option. Unlike K-12 teachers, who are often directly educated and certified in pedagogy and assessment, the large majority of college professors are not.
What then can institutions lean on to help this situation if professional development is not working or not an option? One consistent characteristic among collegiate faculty is they, at some point, have navigated the process of research. Those who were able to survive and thrive through that process should have learned something above the mere knowledge within their degree field. Each person did go through the process of creating knowledge. Is this not what we seek to obtain through assessing program objectives? The many parallels that the process of research holds with the assessment process became apparent after some reflection. Table 1 illustrates the comparison.
|Process of Inquiry||
|Why||The overall purpose of the study and connection to the larger field||Overall goal (course description) students should obtain from the course and the connection to the program as a whole|
|What||The research questions designed to investigate the why||Course objectives of the course used to meet the overall goal|
|Who||Audience benefiting from the study||Students participating in the course|
|How||Methodologies (quantitative, qualitative, or mixed methods) used to answer the research questions (surveys, interviews, historical analysis)||Types of assignments or activities that will measure whether objectives are being met (i.e., research papers, exam questions, discussion boards, pretests and posttests, and also direct and indirect measures)|
|Conclusion||Use and analysis of results to support or refute the research goal and provide insight into future research||Use and analysis of results to support or refute that learning occurred and objectives have been met, as well as to indicate possible improvements to the program in future|
Table 1: Process comparison
It is noteworthy that both processes are investigations at their very core. Here the assessment is investigating whether learning has occurred and to what extent. The process of research is investigating to create new knowledge or add support to an existing topic. Each process seeks to answer specific questions. The proper method must be employed to answer such questions. Finally, results in both processes need to be interpreted and conclusions made regarding the findings.
Ultimately, both the processes cycle back and start over again to provide a constant learning mechanism. It may be noted that the assessment completed for a course is searching for a definitive answer, while the dissertation seeks whatever outcome is achieved. While this has some validity, perhaps look at it in a manner to bring these two thoughts into close alignment. Assessment is completed in a course to see whether learning has occurred. Prior to doing the assessment, we only hypothesize that learning has occurred. This then becomes very similar to the hypothesizing that occurs during the process of research. The focus then becomes not so much about the outcome but about the process and why we are going through it. Assessment expert Linda Suskie (2009) states, “If faculty and staff find it hard to make the leap from articulating processes to articulating outcomes, encourage them to ask ‘why?’” The question of why resonates in the dissertation process as well. I can remember my committee members telling me “You need to focus and answer the ‘why.’”
It is important for curriculums to move away from simply memorizing numbers or regurgitating terms. Kelting-Gibson (2013), while discussing the work of David Perkins, echoes these similar sentiments. “It is important for students to develop understanding and not just memorize facts and figures.” As our curriculacontinue to evolve, so too should our program objectives and the assessments that measure those in order to create deeper learning. The one constant we could rely on is reverting to the process of research to assist us with this evolving change. Doing this may ease the angst when program reviews and reaccreditation time roll around.
Patrick J. Hughes, PhD is program chair and assistant professor of organizational studies at Saint Louis University.
Reprinted from Academic Leader, 31.12 (2015): 5, 7. © Magna Publications. All rights reserved.