Putting Assessment in Its Place
What can you do with four minutes?
You can close the report and check the clock, update your to-do list, sort through your mail, or respond to a minor e-mail query. There are many important tasks you can do in four minutes. And if you don’t do them now, you’ll just have to find another four minutes later. Of course, none of this matters if you have plenty of time and too little to do, but most institutions have finite resources and must be deliberate in how they use them. Program assessment presents a special challenge to resource allocation, requiring a similarly deliberate approach.
How Much Does Instruction in Your Program Cost?
To manage resources effectively, it’s important to know how much it costs to teach students in your programs. Instructional costs vary from program to program based on class size, faculty salaries, equipment, and technology. And not all programs will generate enough revenue to cover costs. That’s OK as long as those high-cost programs are balanced with “cash cows,” programs that generate more revenue than expenses. Instructional cost data can play an important role in strategic planning.
An Intellectual Property Policy for Online Education
Does your institution have an intellectual property policy specific to online courses and course materials? If so, do you know and understand it? Do faculty know it? If faculty receives a financial payment or release time to develop online course materials, does that change who owns the rights to course materials? As the number of online courses and degree programs offered at institutions in higher education continues to expand, intellectual property rights will continue to garner increased attention.
Creating a Strategic Plan for Global Engagement: Ideas for Academic Leaders
“Our goal was to make sure students on campus understood that they would be working in a global environment,” says Karen Kashmanian Oates, Peterson Family Dean of Arts and Sciences at Worcester Polytechnic Institute. As scientists and engineers, WPI students are poised to enter a truly global marketplace, with companies headquartered outside the U. S. or requiring working in or with these other locations.
Constructing a strategic plan for global engagement that will benefit both students and the institution, however, requires more than a simple study-abroad program. In a presentation for the American Conference of Academic Deans (ACAD), Oates and her co-presenters explained the situation thus: “Without a strategic plan for global engagement in place, most interactions have been reactive instead of proactive, with engagements neither initiated nor coordinated with central university aspirations.” By constructing a strategic plan, the institution can help meet institutional goals.
For a little more than a decade, the STEM disciplines (science, technology, engineering, and mathematics) have been enjoying something of a privileged status at American colleges and universities. While enrollments in some other areas are stagnant or declining, they have been rising steadily in many STEM courses. In state systems, investment in faculty, equipment, and facilities often focuses on STEM while other fields go begging. Public figures call for more students to become interested in STEM, often at the same time as they denigrate such disciplines as anthropology, art history, and philosophy.
What accounts for all the positive attention the STEM disciplines have been receiving? The answers are many. First, the severity of the economic recession has caused many students, parents, and politicians to focus on the immediate employability of college graduates. Even if a classicist is as likely as an accountant to find suitable employment within six months of graduation, it is easier for many people to see the connection of business programs to jobs than it is to make that same leap for the liberal arts. “A college of engineering produces engineers,” some may think. “A college of humanities produces . . . what exactly? Secular humanists? Is that a good thing?”
Seven Important Factors in Program Assessment
“No one should be surprised to learn that faculty (in general) have not enthusiastically embraced the opportunity to see if their students measure up to those at other universities or to the expectations of their professors,” writes Diane Halpern in a “personalized review” of assessment programs in general and in her field of psychology. (p. 358) Faculty who believed assessment was another of those “trendy things” destined to pass once something else new came along have been proven wrong. The assessment movement is now close to 30 years old and still very much a part of the higher education scene. Institutions found it hard to ignore once it started being a condition for receiving federal funds and a review criteria used by the national accrediting associations and various professional program reviewing agencies.
Reviewing and updating some of her previous writings, Halpern suggests the list of factors important in program assessment have not changed but merit regular review. Here’s a summary of those seven factors drawn from a more detailed discussion of them that appears in the article referenced below:
The Advantages of an Annual Review of Departmental Data
Many academic departments now engage in annual cycles of assessment of student learning as well as departmental services. Best practices in higher education, reinforced by regional accrediting bodies, among others, dictate that only when departments assess student achievement and departmental initiatives, integrate those assessments meaningfully, and link them to resource allocation (as applicable) can they truly move down a path of continuous improvement. Yet can those assessments alone, important as they are, answer all the questions that departmental faculty and administrators pose about students, faculty, resources, and services? As a supplement to those assessment data, a set of pre-established, mission-centered metrics provides a barometer of the department’s health and vitality while informing timely decision making in a rapidly changing environment both inside and outside academia.
In “Getting SMART with Assessment: ACTION Steps to Institutional Effectiveness” (Assessment Update, 24: 1), Sandra Jordan and I briefly mention this supplementary data as one of three components of a fully integrated annual program review, which we define as an annual cycle of institutional effectiveness that combines the assessment of student learning with the assessment of departmental operations and often includes other departmental data. Whereas that article primarily explores strategies for promoting, clarifying, and supporting effective assessment strategies, in this article I discuss an annual departmental data review—its process, advantages, and management—as a separate component of institutional effectiveness. Used effectively, an annual departmental data review ultimately intersects with and supports other planning and assessment documents to advance departmental decisions.
Moving Beyond Majors
As I sat looking at data for the newly enrolled students in our incoming class, comparing it with institutional and national SAT data, I wondered, is the concept of a major becoming obsolete? Our colleges and universities are built around them. For generations, faculty have been training in one discipline with a distinct identity. Curricula have been designed to make the student’s major the most prominent piece of his or her educational pathway. Even on the admissions side, the first question we ask in a typical interaction is, “What major do you want to study?”