July 10th, 2017

Using Formal Program Review for Continuous Improvement

By:

formal program review

Drexel University uses a Program Alignment and Review (PAR) process to help ensure relevance, quality, and measurable achievement of its academic programs. It’s a formal review process that includes a self-study, external review, and action plan.

In an interview with Academic Leader, Janice Biros, senior vice provost for budget, planning, and administration, and Stephen DiPietro, associate vice provost for university assessment operations, discussed the process and outcomes of this effort, now in its third year.

Biros was an early advocate of a formal program review process. She felt there was a need to look at how the university was allocating its resources and supporting its priorities. Rather than focusing on resource allocation (although this would be an important component of the process), the provost decided on a more positive continuous quality improvement process.

From the start, Biros and DiPietro provided support rather than driving this initiative. A steering committee came up with a plan, timeline, guidelines, and tools. “It was all driven by the deans. They would identify which programs they wanted reviewed and when. They would identify the self-study groups and keep the process moving,” Biros says. “I think it’s unique in that it probably is a more formalized approach than many schools are taking. It has presidential and senior-level support, and I think it’s beginning to take hold.”

Self-study

PAR is a one-year process that begins with a self-study. A self-study committee of three to five faculty members selected by the dean and directed by the program coordinator or department chair conducts the self-study, which is based on the following data:

  • Program course catalog data—required courses, electives, core major requirements
  • Student activity data—10-year reports for enrollment, student credit-hour production, graduate rates, retention and persistence rates, GPA, and student learning assessment plan
  • Faculty profile—10-year report of all employees by type, one-year faculty instructional workload report, 10-year report of external funding awards and applications, and CVs for each faculty member
  • Budget and finances—10-year report of year-to-date expenses
  • Other resources—list of library resources, report on facilities and space, the college strategic plan, the college organizational chart, the program strategic plan, the program organizational chart, experiential learning opportunities, and senior exit survey.

This data is benchmarked against the college and the university to provide the self-study committees with some perspective on the data and how it compares to that of other programs.

The self-study report includes the following elements: an executive summary; program description; background and history; enrollment and student profile; faculty profile; curriculum and instruction; quality of program outcomes and learning assessment; research, scholarship, and creative activity; advising; finances; analysis of resources; facilities and space; technology; strategic alignment; and conclusion and action plan.

External review

To get a broader perspective on the program, the PAR process calls for an external review by up to three outside reviewers/scholars who conduct a site visit and produce a report that addresses many of the same questions addressed in the self-study.

Action plan

Based on the findings of the self-study and recommendations from the external review, the program creates an action plan that provides direction for the next seven years, which is when the next PAR takes place.

Results

Thus far, 23 programs have participated in the PAR process, and 13 are currently engaged in the process. The process has gotten faculty and academic leaders to consider their programs in a broader context. “I had a meeting with chemical engineering and one of the professors told me that he never really understood the curriculum until the PAR process. It provides an opportunity to see it from beginning to end,” DiPietro says. “We’re finding that PAR is facilitating discussions that have not occurred in the past.”

PAR findings inform decisions, some of which can be implemented at the program level and some that require broader participation and resources. “I think one of the things that’s kind of interesting is that we ask [academic programs] to make recommendations for changes, and at the beginning everybody was saying, ‘Where are the resources for all of this?’ There is really not a budget for PAR. We’re trying to integrate changes into other initiatives. For example, faculty hiring recommendations from PAR will be integrated into our overall university faculty hiring plan. And renovations and additional space requests are being integrated into our master planning and renovation process,” Biros says.

A PAR task force on computing found very similar computing courses being offered across five colleges and schools. A group of faculty studied how the university taught computing and recommended creating a College of Computing and Informatics and bringing those courses together in one place. “It’s a much more efficient and effective way to approach it,” Biros says.

The PAR process does not drive decisions. PAR results inform decisions, and Biros and DiPietro are careful in the way they explain its intent and process. The goal is for the institution to make more informed decisions that drive academic quality. Thus far, changes have not resulted in major spending reductions “but, I believe, a more rational allocation of resources,” Biros says.

The implementation of PAR comes at a time when the university is moving from traditional budgeting to responsibility-centered management (RCM), which gives deans more authority in managing their budgets. “The whole culture is changing,” Biros says. “People are looking at information differently. People who didn’t think about budgets before now are. And PAR is just another component of examining what we’re doing. PAR is going to help deans as they manage their budgets. While RCM is not going down to the program level, clearly, when deans have to make decisions about their resources and they’re going to budget, they’ll have information from PAR to look at to prompt or support decisions they might make.”

Rob Kelly is the former editor of Academic Leader.

Reprinted from “Using Formal Program Review for Continuous Improvement,” Academic Leader, 14,02 (2014): 5,6. © Magna Publications. All rights reserved.