A comprehensive guide to conducting academic program reviews in higher education

May 6, 2024 Watermark Insights

A comprehensive guide to conducting academic program reviews in higher education

Higher education is in a unique spot today. Although enrollment numbers have improved recently, rising apathy among traditional student populations is more challenging than ever before due to various social and economic factors.

In this environment, your institution needs to distinguish itself from the rest to stay competitive. Focusing on building high-quality academic programs is one of the best ways to do so.

A well-designed academic program review process can help your institution achieve substantial improvements in academic rigor and course quality — and provide the evidence you need to better engage both current and prospective students.

What is an academic program review?

An academic program review is a periodic evaluation process that assesses how well a given academic program accomplishes its stated purpose. It also identifies opportunities for improvement. 

It involves several phases in which stakeholders gather and examine hard data to determine whether the program:

  • Aligns with the institution's overarching mission and strategic direction
  • Delivers expected learning outcomes 
  • Determines the appropriate actions to take for future program success

The three phases of a program review include the following.

1. Self-study


A program review self-study is similar to an accreditation self-study in that the process involves a program evaluating itself and identifying potential areas for improvement. 

An effective self-study focuses on more than making sure the program looks good for the review team. Its goal is to identify and address the program's strengths and weaknesses in order to optimize student outcomes.

The core areas to address in your self-study include:

  • Program mission: Does the program's stated mission align with the institution's overarching mission and strategic direction? How well does the program accomplish its mission?
  • Program quality: Is the program relevant to students' current needs? Is it rigorous enough to meet the institution's academic standards? 
  • Assessment standards: Do you have evidence that the program and the division it belongs to regularly assesses its curricula, instruction quality, and support services? 
  • Future direction: What resources and actions are required to ensure the program is sustainable moving forward? What data will inform the decisions made for future improvements?
  • Decision making: What data are program faculty using to make decisions regarding their teaching and assessment methodologies?

The committee will create a report summarizing their findings and submit it to the external review board upon completion of the self-study. 

Typically, the administration will check in on the self-study committee at least once during the process to gauge their progress. Depending on how far they have come, the committee may have a report ready by this time.

Although institutions tend to vary in their individual approaches to self-study reporting, all rely on historical, current, and projected data to power their insights. Gathering and analyzing all that data can take anywhere from six months to a full year, so it's best to start the process early to ensure you meet your deadline.

2. External review

After the self-study committee wraps up its report, a team of experts from outside the university will visit for an onsite evaluation. During this time, the external review team will consult with program faculty, staff, students, and other important stakeholders to get a comprehensive idea of the program's performance. 

They will also meet with department leadership, the deans of the corresponding college, and institutional administrators after completing their review to discuss their findings. 

This phase typically lasts two to three days, though it may take longer to properly review larger, more complex programs. Additionally, effective activity scheduling is essential for ensuring review committee members can access everything and everyone they need for a comprehensive evaluation.

3. Next steps

Next steps

The goal of this final phase is to determine the program's future direction — specifically, which actions the institutions should take to enhance the program. 

Once the external review has been completed, the program leads will conduct a comprehensive evaluation of the information from both the self-study and the external review reports. 

Using these findings, they will draft an action plan that includes the steps they plan to take to improve the program. They will also establish a timeline for implementing these changes, which may span several years.

Other key program review deliverables include:

  • Detailed explanation of budgeting and planning processes.
  • Issues raised during the process and clear steps to address them.
  • Plans for continuous improvement initiatives.
  • Explanations of how the institution will collect, analyze, and use program data moving forward.
  • A long-reaching plan for the next review cycle, including names of involved personnel.

These documents should be easily accessible to all administrators and program faculty, as they will include valuable information for implementing changes over the coming years.

The importance of program reviews

Regular program reviews are essential for sustained institutional growth and success, especially now that many prospective students are questioning the value of higher education. 

The benefits of academic program reviews include but are not limited to:

  • More engaged students: Finding ways to improve program relevance and rigor can help students stay engaged with their studies, leading to higher student retention rates.
  • Increased enrollment: Continuously improving your programs can help you differentiate your offerings from those of competing institutions, drawing more students each year.
  • Better student outcomes: Better programs mean students are more prepared to take on more advanced courses, pursue further education, and perform well at jobs in their chosen field.
  • Accreditation: Regular program review helps institutions adapt to changing standards within specific disciplines, making it easier to maintain their accreditation status.

Who is involved in an academic program review?

Most academic program reviews involve several groups, one for each phase of the process. Each group is comprised of various members from different parts of the institution.

Internal self-study committee

Internal self-study committee

A typical self-study committee will include:

  • Department chair
  • Program directors
  • Program faculty
  • Staff administrators

The committee may also include faculty from other programs to help guide the process along, hold program faculty accountable, and add fresh perspectives to the discussion.

In addition to completing the tasks involved in a typical self-study, this committee will be responsible for appointing and reaching out to external reviewers. 

External review committee

Similar to bringing in institutional faculty from outside the program, the purpose of the external review committee is to broaden your perspective and hold the program accountable for its work. 

This committee's key responsibilities include:

  • Reviewing the self-study reports.
  • Performing an onsite evaluation of the program.
  • Delivering a final report summarizing their findings and recommendations.

Members of the external review committee should be reputable peers with demonstrated experience and proficiency in the program's areas of specialization. These individuals may be from your state, or they may come from other states and territories to prevent conflicts of interest from affecting the review process.

Additionally, the size of this committee will depend on the type of program you are reviewing. At a minimum, you should find two reviewers for non-degree programs and three for degree programs.

Other key stakeholders

Other key stakeholders

Including other institutional stakeholders in the review process helps your self-study committee build a more substantial report. For example, input from current students and alumni can reveal areas for improvement that the program faculty may not have considered otherwise.

Some important academic program review stakeholders to include are:

  • Faculty members
  • Students
  • Alumni
  • Industry professionals
  • Institutional administrators 
  • Donors and trustees

Online surveys distributed via email are an effective way to source feedback from stakeholders. It's also important to have a plan for how you will store and use the data you gain, as your chosen system will have a direct impact on the quality of the insights you'll be able to generate.

5 important components of academic program reviews

As with any assessment, academic program reviews have a list of key parts that your team should address. Using the right approach can help you ensure you're covering all your bases and conducting a complete evaluation.

The following are some of the most effective academic program review methodologies for all institutions.

1. Curriculum mapping

Curriculum mapping

Mapping courses within the program to learning outcomes helps demonstrate relevance, effectiveness, and overall coherence. 

Some of the program insights curriculum mapping can provide include:

  • Course sequence: Curriculum mapping reveals whether the order of courses in a program makes sense for achieving the desired learning outcomes. For example, are students learning concepts in a way that makes sense? 
  • Learning gaps: There may be gaps in the course sequence that leave certain desired skills and concepts behind. Visually mapping outcomes to courses helps identify where these gaps are and how to resolve them.
  • Mission alignment: When you can connect each course to specific learning outcomes, you can determine whether the program achieves its mission. 

A curriculum management software solution streamlines the curriculum mapping process by providing a centralized hub for materials like lesson plans and educational resources. 

You should be able to easily pull these materials and evaluate them without having to switch back and forth between systems — this lack of friction will help you save time and prevent errors from making their way into your final review documents.

2. Conducting stakeholder surveys

One of the best sources of information regarding program quality is the people directly involved in it. Surveys are a simple yet incredibly valuable method for gathering data from all program stakeholders, such as:

  • Students: End-of-course surveys provide valuable feedback on the effectiveness of each course currently required for program completion. 
  • Alumni: Surveying recent graduates about their experience reveals how well the program prepared them for their next steps, such as pursuing a higher credential or securing a well-paying job within their field. 
  • Faculty: Faculty surveys can provide valuable insider information about the program, such as resource availability and assessment effectiveness. Instructor input can also reveal whether there is a disconnect between faculty members and students regarding student engagement.
  • Employers: You may also consider surveying local employers to understand whether the program's reputation affects their hiring decisions. For example, if a company has hired program alumni, were they satisfied with their performance?

A digital survey tool that integrates into a holistic data collection and analytics solution helps you maximize your survey results by allowing you to run surveys, analyze your results, and generate effective reports all from one unified place. Plus, by consolidating your institution's tech stack, you can maximize efficiency and save valuable time and money.

3. Reallocating and redistributing resources

Reallocating and redistributing resources

One of the biggest challenges in academic program reviews — and higher ed in general — is balancing institutional costs with educational quality. 

Many institutions have begun cutting less popular programs as a way to save funds, but taking this action often puts affected students in a tough position. Instead, it can help to approach the program review process with the intention of determining where and how to most effectively distribute fiscal resources among departments.

Some examples of strategic reallocation decisions that a program review could prompt include:

  • Turning underperforming majors into minors or non-degree programs.
  • Combining two similar programs to create one that is more comprehensive.
  • Reducing the number of sections for courses that lack sufficient enrollment.

Approaching your program review with institutional finances in mind will help you come up with effective solutions for preserving underperforming programs. 

4. Evaluating faculty performance

A program's instructors are just as important as the concepts each course covers. That's why you should include faculty evaluations as part of your program review process.

Understanding how an instructor's teaching methods and ability to build rapport with students impact learning outcomes adds another dimension to your program review that can help you determine the best course of action to take.

Items to assess in a faculty evaluation include:

  1. Does the instructor hold their students to high standards?
  2. What kinds of resources does the instructor use to create lesson plans and monitor student success?
  3. How will does the instructor adapt their approach to engage more students?
  4. Does the instructor reinforce the importance of achieving behavioral outcomes like regular attendance, collaboration, and self-sufficiency?
  5. Does the instructor incorporate civic-mindedness and diversity into their lesson plans and teaching methodology when appropriate? 
  6. To what extent does the instructor work together with their colleagues and administrators to improve student outcomes?

All of this information will serve as important evidence in creating your action plan.

5. Interpreting labor market data

Interpreting labor market data

Landing a good career within their field is one of the top reasons students choose to pursue higher education. 

Labor market information (LMI) is an excellent tool review committees can use to inform their evaluations, as it can reveal whether programs are meeting the needs of today's students. 

Here are some ways review teams can incorporate LMI into an academic program review:

  • Skills mapping: Similar to curriculum mapping, skills mapping connects a program's courses and experiential learning opportunities to the skills employers are looking for. After the review period, program leaders can use this skills map to determine how best to close any skills gaps that may have appeared.
  • Strategic planning: With access to the most current labor market trends, program leads can more accurately devise an action plan that addresses specific deficiencies and strengths within their programs.
  • Adjusting to meet demand: LMI can reveal which career paths are growing fastest, which is a great indicator of which programs will need to grow or downsize moving forward. As mentioned previously, these insights are helpful for determining the best ways to allocate and distribute resources among academic units.

Using review findings to improve effectiveness and success

After the assessment is complete and your administrators wrap up their final review, it's time to implement your action plan for the program.

The goal here is to use evaluation data to close the assessment loop — to take the correct actions based on the insights you gained from your review. That's where the concept of data-driven decision-making comes in. The more evidence you have to back up your actions, the more effective your decisions are likely to be. And that evidence can only come from a strong analysis. 

Integrated planning and self-study software tools help by:

  • Pulling necessary data from various sources within your institution's tech stack.
  • Storing review data in a centralized hub for easy access and analysis.
  • Enhancing collaboration between self-study committee members. 
  • Making insights more accessible with interactive dashboards and rich visualizations.
  • Putting your data in context for better, more accurate insights.

Ideally, the solution you use will be user-friendly and intuitive. Easily navigable software reduces the time it will take for your faculty to learn how to use it effectively — and the amount of time you'll need to spend refreshing their memory during the next review cycle. 

Facilitating continuous improvement of academic programs

Facilitating continuous improvement of academic programs

It's important to remember that academic program review is a cyclical process. Most institutions will follow either a five-, seven-, or 10-year cycle, but your institution may need to use different intervals based on your specific requirements.

It is also an agile process that allows for adjustments in real time. Each review is iterative, improving on the changes made in previous cycles — allowing this process to be a once-and-done endeavor defeats the purpose of academic program review. 

Smaller program evaluations should take place each year until the next review cycle begins. That's why monitoring key performance indicators (KPIs) between review cycles is so important. 

Important KPIs to track between review years

Sustainability and agility are essential for navigating the current higher ed landscape. Tracking KPIs on a quarterly or yearly basis provides concrete evidence of your action plan's effectiveness, so you can make adjustments that will enable sustained progress. 

Some helpful metrics to monitor include:

  • Graduation rates
  • Attendance
  • Completion rates
  • Achievement of learning outcomes
  • Student retention
  • Program enrollment

Building a culture of continuous improvement can help make it easier to implement action plans and adapt to new challenges in the program when they arise. 

Faculty and administrators should be open to the possibility of change and hold themselves accountable for their current performance. Additionally, everyone within the institution should be encouraged to contribute suggestions and honest feedback to help identify areas for improvement at the program level.

Simplify academic program reviews with Watermark Planning & Self-Study

The proper tools help make program reviews a breeze. Leveraging emerging technologies like those in the Watermark Educational Impact Suite (EIS) simplifies academic program reviews by streamlining analytics and data storage.

Our award-winning Planning & Self-Study solution enables you to complete your program reviews on time and easily share your findings with key stakeholders.

Plus, complete integration with all other Watermark products means you can seamlessly pull data on demand from anywhere for powerful, actionable insights. Our recent strategic partnership with Lightcast, for example, gives users access to the most current labor market data available so departments can see how their courses connect to the demands of the modern workforce.

Request your live demo today to see how easy it is to complete each step of the program review process with our centralized solution.

Simplify academic program reviews with Watermark Planning & Self-Study

About the Author

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut euismod velit sed dolor ornare maximus. Nam eu magna eros.

More Content by Watermark Insights
Previous Article
Why flexibility is important for student success
Why flexibility is important for student success

Higher education institutions need to offer more flexible accommodations to improve retention and graduatio...

Next Article
How to enhance your curriculum to meet current labor market needs
How to enhance your curriculum to meet current labor market needs

Learn why your educational institution should update its degree program curricula to align with labor marke...