AMS: Assessment & Beyond (Not your mama’s accreditation tracker!)

September 18, 2023 Jessica Egbert

When I first began using Taskstream’s Accountability Management System (AMS) in 2010, I believed the implications for our developing assessment and continuous improvement processes would be magical: assessment genies would load data, obtain constituent buy-in, and ensure ongoing engagement. While I’m still waiting for the genies to arrive (Note: they don’t exist, a community of engaged participants is essential!), we’ve discovered a variety of ways to utilize the system to support programmatic assessment, functional assessment, strategic planning, accreditation, and continuous improvement. We have also built virtual exhibit rooms, a repository for policies and handbooks, and an area for the board of trustees. We do not have the Learning Achievement Tool companion system, but I’ll talk to the genies about that later.

For those that are unfamiliar with AMS, we are managing all of these processes using ‘workspaces’. Workspaces are essentially the backbone of AMS and provide us with dedicated spaces to establish a common framework, specify requirements, and collaborate with individuals across the institution. We have the ability to customize each workspace template for a specific need (e.g., programmatic assessment, strategic planning). I’ll share more details and examples on these later.

Our most recent experiment with AMS is for tracking and reporting compliance for the Northwest Commission on Colleges and Universities (NWCCU). The NWCCU recently revised regional accreditation guidelines to fit around the concept of “Core Themes.” Core Themes are manifestations of the institution’s mission. The fulfillment of identified objectives, outcomes, indicators, and targets associated with Core Themes ultimately demonstrate mission fulfillment. Piece of cake, right?!

Once we defined our four core themes (Developing Evidence-Based Practitioners, Elevating Clinical Inquiry Proficiency, Ensuring Educational Quality, and Nurturing Student Success) and the 70 associated indicators, it wasn’t long after that we opted to utilize AMS to demonstrate fulfilment over time. We replicated a simple workspace template we’re currently using for strategic plan tracking which focuses on operational outcomes rather than the cyclical assessment process. Coupled with our existing workspaces for programmatic and functional area assessment and continuous improvement, this new area provides a central, summative repository for our Core Themes efforts.

The differences between workspace configurations are demonstrated below, where you can see the simplicity of the Core Themes Tracking workspace compared with an academic program assessment and continuous improvement process (shown below). Note the latter encompasses the flow of a typical assessment process (e.g., assessment plan, findings, continuous improvement plan, results), while the former is simply an area to “prove it” in one workspace. We anticipate for our April 2015 NWCCU site visit, we’ll simply give our evaluation team access to the Core Themes Tracking workspace as a virtual exhibit room. (They can have an account with read-only access to any area of AMS.)

Sample Core Theme workspace structure in Taskstream's AMS Assessment solution
Sample Core Theme workspace structure
 Sample Academic Program Assessment & Continuous Improvement workspace structure in Taskstream's AMS Assessment solution
Sample Academic Program Assessment & Continuous Improvement workspace structure

Within the Core Themes Tracking workspace, we’ve identified the Core Themes, objectives, and outcomes as a “standing item”.  “Developing Evidence-based Practitioners” is our first Core Theme and the objectives are included in the description and the outcomes are immediately following. The Provost, who is our Accreditation Liaison Officer, primarily manages this workspace.  Participants engaged in data collection include a diversity of individuals with assessment responsibilities, including program directors, faculty, staff, and the University Administration.

 Sample Core Theme, objectives, and outcomes in Taskstream's AMS Assessment solution

Sample Core Theme, objectives, and outcomes

Our 70 indicators each have one or more targets. Indicators may include items such as student learning outcome achievement and accompanying rubrics, survey data, publication data, professional licensure pass rates, etc.

I have included a simple example below related to this first Core Theme in which case Exit Interview Survey Data is the indicator and the target level is 80% agreement regarding the students’ enhanced ability to synthesize evidence-based principles into realistic practice settings. We opted to break this down by individual programs (when there were at least three respondents) as well as provide an institutional mean. We also attached the institutional survey summary as an exhibit. Because we were in compliance for 2014, there is no follow-through action required beyond continued monitoring. However, should the score fall below 80%, a specific action would take that place and indicate the program or functional area accountable for that improvement. That program or functional area would include this within its own assessment and continuous improvement workspace.

Sample Core Theme, action item and status update in Taskstream's AMS Assessment solution

Sample Core Theme, action item and status update

I anticipate as we add additional years of data that we will need to convert the narrative to an attachment and include only the institutional mean in the narrative. For now, though, this will be more convenient for readers to explore without clicking for additional attachments.

We continue to find new uses for AMS in managing our data, processes, and the communication thereof. A willingness to experiment with the system and make changes has improved our overall effectiveness in achieving and demonstrating mission fulfillment. In that regard, we’ve had plenty of “learning experiences” along the way! Below are just a few we recognized from this project.

Lessons Learned

  • Don’t forget mapping! AMS has some great reports based upon mapping. If you are diligent with mapping, you’ll easily be able to answer questions such as “In what areas are you measuring clinical inquiry?” You’ll be able to enthusiastically reply, “Let me show you!” and crank out a quick report showing not only that clinical inquiry is mapped across all degree programs, but that there are also associated measurements to prove it! (This is called “Assessment Nerd Heaven.”)
  • AMS has character limitations for some narrative areas. When utilizing the same workspaces over a long period, be prepared to transition narratives into attachments. A narrative ideally will include the date of entry and a few summary sentences that reference an attachment/evidence.
  • Like any new process, the buy-in and results take time. Thirty-two academic measures and 70 total measures (many of which are new) add to workloads in terms of learning a new process and collecting the appropriate data. Occasionally findings will reveal the measurement or target level is not quite right and needs to be modified – fortunately, that’s why we go through these processes! For example, we realized that it was not only unrealistic, but inappropriate to expect 90% of students would complete a course deliverable at 85% or higher on the course rubric. This was both contrary to minimum pass rates and excluded any deliverable that might be graded on a unique scale. By changing this expectation to “80% or equivalent on a scoring rubric” we provided the flexibility to support broader performance standards and grading scales (e.g., a pass/fail scale for a reflective practicum) while still supporting the high academic performance we expect of our graduate students. Demonstrating flexibility both helps obtain buy-in and supports accreditation efforts towards continuous improvement.
  • Consistent coding is essential. Our coding scheme for the Core Themes workspace is lengthy, but it makes sense to us. Ensure all your items follow a coding scheme so you can better track and communicate your findings. For example, you might number each of your primarily items (e.g., outcomes or Core Themes) as 1, 2, etc. The next layer would include 1.1, 1.2, 2.1, 2.2, 2.3, and so forth for each assessment item or measure underneath the respective outcome. If you need additional layers, consider including the alphabet (e.g., 1.1A, 1.1B) to demonstrate the relationship.
  • Think about your readers. Make sure to acknowledge who will be viewing your workspace. While we might have a tendency to use acronyms for efficiency, you don’t want to frustrate an accreditor who needs an “acronymdex” to understand to what you’re referring. Especially consider spelling out degree program names. I once spoke with a colleague at another institution who kept referring to their “med” program. “Med? I didn’t know you had a medical school.” (It turned out to be a Master of Education (MEd).)

About The Author
Dr. Jessica Egbert is Vice President of Institutional Effectiveness and Community Engagement with Rocky Mountain University of Health Professions, a healthcare graduate institution in Provo. In her role, Dr. Egbert directs systematic assessment and continuous improvement processes to enhance institutional effectiveness via achievement of performance-driven outcomes. She also supports the development and documentation of the University’s comprehensive strategic plan, accreditation-related activities, and other strategic initiatives. Dr. Egbert serves as a University representative to the community, supporting community, strategic, and service-related efforts.

She has over 20 years of expertise in informatics, planning and assessment, and project management in healthcare and education. Dr. Egbert holds a baccalaureate degree in psychology, a master’s degree in education with an instructional technology emphasis, and a doctorate of philosophy in educational leadership. Her current research interests are in non-cognitive factors of hybrid doctoral education, on which she has recently published in the International Journal of Web-based Learning and Teaching Technologies.

Dr. Egbert has presented nationally on data-driven assessment, planning, institutional effectiveness, and accreditation. She is a community advocate and served on the Utah Valley Chamber of Commerce Women’s Business Network Board of Directors for five years, including serving as Chair for two years, and has served on the Utah Valley Chamber of Commerce Board for several years, including as Vice Chair of the Education Committee and presently serving as Chair of the Strategic Planning Committee. She also serves as the Co-Chair of the Provo City Strategic Plan Steering Committee. She has served on the Board of Directors of the Central Utah Chapter of the American Red Cross for two-and-a-half years and currently serves as Chapter Board Chair. Additionally, Dr. Egbert is a Board member with the Utah Women in Higher Education Network.

In 2014, she was included as one of the “Fab 40” in Utah Valley Magazine and is the recipient of numerous performance and service recognitions award. An extrovert by day and introvert by night, her favorite things are her amazing hubby, her snuggly dogs, inspiring powerchicks, excessive ice cream, and the occasional shark encounter.

The post AMS: Assessment & Beyond (Not your mama’s accreditation tracker!) appeared first on Watermark.

Previous Article
Importing Rubrics
Importing Rubrics

There is a very specific location in the trapezius of right-handed people that fiercely objects to repetiti...

Next Article
How Ithaca College is Assessing General Education Outcomes with e-Portfolios and Engaging Faculty in the Process
How Ithaca College is Assessing General Education Outcomes with e-Portfolios and Engaging Faculty in the Process

Last month, I moderated a webinar we hosted featuring Danette Ifert Johnson, PhD, the Vice Provost of Acade...

loading
×

Previous Article
Evolving the Student Course Evaluation Process for Greater Insights
Evolving the Student Course Evaluation Process for Greater Insights

Higher education institutions seek to measure course quality, instructional quality, and learning outcomes....

Next Article
Top 5 Challenges for Student Retention
Top 5 Challenges for Student Retention

Higher education institutions can fight the challenges of student retention. Learn about the most significa...