Greetings from NYC, where we can see the light at the end of a weird, warm winter where most all of our snow fell in one near-record snowstorm. It’s hard to believe it has been only a few months since we launched Outcomes Assessment Projects (formerly Aqua). With more and more institutions coming on board we are getting great feedback and insight into how they are using it to advance their assessment efforts. It has been especially enlightening to see how current AMS and LAT clients are fitting Outcomes Assessment Projects (formerly Aqua) into their workflows as we explore ways to make improvements across the product suite.
We just released some updates to Outcomes Assessment Projects (formerly Aqua) that I’m excited to share here.
It’s even easier to score!
At the heart of Outcomes Assessment Projects (formerly Aqua) is an evaluator experience that makes the task of scoring student work – lots of work – easy and engaging. Scorers are presented with a randomized list of work to score in a queue that respects the rules for which work they can score. It eliminates the stress of organizing, distributing and tracking who is scoring what.
With this last release, we deployed new technology that makes it even easier for us to rapidly improve the user interface. And so we used it to introduce some nice improvements to the experience for evaluators once they are scoring work and need to view multiple rubrics and, possibly, multiple artifacts for that student.
We created alert icons for the instructions and the number of artifacts to be read and we moved navigation between multiple rubrics into tabs at the bottom of the screen. The instructions and the available artifacts fly out from the side menu, but the rubrics are always present at the bottom. It’s a really nice improvement that gives evaluators greater command over the information and actions they need to take. Credit for the creativity in the design goes to our amazing designers, Montse and Scarlett, but our engineering team made it all possible.
Submitting work from the Learning Management System (LMS)
One thing we consistently hear from people who see or start using Outcomes Assessment Projects (formerly Aqua) is they love that it does not require any involvement from the student. Assessment coordinators tasked with collecting work from across a number of courses feel like we’ve built a product to make their lives easier.
But asking the student to submit a clean (or second copy) of an assignment for assessment can be a huge timesaver for schools who have not already collected the work they want to assess, and could help them to avoid a big redaction process. Since it would involve the student, it could also be an opportunity to engage students in the process with the assignment, and help them understand some of the student learning outcomes (SLOs) being assessed.
So we built an integration that allows students to submit work from within their LMS. When enabled (at the project level), students can submit work directly to Outcomes Assessment Projects (formerly Aqua) for an assignment by following a link in their LMS course. It is a very simple interface that appears embedded inside the course in the LMS. The integration is based on the IMS LTI standard, so your LMS administrator can easily enable it.
Keeping track of your project
Supporting Assessment Coordinators and Directors, who are often shouldering heroic burdens running initiatives on campus, remains a priority for us. With this release of Outcomes Assessment Projects (formerly Aqua), we built a new set of tracking reports that helps those users understand how the project is progressing with a specific view into ongoing project activity. These reports include the following:
- Assignment Activity summarizes available enrollments, submissions and evaluations for each assignment in the project, providing a view into collection and scoring activity by assignment..
- Submission Activity goes further, and shows, for each submission, whether it has been scored yet, and by whom!
- Evaluator Activity tells you how busy (or not) evaluators have been.
- Evaluator Reliability tells you whether evaluators are agreeing on scores (when two scores have been requested).
We have a few other goodies in the release, like a new way to quickly select courses for your project, but these are the highlights. Looking ahead to the rest of 2016, we are excited about what we are going to be able to deliver across all our products for our clients. You can expect to see and hear more from us soon!