Land-grant institutions have a specific mission to share the fruits of research with the citizens of their state through extension, bringing vital, practical information to agricultural producers, small business owners, consumers, families, and young people. One major stakeholder in the work of land-grant institutions is the National Institute of Food and Agriculture (NIFA), a part of United States Department of Agriculture (USDA)’s Research, Education and Economics mission area. Because NIFA invests in and supports Cooperative Extension Service initiatives, they look to the institutions whose research and outreach they fund, such as land-grant universities, to report on their programs and impacts. We reached out to Purdue University’s Alee Gunderson, Associate Data Analyst at the Purdue College of Agriculture & Purdue Polytechnic, to hear about her experience collaborating with Faculty Success (formerly Digital Measures) by Watermark to configure Purdue’s system to meet NIFA reporting challenges.
The Challenge: Complex Reporting Requirements
The most pressing need at Purdue’s College of Agriculture, which includes the Cooperative Extension Service, and was to create central data collection and reporting to meet its federal reporting obligations. “One of the biggest challenges for our reporting is that three populations come into it: faculty, extension educators, and extension specialists, who fall somewhere between academic and extension educators,” Gunderson explained. “In addition, some of them are reporting on the academic calendar, and some on the federal calendar. We needed screens for different user needs, but also needed to bring data from those screens together into one number.”
Prior to implementing Faculty Success (formerly Digital Measures), information came from a system that produced reports, but didn’t reveal the data. The team ran reports from the system, but had no way to find incorrectly recorded data or to verify that the system was pulling all of the correct data. “The system wasn’t transparent. We worried about copying and pasting and referencing across a bunch of paths. It took a lot of labor, and we couldn’t always be sure of accuracy,” Gunderson shared. “With Faculty Success (formerly Digital Measures), we can run ad hoc reports to see the data with more granularity and review report templates to see where the data is coming from.”
The Importance of Configuration
Many of the College of Agriculture’s reporting requirements could technically be handled as ad hoc or self-service reports, especially once the data fields were in place on user screens. In fact, during their first year with Faculty Success (formerly Digital Measures), Gunderson’s team tried to create their NIFA reports by aggregating data from several Faculty Success (formerly Digital Measures) ad hoc reports, which proved complex and time-consuming. “For reports that are used regularly, like Purdue’s NIFA reporting, the configuration is critical. We use our expertise gained helping hundreds of clients to provide robust reports that don’t require rethinking each time they’re used. That saves clients many hours of unnecessary work, and provides more reliable and accurate output,” said Andrew Wiech, Faculty Success (formerly Digital Measures) Client Success Manager for Purdue.
Gunderson concurred. “Our data was everywhere, and it was too hard to put things together. With the NIFA reports configured in Faculty Success (formerly Digital Measures), we can run them and look for data inaccuracies and when numbers don’t add up, we have time to email faculty so they can correct it,” she said. “Now, we can verify data, and do it at a measured pace. It reduced a lot of errors and made our NIFA reporting much more manageable.”
For NIFA reporting, there are many different ways to report on data to tell the story of how an institution is fulfilling its plan of work, both qualitatively and quantitatively, Wiech noted. Numbers come in for direct stakeholders, but impact statements are more anecdotal. “Knowing what you need your data to tell you is the starting point for successful faculty activity reporting,” Wiech said. “Start with outcomes.”
Collaboration Is Key
“Clients know their data, and what they need from it, and we know the tool, so building database fields and screens is a consultative process,” said Daniel Farmer, Faculty Success (formerly Digital Measures) Developer. “Alee is very collaborative, which really helps. She comes to us with a vision of what she’s trying to accomplish, and is receptive to feedback and ideas from the Client Success and Development teams that take her vision to the next level.”
“Andrew and Daniel have both been so helpful. I’d send Andrew an email saying, we’re about to do this, is that okay? And Andrew can point out things that might happen that we hadn’t considered when we planned the change,” Gunderson said. “And the developers are creative, thoughtful people. They often know a better way to accomplish our end goal, or can outline the consequences of a proposed change. Daniel’s really good about saying, if we do this, X and Y will happen.”
“I sent the dean a report requested by the provost, and she was thrilled. She said she’d been up all weekend, entering and formatting information, ‘and you just sent it to me from Faculty Success (formerly Digital Measures)!’” Gunderson shared.
How did the first NIFA reporting season go? “We are running the custom reports for NIFA now and the team can clearly see the data and who we need to ask for clarification on their entries,” Gunderson said. “Will there be refinement, some tweaking? Sure. But we have an entirely new level of confidence in our NIFA reporting.”
The post Streamlining NIFA Reporting with Purdue University appeared first on Watermark.