Blog: Idea Exchange

How Ithaca College is Assessing General Education Outcomes with e-Portfolios and Engaging Faculty in the Process

Courtney Peagler April 9, 2015

Last month, I moderated a webinar we hosted featuring Danette Ifert Johnson, PhD, the Vice Provost of Academic Programs at Ithaca College titled “Engaging Faculty across the Institution in General Education Portfolio Assessment.”

The webinar was brimming with great ideas for using e-portfolios for general education outcomes assessment and engaging faculty in the work. If you have an hour, I highly recommend watching the recording. But if you don’t have the time right now, here is a synopsis of some of the key points from the webinar.

Institutional Context
Ithaca College

• Private, Master’s institution located in Ithaca, NY
• Approximately 6,200 undergraduate and 400 graduate students across five schools
• Regional accreditation: Middle States Commission on Higher Education
• Launched first campus-wide general education program in 2013

Accomplishments to Date

• Completed evaluations for 7 of 11 general education program components
• Over 3,000 students have begun developing general education learning e-portfolios
• Over 50 faculty members participating in portfolio evaluations, analysis, and action plan development, including faculty within gen ed and discipline faculty who don’t teach in the program

e-Portfolios Encourage Integrative Learning for Students and Capture Evidence of Learning

• e-Portfolios enable students to demonstrate, and the institution to evaluate, integrative thinking
• e-Portfolios provide a starting point for professional portfolios that students can build over time
• Couldn’t do the types of assessment they wanted to with their homegrown e-portfolio platform
• Started with a pilot semester before rolling out the e-portfolio to all incoming students
• Students are required to complete the portfolio as a graduation requirement

 

Students choose the artifacts they submit to demonstrate achievement of general education learning outcomes. The e-portfolio is organized by learning outcomes rather than courses (represented on the left side of the screen pictured here). Faculty evaluate each section of the portfolio. Holistic portfolio evaluations are planned for the future.

“We thought about assessment from the beginning.”

• General education courses align with outcomes and must include opportunities for students to produce work that demonstrates their achievement of the aligned outcomes
• Developed rubrics for each general education outcome
• Adapted existing rubrics (e.g., AAC&U VALUE rubrics) when appropriate
• Faculty evaluate each component of the e-portfolio – plan to add holistic evaluation later
• Use random and stratified random sampling to select artifacts for assessment; sample size depends on several factors, but target at least 10% of the population for evaluation each cycle
• Evaluations take place over two days for each component – first half day spent on norming activities evaluating a common artifact
• Collect both quantitative and qualitative data through rubric scores and comments

 

Ithaca developed rubrics for each learning outcome, leveraging existing work, such as the AAC&U VALUE Rubrics, when appropriate. Faculty contributed to the rubric development. Evaluators are encouraged to use the comment boxes to contribute qualitative data in addition to numeric rubric scores.

Ten Practices Ithaca Uses to Engage Faculty
1. Made sure the technology would work for student and faculty needs before vetting it with IT
2. Structured the e-Portfolio to reflect outcomes, rather than courses in the program – helpful for encouraging faculty engagement institution-wide, not just within general education
3. Solicited faculty volunteers when designing rubrics
4. Keep evaluation focused on student learning, not to evaluate a particular course or instructor
5. Offer stipends and food (coffee!) to recruit faculty evaluators for 2-day scoring sessions
6. Encouraging faculty to add comments to rubric evaluations helps faculty feel they are contributing more than simply generating quantitative data
7. Use intact groups where possible (e.g., existing committees)
8. Set clear expectations for what faculty need to do and have a finite timeline for the work – ask faculty to do one concrete activity rather than multiple activities
9. Go beyond the “usual suspects” and involve as many different people as possible through a mix of general calls for volunteers and targeted outreach to specific individuals
10. Share the results of the assessment and remind everyone regularly where they can easily access the results

Again, if you have the time, I highly recommend watching the webinar recording for Danette’s full presentation. She did a tremendous job sharing the work they are doing and it is great to hear the story directly from her.

Bonus: Follow-up Answers to Questions Raised During the Webinar

We didn’t have time in the webinar to respond to all of the questions from the audience. However, we were able to follow up with Danette for more answers. Below are her responses to some of those questions.

Engaging Faculty and Students

Q: What is encouraging faculty to go through the process of making their course part of the e-portfolio?
A: Because the electronic learning portfolio is part of our general education program, all courses within the general education program participate in the portfolio. Some faculty participate because they are interested in doing so and see the new general education program as a way to develop new courses of interest to them or to expand the reach of existing courses, some simply expect to be part of general education because of their disciplines, and some participate to enhance enrollments in courses they teach.

Q: How are students motivated to use the portfolio as you want them to? What is the value to them?
A: Students are required to complete the portfolio as a graduation requirement. We are also working on additional ways of engaging students with the portfolios (e.g., awards for best portfolio, considering pilot of alternate portfolio formats so students can select the format that works best for them). The portfolios are valuable to students after graduation for a couple of different reasons. One is that by engaging in the integration and reflection required during their capstone experience, they can articulate how their educational experience fits together and the entire set of knowledge, skills, and competencies they have developed, which is of benefit when talking with prospective employers or graduate schools. Second, the learning portfolio provides a starting point for an ongoing professional portfolio that students can modify over the course of their lives.

Q: How are students integrating their learning ACROSS learning outcomes? Are there assignments designed for this purpose?
A: There are a couple of different ways that students are integrating across learning outcomes. One is via a capstone experience (which is being piloted this spring) where students explicitly draw together their learning. A second is through the reflective component of the portfolio, where they are asked at multiple points to discuss how what they have learned in a specific course links to what they’ve learned in other courses. Because we’re in year two of the curriculum implementation, we have not yet evaluated these components but have a plan in place to do so as we move forward.
Sampling

Q: What is your sample size? Are ALL students uploading their work?
A: All students are supposed to be uploading their work. Right now, about 80% of sophomores have submitted work from the required first year course. Sample size for each evaluation session differs depending on the number of evaluators and how many student artifacts they can reasonably evaluate in two days (e.g., humanities artifacts tend to be longer papers, so fewer of those can be evaluated), but we target at least 10% of the population for evaluation in each cycle.

Q: What is your process for sampling artifacts to be assessed?
A: We use random and stratified random sampling to select artifacts. The first selection is random and then the sample demographics are compared to the overall demographics of the population. If the sample demographics are closely representative, no additional sampling is necessary. If the sample demographics are substantially off from the population demographics (e.g., if our percentage of transfer students were 9% in the population but 2% or 16% in the sample), we would move to stratified random sampling to get closer to the population demographics.

Q: How do you determine which artifacts are scored more than once?
A: Once the evaluation sample is set, a subset of those artifacts are randomly selected to be evaluated by multiple evaluators.

As the Vice President of Strategic and Business Development at Taskstream, Courtney Peagler works with educators and policymakers to understand the assessment, accountability, and e-portfolio needs in Higher Education and K-12 and how those needs could be and are currently addressed by the company. Prior to joining Taskstream in 2007, Courtney’s experience in educational technology included e-learning instructional design, the evaluation of emerging technologies for education, and assisting educators in their use of technology. Courtney earned her M.A. in Educational Communication and Technology from New York University’s Steinhardt School of Education and her B.A. in Psychology magna cum laude from Harvard.

Author
Courtney Peagler
Watermark