January–February 2022
Volume 34, Number 1
Progress, Trends, and Practices in Higher Education
Institutional Context
C
          to
extend access to high-quality bachelor’s, master’s, specialist, doctoral, and
certicate programs for adults who seek to maximize their personal and pro-
fessional potential. This mission is fullled through innovative programs that are re-
sponsive to the needs of adult learners and involve active, engaging, challenging, and
relevant learning experiences offered in a variety of delivery modes.
All Capella curricula are competency-based, dened by scholar-practitioner fac-
ulty, and aligned to the expectations and standards of professional associations and
accreditors, state licensing boards, and respected employers. Capella offers academic
programming in the elds of business; information technology; education; nursing and
health sciences; public administration; counseling and therapy, human services and
social work, and psychology.
Integration Among Curriculum and Assessment
Capella University’s assessment system is modeled on the assessment triangle (cog-
nition, observation, and interpretation) and is operationalized through a fully embed-
ded assessment model (FEAM). Assessment is one component in our academic qual-
ity framework, a logic model articulating the elements of the educational ecosystem
in terms of input, output, outcome, and impact. The assessment system is founded
on clearly stated and dened outcomes at the university, offering, and course level;
consistent development of and use of scoring guides; and a tight alignment structure
between all the curricular elements.
Outcomes. University outcomes clearly state what learners at all degree levels are ex-
pected to demonstrate during their education. Program- and specialization-level learn-
ing outcomes express the expectations of the discipline and measurable performance
standards. Competencies articulate the knowledge, skills, and dispositions learners are
expected to demonstrate and are assessed in multiple assignments in courses.
Scoring Guides. The purpose of a scoring guide is to provide clear, measurable cri-
teria and grading information for each assignment. Faculty use these criterion-refer-
enced scoring guides to directly assess the demonstration of each competency in every
Capella University’s Journey Toward
Sustained Excellence in Assessment
Jaclyn Zacharias, Nancy Ackerman, and Christine S. Yates
© 2022 Wiley Periodicals LLC • All rights reserved
View this newsletter online at wileyonlinelibrary.com
ARTICLES
Capella University’s Journey Toward 1
Sustained Excellence in Assessment
Jaclyn Zacharias, Nancy Ackerman, and
Christine S. Yates
Editors’ Notes 3
Stephen P. Hundley and Caleb J. Keith
Reections on California State 4
University East Bay’s Excellence in
Assessment Designation through the
Lens of Student Learning and Success
Kevin W. Kaatz and associates
Nurturing a Culture of Continuous 6
Improvement: Sustaining Excellence in
Assessment at the Community College
of Baltimore County
Jennifer Kilbourne and associates
Broadening and Deepening a 8
Campus Culture of Assessment at
Cameron University
Karla J. Oty
Transformative Principles 10
Contributing to Whatcom Community
College’s Assessment Progress
Anne Marie Karlberg, Tresha Dutton,
Peter Horne, and Ed Harri
Sustaining a Culture of Assessment 12
Excellence at IUPUI
Susan Kahn and Stephen P. Hundley
COLUMN
NILOA Perspectives 14
Gianina Baker and Kate Drezek
McConnell
C
ONTENTS
2 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
assignment. The use of standard scoring guides in all sections helps ensure grading con-
sistency across faculty and over time. Faculty judge competency demonstration as reect-
ing one of four performance levels: nonperformance, basic, procient, and distinguished.
Alignment Leading to Measurement of Learning. Capella’s FEAM embeds curricular
expectations—in the form of course competencies aligned to program outcomes—into
every graded assignment and assessment instrument. Scoring guide criteria are aligned
to competencies and are, therefore, used as measures of outcome and competency dem-
onstration. As a result, faculty judgments can be aggregated over time as a measure of
learning at the unit, course, and program level and builds a rich context for interpreting
data to improve the learning experience.
Using Measurement of Learning to Evaluate Program Health
Capella conducts regularly scheduled reviews at different frequencies and levels of
detail that include data trend analysis, interpretation, and action plan development as
needed. The two primary reviews are Academic Program Review (APR) and Learner
Performance Review (LPR). Additionally, Capella uses measurement of learning to fa-
cilitate transparency to both internal and external audiences.
Academic Program Review. APR is a holistic analysis of the health of a particular
academic offering using the Academic Quality Framework. APRs are conducted every
ve years by faculty in collaboration with assessment specialists, accreditation special-
ists, product managers, institutional effectiveness analysts, and others as needed. This
team analyzes aggregated competency demonstration and learning outcomes assessment
data; alumni survey results; end-of-course evaluation (EOCE) results; a quarterly experi-
ence survey that measures learner satisfaction with overall experience at Capella; faculty
course evaluation results; faculty performance, enrollment, persistence, and completion
rates; and additional measures as appropriate. They also conduct an in-depth review of
curriculum elements, course sequencing, and external standards and monitor the success
of previously made improvements against goals. The team develops recommendations
to address any issues identied and presents these to the dean for resource prioritization.
Learner Performance Review. Assessment specialists conduct LPRs for all academic
offerings; these involve an in-depth review of each course in a program or specialization
every two to three years, or more frequently if needed for specialized accreditations. The
assessment specialists produce a report showing aggregated performance data for each
learning outcome and each course competency in every course; they include additional
review as needed of assignment-level performance, course revision history, curricular-
assessment alignments, and EOCE data. The report includes analysis of data and recom-
mendations for revisions to the overall offering, specic courses, and/or specic course
activities as appropriate. A team including the assessment specialist, program director,
faculty, academic associates, and accreditation specialist discuss the report and recom-
mendations and determine action plans for moving forward.
Transparency. Capella has a public-facing website (CapellaResults.com) which dis-
plays aggregated learning and career outcomes data for Capella’s academic offerings.
Additionally, every learner at Capella has a personalized competency map which shows
their academic progress on each course competency, so they can build on their strengths
and improve in needed areas. Learners can share Competency Maps on social media
(continued on page 9)
Assessment Update
Progress, Trends, and Practices
in Higher Education
ASSESSMENT UPDATE (Print ISSN: 1041-6099; Online ISSN: 1536-
0725) is published bimonthly by Wiley Periodicals LLC, 111 River St.,
Hoboken, NJ 07030-5774 USA.
Postmaster: Send all address changes to ASSESSMENT UPDATE,
Wiley Periodicals LLC, C/O The Sheridan Press, PO Box 465, Hanover,
PA 17331 USA.
Information for Subscribers: Assessment Update is published in 6
issues per year. Subscription prices for 2022 are: Institutional Print
+ Online: $399 (The Americas), £210 (UK), €262 (Europe), $399
(rest of the world). Institutional Online Only: $355 (The Americas),
£187 (UK), €233 (Europe), $355 (rest of the world). Institutional
Print Only: $371 (The Americas), £196 (UK), €244 (Europe), $371
(rest of the world). Personal Print + Online: $176 (The Americas),
£95 (UK), €121 (Europe), $217 (rest of the world). Personal Online
Only: $126 (The Americas), £60 (UK), €77 (Europe), $126 (rest of the
world). Personal Print Only: $160 (The Americas), £98 (UK), €125
(Europe), $201 (rest of the world). Prices are exclusive of tax. Asia-
Pacific GST, Canadian GST/HST and European VAT will be applied at
the appropriate rates. For more information on current tax rates, please
go to https://onlinelibrary.wiley.com/library-info/products/price-lists/
payment. The price includes online access to the current and all online
backfiles to January 1, 2018, where available. For other pricing op-
tions, including access information and terms and conditions, please
visit https://onlinelibrary.wiley.com/library-info/products/price-lists.
Terms of use can be found here: https://onlinelibrary.wiley.com/
terms-and-conditions.
Copyright and Copying (in any format): Copyright © 2022 Wiley
Periodicals LLC. All rights reserved. No part of this publication may
be reproduced, stored, or transmitted in any form or by any means
without the prior permission in writing from the copyright holder.
Authorization to copy items for internal and personal use is granted
by the copyright holder for libraries and other users registered with
their local Reproduction Rights Organisation (RRO), e.g. Copyright
Clearance Center (CCC), 222 Rosewood Drive, Danvers, MA 01923,
USA (www.copyright.com), provided the appropriate fee is paid
directly to the RRO. This consent does not extend to other kinds of
copying such as copying for general distribution, for advertising or
promotional purposes, for republication, for creating new collective
works, or for resale. Permissions for such reuse can be obtained using
the RightsLink “Request Permissions” link on Wiley Online Library.
Special requests should be addressed to: [email protected].
Delivery Terms and Legal Title: Where the subscription price includes
print issues and delivery is to the recipient’s address, delivery terms are
Delivered at Place (DAP); the recipient is responsible for paying any
import duty or taxes. Title to all issues transfers Free of Board (FOB)
our shipping point, freight prepaid. We will endeavour to fulfil claims
for missing or damaged copies within six months of publication, within
our reasonable discretion and subject to availability.
Our policy is to replace missing or damaged copies within our reason-
able discretion, subject to print issue availability, and subject to Wiley’s
Title-By-Title Journals Subscriptions Terms & Conditions. Claims for
missing or damaged print issues must be sent to cs-journals@wiley.
com within three months from date of subscription payment or date
of issue publication, whichever is most recent.
Back issues: Single issues from current and recent volumes are avail-
able at the current single-issue price from [email protected].
Disclaimer: The Publisher and Editors cannot be held responsible
for errors or any consequences arising from the use of information
contained in this journal; the views and opinions expressed do not
necessarily reflect those of the Publisher and Editors, neither does
the publication of advertisements constitute any endorsement by the
Publisher and Editors of the products advertised.
Journal Customer Services: For ordering information, claims and
any enquiry concerning your journal subscription please go to https://
wolsupport.wiley.com/s/contactsupport?tabset-a7d10=2 or contact
your nearest office. Americas: Email: [email protected]; Tel: +1
877 762 2974. Europe, Middle East, and Africa: Email: cs-journals@
wiley.com; Tel: +44 (0) 1865 778315; 0800 1800 536 (Germany).
Asia Pacific: Email: [email protected]; Tel: +65 6511 8000.
Japan: For Japanese speaking support, Email: [email protected].
Visit our Online Customer Help at https://wolsupport.wiley.com/s/
contactsupport?tabset-a7d10=2.
Wiley’s Corporate Citizenship initiative seeks to address the envi-
ronmental, social, economic, and ethical challenges faced in our
business and which are important to our diverse stakeholder groups.
Since launching the initiative, we have focused on sharing our content
with those in need, enhancing community philanthropy, reducing our
carbon impact, creating global guidelines and best practices for paper
use, establishing a vendor code of ethics, and engaging our colleagues
and other stakeholders in our efforts. Follow our progress at www.
wiley.com/go/citizenship.
View this journal online at www.wileyonlinelibrary.com/journal/AU.
Executive Editor: Stephen P. Hundley. Associate Editor: Caleb J. Keith.
Assistant Editors: A. Katherine Busby and Shirley J. Yorger. Publishing
Director: Lisa Dionne Lento. Production Editor: Mary Jean Jones.
Editorial Correspondence: Contact via email: [email protected].
For submission instructions, subscription, and all other information,
visit: www.wileyonlinelibrary.com/journal/au.
Printed in the USA by The Sheridan Group.
Assessment is one component in our academic quality framework, a
logic model articulating the elements of the educational ecosystem in
terms of input, output, outcome, and impact.
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 3
E
ditorsNotEs
Peer Review in Assessment and Improvement:
An Overview of Five Principles to Promote Effective Practice
Stephen P. Hundley and Caleb J. Keith
I
     Assessment
Update, we prole select recipients
of the 2021 Excellence in Assess-
ment (EIA) Designation, a national-level
recognition co-sponsored by VSA Ana-
lytics, the National Institute for Learn-
ing Outcomes Assessment (NILOA),
and the American Association of Col-
leges and Universities (AAC&U). As
described on NILOAs website (https://
www.learningoutcomesassessment.org/
eia/), the EIA Designation recognizes
institutions that successfully integrate as-
sessment practices throughout the institu-
tion, provide evidence of student learning
outcomes, and use assessment results to
guide institutional decision-making and
improve student performance. This issue
features articles from some of the institu-
tions receiving the 2021 honor, including
IUPUI, our home institution.
As many readers know, IUPUI also
hosts the Assessment Institute in Indian-
apolis, the oldest and largest U.S. higher
education event focused on assessment
and improvement. Following two years
offering a virtual engagement to accom-
modate disruptions associated with the
COVID-19 pandemic, we will resume
our in-person Assessment Institute,
held October 9–11, 2022, at the Indi-
anapolis Marriott Downtown Hotel.
We look forward to presenting an array
of workshops, keynote presentations,
concurrent and poster sessions, and net-
working opportunities at the 2022 As-
sessment Institute. Learn more about
this year’s event, including registration
details, schedule overview, and program
tracks and topics, at our website, https://
assessmentinstitute.iupui.edu/index.html.
The theme of our Editors’ Notes for
2022 is “Peer Review in Assessment and
Improvement: Five Principles to Promote
Effective Practice.” Peer review has long
been used in the higher education sector
to serve a variety of purposes and meet
the needs of several audiences. Activities
supportive of assessment and improve-
ment also increasingly rely on peers to
offer credible subject matter expertise in
respective contexts, provide judgments,
develop recommendations for enhanced
performance, and make contributions to
creating and sustaining a culture of con-
tinuous improvement and innovation. The
ve principles to promote effective prac-
tice in peer review for assessment and im-
provement are:
1. Recognize the purpose of the peer re-
view process in higher education as-
sessment and improvement.
2. Value the multitude of perspectives,
contexts, and methods related to as-
sessment and improvement.
3. Adopt a consultative approach to the
peer review process.
4. Make effective judgements using in-
clusive sources and credible evidence.
5. Provide relevant feedback to
stakeholders.
Principle #1
: Recognize the
purpose of the peer review
process in higher education
assessment and improvement
One enduring feature of the higher
education sector is its use of peers in pro-
cesses to generate, evaluate, disseminate,
and curate knowledge for a variety of
purposes and audiences. Peers are often
individuals who are regarded as subject
matter experts in a particular domain, and
they usually have educational and profes-
sional preparation and experiences com-
parable to those desirous of and reliant
on the peer’s perspectives, judgment, and
feedback. Depending on the purpose of
the peer review process, peers may be lo-
cal in nature (e.g., within the institution),
represent a valued external constituency
(e.g., community members, employers,
or alumni), or have an “arms-length” dis-
tance from the activity under review (e.g.,
colleagues from the discipline or profes-
sion working in other institutional set-
tings). The type of review informs which
peer(s) are appropriate to engage. Indeed,
peers have the potential to contribute to
a variety of worthwhile activities, in-
cluding reviews of teaching; evaluations
of academicians for tenure and promo-
tion purposes; making judgements about
the signicance and quality of scholarly
contributions; as part of periodic, inter-
nally oriented program review processes;
as colleagues serving on accreditation
teams; and, increasingly, as part of assess-
ment and improvement activities taking
(continued on page 15)
4 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
Reections on California State University
East Bay’s Excellence in Assessment
Designation through the Lens of Student
Learning and Success
Kevin W. Kaatz, Ana Almeida, Sarah Aubert, Paul Carpenter, Caron Inouye,
Danika LeDuc, Balaraman Rajan, Julie Stein, and Fanny Yeung
Institutional Context
I
ntroduction to california State uni-
versity East Bay (CSUEB). Part of the
23-campus California State Univer-
sity system, CSUEB is a designated His-
panic-, Asian-American-Native-Ameri-
can-Pacic-Islander-, and Minority-Serv-
ing Institution, serving a high proportion
of commuter, transfer, rst-generation,
international, and non-traditional stu-
dents. With one of the most diverse stu-
dent populations in the United States, the
university has a strong commitment to
building a culture of equitable and inclu-
sive teaching, learning, and assessment.
Summary of ILO Assessment Integra-
tion into Existing University Assessment
Processes. Institutional Learning Out-
comes (ILOs) were developed campus-
wide with faculty, staff, and students
and approved by the Academic Senate in
2012. Cross-disciplinary faculty teams
collaborated to build an ILO assessment
infrastructure that integrated with exist-
ing academic, general education, gradu-
ate, and co-curricular assessment. As part
of that process, multi-disciplinary faculty
developed, piloted, and implemented as-
sessment rubrics for every ILO following
a university-wide plan. Today, the cam-
pus has a robust campus-wide assessment
process including closing-the-loop activi-
ties supported by multiple faculty com-
mittees and campus leadership.
Focus of Article. CSUEB’s award
of the Excellence in Assessment (EIA)
designation is a credit to the multidis-
ciplinary collaboration of faculty who
shepherded our assessment processes and
resulting institutional improvements and
growth. Looking ahead, our direction is to
increase active learning, become a more
unied campus around student success
approaches, include more student voices
in the assessment process, and build in
more assessment of student learning ex-
periences outside the classroom that align
to our ILOs.
Top EIA Assessment Achievements
Were Pointed Toward Student
Success
University-wide Engagement. Hun-
dreds of faculty members from the univer-
sity’s four colleges and university libraries
have been involved in all aspects of ILO
assessment, including rubric development,
collection, and evaluation. Faculty from
every college also developed ILO assign-
ment guides for every ILO, which were
shared with the campus community.
Peer-to-peer Discussions of Effec-
tive Assignment Design. University-wide
engagement extended to funded profes-
sional development opportunities in the
form of assignment design workshops
providing faculty with the time, space,
structure, and funding to create tangible
and concrete assignments that aligned
with the ILOs and deepened student
learning. Faculty coaches reviewed as-
signments, explained learning outcomes,
and brainstormed creative options to help
their faculty peers meet the ILO in a par-
ticular discipline. Faculty implemented
different strategies to elicit student work
that clearly demonstrated their learning
of that outcome. These facilitated peer-to-
peer coaching sessions improving assign-
ment design proved to be a critical part of
the assessment process.
Closing the Loop. Closing-the-loop
discussions took place in all colleges fol-
lowing the aggregation and reporting of
assessment data. Faculty discussed im-
proving pedagogical practices, assign-
ment design, and the format of assign-
ments (e.g., formative vs. summative,
scaffolding, self and peer assessment).
The results provided a starting point for
rich conversations among faculty as to
how they dene, understand, and use the
ILO rubrics to improve engagement and
demonstrate student learning.
By improving assignment design to be transparent, anti-racist, and
equitable, we better serve our diverse students by giving them the
opportunity to be more reective on their learning, which more
authentically represents their learning.
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 5
University-wide Growth. It is clear
from this university-wide work that fac-
ulty have a passion for student learning
and a desire to dedicate more time and
energy to best support students’ learning.
Campus-wide activities and discussions
provided forums to remind students that
they are doing good, albeit challenging,
work. It showed faculty are not alone in
helping students attain these learning out-
comes. This work also allowed faculty the
chance to learn more about their own in-
stitution and to realize that they are work-
ing collectively toward similar goals. It is
also clear that the university as a whole
has matured in its collective decision-
making ability. We now have a range of
faculty and staff highly experienced in
assessment who also operate very ef-
fectively as a group to make meaningful
improvements.
Top Future Directions Are Toward
Strengthening Student Success
Assessment Practices that Differen-
tiate Between Teaching and Learning.
Engaging students in evidence-based ac-
tive learning experiences is a priority at
CSUEB. Providing students multiple
ways to externalize their thinking, with
more emphasis on no- or low-stakes as-
sessments, allows us greater insight into
student learning and the ability to adapt
as needed. We are shifting the metrics of
our teaching from “content coverage” to
“student mastery.” By improving assign-
ment design to be transparent, anti-racist,
and equitable, we better serve our diverse
students by giving them the opportunity
to be more reective on their learning,
which more authentically represents their
learning. We are also moving toward the
idea that assessment is “student-centered
and instructor-supporting.” It is our ex-
pectation and assumption that improv-
ing assignments for assessment purposes
will allow for a better understanding of
student learning while also pointing out
areas for improvement. As part of this,
we plan to continue collecting compara-
tive data to show improvement of student
learning over time.
Integration of Curriculum and As-
sessment Processes. We will continue
to integrate our curricular and assess-
ment processes in our ongoing work to
strengthen student success. This process
has intentionally brought together fac-
ulty from all over the university at every
stage of the process, on every ILO. It is
apparent we have a collective responsi-
bility around shaping and developing a
student’s progress toward any given out-
come. This also highlights the need to
continue to work together to help better
understand the interconnectedness of cur-
riculum and assessment and to provide
the most cohesive student experience
possible, particularly around the core
competencies. Institutional and pedagogi-
cal goals will also continue to focus on
alignment and supporting student learn-
ing and success. As part of this effort, we
are evaluating technologies that support a
more integrated curriculum development
and assessment infrastructure. By better
integrating processes into all aspects of
curriculum and assessment, faculty have
a better understanding of student learning
while simultaneously providing CSUEB
with the necessary data for accreditation
and institutional improvement.
Increased Inclusion of Student Voices
in ILO Assessment. Another future direc-
tion is to give students more opportuni-
ties to engage with the ILO process and
listen to their feedback on assessed as-
signments. As we prepare our students to
be life-long learners, a more intentional
and structured process for them to reect
on their own development across all the
learning outcomes over time will prove
useful.
Increased Integration of Experiences
Outside of the Classroom. We recognize
students’ growth over the course of their
college experience extends beyond the
classroom to their involvement in ac-
tivities such as internships and clubs.
This engagement contributes to student
learning and supports mastery of skills
as they apply them in “real-world” set-
tings. We plan to increase assessment
across various student experiences over
time to gather more on the progression of
core competency development. Aligning
and assessing students’ experiences out-
side the classroom will also broaden the
campus conversation about institutional
effectiveness.
CSUEB’s President on Assessment
Journey. CSUEB President Cathy Sand-
een effectively summarized our assess-
ment journey in the August 2021 award
announcement. “We know that a Cal
State East Bay education is transforming
for our students’ lives. This honor recog-
nizes how our faculty and staff connect
the dots throughout an entire degree pro-
gram, including our beyond-classroom
experiences. The signicant collabora-
tions that occur to ensure these learning
outcomes are unparalleled and make stu-
dents’ learning equitable, accessible and
useful.
Kevin W. Kaatz is an associate professor
of history, Ana Almeida is an assistant
professor and associate director of the
Green Biome Institute, Sarah Aubert is
the curriculum services project manager,
Paul Carpenter is a professor and chair of
kinesiology, Caron Inouye is the director of
general education and a professor, Danika
LeDuc is the associate dean in the College
of Science, Balaraman Rajan is an associ-
ate professor of management, Julie Stein
is the educational effectiveness project
manager, and Fanny Yeung is the director
of Institutional Effectiveness and Research
at California State University East Bay.
Assessment Institute in Indianapolis
Hosted by IUPUI, the Assessment Institute in Indianapolis is the oldest and
largest U.S. higher education event focused on assessment and improvement. This
year, we will resume our in-person event at the Indianapolis Marriott Downtown
Hotel October 9–11, 2022.
Learn more at the Assessment Institute website: https://assessmentinstitute.
iupui.edu/
6 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
Nurturing a Culture of Continuous
Improvement: Sustaining Excellence in
Assessment at the Community College of
Baltimore County
Jennifer Kilbourne, Amy Roberts Wilson, Glenda Breaux, and Joaquin G.
Martinez
S
 ,  
of Baltimore County (CCBC) has
had an assessment system in place
that draws upon the expertise of assess-
ment professionals and faculty stakehold-
ers to conduct purposeful, systematic, and
collaborative assessment at the course,
program, and institutional levels. Apply-
ing for the renewal of the Sustained Ex-
cellence in Assessment award allowed us
to re-examine our assessment initiatives
through an external lens with the help of
the Excellence in Assessment (EIA) des-
ignation rubric, and the process revealed
areas for improvement. We are honored
to have now won the sustained desig-
nation in both 2016 and 2021, and we
move forward committed to continuous
improvement.
Institutional Context
CCBC is a large, multi-campus in-
stitution serving more than 45,000 stu-
dents each year. CCBC offers associate
degrees and certicate programs in both
career and transfer curricula that comple-
ment workforce demand in the Baltimore
Region.
The philosophy that all members of
the institution share responsibility for
improvement of student learning as a
collective enterprise guides assessment
at CCBC. Collaborative internal reviews
and advisory boards work with assess-
ment leaders to monitor best practices,
develop assessment resources, and sup-
port policies and procedures related to
scheduled stages of the assessment loop.
While these entities guide the assessment
stages, discipline-faculty teams develop
course and program assessment tools
and interventions, which are embedded
authentically within the classroom. That
faculty drive tool development and im-
plementation ensures authentic use and
investment in assessment measures. The
Ofce of Planning, Research, and Evalu-
ation (PRE) provides the infrastructure
to analyze data used to develop interven-
tion strategies to address learning de-
cits. These coordinated efforts embed as-
sessment in students’ CCBC experience
and sustain excellence in assessment at
CCBC.
Assessment Strategies Embedded
in the CCBC Experience
CCBC’s mission and strategic priori-
ties guide institutional assessment. Quali-
tative and quantitative data are analyzed
through the collection of internal survey
data, student success data, learning out-
comes assessment data, general educa-
tion outcomes assessment data, and pro-
gram assessment data. Results determine
intervention strategies in the classroom,
guide program improvement, and inform
professional development opportunities
for faculty and staff. Validation, both in-
ternal and external, of assessment instru-
ments ensures the use of high-quality as-
sessment tools and procedures.
CCBC has four primary sets of stu-
dent learning outcomes that guide all pro-
grams of study: core competencies, gen-
eral education (GE) outcomes, program
outcomes, and institutional benchmarks.
Within each degree, students take both
GE and program requirements, where
core competencies are fostered. Institu-
tional benchmarks, including course suc-
cess, retention, completion, and transfer
rates, are tracked and reported annually.
Combined, these outcomes prepare stu-
dents with 21st century skills for lifelong
learning.
Goals for communication, problem
solving, global perspective and social
responsibility, and independent learning
and personal management are reected in
course-level learning outcomes, general
education outcomes, and program-level
outcomes. Co-curricular activities and
High Impact Practices (HIPs) infusion projects, designed to boost student
engagement, success, and retention, provide broad assessment of
shared learning outcomes through a data dashboard that allows course,
program, and co-curricular stakeholders to disaggregate success and
retention rates.
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 7
High Impact Practice infusion projects
facilitate the development of these core
competencies. These shared learning
outcomes form scaffolded infrastructure
throughout students’ academic and co-
curricular experiences.
CCBC intentionally embeds program
outcomes in program requirements to en-
sure students master learning outcomes
upon degree completion. All GE courses
utilize a common graded assignment that
maps both GE and course-level learning
outcomes and serves as an authentic as-
sessment tool. On a three-year cycle, stu-
dent artifacts are scored by trained faculty
teams, PRE analyzes and reports data, and
faculty teams develop interventions based
on the data and plan the next assessment.
Furthermore, multiple targeted as-
sessments work together to capture stu-
dents’ mastery of course, program, and
co-curricular shared learning outcomes.
Highly enrolled courses assess learning
outcomes using faculty-developed and
externally validated instruments in Learn-
ing Outcomes Assessment projects. Since
1999, CCBC has assessed 50 high-impact
(highly enrolled, multi-section) courses
through these projects. As a result, inter-
vention strategies have been implemented
to improve the learning experience for
over 75,000 students.
In addition, academic programs are
evaluated through a committee-driven
program review process in a ve-year cy-
cle. Program review includes curriculum
assessment as well as market feasibility
analyses. CCBC’s program review pro-
cess uses curriculum mapping to iden-
tify and assess program outcomes within
program-required courses. Program out-
comes assessment projects ensure gradu-
ates are meeting outcomes required for
entry into the workforce or transfer.
High Impact Practices (HIPs) infu-
sion projects, designed to boost student
engagement, success, and retention, pro-
vide broad assessment of shared learning
outcomes through a data dashboard that
allows course, program, and co-curricular
stakeholders to disaggregate success and
retention rates. While achievement gaps
persist at CCBC, they were signicantly
decreased in courses such as Biology I:
Molecules and Cell (–20%), Fundamen-
tals of Communication (–7%), and Intro-
duction to Psychology (–11%) when HIPs
were infused.
Lessons Learned
CCBC benets from years of thought-
ful engagement with assessment prac-
tices and policies. As we went through
the Excellence in Assessment applica-
tion process, we were able to examine
our assessment infrastructure and pro-
cesses through an external lens. The EIA
scoring rubric was particularly helpful in
framing not only the processes involved
in assessment excellence, but also en-
gaging the people. During the process
we learned we have a robust, multi-level
system that is inclusive of many stake-
holders, but that some groups do not yet
have a seat at the table and others are
not included in information sharing. In
particular, we learned students have not
had a voice in the development and im-
plementation of assessment practices.
As a result, student representatives will
begin to serve on the General Educa-
tion Review Board. We also learned that
greater integration between Instruction
and Student Support Services is required
to reap maximal benet from assessment
activities. A reorganization of the college
will begin to break down these silos as
both areas will now serve under CCBC’s
Provost, bringing quality learning to the
forefront of our mission. CCBC recog-
nizes the central, instrumental role that
assessment plays in the life cycle of the
academic enterprise. Therefore, we re-
main attentive to the iterative nature of
our work on behalf of our students and
appreciate the opportunity to provide
an update of the comprehensive institu-
tional efforts as part of this designation.
Recommendations
CCBC’s application for the renewal of
the Sustained Excellence in Assessment
award embraced the need for continuous
improvement by examining our assess-
ment initiatives. CCBC’s data-informed
decision-making is prevalent; however,
institutions can become complacent with-
out continual evaluation. This reapplica-
tion exercise pushed us to examine all
aspects of our assessment model.
When applying for the EIA desig-
nation, think critically about the crite-
ria and develop a plan to address each
component of the application from the
institution’s perspective. Approaches to
assessment are different and exciting.
The EIA application is a place to high-
light strengths and recognize opportuni-
ties for improvement. It is also essential
to establish a team of assessment experts
throughout the college as contributors.
CCBC’s Learning Outcomes Assessment
Advisory Board, composed of faculty,
staff, and administrators, guided our pro-
cess of self-reection.
While CCBC has been recognized
by this esteemed designation, the jour-
ney of continuous improvement never
ceases. We look forward to engaging
faculty, staff, and students with these
sustained efforts for an enriched student
experience.
Jennifer Kilbourne is the dean of curricu-
lum and assessment, Amy Roberts Wilson
is the Honors Program director and an
associate professor of English and women’s
studies, Glenda Breaux is the senior direc-
tor of Planning, Research, and Evaluation,
and Joaquin G. Martinez is the provost and
vice president of instruction at the Com-
munity College of Baltimore County.
Have an idea for an article or special issue
of Assessment Update? Let us know.
Email the executive editor at [email protected]
www.wileyonlinelibrary.com/journal/au
or http://assessmentinstitute.iupui.edu/
8 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
Broadening and Deepening a Campus Culture
of Assessment at Cameron University
Karla J. Oty
Institutional Context
C
  ()   -
gional public university located
in southwest Oklahoma. CU is an
open admission institution at the Associ-
ate in Applied Science level with mini-
mal admissions requirements at the As-
sociate in Arts, Associate in Science, and
bachelor’s degree levels. CU also offers
a limited number of master’s degrees in
the professional areas of teaching, busi-
ness, and psychology. In Fall 2020, CU’s
enrollment was 3,471 undergraduate stu-
dents and 300 graduate students.
Assessment activities at CU began in
annual year (AY) 1992–1993 for academic
programs and in AY 2008–2009 for units
in Student Services. Units have been added
to the assessment process annually since
AY 2011–2012. In AY 2020–2021, all ac-
ademic programs and all but eight non-ac-
ademic units participated in the formal as-
sessment process. This sustained practice
of assessment helped CU be selected to the
inaugural class of Excellence in Assess-
ment (EIA) designees and as a continued
Sustained Excellence designee in 2021.
Assessment Strategies
At CU, the primary purpose of assess-
ment is to use data to determine if student
learning, engagement, and satisfaction is
at the desired level and, if not, to develop
action items to address shortfalls. CU’s
comprehensive system of assessment is
managed by the Ofce of Institutional Re-
search, Assessment, and Accountability
(IRAA) and the Institutional Assessment
Committee (IAC) and is overseen by the
Executive Council (EC) of the University.
IAC is comprised of 26 faculty and staff
members including the vice president for
academic affairs, the IRAA director, the
chair of the General Education Commit-
tee, the chair of the Developmental Edu-
cation Assessment Committee, a Faculty
Senate representative, and other members
representing both programs and units.
Programs leading to a degree or certi-
cate, general education, developmental ed-
ucation, and non-degree granting academic
and non-academic units participate in the
assessment process each year. Information
for programs and units is entered into a
software package to create an assessment
report. Two members of IAC acting as peer
reviewers and the appropriate supervisor
use a PDF form to indicate elements that
are addressed, provide written comments,
and make a recommendation as to whether
the program or unit would benet from a
roundtable discussion. The appropriate EC
member determines whether the program
or unit will participate in a roundtable; all
programs and units participate in a round-
table at least once every three years. Each
roundtable begins with the program or unit
members answering questions about what
they have learned from their assessment
data and how they have used what they
have learned to make improvements. The
rest of the allotted time is spent discussing
feedback, results of the assessment pro-
cess, and suggestions for improvements.
The exibility of the scheduling of the
roundtables ensures programs struggling
with a particular part of the assessment
process can participate in conversations
each year to help them make progress,
while programs with a well-developed
assessment process participate in round-
tables once every three years.
Lessons Learned in CU’s
Assessment Journey
In 2010 the assessment process was
changed in an attempt to shift the mind-
set away from compliance and toward
one that would result in improvements of
student learning, engagement, and satis-
faction. The implementation of roundta-
bles was intended to ensure there were
substantive conversations about what
was working well and what could be im-
proved. The discussions were also meant
to emphasize the role of IAC as peer re-
viewers providing advice and guidance,
and not as individuals who were trying to
nd something wrong in the assessment
report. There was also an added empha-
sis in encouraging faculty and staff to
critically examine the data collected and
to use the data to identify strengths and
weaknesses. It was, and is, sometimes
difcult to admit students may not be
doing as well as would be desired on a
particular measure, and the roundtable
conversations provide a vehicle in which
weaknesses in student learning, engage-
ment, or satisfaction can be discussed in
a non-threatening manner. The IAC peer
reviewers share examples of approaches
that they have tried in their program or
that they have seen in other programs
they have reviewed, which can lead to
Programs leading to a degree or certicate, general education,
developmental education, and non-degree granting academic and non-
academic units participate in the assessment process each year.
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 9
a rich conversation on how to make im-
provements based on data. Although the
process is not perfect, and there have been
adjustments made along the way, the as-
sessment culture at CU has changed dra-
matically over the last 10 years.
Other key lessons learned were that
training, resources, and timely responses
to questions were needed to help faculty
and staff. The IRAA director conducts
training sessions on the assessment soft-
ware used for the assessment reports,
provides an updated User Document each
year, and hosts ofce hours. Each year
IAC members undergo peer reviewer
training before beginning their reviews,
which helps the reviewers provide consist-
ent, clear, and constructive feedback to the
programs. Faculty and staff can send an
email to [email protected] with
any questions or concerns they have relat-
ing to assessment; someone in the IRAA
ofce responds within one business day.
CU also provides resources to assist
programs and units participating in the as-
sessment process to ensure the processes
and methodologies for assessment reect
good practice. The chair of IAC conducts
an annual campus-wide meeting to update
and inform faculty and staff on the assess-
ment process. Additionally, IAC hosts
workshops to provide support for assess-
ment topics. Workshops in the last 10
years include developing SLOs, rubrics,
curriculum mapping, inter-rater reliabil-
ity and content validity, online assessment
strategies, data tables, and co-curricular
assessment. In 2021–2022, members of
IAC are making short videos on key as-
sessment topics that will be available on
demand. Over the last 10 years, additional
funding has been available to offset costs
associated with assessment, including
external reviews of locally developed
measures and rubrics for content validity,
external experts to conduct workshops on
assessment-related topics, and opportuni-
ties for faculty to participate in confer-
ences related to assessment. The IRAA
Ofce provides funds to purchase stand-
ardized assessment exams.
Recommendations to Other
Institutions as They Consider
the EIA Designation Application
Process
Although the writing of the EIA nar-
rative for the application process can
involve signicant time and effort, we
found the process to be meaningful and
useful. The application process allowed us
to reect and articulate the improvements
we have made as a campus and to help
focus the discussion on what areas still
need improvement. The provided rubric
is especially valuable in preparing a suc-
cessful application. We formed a smaller
sub-committee to write the application
narrative. This draft was then shared with
the relevant committees and leadership to
garner feedback and strengthen the appli-
cation. One of the most challenging parts
of the application was to include as much
information as possible while still coming
in under the word count of the applica-
tion. Our most recent application was one
word short of the limit! The feedback re-
ceived from CU’s successful Excellence
in Assessment application in 2015–2016
helped guide improvements in the assess-
ment process, and CU looks forward to
using the feedback received this Fall from
EIA to guide improvements over the next
ve years.
Karla J. Oty is the director of Institutional
Research, Assessment, and Accountability
at Cameron University.
(continued on page 16)
and on resumes to show employers their
achievements.
Lessons Learned
Reecting on challenges and imple-
menting continuous improvements with
our own assessment strategies is no dif-
ferent from the process we do on a regular
basis. Here are a few of the lessons we
have learned and are actively targeting for
improvement:
1. Closing the Loop: One of our biggest
challenges was “closing the loop” on
our quality-improvement efforts. Each
of our assessment strategies had well-
considered, detailed processes laid
out. However, we continued to strug-
gle with the collection and tracking
of the nal decisions made toward
our continuous quality efforts. To ad-
dress this challenge, we created best
practices and expectations for docu-
mentation of improvement initiatives
and a standardized tracking document.
While still in the early stages, we have
seen an increase in follow-through of
action plans.
2. Improved Communication and Vis-
ibility: Many times, assessment work
had been unknowingly duplicated or
repeated due to lack of visibility and
communication. In response, regular
quarterly check-in meetings were es-
tablished with all stakeholders of an
academic program to ensure aware-
ness of assessment efforts. Addition-
ally, a centralized SharePoint site was
created to house assessment efforts
and is widely available to stakehold-
ers. This has helped increase visibility
of current assessment work and allows
for previous efforts to be reviewed to
help minimize redundancies.
3. Evidence of Faculty Involvement: Fac-
ulty involvement, both leading and
contributing to our assessment sys-
tems, is part of our regular processes.
However, we recognized there was
not always clear tangible evidence of
their involvement. As a result, we have
prioritized taking thorough minutes in
committee meetings and saving email
correspondence as direct evidence of
faculty involvement.
Recommendations to Future
Applicants
When considering the EIA designa-
tion application process, we recommend
applicants:
Capella University’s Journey Toward
Sustained Excellence in Assessment
(continued from page 2)
10 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
Transformative Principles Contributing to
Whatcom Community College’s Assessment
Progress
Anne Marie Karlberg, Tresha Dutton, Peter Horne, and Ed Harri
Institutional Context
W
  -
lege (WCC), in Bellingham,
Washington, served 7,400 stu-
dents in 2020–21. WCC offers transfer
degrees, professional-technical degrees
and certicates, Bachelor of Applied Sci-
ence degrees, basic education, and com-
munity and continuing education courses.
In February 2008, the Northwest
Commission of Colleges and Univer-
sities expressed “grave concern” that
WCC did not have meaningful assess-
ment processes linking data, analysis,
and planning. With new leadership and
faculty, WCC made critical and mean-
ingful changes to its processes, devel-
oping sustainable assessment processes
to support student learning and inform
college planning. In 2007, WCC created
a faculty outcomes assessment coordi-
nator position with release time and, in
2008, hired a director for assessment and
institutional research (AIR) to build an
AIR ofce. These two individuals have
worked together for the past 13 years.
In its 2019 accreditation visit, WCC re-
ceived no recommendations and three
commendations, including recogni-
tion of “its widespread and systematic
use of data for decision-making and
improvement.
WCC’s Guiding Principles for
Outcomes Assessment
The following principles are indis-
pensable in guiding the outcomes assess-
ment work at WCC to inspire and support
conversations among faculty, staff, and
students and to improve student learning.
WCC lives and breathes these principles
that have transformed assessment.
1. Creating sustainable processes
a. Create simple, meaningful, and
sustainable reports and processes:
WCC limits report content to the
most essential elements that en-
courage reection and result in ac-
tion (preferably 1–2 pages).
b. Start with invested faculty and staff:
When WCC rst developed assess-
ment processes, it started with a
group of faculty volunteers who
saw the long-term vision. WCC fo-
cuses on faculty who are motivated
to create meaningful teaching and
learning experiences and promotes
opportunities for other faculty to
join when they begin to hear the
passionate conversations among
their peers.
c. Frame outcomes assessment re-
sults as a resource for continuous
improvement and support: WCC
makes it clear to faculty and staff
that assessment results will not be
used punitively.
d. Start with a proposal and request
feedback when creating new as-
sessment-related processes: Start-
ing conversations with the best
available ideas, while encouraging
critique and the emergence of new
ideas, often results in more mean-
ingful and substantial feedback and
products. For example, when rst
developing rubrics, consider start-
ing with the NILOA rubrics, rather
than creating rubrics from scratch.
e. Develop a culture of institutional
learning through action: WCC
gathers enough information—rath-
er than an exhaustive amount—at
an appropriate level, to be able to
take a logical next step and, then,
learns from the experience. For
example, WCC piloted a new core
learning ability report with summer
faculty to get a sense if the college
was moving in the right direction in
revising the process. This informa-
tion was then immediately used in
the fall by the outcomes assessment
committee to guide their next steps.
f. Create support and incentives for
faculty to participate: In addition to
the faculty educational workshop
incentives noted in the paragraph
below, faculty professional reports,
including course outcome reports,
are contractually required annually
of full-time and adjunct faculty.
Embedding this work in the review
cycle has created a systematized
and valued element of the faculty
review process.
Invest in relationships across campus to build trust, solicit input
and feedback, improve relevance and responsiveness, offer
support, and increase receptivity.
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 11
2. Engaging in collaborative learning
a. Engage faculty and staff in peer-
driven professional development
opportunities to advance assess-
ment work: WCC is committed to
instituting transformational change
by encouraging faculty and staff
to use innovative, equity-driven
strategies for student learning and
assessment. To advance this effort,
WCC engages faculty and staff in
peer-driven professional develop-
ment. Since 2011, WCC has of-
fered faculty education workshops
(FEWs) focused on teaching,
learning, and assessment practice.
Through these FEWs, which are
15-hour mini-courses, WCC es-
tablished and institutionalized the
foundational work of outcomes
assessment and data-driven reec-
tion. Initial FEWS were designed
by AIR and focused on topics such
as writing course outcomes; creat-
ing meaningful rubrics; and align-
ing course outcomes, teaching
strategies, and assessments. Over
time, an increasing number of fac-
ulty and staff developed FEWs, and
WCC now offers about 13 FEWs
annually, focusing on assessment,
equity, student-centered pedagogy,
and using student success data for
improvement. As part of WCC’s
commitment to assessment and
equity-driven pedagogy, full-time
faculty completing FEWs receive
permanent salary increases, and ad-
junct faculty are paid stipends. This
investment has provided a huge
incentive for participation in out-
comes assessment and data-driven
equity work and has reinforced the
value of this work by faculty.
b. Routinely request feedback on
processes and reports: For exam-
ple, WCC includes a space on its
reports for faculty and staff to sug-
gest improvements to its reports
and processes. The College then
tries to integrate substantive sug-
gestions. For instance, the faculty
reports Canvas page noted below
resulted from a faculty suggestion
on a program outcome report.
3. Providing transparent communication
a. Make outcomes assessment data
and resources accessible on a pub-
lic website: Since 2009, WCC has
maintained a comprehensive pub-
lic AIR website, which includes a
wealth of outcomes assessment,
student success data, and educa-
tional materials and serves as a re-
source for faculty, staff, students,
and the public. Maintaining re-
sources in an easily accessible cen-
tral location broadens engagement
and participation in assessment ef-
forts. Providing access to resourc-
es and transparency is central to
WCC’s assessment work.
b. Create a central place for faculty to
submit reports: WCC has a faculty
reports Canvas page centralizing
all faculty assessment resources,
reporting, and tracking.
c. Provide timely feedback to each
faculty and staff member who sub-
mits an assessment report: Faculty
and staff receive feedback on all
reports submitted, to acknowledge
the value of their work and appreci-
ation for the time dedicated to cre-
ating the reports. The “next steps”
identied by faculty in their course
and program outcome reports are
emailed to faculty during the quar-
ter in which the information is rel-
evant, reminding them of the great
ideas generated when the initial re-
ports were submitted.
d. Communicate assessment-related
information in multiple forums and
encourage conversations: In addi-
tion to communicating informa-
tion via the website, professional
development day, and other work-
shops, AIR staff routinely meet
one-on-one and in small groups
with faculty and staff to discuss as-
sessment information. WCC tries
to create spaces—at workshops, in
meetings, or one-on-one—where
faculty and staff can reect about
outcomes assessment informa-
tion and, together, consider possi-
ble next steps. Also, in 2018, AIR
began sharing recent assessment
results in short catchy monthly or
quarterly emails to employees ti-
tled “What’s in the AIR?” In spring
2020, AIR also began sending reg-
ular “Assess-Minute” emails to fac-
ulty communicating brief, relevant,
and timely assessment, teaching,
and learning resources.
e. Invest in relationships across cam-
pus to build trust, solicit input and
feedback, improve relevance and
responsiveness, offer support, and
increase receptivity. When possi-
ble, respond to individuals request-
ing assistance through a phone call,
in-person, or zoom, rather than
through email.
Next Steps in WCC’s Assessment
Journey
Embracing its guiding principles of
creating sustainable processes, engaging
in collaborative learning, and providing
transparent communication, WCC is fo-
cusing on two major initiatives this aca-
demic year:
1. WCC’s core learning ability process,
which has been in place for the past
eight years, is being revised by its
outcomes assessment committee to in-
crease the meaningfulness of the data
and simplify the process.
2. WCC will be more proactive in en-
gaging students in all assessment
processes by forming an AIR student
advisory group (with paid students),
which will take WCC’s assessment
work to the next level, providing
more systematic student input and
feedback.
Anne Marie Karlberg is the director for as-
sessment and institutional research, Tresha
Dutton is a professor of communication
studies and faculty outcomes assess-
ment coordinator, Peter Horne is a senior
research analyst, and Ed Harri is the vice
president for instruction at Whatcom Com-
munity College.
12 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
Sustaining a Culture of Assessment
Excellence at IUPUI
Susan Kahn and Stephen P. Hundley
Our Context
W
 , , +
programs (including under-
graduate, graduate, and profes-
sional), more than 4,000 faculty members,
and a budget of some $1.7 billion, Indi-
ana University-Purdue University Indian-
apolis (IUPUI) represents a unique and
enduring partnership between Indiana’s
two major public universities. Formed in
1969 from an array of graduate and pro-
fessional programs offered by Purdue and
IU in Indianapolis, IUPUI built its under-
graduate core within a non-traditional, de-
centralized framework. Degree programs
are connected to either IU or Purdue and
students receive degrees from one or the
other, depending on the program. In ad-
dition, IU, our managing partner, was an
early adopter of Responsibility-Centered
Management, a budget model that sup-
ports the scal decentralization and in-
dependence of each budget unit. Given
this history and context, academic cul-
tures across the campus vary widely and
top-down academic mandates are typi-
cally difcult to implement. For us, then,
“excellence in assessment” arises from a
culture of evidence-based teaching and
learning informed by strategic and dis-
tributed leadership that vests assessment
primarily at the program level.
In 2016, we were recognized as one of
the inaugural recipients of the Sustained
Excellence in Assessment designation,
and we were pleased to receive this honor
once again in 2021. IUPUI has created an
enduring and pervasive culture of assess-
ment and improvement for more than 30
years. Supporting this culture are an ar-
ray of campus-wide resources, including
a robust data infrastructure, a rich variety
of professional development opportuni-
ties, and distributed assessment expertise.
Assessment and improvement are also
strengthened by effective leadership and
governance at the campus, school, and de-
partment levels.
At the undergraduate level, we envi-
sion students’ educational experience as
a coherent learning pathway, beginning
with the First-Year Experience; advancing
through general education core and elec-
tive courses, the major, and at least four
validated “engaged learning” experiences;
and culminating in a senior capstone expe-
rience. Ideally, these experiences combine
and cohere, as students gain progressive
mastery of both their chosen discipline and
our institutional learning outcomes, known
as the Proles of Learning for Undergradu-
ate Success (“the Proles”), which incorpo-
rate both intellectual and personal growth.
Ongoing assessment, improvement,
and evidence-based decision-making are
essential to realizing this vision for all
students. Thus, to qualify for inclusion in
the general education program, courses
must present evidence of student learning
of one or more of the Proles. All under-
graduate degree programs have integrated
the Proles into degree-level learning out-
comes, introductory courses, key courses
in the major, and capstone experiences.
Many have gone beyond this minimum
requirement and aligned all courses, and
even assignments, with the Proles.
Because of IUPUI’s decentralized
structure and culture, we do not mandate
specic assessment strategies, methods,
or instruments; rather, we build capacity
for assessment and improvement through
professional development, collaborative
campus-wide initiatives, and committees
that engage in ongoing assessment work,
discussion, and exchange. Annual unit
assessment reports are reviewed through
our campus-wide Program Review and
Assessment Committee (PRAC), and,
collectively, enable us to gauge progress
in student achievement of campus-wide
learning outcomes.
IUPUI’s Conceptual Learning
Framework
To encapsulate the myriad instructional
and assessment activities taking place
across IUPUI, our conceptual learning
framework depicts how we intentionally
develop, implement, and align campus-
wide efforts to promote student learning. At
IUPUI, we prepare graduates for a variety
of post-degree roles and contexts. Some of
these broad outcomes include demonstrat-
ing civicmindedness, nding employment,
engaging in lifelong learning, pursuing
graduate and professional education,
thriving in a diverse and global world, and
remaining connected to us as alumni. Stu-
dents participate in a variety of purposeful
learning experiences on their pathway to
graduation to prepare them for a dynamic,
meaningful, and resilient future.
As noted, the Proles are IUPUI’s
institutional-level student learning out-
comes (SLOs). All our learning activities
intentionally prepare students to be com-
municators, problem solvers, innovators,
and community contributors—learning
outcomes we desire of all our graduates,
regardless of major. They are cascaded
and aligned throughout IUPUI. Program-
level SLOs represent specic learning
achievement required of graduates in indi-
vidual degree programs. These reect the
various disciplinary ways of advancing
our broader institutional SLOs. Course-
and activity-level SLOs are the individual
contexts in which learning occurs. These
include academic courses, along with ex-
periential, community, global, and cocur-
ricular learning opportunities involving
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 13
on- and off-campus partners. The Proles
and program-level SLOs get introduced
and/or reinforced in these learning expe-
riences. Finally, assignment-level SLOs
include specic interventions and assess-
ments designed to implement course- and
activity-level goals for learning. These
also give students plentiful opportunities
to demonstrate competence related to the
Proles and program-level learning goals.
Undergirding these activities is IU-
PUI’s mission, vision, values, and stra-
tegic plan. Our #1 strategic plan goal is
to promote undergraduate student learn-
ing and success. In addition to academic
affairs, student affairs, and the academic
units, a host of ofces and committees
engage in distributed leadership to sup-
port our efforts, including: PRAC; Center
for Teaching and Learning; Planning and
Institutional Improvement; Institute for
Engaged Learning (IEL); Student Expe-
rience Council; Institutional Research &
Decision Support (IRDS); Undergraduate
Affairs Committee; Division of Under-
graduate Education; and Ofce of Com-
munity Engagement. Several processes
and tools enable faculty, staff, students,
and other stakeholders to facilitate and
document student learning and assure
our ongoing commitment to quality. Pro-
cesses include degree proposals, periodic
general education and program reviews,
strategic plan and PRAC reports, and ac-
creditation activities. Tools such as our
Learning Management System, The Re-
cord (IUPUI’s version of a Comprehen-
sive Learner Record), ePortfolios, degree
maps and audits, and transcripts all sup-
port and encapsulate student achievement
of learning at IUPUI.
Recent and Ongoing Priorities
Since 2016, IUPUI has focused ev-
idence-informed improvements in ve
high-priority areas that advance our stra-
tegic plan goal of promoting undergradu-
ate learning and success:
1. We developed capacity to disaggregate
data to highlight needs and uncover
equity gaps among various popula-
tions. Robust data infrastructure in
IRDS enables campus-, unit-, and
program-level decision-makers to ac-
cess context-specic student data to
identify and address equity gaps. We
also provide professional development
to enhance decision-makers’ ability to
understand and respond appropriately
to assessment data.
2. We focused on holistic learner support.
Enhanced collaboration and coordina-
tion across campus-level units enabled
implementation of needed improve-
ments: expanding Bridge and the other
programs above; increasing capacity
in Counseling and Psychological Ser-
vices; expanding resources to address
student nancial, housing, and food
insecurity; and organizing scattered
services into a comprehensive Center
for Transfer and Adult Students.
3. We reorganized our work on High Im-
pact Practices (HIPs). The creation of
IEL brought campus ofces leading
HIPs under one organizational um-
brella. This realignment enables us to
promote delity, equity, and scalabil-
ity of HIPs and to integrate HIPs en-
gagement into guided, coherent educa-
tional pathways.
4. We pursued a strategic equity agenda.
These efforts have included a holistic,
test-optional admissions process de-
signed to broaden access for histori-
cally underserved students; updated
promotion and tenure guidelines that
recognize and reward faculty achieve-
ments that enhance equity and inclu-
sion; and professional development on
culturally responsive instructional and
assessment approaches.
5. We disseminated our assessment work,
strengthened our reputation for assess-
ment leadership, and advanced assess-
ment as a eld. Along with organizing
the Assessment Institute in Indianapolis
and producing Assessment Update
both sites for disseminating IUPUI as-
sessment work—campus assessment
leaders implemented the podcast series
Leading Improvements in Higher Edu-
cation, and produced a volume, Trends
in Assessment, that draws on Institute
tracks to discern new and emerging as-
sessment trends and issues.
Reection and Conclusion
Three decades of sustained effort have
enabled IUPUI to establish and maintain
a ourishing assessment culture. Special
strengths include:
Abundant opportunities for profes-
sional development in assessment.
Widespread understanding of the im-
portance of ongoing assessment and
improvement.
Knowledgeable leadership for
assessment.
Distributed assessment expertise at the
institution, school, and program levels.
Varied approaches to assessment
among our diverse academic units,
enabling assessment to reect unit
missions and disciplinary standards of
evidence, thus supporting sustainabil-
ity of assessment.
A reward structure that recognizes as-
sessment achievements and leadership.
Signicant contributions to the devel-
opment of the assessment as a eld.
Finally, our goals for the future
include:
Continue implementing the Proles
and promoting “whole student” devel-
opment through use of authentic as-
sessment approaches.
Expand the number of experiences
included in The Record, based on evi-
dence of student learning of the Proles.
Implement a campus-based assess-
ment award to complement the Trudy
W. Banta Lifetime Achievement in As-
sessment Award conferred at the As-
sessment Institute.
Continue to broaden stakehold-
er engagement in assessment and
improvement.
Continue promoting increased campus
assessment capacity through profes-
sional development.
Enhance IUPUI’s national leadership
role in developing the assessment eld.
Continue to sustain excellence in
assessment.
Susan Kahn is director of planning and in-
stitutional improvement initiatives (retired)
and Stephen P. Hundley is senior advisor to
the chancellor and executive editor of As-
sessment Update, both from IUPUI.
14 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
NILOA Perspectives
What’s Next for the Excellence in Assessment
Designation
Gianina Baker and Kate Drezek McConnell
Context
T
   
(EIA) Designation was intro-
duced at the Association of
Public and Land Grant Universities in
2016 under Peter McPherson’s leader-
ship and Teri Hinds’ direction. Jointly
sponsored by VSA Analytics, the As-
sociation of American Colleges & Uni-
versities (AAC&U), and the National
Institute for Learning Outcomes Assess-
ment (NILOA), the EIA Designation has
evolved since its original release. Each
year, we, the co-sponsoring organiza-
tions, get a chance to take stock of what
we’ve learned from the current class of
designees, hear feedback on the process
from our expert reviewers as well as
prior designees, and discuss the future of
the Designation. In this NILOA Perspec-
tives column, we celebrate our 2021 EIA
Designation class, reect on the past ve
years, and share an announcement with
readers.
2021 Excellence in Assessment
Class
First, congratulations to our 2021
EIA Designees! The 2021 class includes
ve Sustained Excellence in Assess-
ment Designees—Cameron University,
Capella University, the Community Col-
lege of Baltimore County, IUPUI, and
Rose-Hulman Institute of Technology—
all of which earned either Excellence or
Sustained Excellence Designations in
2016. We are also excited to welcome
two new Excellence in Assessment De-
signees—California State University-
East Bay and Whatcom Community
College—which now increases our total
to 41 EIA Designees!
Particular to this year’s class is their
collective commitment to transparency
in the assessment of student learning.
From implementation of solid, mature
assessment infrastructures to intentional
involvement of internal and external
stakeholders to evidence of thoughtful,
reective data discussions, these institu-
tions’ processes afrm that there is no
one right way to do assessment, as they
were able to write balanced, organized,
and cogent narratives. These institutions,
along with previous designees, continue
to serve as models of scaled, aligned as-
sessment practice.
This year’s class literally weathered
the storm that is COVID-19. We, the co-
sponsoring organizations, questioned if
the institutions would be able to write a
cohesive narrative and have active par-
ticipation and support from staff so as to
not divert from their priorities in a time of
rapid change and inequity during the pan-
demic. Designee institutions overcame
challenges in a time of rapid change and
inequity during the pandemic, demon-
strating they are true exemplars in higher
education assessment practice.
EIA Designation Changes
Through the Years
Just as institutions practice continu-
ous improvement with their institutional
processes for assessment, we too engage
in a robust evaluation of the process
through which we bestow institutions
the EIA Designation. Over the years,
we made several improvements to the
application process, the application it-
self, as well as the evaluation rubric. For
example, we experimented with using
weighting multipliers after the rst year
of the EIA Designation, but ultimately
decided to remove them based on several
conversations with reviewers, applicants,
and designees. We worked diligently to
clarify the language of the application
and added additional detail to our scor-
ing instructions so reviewers could more
easily score applications. Most impor-
tantly, we worked to ensure the applica-
tion and scoring criteria truly reected
our collective sense of excellent praxis
in assessment, including expanding
key stakeholders to include community
partners and community-based organi-
zations; emphasizing the importance
of educating the whole student by add-
ing co-curricular and unit-level assess-
ment to the application and the rubric;
and nally, providing institutions ample
space within the application to address
how they were connecting assessment of
student learning and success and equity.
Collectively, the co-sponsoring or-
ganizations worked to steady the EIA
Designation process, despite staff turno-
ver at each of our respective organiza-
tions. We have worked to promote the
process and Designees through our vari-
ous channels, provided opportunities to
celebrate each Designee, and we con-
tinue to answer questions as institutions
seek the Designation.
Most recently, with a variety of con-
versations and deep discussions about
what the EIA Designation is and what it
means to applicants, previous designees,
Designee institutions overcame challenges in a time of rapid change
and inequity during the pandemic, demonstrating they are true
exemplars in higher education assessment practice.
Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC 15
reviewers, co-sponsors, endorsers, and
the assessment community writ large,
we recognize that to ensure relevance the
EIA Designation needs to evolve.
What’s Next?
NILOA is now going through its
own transitions, and we, too, need to
acknowledge change. In response to
these changes, AAC&U, a long-time
collaborator on the EIA, will take on
stewardship of the award beginning in
January 2022. Under the leadership of
Kate Drezek McConnell, Vice President
for Curricular and Pedagogical Innova-
tion and Executive Director of VALUE,
AAC&U will engage the broader assess-
ment community in a robust and reec-
tive evaluation of the EIA Designation’s
mission, processes, and outcomes in or-
der to identify areas of excellence within
the current protocols as well as opportu-
nities for change, growth, and enhance-
ment. Drawing on its history of com-
munity engagement and crowdsourcing
within the higher education community,
AAC&U is excited to embark on this
important work. You can expect a call to
engage in this process in the near future.
At AAC&U, we believe now is the
right moment for further reection, re-
view, and possible enhancement of the
EIA designation. As we work to ensure
the long-term viability and relevance of
the EIA designation, there are questions
we will continue to ask of ourselves
and of the designation, from deciding
whether or not there should be additional
levels (e.g., a “Rising to Excellence” or
“Honorable Mention” for those institu-
tions who were told to resubmit their
application) to addressing the dynamic
tension between articulated and enacted
assessment practices, along with demon-
strated results of these processes. These
questions and more will assist in our
evaluation of the EIA process and help
decide the path forward.
Final Thoughts from NILOA
NILOA has shepherded the EIA pro-
cess through its many changes, and while
it is tough to let go of the reins, we are
excited for EIAs future. We know and
trust it is in the good hands of AAC&U,
and we look forward to assisting where
needed. We want to recognize the hard
work of our expert assessment reviewers
in providing feedback to institutions that
applied each year.
And most of all, we want to thank
the 41+ institutions that have applied
and made the EIA Designation what it
is today. As we continue to highlight and
celebrate your excellence in assessment,
we hope to have earned your trust and
friendship. For those of you with whom
we have interacted over the past few
years discussing and lifting up your ex-
cellent assessment practices, it was our
pleasure.
Gianina R. Baker is the acting director of
the National Institute for Learning Out-
comes Assessment and the associate direc-
tor for evaluation, learning, and equitable
assessment in the Ofce of Community
College Research & Leadership at the
University of Illinois Urbana-Champaign;
and Kate Drezek McConnell is the vice
president for curricular and pedagogi-
cal innovation and executive director of
VALUE at AAC&U.
place within learning experiences at the
course, program, and institutional levels.
We will discuss Principle #1 in greater
detail in Volume 34, Number 2.
Principle #2
: Value the multitude
of perspectives, contexts, and
methods related to assessment
and improvement
Peer review processes require an un-
derstanding of how perspectives, con-
texts, and methods support assessment
and improvement activities. Perspectives
in peer review include those of review-
ers, stakeholders, and decision-makers.
The value of peer review is often maxi-
mized by leveraging and incorporating
feedback from multiple peer reviewers,
including internal colleagues, external
subject matter experts, community mem-
bers, and other important constituents of
the activity undergoing review. Stake-
holders include administrators, who may
sponsor the peer review process; faculty
and staff of the activities involved in the
peer review process; students and alumni
who are often direct beneciaries of
learning interventions; and partners, in-
cluding those on-campus or elsewhere,
who make specic learning contribu-
tions. Decision-makers are individuals at
various levels who lead and champion the
work being peer reviewed and are often
able to affect change as an outcome of
feedback received from reviewers. Con-
texts for peer review in assessment and
improvement include both the type and
scope of activity undergoing peer review
and its placement in the activity lifecycle,
along with the institutional culture for as-
sessment and improvement, the motiva-
tions for peer review, and how outcomes
from peer review processes are used. Fi-
nally, methods employed in the peer re-
view process are often informed by the
goals and scope of the activities being
reviewed. Such methods may include a
blend of direct, indirect, quantitative, and
qualitative approaches to data gathering;
use in-person, virtual, hybrid, or inde-
pendent review of artifacts; involve ob-
servations, interviews, focus groups, and
document analysis; rely on individual or
team judgements; and range from highly
prescribed/structured to highly emergent/
semi-structured review processes. We
will discuss Principle #2 in greater detail
in Volume 34, Number 3.
Peer Review in Assessment and Improvement:
An Overview of Five Principles to Promote
Effective Practice
(continued from page 3)
(continued on page 16)
16 Assessment Update • January–February 2022 • Volume 34, Number 1 • © 2022 Wiley Periodicals LLC
Principle #3
: Adopt a consultative
approach to the peer review
process
Effective peer reviewers often adopt a
consultative approach to the peer review
process, which involves reviewing infor-
mation, querying stakeholders, evaluat-
ing evidence, making judgements, and
generating recommendations. Such a
consultative approach entails having the
peer reviewer serve as a “critical friend”
to the program, entity, or context under-
going review, along with understanding
desired roles, behaviors, and expectations
of a consultant. The consultative process
in which peer reviewers participate in-
clude phases such as preparation, initial
entry, engagement, analysis, judgment,
feedback, clarication, and exit, with spe-
cic stakeholder relationships unfolding
in each phase. There are numerous other
considerations involved in the consulta-
tive approach, including using specic
tools and resources to engage in peer
review; adopting an appreciative inquiry
perspective to the work; placing the re-
view of an activity in its broader context,
such as institutionally, disciplinarily, or
nationally; navigating ambiguity, com-
plexity, and interpersonal or political dy-
namics; and fostering an environment that
allows for candid exchange of ideas and
experiences. We will discuss this princi-
ple in Volume 34, Number 4.
Principle #4
: Make effective
judgements using inclusive
sources and credible evidence
One principal role of peer reviewers in
their assessment and improvement work
is to make effective judgements using
inclusive sources and credible evidence.
This entails determining who are “inclu-
sive sources” and what counts as “cred-
ible evidence” in reviewing the activity. It
also relies on peer reviewers ensuring that
all necessary stakeholder perspectives
are included in the process; such stake-
holders often include students, alumni,
faculty, staff, administrators, colleagues
elsewhere in the institution supporting or
interacting with the activity undergoing
review, and external partners. The goal
is to invite and promote the multiplicity
of sources to inform themes. As peer re-
viewers engage in their analysis of feed-
back from stakeholders, it is necessary for
them to identify isolated incidents, pat-
terns of behavior, and systemic issues, all
of which should yield information about
what is working well, what are areas for
improvement, and what are specic rec-
ommendations or observations. As peer
reviewers make effective judgements,
they will need to recognize the broader
environmental considerations; this entails
placing the activity in its proper com-
parison context. Often this involves an
understanding of satiscing vs. maximiz-
ing performance or outcome of the activ-
ity being reviewed, with an appreciation
of the activity’s resources, contexts, and
priorities. Finally, peer reviewers need to
always keep in mind the scope of the re-
view and remind themselves—and others
involved in or benetting from the peer
review process—of the type of informa-
tion the reviewer is being asked to provide
perspectives. We will discuss this princi-
ple in Volume 34, Number 5.
Principle #5
: Provide relevant
feedback to stakeholders
Ultimately, effective peer review pro-
cesses yield outcomes that can make
a positive difference to enhancing the
performance of individuals, learning en-
vironments, programs, and institutions.
This requires peer reviewers to provide
relevant feedback to stakeholders. There
should be distinctions made between
evaluative and improvement-oriented
feedback, along with an understanding of
the format in which feedback is expected
and the intended audiences and uses for
feedback. The timing and nature of feed-
back—formative, to make improvements
vs. summative, to provide evaluations—
also needs to be claried as part of expec-
tation setting for peer review processes.
Often feedback from peers involves shar-
ing of recommendations; thus, care and
attention is necessary to prioritize recom-
mendations, including identifying sequen-
tial or interdependent actions and the time
or cost horizons associated with recom-
mendations. In some instances, it may be
necessary for the recipients of feedback to
grapple with differing perspectives held
by multiple peer reviewers—either from
reviewers as part of a multi-reviewer team
or from feedback received by multiple in-
dividual reviewers. Finally, responding to
feedback, socializing the feedback with
stakeholders, adopting recommendations,
and institutionalizing components of the
peer review process are all vital compo-
nents to ensuring feedback from peers is
used effectively by stakeholders. We will
discuss this principle in Volume 34, Num-
ber 6.
There are plentiful opportunities and
contexts for using peer review to support
assessment and improvement in higher
education. Thus, we look forward to fo-
cusing more fully on each of these ve
principles in Editors’ Notes throughout
the remainder of 2022. Thank you for
reading Assessment Update.
start early and identify required sections
within the application. It’s especially
helpful to begin with the end in mind:
start your project with a clear vision of
your desired direction and destination;
identify subject matter experts who
can contribute knowledge to the plans,
strategies, and use of data that will be
showcased in your application;
create a project management timeline
with agreed upon deadlines; and
get an editor involved to review your
application to ensure ow, accuracy,
clarity, and lack of errors in your nal
product.
Jaclyn Zacharias, Nancy Ackerman, and
Christine S. Yates are assessment special-
ists at Capella University.
Capella University’s Journey Toward
Sustained Excellence in Assessment
(continued from page 9)