Assessing Curricular and Co-Curricular Student Learning
Guided by the understanding that outcome-based student learning assessment provides the tools to communicate about, understand, and advance student learning in curricular and co-curricular areas, UNE’s University Assessment Committee (UAC) has put together the following mix of assessment resources.
The materials reflect the UAC’s mission of enhancing, facilitating, and making transparent a university-wide student learning assessment system, and its vision of expanding, promoting, and facilitating equity-based and equity-driven assessment practices, to advance student learning in UNE’s curricular and co-curricular areas.
Key Steps of an Assessment Process
Assessing curricular and co-curricular programs’ educational effectiveness entails several standard steps. Wiggins and McTighe’s (1998) backward design serves as a model for establishing those steps and making outcomes-based student learning assessment practices central to advancing student learning.
Unlike the traditional, frontloading framework that involves designing curricular and co-curricular offerings around the content and materials, backward design begins by establishing the student learning outcomes, determining the assessment measures, and then planning the learning materials and instruction around those outcomes.
As the assessment wheel illustrates, once curricular and co-curricular programs establish their learning outcomes and measures, they then continue to follow the standard assessment steps of identifying each outcome’s benchmarks or target goals, collecting and analyzing the data, and making decisions based on the findings.
Following the contributions of Montenegro (2020), and Lundquist and Heiser (2020), this assessment wheel adds another essential component to the steps, students and other stakeholders. To create an inclusive assessment process, Montenegro, and Lundquist and Heiser, advise involving students and stakeholders at every step by including their voices and input and considering their positionality, intentionality, and power. They advise us to keep the human at the center of assessment design and practice.
Sources
Lundquist, A. E., & Heiser, C. A. (2020, August 21). Practicing equity-centered assessment. Campus Labs. https://www.anthology.com/blog/practicing-equity-centered-assessment
Montenegro, E. (2020). Focus on students and equity in assessment to improve learning. In N. A. Jankowski, G. R. Baker, K. Brown-Tess, & E. Montenegro (Eds.), Student-focused learning and assessment: Involving students in the learning process in higher education (pp. 187-209). Peter Lang.
Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Association for Supervision and Curriculum Development.
Student Learning Outcomes
Student Learning Outcomes serve as the basis for program and course curriculum, and through assessment, lead to an understanding of the degree to which students develop those outcomes.
Direct and indirect Measures
Direct and indirect measures provide the tools to gather quantitative and qualitative data that can demonstrate the degree to which students achieve the learning outcomes.
After creating measurable student learning outcomes (SLOs), the next step is to develop direct and/or indirect measures that align with those SLOs and will provide evidence of the level of student learning the course/program has achieved. Consider using both direct and indirect measures to acquire more comprehensive evidence of student learning, which can lead to more informed curricular and instructional changes.
Using direct measures, qualified evaluators directly observe, assess, and provide tangible evidence of students’ skills, knowledge, and performance of the SLOs. Accompany direct measures with a clearly defined rubric, checklist, or exam blueprint that establishes the standards of the SLOs and allows for the systematic collection of evidence of student learning.
Indirect measures provide data on students’ self-reported thoughts, attitudes, beliefs, and values in regards to their learning, as well as data on the educational environment where that learning takes place. Indirect measures are often used to interpret and support evidence of student learning that was collected from direct measures.
- Scores and pass rates on standardized, licensure, or certification exams
- Capstone projects (e.g., research essays, theses, dissertations, presentations, oral defenses, exhibitions)
- Written work (e.g. minute papers, short answers, essays, scaffolded writing assignments)
- Annotated bibliography
- Presentations and PowerPoint presentations
- Poster boards
- Performances
- Portfolios of student work
- Case studies
- Roleplay
- Simulations
- Locally designed exams (e.g., final exams in key courses or qualifying exams)
- Journals / double-entry journals
- Team or group projects or presentations
- “Think-alouds”
- Knowledge maps
- Classroom response systems (clickers)
- Service-learning projects or experiences
- Online asynchronous student discussion threads
- Wikis or blogs
- Observations of student behavior (e.g., in presentations and group discussions)
- Debates
- Pre- and post-test or essay scores (and score gains that illustrate the value added to student learning)
- Field supervisor ratings of student skills in internships, clinical experiences, practica, student teaching, or other professional and content-related experiences
- Employer ratings of graduates’ performance in the workplace
- Course grades or students’ average grade of several different essays and assignments
- Retention rates
- Graduation rates
- Admission rates (such as those into other four-year colleges or graduate programs) and graduation rates from those programs
- Scores on tests required for further study, such as the Graduate Record Examinations (GRE) or the Medical College Admission Test (MCAT), that evaluate skills students have learned over a lifetime
- Quality and reputation of four-year and graduate programs where alumni have earned acceptance
- Employment of alumni in appropriate career positions and starting salary
- Alumni surveys of their career responsibilities and career satisfaction
- Student reflective essays or evaluations of their acquired skills and knowledge
- Other student surveys, questionnaires, exit interviews, or focus-group reports
- Employer surveys
- Student-earned honors, awards, or scholarships
- Rate of student involvement in faculty research, collaborative publications, or service learning projects
- Voluntary notes or gifts from students or alumni
*Adopted from Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). San Francisco, CA: Jossey-Bass.
Resources for Completing the Annual Assessment Report
UNE's academic and co-curricular units annually submit an assessment report to the head of their college/division as well as the Office of the Provost and the University Assessment Committee. For guidance on completing the report, refer to the following documents:
Internal and External Support
Internal Support
- University Assessment Committee
- UNE’s Center for Excellence in Teaching and Learning
- UNE’s Office of Institutional Research and Data Analytics
Local and National Assessment Resources, Professional Organizations, and Listservs
- AAC&U’s Valid Assessment of Learning in Undergraduate Education (VALUE) Rubrics
- American Evaluation Association
- ASSESS listserv
- Assessment Commons
- Association for Institutional Research (AIR)
- Association for the Assessment of Learning in Higher Education (AALHE)
- IUPUI Assessment Institute
- Learning Improvement Community
- National Institute for Learning Outcomes Assessment (NILOA)
- New England Educational Assessment Network (NEean)
- North East Association for Institutional Research (NEAIR)
Co-curricular Assessment Resources, Professional Organizations, and Listservs
- American College Personnel Association-College Student Educators International (ACPA)
- ACPA’s Commission for Assessment and Evaluation
- Assessment and Research in Career Services (ARCS)
- Council for the Advancement of Standards in Higher Education (CAS)
- National Academic Advising Association’s (NACADA) Assessment Resources
- National Association of Student Personnel Administrators (NASPA)
- Student Affairs Assessment Leaders (SAAL)
Assessment Journals
- Assessment and Evaluation in Higher Education
- Assessment in Education: Principles, Policy, and Practice
- Assessment Update
- Educational Assessment
- Educational Assessment, Evaluation, and Accountability
- Intersection: A Journal at the Intersection of Assessment and Learning
- Journal of Assessment and Institutional Effectiveness
- Journal of Student Affairs
- Journal of Student Affairs Inquiry
- Journal of Student Affairs Research and Practice
- New Directions for Teaching and Learning
- Research and Practice in Assessment