Skip to main content Skip to navigation
Division of Academic Engagement and Student Achievement UCORE Assessment

Use of Student Learning Evidence

Visual depiction of the NILOA Transparency Framework Use of Student Learning Evidence Component.Assessment results inform continual reflection and discussion of teaching and learning; they contribute to decision making to ensure effective teaching and learning. Decisions can include choosing to make changes to a program, continue current effective practices, or build on strengths.

Below are some examples of how student learning evidence in the context of UCORE contributes to decision making intended to support student learning and quality education.

Many Uses, Many Levels

Using assessment results to inform decisions and improvements is the most complicated “step” in the assessment cycle, since it includes multiple levels of use by faculty, program leadership, and university administration. 

Selected Examples of Uses of UCORE Student Learning Evidence

Quick Tabs

Enter list of tabs:

Edit panels:

Below are some examples of how student learning evidence from Key Assessments for UCORE contributes to decision making intended to support student learning and quality education. Click on the links for more information about the many ways that assessment results are being used to advance and improve the student experience.


UCORE Capstone [CAPS] Course Assessment

The UCORE committee and subcommittee for assessment review the UCORE Capstone [CAPS] Course Assessment results each year and suggest actions for improved assessment or use to inform decision-making, a practice begun in 2015. As part of the UCORE Capstone [CAPS] Course Assessment reporting, [CAPS] instructors also indicate if they have made changes to their [CAPS] courses based on previous assessments, or if they plan to make changes in the future.

Below are some recent examples of how student learning evidence from UCORE Capstone [CAPS] Course Assessment contributes to decision making intended to support student learning and quality education. (For additional current and past examples of how student learning evidence from UCORE Capstone [CAPS] Course Assessment contributes to decision making, see [CAPS] Assessment Use.)

  • Using Assessment to Inform Decision-making in AY 2019-20 UCORE Capstone [CAPS] Courses

    As part of AY 2019-20 [CAPS] Assessment Reporting for UCORE, instructors were asked if they planned to make any changes in future semesters based on assessments from the current semester. Overall, 58% of instructors indicated that they planned to make a change to their course based on assessment.

    Read Story
  • Using Assessment to Inform Decision-making in AY 2017-18 UCORE Capstone [CAPS] Courses

    As part of AY 2017-18 [CAPS] Course Assessment reporting, instructors who indicated they had taught their [CAPS] course before were asked to report if they had made any changes to their courses based on assessments from previous semesters. Additionally, instructors indicating that they would be teaching their [CAPS] course again in the future were asked if they planned to make any changes based on assessments from this semester. Overall, 68% of instructors indicated that they had made or planned to make a change to their course based on assessment

    Read Story
  • Using Assessment to Inform Decision-making in AY 2016-17 UCORE Capstone [CAPS] Courses

    As part of AY 2016-17 [CAPS] Course Assessment reporting, instructors who indicated they had taught their [CAPS] course before were asked to report if they had made any changes to their course in fall 2016 and spring 2017 based on assessments from previous semesters. Additionally, instructors teaching their [CAPS] course for the first time were asked if they planned to make any changes in future semesters based on assessments from this semester. Overall, 62% of instructors indicated that they had made or planned to make a change to their course based on assessment

    Read Story

See Key Assessments for additional information about UCORE Capstone [CAPS] Course Assessment.


Roots of Contemporary Issues [ROOT] Assessment

The Roots of Contemporary Issues Steering Committee and faculty review assessment results each year and suggest actions for improved assessment or use to inform decision-making, a practice begun in 2012-13.

Below are some examples of how student learning evidence from Roots of Contemporary Issues Assessment contributes to decision making intended to support student learning and quality education.

See Key Assessments for additional information about Roots of Contemporary Issues [ROOT] Assessment.


National Survey of Student Engagement (NSSE)

The UCORE subcommittee for assessment and university leadership review NSSE results following each administration. NSSE results are also disaggregated for undergraduate academic degree programs, colleges, and campuses to provide information about the student perspective to help continually improve the learning experience for students. NSSE reports provide participating institutions with results that compare their students’ responses with those of students at self-selected groups of comparison institutions. Universities can use their NSSE results to identify aspects of the undergraduate experience that can be improved through changes in policy and practice. 

Below are some examples of how student learning evidence from NSSE (updated in 2013) contributes to decision making intended to support student learning and quality education.

  • Using National Survey of Student Engagement (NSSE) Results at Multiple Levels to Improve the Student Experience

    The National Survey of Student Engagement (NSSE) provides valuable data at WSU for undergraduate academic degree programs, departments, colleges, campuses and the university. NSSE results provide degree programs, departments and colleges with information about the student perspective to help continually improve the learning experience for students. WSU students’ NSSE responses have also helped the university understand what is going well in terms of student engagement, and areas that could improve. 

    Read Story

See Key Assessments for additional information about the National Survey of Student Engagement.

Below are some examples of how student learning evidence on WSU’s Learning Goals contributes to decision making at the institutional-level, intended to support student learning and quality education.

  • Uses as a dashboard to monitor effectiveness of UCORE curriculum
  • Learning outcomes assessment data, complemented by centrally collected indicators (such as EAB analytics, course pass and DFW rates), have helped guide university-level initiatives and decisions to support teaching and learning, and quality education:
  • General education revision, adjustments to WSU’s Seven Learning Goals, UCORE handbook and policy, and renewal of UCORE courses
  • Recommendations from UCORE Assessment Committee to improve assessment processes

Below are some additional examples of how student learning evidence on contributes to decision making intended to support student learning and quality education in the context of the UCORE curriculum. Click on the links for more information about the many ways that assessment results are being used to advance and improve the student experience.

  • Embedded Assessment Results Influence Teaching and Build Shared Expectations of Student Achievement in English 101 [WRTG]

    WSU’s Pullman English Composition Program has used English 101 (College Composition) assessment results to guide professional development for instructors and to start conversations across campuses to increase the shared understanding of instructors about the learning outcomes and expectations for student achievement.

    Read Story
  • Research Services Librarians Conduct and Use Information Literacy Assessment

    Information Literacy, one of WSU’s Seven Learning Goals of Undergraduate Education, refers to effectively identifying, locating, evaluating, using responsibly and sharing information for the problem at hand. The nearly thirty WSU research/public services librarians, across all campuses (including fully online WSU Global), provide library instruction to faculty, students, and staff. Subject specialist librarians and departmental liaisons offer information literacy instruction in their respective disciplines and subject areas. Many of the public services librarians are part of the Library Instruction Team which focuses on collaborative instructional work with academic and co-curricular programs. 

    Read Story
  • Using Science Literacy Concept Inventory (SLCI) Results to Improve UCORE Courses

    Science Literacy Concept Inventory (SLCI) results provide faculty with information about students’ grasp of science literacy. Wrong answers on the SLCI reveal students’ common misconceptions about science, information which can be used to direct and improve teaching related to science literacy. For example, responses to SLCI of students in a particular course may reveal that many students hold a misconception that human perceptions alter physical laws. Armed with this knowledge, faculty could aim instruction at improving students’ understanding of how science rests on physical laws that are unchanged by public opinion or perception. 

    Read Story
  • Using Science Literacy Concept Inventory (SLCI) Results to Inform Decision-making at Multiple-levels

    Faculty have used results from the Science Literacy Concept Inventory (SLCI) in a variety of ways to inform and improve their science literacy instruction including adapting assignments and more explicitly addressing science literacy concepts in instruction. Academic degree programs have used the data to compare students at different academic levels. At the institutional level, SLCI data has been used to indicate progress toward institutional learning goals. 

    Read Story

While difficult to capture, some impacts also cumulate and contribute over time to promoting student learning at WSU and are not immediately visible:

Increasing shared faculty understanding curriculum

Assessment activities and discussion of results offer ways for faculty to think about student learning in the curriculum and how to support it most effectively in their own classes, increasing shared faculty understanding of the curriculum.  For example, norming on a rubric can deepen a common understanding of particular learning outcomes among faculty, and, over time, can help focus instruction and improve communication and feedback to students.

Sowing seeds that promote faculty learning

As avenues of organizational change, dialog and collaboration within the faculty collective on assessment-related activities can stimulate faculty learning and development — which over time feeds into continual improvement of teaching and learning. Though not immediately visible, influences may include: personal transformation in thinking about a particular aspect of teaching or learning, or how learning occurs; changes to faculty motivation or attitudes; disruptions to conventional wisdom which allow faculty to re-examine an issue in the future; or building communities of practice within key courses, programs or a department.

Reference: An Integrated Model of Influence: Use of Assessment Data in Higher Education [Jonson, Guetterman, Thompson] Research and Practice in Assessment, Vol Nine, Summer 2014