Program Assessment
What is program assessment and why is it important?
Program assessment is an on-going process designed to monitor and improve student learning. As part of the university's accreditation process, UTEP must describe how it measures the extent to which it reaches educational objectives that are consistent with its mission and appropriate for the degrees it offers. Under the leadership of the Associate Provost for Institutional Effectiveness, the Office of the Provost provides leadership of curriculum management, assessment activities, accreditation processes, and faculty activity reporting to measure and continuously improve academic programs on campus.
What does a program assessment plan entail; where do I begin?
An assessment plan is part of a continuous cycle in which assessment results are used to support sustainable improvements. An assessment plan helps programs make intentional connections between expectations for learning, course pedagogy, and curriculum development.
Assessment Plan Toolkit
This toolkit will provide your department with best practices, examples, and templates to build the elements of an assessment plan. The following is a visual model of the institution's approach to meaningful academic outcomes.
Writing Student Learning Outcomes
Student learning outcomes are the specified knowledge, skills, abilities or attitudes that students are expected to attain by the end of a learning experience or program of study. Good learning outcomes clarify what you want students to learn, and indicate what type of work might be addressed. Student learning outcomes should be meaningful, measurable, and manageable.
Student Learning Outcomes Examples
Learning outcomes are often mistaken for program goals. Below are examples of the differences, paying close attention to measurable verbs that demonstrate critical thinking.
Student Learning Outcomes |
Measurable Verbs that Demonstrate Critical Thinking | Program Goals |
Ambiguous/ Unmeasurable Verbs |
---|---|---|---|
Students will write persuasive papers using standard written English | Write | Students will communicate effectively | Communicate |
Students will articulate the value of considering diverse perspectives | Articulate | Students will appreciate diversity | Appreciate |
Students will apply technological tools to solve problems | Apply | Students will understand |
Understand |
Exercise for Writing Learning Outcomes
1. Who is expected to learn? Undergraduate students
2. What learning is expected? Identify environmental problems, evaluate problem-solving strategies, and develop science-based solutions.
3. When/where is learning expected? After completing pre-requisite courses.
4. Why is learning expected? Students need to meet program requirements to graduate.
Together, the responses above help to create the following student learning outcome:
Undergraduate students will be able to identify environmental problems, evaluate problem-solving strategies, and develop science-based solutions to meet program requirements after completing pre-requisite courses
Exercises for Writing Measurable Verbs
Benjamin Bloom created a taxonomy of measurable verbs to help us describe and classify observable knowledge, skills, attitudes, behaviors,
Creating a Curriculum Map
Curriculum mapping is the process of describing where (within the degree plan) student learning outcomes live, and how and where the student learning outcomes are developed. Curriculum mapping helps identify and address academic gaps, redundancies, and misalignments for purposes of improving the overall coherence of a course of study and, by extension, its effectiveness.
Curriculum Map Elements
- Student Learning Outcomes for your Program
- Required Courses in your Program
- Required Courses contributing to your Program
- Elective Courses in your Program
- Co-curricular requirements, if any
I = Objective is Instilled
R = Goal is Reinforced
P = Goal is Practiced
Course | Objective 1 | Objective 2 | Objective 3 | Objective 4 |
GEOL 1313 | I-P | I | ||
GEOL 1314 | P | I | ||
GEOL 2411 | I-P | I | I | I-P |
GEOL 3412 | P | I | I-P | |
GEOL 3315 | R | I-P | P-R | |
GEOL 3420 | R | P | P-R | |
GEOL 3423 | R | P | P | P |
GEOL 3425 | R | P | R | P |
GEOL 4375-4376 | R | P-R | P-R | P-R |
Objective 1: A general knowledge of physical and historical geology and of the interrelations between surface and interior earth processes.
Objective 2: The ability to solve geological problems, to propose multiple working hypotheses, to observe, and map surface geology, and deduce subsurface structure.
Objective 3: The ability to communicate geologic information in oral or written form.
Objective 4: The professional attitude required to conduct geological investigations as a graduate student or employees in industry or government.
I = Outcome is Introduced
D = Outcome is Developed & practiced with feedback
M = Demonstrated the Mastery level appropriate for graduation
Course | Outcome 1 | Outcome 2 | Outcome 3 | Outcome 4 | Outcome 5 |
"AHF" | I | I | I | ||
"AHM/C" | I,D | I,D | I,D | I,D | I |
"AHUD" | D | D | D | D | I,D |
"AHLA" | D | D | D | D | I,D |
"AHA/A | D | D | D | D | I,D |
"AHELECT" | M | M | M | M | M |
"AHCPSTN" | M | M | M | M | M |
Art History Foundation “AHF”: (ARTH 1305, ARTH 1306)
Introductory Modern/Contemporary level Art History course options: “AHM/C” (ARTH 2303, ARTH 2313)
Upper-Division Art History elective options: “AHUD” (ARTH 3305, ARTH 3310, ARTH 3315, ARTH 3340, ARTH 3385, ARTH 3393, ARTH 3395, ARTH 3399)
Upper-Division Art History Latin American course options: “AHLA” (ART 3353, ARTH 3355, ARTH 3357, ARTH 3359)
Upper-Division Art History Asian/African course options: “AHA/A” (ARTH 3361, ARTH 3364, ARTH 3366)
Art History Electives course options: "AHELECT" (AHM/C, AHUD, AHLA, AHA/A)
Art History Capstone course options: “AHCPSTN” (ARTH 4383)
I = Outcome is Introduced
D = Outcome is Developed with practice
M = Outcome is Mastered
Student Learning Outcomes |
Core Courses (12 SCH) TED6300, 6301, 6302, 6310 | Research Methods (12 SCH) TED6396, 6320, 6322, and one of TED6321, 6323, 6319 |
Specialization Courses (15 SCH) Strand Specific |
Milestone-1 Portfolio Presentation (3 SCH) TED6394 |
Milestone-2 Proposal Hearing (3 SCH) TED6397 |
Milestone-3 Dissertation Defense (6 SCH) TED6398, 6399 |
---|---|---|---|---|---|---|
1. Students will be able to conduct research using appropriate methodologies to study curriculum and instruction. | I | I | D | D | M | M |
2. Students will expand on the existing pedagogical knowledge base about learners from linguistically and culturally diverse backgrounds by examining and critically evaluating extant theories and practices. | I | I | D | D | M | M |
3. Students will design innovative instructional strategies to promote the cognitive and social development of diverse learners. | I | I | D | D | M | M |
4. Students will provide significant contributions to the research literature on educational reform through participation in research colloquia and publications. | I | I | I | D | D | M |
Selecting Assessment Methods
Various direct and indirect instruments can be used to measure learning outcomes. Direct methods look at actual student performance on a learning task, and indirect methods ask students to reflect upon learning changes.
Direct Methods Scenario |
Indirect Methods Scenario |
---|---|
Did the students who completed the midterm project actually show improvement in their final project? Examples of direct methods to use:
|
Did the students who completed the midterm project feel more confident in their abilities? Examples of indirect methods to use:
|
More Assessment Methods
Rubrics are helpful to address complex, compound outcomes with multiple dimensions. Scores can be collected on each dimension separately and an overall score can be calculated or derived. The Association of American Colleges & Universities offers 16 rubric samples and templates at no cost here. Visit: https://www.aacu.org/value-rubrics.
Observational checklists are used in situations where student performance can be directly observed as it happens, such as clinical fieldwork, laboratory skills courses, mock job interviews, or creative performances.
Surveys are helpful to measure attitudes, opinions, feelings, and impressions. These indirect methods can also measure past behaviors that could not have been observed, intentions, future directions, or plans for behavior.
Interviews are similar to surveys but allow for a wider range of answers. They are easier to follow up on interesting directions and explore responses more deeply. Interviews will often require complex coding for later analyses, but they are a valuable tool for measuring.
Focus groups are best used when you need insight into the reasons for participants’ beliefs, attitudes, and experiences. Participants have the opportunity to react to each other’s ideas, which can provide some consensus about the degree to which the outcomes were met. Focus groups can help explore multiple issues and programs.
Designing an Assessment Plan
Once an assessment plan has been mapped out, decide what will be assessed and when. It is not necessary to assess every outcome, every year. Consider the following checklist when designing your assessment plan for the academic year.
Assessment Plan Checklist
✔ Determine dates when data is collected and reviewed |
✔ Is the data collected meaningful? |
✔ Is the data collected measureable? |
✔ Is the data collected manageable? |
✔ Select appropriate sample sizes, do not try to assess every student for every outcome |
✔ Use students’ best work |
The following plans have been made available to help you see samples of a finished assessment plan.
Assessment Plans
BBA in General Business
PhD Geological Sciences
Master of Business Administration (MBA)
MS in Software Engineering
Learning Outcomes
BBA in General Business
PhD Geological Sciences
Master of Business Administration (MBA)
MS in Software Engineering
Gathering Evidence
The process of collecting evidence should be clearly defined and shared among all stakeholders to ensure it is embedded into the recurring processes. Communication is a key to determine who is responsible for administering the assessment method, and who is responsible for gathering results and storing evidence in a shared folder for continuity. The assessment timeline defines where and when evidence will be gathered but an action plan must be integrated for these data collection points to actually happen. Consider the following tips for gathering evidence.
Tips for gathering evidence
✔ For course embedded assessments, incorporate the assessment method in course syllabi schedule to ensure it always happens. |
✔ Utilize survey software such as QuestionPro to assess leaning outcomes and have survey results easily accessible. |
✔ Create a shared folder with access to all stakeholders to ensure data is being stored and shared. Visit Microsoft OneDrive for Business, UTEP’s supported cloud storage system). |
✔ Make use of data you already have. |
Sampling Techniques
Evidence collected does not need to include for example all capstone papers of a class of 150 students; a significant sample of capstone papers can be gathered to represent the whole. The process of sampling allows the program to focus on a portion of the evidence and present findings that will apply to all. Consider the following tips for selecting sample:
- Remove any identifiers from evidence to avoid linking data to public records.
- Be sure the sample is large enough to be representative
- Use stratified random sampling to reduce bias.
Analyzing and Discussing Evidence
Presenting evidence in quantitative and qualitative terms will ensure a descriptive and meaningful analysis of results.
Quantitative | Qualitative |
---|---|
Demographic information (e.g., what groups were included) | Common themes mentioned in a survey or interview |
Frequencies (e.g., how many performed at each level) | Shared comments mentioned in focus groups) |
Central tendencies (e.g., means or medians) | Explanatory quotes that effectively illustrate praises or concerns |
Differences over time or between groups | Definitions of terms |
Percentages (e.g., increases, decreases, rates, etc.) | Descriptions of values (to emphasize alignment with mission) |
Targets/Benchmarks
A target is a realistic standard level that the program would like to achieve. Explain your data in relation to the program’s target. Sample Explanation: "Our target was for 75% of students in our sample to score at the “acceptable” or above level of the rubric. Data show that 70% of students reached this level."
Type of Visuals to Present Evidence
Tables | Graphs |
---|---|
Organize evidence | Pie Chart (part of whole or distribution) |
Highlight trends | Line or Bar Chart (Historical trends) |
Comparisons | Bar Chart (Categorical groups) |
Analyzing and Discussing Evidence
✔ Involve various stakeholders to guarantee different perspectives such as faculty who teach the same course, advisory board, program alumni, etc.
✔ Include the process of analyzing data into recurring faculty meetings/retreats to ensure it gets done.
Upload to the Planning Module
Updated assessment plans (for the upcoming year) should be uploaded to the Planning Module by March 1 of the current year. To log in to the Planning Module, go to My UTEP and log in with your UTEP credentials. The Planning Module App is located on your dashboard. Click here for a full tutorial on how to upload your assessment plan.
Responding to Findings
Collaboration and consensus of multiple stakeholders is essential to respond to findings and to justify decisions concerning changes in instruction and/or curriculum.
The following questions will allow the program to reflect on findings and possible action plan(s):
- What do these findings mean for your program?
- Did your assessment reveal something that helped your program make a change?
- Did your results show that your intervention worked?
- Can a policy decision be made based on the data provided?
- Are the assessment methods effective in measuring the specific student learning outcome?
How to Write an Effective Action Plan
- Set SMART goals (Specific, Measurable, Attainable, Relevant and Time-based)
- Create a list of actions
- Set a timeline
- Designate resources
- Monitor the progress
Following-up on last year’s action plan and longer term follow-up
- Always close the cycle and report on changes that were implemented.
- Report on successes.
- Report on unintended consequences.
- Report on ongoing efforts.
Each program should provide an updated assessment plan for the upcoming year by March 1 of the current year. For example, assessment plans for the 2020-2021 academic year would be due by March 1, 2020.
Within each assessment plan, there is a timeline box. Within that timeline box, the program identifies its own deadline for providing the assessment report of the previous year’s data. Typically, a program reports its data for last academic year by late fall of the subsequent year, but these deadlines may be affected by program-level accreditation reports or college-level deadlines.
A Planning Module Tutorial is avilable for download. Please contact assessment@utep.edu for further assistance.
Your assessment plan belongs to you. You may access your assessment plan and update it at any time. You may create a brand new assessment plan with a brand new timeline at any time. Please do document any changes in your outcomes, measures, timeline, or curriculum map in your assessment plan.
The assessment plan process should involve all faculty from your department. In particular, those who are involved in the development and review of your curriculum should be involved in creating the curriculum map and identifying appropriate measures. Those involved in delivering your curriculum should be involved in the sampling and review of student work. All faculty should be involved in interpreting the findings and determining what responses or actions might be necessary.
Course grades nearly always involve the evaluation of many different course objectives and outcomes, for one student. Assessments are an evaluation of only one outcome at a time, across many students. A student may be an excellent writer, but not quite so excellent at empirical and quantitative reasoning, and get a B as a result. That B does not clearly show that the student was exemplary in writing, but below average in quantitative reasoning. The goal of assessment is not to look at how one particular student is doing on many things, but to look at how an entire sample of students are doing on ONE thing.
Assignment grades may be perfectly aligned with outcomes and perfectly appropriate for assessment; if so, assignment grades may be used for assessment, especially if objectively scored. Best practices suggests having multiple reviewers in your program evaluate a set of student assignments with some shared understandings (through a rubric or checklist, for example), as it generally leads to a clearer understanding of how students are doing on a particular outcome. The activity of reviewing together can also lead to more clarity on expectations for faculty across a program
Yes, a separate assessment plan and reporting is still necessary for SACSCOC. Accrediting bodies are not necessarily the same, nor do they follow the same guidelines. Certainly, you may use the same data and planning processes, but we will need the reporting to align with the rest of the university. We are happy to help programs translate their regular processes and reports into the summary reports needed for SACSCOC.
Karla Iscapa, Director of Assessment and Evaluation, will provide custom workshops for you and/or your department. Please contact assessment@utep.edu to request a workshop.