SU home | OIRA home | contact | search | site map  

       
Assessment of Student Learning  
   
Assessment of Programs or Projects  

An Assessment Overview

 

Defining Goals and Objectives

 

Planning the Assessment

 
Collecting Information

Analyzing and Interpreting Data

 

Reporting Findings

 

Links

 
Student Ratings  
   
Scannable Forms Processing  
   
Data Collection Schedule  
   
University Six-Year Assessment Plan  
   
Consultation  
   
Library  
   
   

Collecting Information

 
  In this stage of the assessment process, information is gathered to address the assessment questions. Although the assessment questions drive the selection of data collection techniques, there are a number of factors to consider.
  • What information needs to be collected?
    • The information that needs to be collected is delineated by the assessment questions.
  • What are the information sources? (target audiences)
    • People (e.g., student participants – current, past, withdrawn, perspective; support staff – tutors, counselors; program staff; faculty; parents; administrators)
    • Documents
    • Records
    • Observations
  • How much information should be collected?
    • Entire population
    • Sample of the population
  • How should the information be collected? (methodology)
    • Survey (paper, web-based, scan form)
    • Focus group
    • Interviews (face-to-face, telephone)
    • Observations (e.g., events, behaviors, level of engagement)
    • Document analysis (e.g., program documents, activity logs, student work)
    • Record analysis (e.g., university student record system, attendance records)
    • Testing (pre-test, post-test)
    • Literature review
    • Other existing data sources (e.g., retention data, institutional survey data)

Some assessment questions are best addressed by using a variety of data collection techniques. For example, a survey may be administered to gather information from a large number of people, with follow-up interviews or focus groups conducted with certain respondents to obtain more in-depth information. Using several different information sources helps to substantiate the findings. For instance, if exploring a tutoring program, a data collection strategy may include a survey and/or focus groups with students, a survey and/or interviews with tutors, and a record analysis of students’ attendance behaviors. Triangulation, or using multiple data gathering strategies from several sources, helps to more completely explore the assessment questions.

Although the assessment questions drive the methodology, the practicality of the approach must also be taken into consideration. Time, cost, and scope of the assessment should be considered. The amount of time needed to develop the data collection instruments (e.g., survey, focus group protocol, document analysis guidelines), gather the information (e.g., distribute the survey, conduct the focus groups, review documents), and analyze the data must realistically reflect the timetable of the program/project staff and administrators. Budgetary resources must be compatible with the cost of the assessment. The scope or magnitude of the assessment is often dependent on time and budget. For example, if a methodology includes interviewing twenty participants, but financial resources are limited and a short timeline exists, the practicality of the approach comes into question.

 

 
       
Office of Institutional Research and Assessment
400 Ostrom Avenue • Syracuse, NY 13244-3250
Phone: (315) 443-8700 • Fax: (315) 443-1524 • E-mail: oira@syr.edu • Web: http://oira.syr.edu