Skip to Main Content

Higher Education Research Institute College Student Survey (CSS)

 

Introduction

The College Student Survey (CSS), created in 1993 by the Higher Education Research Institute (HERI) at the University of California, Los Angeles (UCLA), is administered primarily to graduating or continuing seniors but may also be administered to a sample of undergraduate students.  The CSS allows colleges and universities to conduct follow-up studies of students’ undergraduate experiences.  The survey provides institutions with an overall profile of the senior class, an overview of student experiences at the institution, and information about future aspirations and career goals.  Used by more than 800 institutions and 270,000 students, the instrument enables and informs understanding of the values, attitudes, and goals of senior students, as well as provides information about the quality and effectiveness of various aspects of college life and education offered by a particular institution (3).

The following paragraphs provide a review of the CSS.  The review begins with an overview of the participation and administration guidelines, including costs of the survey.  This is followed by an introduction of the many ways in which the survey is used by colleges and universities.  Reflections on the connection between student inputs, the educational environment, and learning outcomes are then shared.  Finally, the merit of the CSS for measuring liberal arts educational outcomes is addressed. 

Participation and Administration

During the fall semester of each academic year, all colleges and universities are invited to participate in the CSS, which is available throughout the year.  The first cycle runs from January to June and the second runs from July to December. Surveys may be administered on paper, may be web-based, or may be a combination of the two forms.  The way in which the CSS is distributed is determined by individual institutions.  Sample size and type as well as collection and return of the surveys to the HERI at the end of the data collection process are also a responsibility of the sponsoring college or university.  Two processing deadlines are available and allow for year round administration (2).  Pending cycle choice, results are forwarded to the participating institutions in either the middle of August, early October, or the middle of February. 

The survey is a four-page, pre-coded instrument that requires a completion time of approximately 45 minutes (2).  In order to provide institutions of higher education with opportunities to learn more about specific aspects of student life and learning at particular campuses, both the web-based and paper forms of the CSS allow institutions to add up to 30 additional questions.  Responses to those questions are then included in the aforementioned reports (2).  The addition of optional questions adds to the anticipated completion time.  An initial fee of $450 is charged to each participating institution with an additional $2.00 charge added to the final cost per returned survey.  All costs for data collection, processing and preparation of data reports are included in the fees.  Supplementary data services, such as consortium reports which compare outcomes of five or more similar institutions, are available at additional cost.

Two primary types of reports are provided to institutions using the CSS.  First, the CSS Campus Profile Report compares student responses from one institution to those at similar institutions.  Secondly, the results are organized according to gender.  The CSS Longitudinal Report compares responses of those taking the CSS and their responses on a previously administered CIRP Freshman Survey (2).  Individual question responses to the CSS, and CIRP Freshman Survey when available, are provided on a diskette for each CSS respondent at an institution.  Each report in an institution’s profile may be transferred to an EXCEL or Lotus spreadsheet that may then be used to compare responses and create reports about specific groups of data, develop graphical representations of the data, and perform additional analysis.  Special Comparison Reports match an institution’s responses to two of the seven available comparison groups.  Consortium Reports which show overall profiles based on responses from five or more institutions are also available.  Finally, Special Breakout Reports of up to 190 sub-reports for separate sub-groups of students may be created to specification (2). 

About the CSS

Fifteen self-assessed measures focusing on academic, social, intellectual, and emotional capabilities coupled with conventional measures of academic achievement, including self-reported grade point average and Graduate Record Examination scores, are included in the instrument.  A number of outcomes may be measured by the instrument including satisfaction with the college experience; involvement; cognitive and affective development; values, attitudes, and goals; and degree and career aspirations.  Often used to measure a specific group (e.g. seniors) at a particular point in time, when paired with the Cooperative Institutional Research Program’s (CIRP) freshmen survey, the CSS may be used to assess cognitive and affective development in college and university students over time (2).

Results of the CSS may be used in a variety of ways.  One means of using the data is to evaluate student satisfaction.  Using the instrument, students rate overall satisfaction with 28 aspects of the college experience including, but not limited to, the following: coursework, interactions with faculty and staff, and campus life.  For example, students are asked, "Please rate your satisfaction with your current (or most recent) college in each area: general education or core curriculum courses, science and mathematics courses, relevance of coursework to everyday life, overall quality of instruction, academic advising, student housing, job placement services for students, etc." (2).  The CSS may also be used to collect information on student involvement in both academic and co-curricular opportunities.  Students completing the instrument are asked questions regarding the quantity of time spent in academic and social activities as well as about participation in recreational activities.  Questions such as, "During the past year, how much time did you spend during a typical week doing the following activities: studying/homework, attending classes/labs, exercising/sports, partying, volunteer work, etc?" provide insight into the aforementioned areas (2).  Retention may also be measured using CSS questions about leaves of absence, withdraw, or transfer activity.  The instrument provides information about these activities among students at a particular institution while also providing comparison information with other similar institutions of higher learning (2). 

Additionally, students’ values, attitudes, and goals may be assessed using the CSS.  The significance of particular personal goals and values including materialism, altruism, the need for recognition, an interest in social change, and community service are but a few of the values and goals explored in by the CSS.  For example, students are asked to indicate whether they agree strongly, agree somewhat, disagree somewhat, or disagree strongly with a number of value statements including the following:  "The death penalty should be abolished; Marijuana should be legalized; and, Colleges should prohibit racist/sexist speech on campus."  Finally, the opportunity to add 30 additional questions designed to address local campus issues allows for learning about specific campus matters and questions that are not otherwise addressed (2).

Learning Outcomes in Relation to Inputs

When administered in conjunction with the CIRP Freshman Survey, the CSS may be used as a way of assessing student learning outcomes while taking into account the input variables of entering students.  These variables include student attitudes, values, and expectations.  According to HERI (2003), it is important to not only consider inputs and outcomes, but also focus on what takes place between the two points in order to determine the extent to which environmental factors affect learning and development in students.  This type of assessment over time provides results that yield information about the relationship between particular education practices and learning outcomes. 

Using this notion, the CIRP Freshman Survey acts as a pre-test, highlighting the input variables students bring with them at the onset of the college experience.  The institution of higher education then contributes the environmental factors that impact learning.  The resulting outcomes of the interaction between the inputs and environment may be measured using the CSS.  By using both instruments, institutions obtain valuable information about their students and the impact of college or university environments on overall development and learning (2).  Data may be used in a myriad of ways that include, but are not limited to, informing accreditation processes, providing a framework for evaluating programs and services, meeting standards, and assessing institutional impact on student life. 

According to HERI (2), one Midwest institution of higher education uses the combination of the CIRP Freshman Survey and CSS to assess differences among student experiences based on major and race.  By utilizing both instruments, and controlling student background inputs, members of the university or college community are able to determine how the environment created by the institution impacts student learning and development (2).  Moreover, a college in the South assesses the utility of service learning programs by researching changes in attitudes toward volunteerism from students’ freshmen year, as indicated by the CIRP Freshman Survey, to the students’ senior year, as indicated by the CSS.  Finally, an institution in the East noted that although students’ values did not change significantly during college, their actual experiences in college differed extensively from those articulated in their pre-college expectations.  After collecting this information through the use of the CIRP Freshman Survey and CSS, members of the institution were able to develop more informed and meaningful enrollment management and strategic planning processes (2).

Liberal Arts Education Assessment

According to Pascarella and Terenzini (1991), a significant difference exists between change and development.  Change involves alterations over time in internal affective or cognitive characteristics.  Development, on the other hand, goes a step further than change by implying "that growth is valued and pursued as a desirable psychological and education end" (5).  The goal of a liberal arts education is to promote not simply change, but rather development, in college and university students.  The goal of this type of higher education is to instill in students the ability and desire to think critically, to integrate ideas, and to create an informed and intentional means of addressing social, political, and personal issues (4).  The CSS is a beginning step in the assessment process necessary to understand the impact of a liberal arts education on student development.

With regards to the qualities of a liberal arts education put forth by Pascarella et al. (2004), the following data may be collected using the standard CSS instrument: enrollment in college full time, effectiveness of teaching, high levels of interaction with faculty and peers both in and out of the classroom, high levels of academic expectation and encouragement, integrated learning, and courses that emphasize study of the liberal arts rather than vocational/technical areas.  The standard CSS instrument measures both higher-order mental processes such as development in students’ abilities to think critically, as well as affective outcomes such as attitudes, values, and self-concept (1).  Additional liberal arts outcomes that may be measured by the standard instrument or that may be measured by adding individualized questions at the end of the instrument include the outcomes of self-reflection, citizenship, discernment, service, leadership, and moral development. 

While only using the CSS provides some insight into the aforementioned outcomes, the information garnered does not give a clear indication of the true impact of a liberal arts education because student input variables are not known.  The ability for institutions to control for input variables, such as academic preparation in high school or parental degree attainment levels, by using the CSS in conjunction with the CIRP Freshman Survey, however, enables colleges and universities to ascertain more meaningful information regarding the effects of liberal arts education on student learning and development in college.  Moreover, the combined administration of the CIRP and CSS provides additional insight about the interaction between student inputs and the college environment and what outcomes may or may not occur as a result of that interaction.  The CIRP Freshman Survey inquires about pre-college expectations and preparation while the CSS provides follow-up self-assessment of development and growth.  Students, for example, share information about the frequency of interaction with faculty members during class or office hours and outside of the traditional classroom setting.  Such information may yield insight about quality of teaching and intentional interactions that contribute to overall learning. 

Additionally, students identify the personal importance of "developing a meaningful philosophy of life" (2).  Such a philosophy is enhanced and informed by courses that emphasize the study of the liberal arts and the ability to integrate ideas, as opposed to classroom experiences that merely emphasize skill development or lack encouragement in self-reflection (4).  The level of importance senior students place on such outcomes may be attributed to the influence of liberal arts education at the college or university.  This is because the development of a "meaningful life philosophy" is not generally emphasized in vocational or technical education which focuses primarily on the mastery of particular skill sets.

The combination of the two instruments, CIRP and CSS, also yields information regarding potential student changes in the promotion of diversity and racial understanding, the importance of critical and creative contributions to science, writing and art, and interest in becoming active in community leadership.  Each of these outcomes may be considered indicative of a liberal arts education that challenges students to actively reflect on and voice personal opinions in an informed manner (6).

Moreover, the CSS provides information about faculty emphasis and encouragement about the pursuit of graduate study, intellectual stimulation and challenge, and involvement in academic-based work outside of the traditional classroom setting.  Each of these outcomes, when considered together, may highlight institutional emphasis on, and student recognition of, the expectation to produce quality work.  According to Wiggins (1994), under the auspices of a liberal arts education, institutions should create and maintain high expectations for the work produced by students both inside and outside of the classroom.  This type of expectation is strengthened by the quality of faculty teaching, the quality of the classroom environment, and the quality of encouragement and challenge given to students.  The value and frequency of each of these factors in a college environment may be quantified using the CSS and may be further understood using the CIRP Freshman Survey to measure input variables.

The CSS also provides students with an opportunity for self-assessment, a quality highlighted by Wiggins (1994) as important in a true liberal education environment.  By first assessing inputs and then reflecting on outcomes, students and other members of the campus community may gain more in-depth understanding about the impact the liberal arts learning environment has on overall affective and cognitive development.  The results of the CSS based on student reflection about the importance of political action, community leadership, personal philosophical development, and intellectual challenge and stimulation, may provide members of campus communities with more clarity about the extent to which "intellectual voice" is developed and "owned" by the students at the culmination of their college experiences (6). 

Concluding Remarks

The CSS may be used to assess a myriad of liberal arts outcomes.  When used in conjunction with the CIRP Freshman Survey, more in-depth information regarding the interaction between inputs and the liberal arts learning environment may be gathered in order to inform understanding of outcomes.  The instrument is flexible, allowing for the assessment of a number of pre-determined outcomes or the addition of outcomes specific to a particular institution, and may be used for specific group or long-term assessment.  Because CSS was not designed to specifically assess liberal arts educational outcomes, additional forms of assessment may be necessary in order to truly learn about the interaction between a liberal arts educational environment and student development.  Despite the fact that the CSS may not address all liberal arts educational outcomes, the instrument provides an effective starting point for more in-depth assessment and research.

 

 

 

References

  1. Astin, A. (1993). What matters in college:  Four critical years revisited.  San Francisco, CA:  Jossey-Bass Inc.

  2. Higher Education Research Institute (HERI).  College Student Survey.  http://www.gseis.ucla.edu/heri/css.html

  3. Higher Education Research Institute (HERI)(2003).  Designing a student assessment study: The CIRP surveys and the Input-Environment-Outcome Model.  (Available from the Higher Education Research Institute, UCLA, Graduate School of Education and Information Studies, R. 3005 Moore Hall, Box 951521, Los Angeles, CA  90095-1925.)

  4. Pascarella, E., Blaich, C., Wolniak, G., & Seifert, T. (2004, March).  A liberal arts education changes lives: why everyone else can and should have this experience. 

  5. Pascarella, E., & Terenzini, P. (1991).  How college affects students.  San Francisco, CA: Jossey-Bass Inc.

  6. Wiggins, G. (1994).  The truth may make you free, but the test may keep you imprisoned:  Toward assessment worthy of the liberal arts.  In J. Stark & A. Thomas (Eds.), Assessment Program Evaluation, ASHE Reader Series.  Boston, MA:  Pearson Custom Publishing.