Skip to Main Content

The College Student Expectations Questionnaire (CSXQ) and The College Student Experiences Questionnaire (CSEQ)

by John H. Pryor, Dartmouth College

Summer 2005


OVERVIEW

Highlights of the CSXQ/CSEQ:

  • Straightforward surveys that look at how engaged college students expect to be (CSXQ) or how engaged students feel they are (CSEQ). 
  • The CSXQ takes about 15 minutes to complete (100 fill-in-the-blank questions); the CSEQ about 30-40 minutes (87 of the same questions as appear on the CSXQ, plus additional questions asking students to report their growth in college).

Uses of the CSXQ/CSEQ:

  • The surveys can be administered individually to gain an understanding of what students expect to encounter during college (CSXQ), or the activities students are engaged in and their self-reported growth after completion of some amount of college (CSEQ).
  • The CSXQ can be administered before college starts or at the beginning of students’ first year, and the CSEQ can be administered at the end of the first or second year of college; student expectations can be compared with their actual experiences and inconsistencies can be examined.
  • The results of either survey can be linked to demographic information (some of which is asked for in the surveys themselves) or to student responses on other surveys that look at particular skills/outcomes (e.g., the Socially Responsible Leadership Scale (SRLS) for leadership or the California Critical Thinking Disposition Inventory (CCTDI) for critical thinking) to examine the relationships between student demographics/characteristics and expectations and experiences.
  • To the extent that student self-reported gains can be trusted, the CSEQ can be used to look at how student development in areas like thinking analytically and pursuing ideas might reflect broader liberal arts outcomes like effective reasoning, problem solving, or lifelong learning.

Jill Cellars Rogers
Center of Inquiry in the Liberal Arts

at Wabash College



Introduction

The idea behind examining student experiences is simple: students who are highly engaged in college life get the most out of college. This line of research began with Bob Pace, who created the College Student Experiences Questionnaire, and continues under the direction of George Kuh at Indiana University. Both have published widely on student involvement, or engagement, in college. This essay will describe a pair of instruments that can be used to study self-assessed engagement and gains in college: the College Student Expectations Questionnaire (CSXQ) and the College Student Experiences Questionnaire (CSEQ).

The CSEQ examines how involved students are with various aspects of college, such as interaction with faculty, behavior in the classroom, interaction with other students, use of facilities, and involvement in student organizations. The CSEQ also contains a section in which students can estimate the gains they have made while in college. The instrument is usually administered towards the end of the academic year to students at any level (i.e., first-years, sophomores, juniors, or seniors). The CSEQ has been used at hundreds of colleges and universities since 1979 and is now in its fourth edition. The currently popular National Student on Student Engagement (NSSE) has its roots in the CSEQ.

The College Student Expectations Questionnaire, or CSXQ, was developed in the 1990s as a companion instrument to the CSEQ. The CSXQ is administered to first-year students either before the start of the school year or shortly after arrival. While the CSEQ asks students about the experiences they have had since arriving at college, the CSXQ asks students how often they expect to engage in these same behaviors during their first year of college. The two instruments can be administered as a pair to paint a picture of how fully student expectations of college have been met. Students taking the CSXQ or CSEQ can receive feedback in an individualized report comparing their expectations (or experiences) about college with other students at their school.

Institutions planning to assess student involvement with the CSEQ should consider using the CSXQ as a "pre-test." This pairing of the instruments is an excellent way to examine the gap between what is expected and what actually happens during the first year in college. The findings can be used to either adjust programs and policies at the institution to more accurately fulfill expectations, or to realistically adjust student expectations to avoid disappointment and discouragement.


Participation and Administration

Both instruments are available in either a scannable paper format or a web-based online form. In either format, students will need about 15 minutes to complete the CSXQ and 30–40 minutes for the CSEQ.

The CSXQ contains 100 fill-in-the-blank questions, 87 of which are repeated in the CSEQ. Institutions may choose to craft questions specific to organizational needs that can be added to the CSEQ (there are different options for either paper or online forms, explained below). Participating schools will receive a report on the findings from their institutions as well as the electronic data for analyses.

Both ways of administering the survey allow institutions to track individuals by assigning them unique code numbers associated with their questionnaires. Code numbers are useful in helping schools determine who has completed the questionnaire. For students who don’t return their surveys, institutions may choose to send a second questionnaire. Also, schools might choose to offer incentives to survey participants (some evidence suggests that awarding survey incentives helps increase response rates, although this is not a universal finding). Finally, institutions might choose to connect other information to student survey responses. For instance, institutions might want to merge data from their student information system, such as the classes particular students are signed up for, with test results.

While the CSEQ/CSXQ website has a great deal of information, there is one aspect that can be confusing. Since the CSEQ predates the CSXQ by many years, researchers at Indiana University administering both instruments refer to themselves as the "CSEQ staff" in many of their documents. The same people also administer the CSXQ, but always refer to themselves as the "CSEQ staff." Users should be cautious, as it is easy to mix up CSEQ and CSXQ when reading or in conversation.


Paper Format

The paper version of the CSXQ is a two-page booklet (four total pages). The paper version of the CSEQ is a booklet with eight total pages. Institutions order as many copies as necessary and administer the surveys themselves. Once completed, schools return the surveys to the CSEQ staff, who will scan the responses and report results within 4-6 weeks. Institutions administering the survey via mail should order enough duplicate copies to send follow-up requests to non-respondents (schools will need to estimate various response rates for the mailings). Each questionnaire comes with a unique six-digit code number printed on the last page that can be used to track respondents.


Electronic Format

The electronic versions of these instruments are accessed via the web and completed online. This is often more convenient for students and takes less time to administer than a paper survey, but schools must have a complete set of email addresses available for participating students, and of course, students must have ready access to computers with internet access. Institutions provide student emails to CSEQ staff, who then send each prospective participant an email invitation to take the survey. Each invitation contains a unique code number specific to that student, similar to the printed number on the paper survey. Like many such web survey operations, CSEQ staff has a policy that student emails be used for the survey only.

Schools can administer the online CSXQ/CSEQ in one of two ways. The "full-service" administration option, which is more expensive, requires participating institutions to provide CSEQ staff with student names and emails. CSEQ staff will, in turn, send email requests to the students and keep track of sending reminders to those who have not yet responded. The "self-service" option requires the institution to send out the emails to its own students. Schools choosing the self-service option should be prepared to customize emails to each student with the unique code number that identifies him or her and allows access to the questionnaire. The self-service option, therefore, requires using a mail merge program in most cases, and many institutions would gladly have the CSEQ staff take care of this detail for a small additional fee (as of summer 2005, self-service price for 1,000 students is $2,950, compared to a full-service price of $3,500, a difference of $550). Furthermore, with the full-service mailing, the CSEQ staff will craft emails to look as though they come from the participating school. This is a nice touch that can help increase response rates, as students are often more likely to read and respond to an internal request as opposed to something received from outside the institution. As with the paper version, schools should receive their report and raw data within 4–6 weeks after the survey cut-off date. This author strongly recommends the full-service option, especially since the additional cost is so small.

An additional feature of the online version is that it checks for missing data and prompts the student to complete any incomplete questions. Students are given this opportunity once, and if they still choose not to complete the question (as should be their prerogative via most human subjects/institutional review board policies), the question is marked as incomplete and they are allowed to continue. There is some evidence that online forms, therefore, will give you a more complete dataset (see Experiences with the CSXQ below).


Additional Institution-specific Questions

One important difference between the paper version and the online version of the questionnaires is that institutions have more flexibility with the additional questions in the online version. At the end of the paper form, there is space for students to mark answers to 20 additional questions, with the caveat that the responses need to conform to a five-item (or fewer) response scale (i.e., to answer, a student will need to decide between as many as five fill-in-the-bubble responses). Institutions will need to provide students completing the paper version of the survey with a paper list of the additional questions and responses listed, and instruct them to mark their responses on the survey booklet.

Online, additional questions are also limited to 20, but these can include open-ended questions (for which students compose a written answer instead of filling in bubbles next to pre-existing responses). There is also greater flexibility in the closed-ended responses, such as the ability to use six responses instead of five. While additional questions on the online version allow schools more flexibility, they also create additional costs ($40 per question, as of summer 2005). No added costs are incurred for the additional questions on the paper version.


About the College Student Experiences/Expectations Questionnaires

History

There is a vast body of literature that examines the importance of student involvement in college and its impact on student outcomes. [3, for example] The CSEQ has been cited in over 250 scholarly reports. [2] Over the years, the CSEQ has changed with the times, as evidenced by the more modern inclusion of technology and methods of electronic communication (e.g., "How often do you use email to communicate with an instructor or classmates?"). The CSXQ was developed to examine what students expect to do in their first year of college. Now in its second edition, the CSXQ has the same strong psychometric qualities as its parent instrument, the CSEQ.

Instruments

The CSXQ is divided into three major sections: College Activities, The College Environment, and Background Information. The CSEQ contains the same categories (slightly expanded) and adds a section called "Estimates of Gains."

College Activities Section

The following table breaks down the number of questions on the CSXQ and CSEQ focusing on various college activities:

  Activity Subsection

   CSXQ 

 CSEQ  
  Library and Information Technology

9

 
  Library  

8

  Computer and Information Technology  

9

  Experiences with Faculty Members

7

10

  Course Learning Activities

9

11

  Writing Experiences

5

7

  Campus Facilities
  (some "Art, Music, Theater" items in CSEQ listed in
  "Campus Facilities" in CSXQ)

9

8

  Art, Music, Theater  

7

  Clubs, Organizations, and Service Projects

5

5

  Personal Experiences  

8

  Student Acquaintances

7

10

  Scientific and Quantitative Experiences

5

10

  Topics of Conversation

10

10

  Information in Conversations

6

6

For each question, students are asked, "How often do you expect to . . .?" with one of four possible responses, as in the example below from the Experiences with Faculty section:

Ask your instructor for comments and criticisms about your academic performance? 

  • Very often
  • Often
  • Occasionally
  • Never

For each of these categories, the responses are combined to obtain an overall expectation score for that area, assigning a value of 1 (never), 2 (occasionally), 3 (often), and 4 (very often) for each item and then adding the values. A higher score indicates higher expectations/more experience.

The College Environment Section

This section (in both instruments) contains seven questions in which students are asked to rate, using a scale of 1 (weak) to 7 (strong), how much emphasis their institution places on various aspects of a college environment. Students estimate whether the institution will be scholarly and intellectual, will work toward understanding diversity, or will emphasize vocational studies, among other qualities. Students are also asked how supportive they think other students, faculty, and administrators will be.

Gains in College

The CSEQ contains a section that is not mirrored in the CSXQ, where respondents are asked to estimate the extent to which they have made progress on 25 goals. These goals include items such as "learning on your own, pursuing ideas, and finding information you need" and "thinking analytically and logically." The response scale for these items is "very much," "quite a bit," "some," and "very little. The degree to which a student feels he or she has achieved a goal (or goals) can be compared with how engaged that student feels he or she is. This allows institutional researchers to study relationships between the level of student engagement and the degree to which students achieve outcomes. For example, researchers could examine whether students who are more engaged with faculty actually report developing analytical thinking skills. 

Background Information

In this section, basic demographic information such as age, sex, race, residence, and parental education levels is requested. Students are also asked their expected major, grades, and anticipated time spent working on classes and at jobs.


Reports

Standard Package

The basic report that schools receive as part of the survey package contains tables for each question and summaries of the student responses, given in both actual numbers of responses and in percentages. In addition to the total number of students responding, the report also breaks down responses between men, women, and those who did not answer the "sex" demographic question. When appropriate, the report includes means, standard deviations, and the standard error of the mean. Computer files with the data are also provided, allowing schools to do their own analyses.

Comparative Data

Institutions might wish to compare their institutional results with those obtained at other institutions. As part of the standard reporting package, both CSXQ and CSEQ reports include norms broken out by Carnegie Classification.

Additional Analyses

For $150 an hour, the CSEQ staff will perform additional analyses, which might include breakouts by racial groups, proposed major, or special demographic queries asked in additional questions. 

Student Advising Report

This reporting option provides individual students with a report that compares their responses to the average response of their classmates. This option is available for both the CSXQ and the CSEQ.


Using the CSEQ and the CSXQ Together

Generally, students come to college with high expectations. [4] Combining the CSXQ at the beginning of the year with the CSEQ at the end of the year, therefore, is much more valuable than administering the CSXQ as a stand-alone instrument. The CSXQ describes the first-year student before he or she is exposed to the college environment. The CSEQ contains many items that measure involvement in that environment, plus a section on self-described goals. In short, the combined pre- and post-test format allows researchers to compare expectations with experiences, and look for relationships between goals and student characteristics, expectations, or demographics.

Why might institutions want to know about student expectations? For one thing, knowing the expectations that students carry with them to college helps in understanding the mindset of entering students. Institutions will know if students have high expectations for diverse interactions, or if they are more focused on being active in the classroom or on involvement in student organizations. Results of the CSXQ can provide evidence and information for aligning student expectations with institutional goals.

If student expectations are in line with the institutional focus, then so much the better, and institutions can then assess how expectations compare to experiences. However, this is often not the case. If, for instance, CSXQ results indicate that 85% of an institution's incoming class think they will be engaged in faculty research their first year, but CSEQ results suggest that only 20% of first-year students have actually performed such research, a significant discrepancy between expectations and reality exists, and institutions may choose to re-evaluate their priorities. An institution may either choose to increase opportunities for first-year students to perform research, or they might work proactively (during the admissions process, perhaps) to change the expectations of the students. Not surprisingly, evidence suggests that students with unmet expectations are more likely to contemplate leaving the institution. [1] Admitting a student with unrealistic expectations does neither the student nor the institution any good, if the unfulfilled student transfers to another school or drops out of college. Matching expectations and experiences can, therefore, help retention.


Limitations

There is one hole in this argument, however. This is the assumption that a discrepancy between the expectations of an incoming student and the experiences of that student after a full year of college means that a student still expects, at the end of the year, to have achieved (or at least progressed towards) the earlier expectations. Perhaps another explanation accounts for the difference: an incoming student has played soccer for ten years and enjoys the sport immensely. She fills out the CSXQ, reporting that she will "very often" play a team sport. While attending college, her roommate drags her to the debate club meetings, and she finds a passion for debating that did not exist when she completed the CSXQ. In high school, she spent so much time in soccer practice that she never participated in organizations like the debating team, so when asked on the CSXQ if she would attend a "meeting of a campus club, organization, or student government group," she checked "never."  Now she is involved in debating, and finds that her schedule does not allow for both soccer and debate team. She’s played soccer for years, and wants to try something new, so she weighs the options and makes the choice to drop soccer in favor of the debate team. Looking at her CSXQ/CSEQ responses for team sports, one might categorize this situation as a disappointment for this student, when actually the student simply has had a change in expectations. Researchers should expect that students without experience in a college setting will, once they have some experience, readjust their expectations. This means looking more carefully at the survey results to determine if student efforts are shifting, rather than being curtailed.

Another small concern I have is the age choices in the demographics section.  The lowest age-group choice is "19 and younger" and the next choice is "20-23." I suggest that an 18-year-old has different expectations from a 19-year-old first-year student, and that there is even more difference between a 20 and a 23-year-old first-year student, yet the instrument will not let you examine this.

Institutions might consider having several students take a "trial run" of the CSXQ/CSEQ to identify any points that might be confusing for the intended test takers. For example, the questionnaire gathers information on course load in the first term by asking for "credit hours." Some institutions don’t have a "credit hour" system, so this question could confuse first-year students. In this case, students could be told to disregard the question on credit hours.


Experiences with the CSXQ

There are some readily available web-based reports on experiences with the CSXQ that might help institutions decide whether to administer the survey, and how to do so.

One institution in the Minnesota State University system used CSXQ results to show how much time students thought they would spend studying in college, compared to the number of credit hours in which they were enrolled (see report). This indicated a lack of understanding among students of how much studying was required per credit hour. It exemplifies how student expectations, if understood through assessment, can be addressed early on by communicating how much time should actually be spent studying.

Appalachian State University compared the results obtained via paper to those obtained from web-based administration. While they do not report the response rates for the two conditions, they do report that the individuals assigned to the online version were more likely to answer every question on the form than those assigned to the paper version (98% versus 50%). They also found that expectations were higher among students completing the online version. Without information on relative response rates, however, it is difficult to interpret the possible reasons and implications of the finding. (See report.)


Experiences with the CSEQ

A great deal of published research that uses the CSEQ exists. Of particular interest are those studies looking at how institutions use the survey results. Like the CSXQ, numerous reports are available on institutional websites.

At Appalachian State University, one can see an example of a short report that compares expectations from the CSXQ with experiences from the CSEQ. A similar use of the instruments is illustrated by another short report from Arizona State.

Some reports illustrate selective uses of the data; for example, Wake Forest University used CSEQ results to examine the impact of a computing initiative. Others use CSEQ data as part of the re-accreditation self-studies (see a report from Shippensburg University).


Conclusion

The CSEQ and CSXQ are excellent tools for examining the engagement and expectations of engagement of your incoming students. While both can be used as stand-alone surveys, this author particularly recommends using the CSEQ in combination with the CSXQ to best see how student expectations have been realized throughout the institution.  The CSEQ/CSXQ surveys can be of use in looking at liberal arts outcomes when examining levels of student engagement in "best practices," but other instruments and in-class assessment that look more specifically at desired liberal arts outcomes (the CCTDI, for example, to examine critical thinking) should be considered, along with qualitative approaches to capture the overall institutional culture.


References

  1. Braxton, J.M., Vesper, N., & Hossler, D. (1995). Expectations for college and student persistence. Research in Higher Education, 36, 595–612.

  2. Gonyea, R.M., Kish, K.A., Kuh, G.D., Muthiah, R.N., & Thomas, A.D. (2003). College student experiences questionnaire: Norms for the fourth edition. Bloomington, IN: Indiana University Center for Postsecondary Research, Policy, and Planning. Preview retrieved August 17 2005, from http://www.iub.edu/~cseq/pdf/intro_CSEQ_4th_Ed_Norms.pdf

  3. Kuh, G. D., Schuh, J. S., Whitt, E. J., & Associates (1991). Involving colleges: Successful approaches to fostering student learning and personal development outside the classroom. San Francisco: Jossey-Bass.

  4. Olsen, D., Kuh, G.D., & others. (1998, November). Great expectations: What students expect from college and what they get. Paper presented at the annual meeting of the Association for the Study of Higher Education, Miami.