Introduction
The idea behind examining student experiences is simple: students who are highly engaged in aspects of college life get the most out of college. This line of research began with Bob Pace, who created the College Student Experiences Questionnaire, and continues under the direction of George Kuh at Indiana University. Both have published widely on student involvement, or engagement, in college. This essay will describe a pair of instruments that can be used to study self-assessed engagement and gains in college: the College Student Expectations Questionnaire (CSXQ) and the College Student Experiences Questionnaire (CSEQ).
The CSEQ examines how involved students are with various aspects of college, such as interaction with faculty, behavior in the classroom, interaction with other students, use of facilities, involvement in student organizations, and other areas. The CSEQ also contains a section in which students can estimate the gains they have made while in college. The instrument is usually administered towards the end of the academic year to students at any level (i.e, first-years, sophomores, juniors, or seniors). The CSEQ has been used at hundreds of colleges and universities since 1979, and is now in its fourth edition. The currently popular National Student on Student Engagement (NSSE) has its roots in the CSEQ.
The College Student Expectations Questionnaire, or CSXQ, was developed in the 1990s as a companion instrument to the CSEQ. The CSXQ is administered to first-year students either before the start of the school year or shortly after arrival. While the CSEQ asks students about the experiences they have had since arriving at college, the CSXQ asks students how often they expect to engage in these same behaviors during their first year of college. The two instruments can be administered as a pair to paint a picture of how fully student expectations of college have been met. Responses from groups of other institutions (also known as "norms") are available so that institutions can understand how they compare to others. Students taking the CSXQ or CSEQ can receive feedback in an individualized report comparing their expectations (or experiences) about college with other students at their school.
Institutions planning to assess student involvement with the CSEQ should consider using the CSXQ as a "pre-test." This pairing of the instruments is an excellent way to examine the gap between what is expected and what actually happens during the first year in college. The findings can be used to either adjust programs and policies at the institution to more accurately fulfill expectations, or to realistically adjust student expectations to avoid disappointment and discouragement.
Participation and Administration
Both instruments are available in either a scannable paper format or as a web-based online form. In either format, students will need about 15 minutes to complete the CSXQ and 30–40 minutes for the CSEQ.
The CSXQ contains 100 fill-in-the-blank questions, 87 of which are repeated in the post-test version on the CSEQ. Institutions may choose to craft additional questions specific to organizational needs, which can be added to the questionnaire (there are different options for either paper or online forms, explained below). Participating schools will receive a report on the findings from their institution as well as the electronic data for analyses.
Both ways of administering the survey allow institutions to track individuals by assigning them unique code-numbers associated with their questionnaires. Code numbers are useful in helping schools determine who has completed the questionnaire. For students who don’t return their surveys, institutions may choose to send a second questionnaire. Also, schools might choose to offer incentives to survey participants (some evidence suggests that awarding survey incentives helps increase response rates, although this is not a universal finding). Finally, institutions might choose to connect other information to student survey responses. For instance, if schools administer the CSEQ and CSXQ, test administrators will want to see how the expectations of students compare to their actual experiences. Institutions might also want to merge data from their student information system, such as the classes particular students are signed up for, with test results. Researchers can only accomplish these goals if they have a way of identifying respondents with a code number.
While the CSEQ/CSXQ website has a great deal of wonderful information, there is one aspect that can be confusing. Since the CSEQ predates the CSXQ by many years, researchers at Indiana University administering both instruments refer to themselves as the "CSEQ staff" in many of their documents. The same people also administer the CSXQ, but always refer to themselves as the "CSEQ staff." Users should be cautious, as it is easy to mix up CSEQ and CSXQ when reading or in conversation.
Paper Format
The paper version of the CSXQ is a two-page booklet (four total printed pages). The paper version of the CSEQ is a booklet with eight total pages. Institutions order as many copies as necessary from the CSEQ staff and administer the surveys themselves. Once completed, schools return the surveys to the CSEQ staff, who will scan the responses and report results within 4-6 weeks. Institutions administering the survey via mail should order enough duplicate copies to send follow-up requests to non-respondents (schools will need to estimate various response rates for the mailings). Each questionnaire comes with a unique six-digit code number printed on the last page that can be used to track respondents.
Electronic Format
The online versions are accessed via the web and completed online. This is often more convenient for students and takes less time to administer than a paper survey, but schools must have a complete set of email addresses available for participating students and, of course, students must have ready access to computers with internet access. Institutions provide student emails to CSEQ staff, who then send each prospective participant an email invitation to take the survey. Each invitation contains a unique code number specific to that student, similar to the printed number on the paper survey. Like many such web survey operations, CSEQ staff have a policy to allow the use of student emails for the survey only.
Schools can administer the online CSXQ/CSEQ in one of two ways. The "full-service" administration option, which is more expensive, requires participating institutions to provide CSEQ staff with student names and emails. CSEQ staff will, in turn, send email requests to the students and keep track of sending reminders to those who have not yet responded. The "self-service" option requires the institution to send out the emails to its own students. Schools choosing the self-service option should be prepared to customize emails to each student with the unique code number that identifies him or her and allows access to the questionnaire. The self-service option, therefore, requires using a mail merge program in most cases, and many institutions would gladly have the CSEQ staff take care of this detail for the small additional fee (as of summer 2005, self-service price for 1,000 students is $2,950, compared to a full-service price of $3,500, a difference of $550). Furthermore, with the full-service mailing, the CSEQ staff will craft emails to look as though they came from the participating school. This is a nice touch that can help increase response rates, as students are often more likely to read and respond to an internal request as opposed to something received from outside the institution. As with the paper version, schools should receive their report and raw data within 4–6 weeks after the survey cut-off date. This author strongly recommends the full-service option, especially since the additional cost is so small.
An additional feature of the online version is that it checks for missing data and prompts the student to complete any incomplete questions. Students are given this opportunity once, and if they still choose not to complete the question (as should be their prerogative via most human subjects/institutional review board policies), the question is marked as incomplete and they are allowed to continue. There is some evidence that online forms, therefore, will give you a more complete dataset (see Experiences with the CSXQ, below).
Additional Institution-specific Questions
One important difference between the paper version and the online version of the questionnaires is that institutions have more flexibility with the additional questions in the online version. At the end of the paper form, there is space for students to mark answers to 20 additional questions, with the caveat that the responses need to conform to a five-item (or fewer) response scale (i.e., to answer, a student will need to decide between as many as five fill-in-the-bubble responses). Institutions will need to provide students completing the paper version of the survey with a paper list of the additional questions and responses listed, and instruct them to mark their responses on the survey booklet.
Online, additional questions are also limited to 20, but these can include open-ended questions (for which students compose a written answer instead of filling in bubbles next to pre-existing responses). There is also greater flexibility in the closed-ended responses, such as the ability to use six responses instead of five. While additional questions on the online version allow schools more flexibility, they also create additional costs ($40 per question, as of summer 2005). No added costs are incurred for the additional questions on the paper version.
Reports
Standard Package
The basic report that schools receive as part of the survey package contains tables for each question and summaries of the student responses, given in both actual numbers of responses and in percentages. In addition to the total number of students responding, the report also breaks down responses between men, women, and those who did not answer the "sex" demographic question. When appropriate, the report includes means, standard deviations, and the standard error of the mean. Computer files with the data are also provided, allowing schools to do their own analyses.
Comparative Data
Institutions might wish to compare their institutional results with those obtained at other institutions. As part of the standard reporting package, both CSXQ and CSEQ reports include norms broken out by Carnegie Classification.
Additional Analyses
For $150 an hour, the CSEQ staff will perform additional analyses, which might include breakouts by racial groups, proposed major, or special demographic queries asked in additional questions.
Student Advising Report
This reporting option provides individual students with a report that compares their responses to the average response of their classmates. This option is available for both the CSXQ and the CSEQ.
About the College Student Experiences/Expectations Questionnaires
History
There is a vast body of literature that examines the importance of student involvement in college and its impact on student outcomes. [3, for example] The CSEQ has been cited in over 250 scholarly reports. [2] Over the years, the CSEQ has changed with the times, as evidenced by the more modern inclusion of technology and methods of electronic communication (i.e., "How often do you use email to communicate with an instructor or classmates?"). The CSXQ was developed to examine what students expect to do in their first year of college. Now in its second edition, the CSXQ has the same strong psychometric qualities as its parent instrument, the CSEQ.
Instruments
The CSXQ is divided into three major sections (pdf): College Activities, The College Environment, and Background Information. The CSEQ contains the same categories (slightly expanded) and adds a section called "Estimates of Gains."
College Activities Section
The following chart breaks down the number of questions on the CSXQ and CSEQ focusing on various college activities:
Activity Subsection | CSXQ | CSEQ |
Library and Information Technology | 9 | |
Library | 8 | |
Computer and Information Technology | 9 | |
Experiences with Faculty Members | 7 | 10 |
Course Learning Activities | 9 | 11 |
Writing Experiences | 5 | 7 |
Campus Facilities (Some "Art, Music, Theater" items in CSEQ listed in "Campus Facilities" in CSXQ.) |
9 | 8 |
Art, Music, Theater | 7 | |
Clubs, Organizations, and Service Projects | 5 | 5 |
Personal Experiences | 8 | |
Student Acquaintances | 7 | 10 |
Student Acquaintances | 7 | 10 |
Scientific and Quantitative Experiences | 5 | 10 |
Topics of Conversation | 10 | 10 |
Information in Conversations | 6 | 6 |
For each question, students are asked, "How often do you expect to . . .?" with one of four possible responses, as in the example below from the Experiences with Faculty section:
Ask your instructor for comments and criticisms about your academic performance?
For each of these categories, the responses are combined to obtain an overall expectation score for that area, assigning a value of 1 (never), 2 (occasionally), 3 (often), and 4 (very often) for each item and then adding the values. A higher score indicates higher expectations/more experience.
The College Environment Section
This section (in both instruments) contains seven questions in which students are asked to rate, using a scale of 1 (weak) to 7 (strong), how much emphasis their institution places on various aspects of a college environment. Students estimate whether the institution will be scholarly and intellectual, will work toward understanding diversity, or will emphasize vocational studies, among other qualities. Students are also asked how supportive they think other students, faculty, and administrators will be.
Gains in College
The CSEQ contains a section that is not mirrored in the CSXQ, where respondents are asked to estimate the extent to which they have made progress on 25 goals. These goals include such items as "learning on your own, pursuing ideas, and finding information you need" and "thinking analytically and logically." The response scale for these items is "very much," "quite a bit," "some," and "very little. The degree to which a student feels he or she has achieved a goal (or goals) can be compared with how engaged that student feels he or she is. This allows institutional researchers to study relationships between the level of student engagement and the degree to which students achieve outcomes. For example, researchers could examine whether students who are more engaged with faculty actually report developing analytical thinking skills.
Background Information
In this section, basic demographic information is requested, such as age, sex, race, residence, and parental education levels. Students are also asked their expected major, grades, and anticipated time spent working on classes and at jobs.
Liberal Arts Outcomes and the CSEQ/CSXQ
Clearly the CSXQ and CSEQ are instruments with a purpose, and a well-aimed purpose at that. They are higher education assessment surveys focusing on curricular and co-curricular experiences that promote student learning, and are not designed to assess outcomes, per se. However, the surveys include indirect measures that assess the extent to which students engage in educational practices associated with high levels of learning and development. For example, the following are all examples of practices that contribute to student learning:
Research done by the Center of Inquiry in the Liberal Arts suggests that the degree to which students experience a combination of these practices during their college career correlates positively to liberal arts outcomes. For example, a student who experiences high levels of faculty interaction; supportive relationships with faculty, administrators, and peers; and challenging coursework in college will likely demonstrate growth in areas like openness to diversity and critical thinking, among others. This same research suggests that these gains will be more pronounced for women, high-risk students, and minority students.
Connecting educational practices with liberal arts outcomes is compelling. Data from the CSEQ/CSXQ surveys are intended to be interpreted at an institutional level. Survey results provide institutions with information about the quality of the undergraduate student experience/expectations and can suggest the presence or absence of a liberal arts environment. However, capturing the broader intellectual character of a school can be challenging when using aggregate survey data gathered from student perceptions of curricular and co-curricular practices. Such information is certainly useful, but additional data points and qualitative approaches should be used to provide a more in-depth and rich representation of institutional character.
Using the CSEQ and the CSXQ Together
Generally, students come to college with high expectations. [4] Combining the CSXQ at the beginning of the year with the CSEQ at the end of the year, therefore, is much more valuable than administering the CSXQ as a stand-alone instrument. The CSXQ describes the first-year student before he or she is exposed to the college environment. The CSEQ contains many items that measure involvement in that environment, plus a section on self-described goals. In short, the pre- and post-test format allows researchers to compare expectations with experiences, and look for relationships between goals and student characteristics, expectations, or demographics.
Why might you want to know about student expectations?
For one thing, knowing the expectations that students carry with them to college helps to understand the mindset of entering students. Institutions will know if students have high expectations for diverse interactions, if they are focused on being active in the classroom, or more focused on involvement in student organizations. Results of the CSXQ can provide evidence and information for aligning student expectations with institutional goals.
If student expectations are in line with the institutional focus, then so much the better, and institutions can then assess how expectations compare to experiences. However, this is often not the case. If, for instance, CSXQ results indicate that 85% of an institutions’ incoming class think they will be engaged in faculty research their first year, but CSEQ results suggest that only 20% of first-year students have actually performed such research, a significant discrepancy between expectations and reality exists, and institutions may choose to re-evaluate their priorities. An institution may either choose to increase opportunities for first-year students to perform research, or they might work proactively (during the admissions process, perhaps) to change the expectations of the students. Not surprisingly, evidence suggests that students with unmet expectations are more likely to contemplate leaving the institution. [1] Admitting students with unrealistic expectation does neither the student nor the institution any good if the unfulfilled student transfers to another school or drops out of college. Matching expectations and experiences can, therefore, help retention.
Caveats
There is one hole in this argument, however. This is the assumption that a discrepancy between the expectations of an incoming student and the experiences of that student after a full year of college is a problem because that student still expects, at the end of the year, to have achieved (or at least progressed towards) the earlier expectations. Perhaps another explanation accounts for the difference: an incoming student has played soccer for ten years and enjoys the sport immensely. She fills out the CSXQ, reporting that she will "very often" play a team sport. Attending college, her roommate drags her to the debate club meetings, and she finds a passion for debating that did not exist when she completed the CSXQ. In high school, she spent so much time in soccer practice, in fact, that she never participated in organizations like the debating team, so she checked "never" when asked on the CSXQ if she would attend a "meeting of a campus club, organization, or student government group." Now she is involved in debating, and finds that her schedule does not allow for both soccer and debate team. She’s played soccer for years, and wants to try something new, so she weighs the options and makes the choice to drop soccer in favor of the debate team. Looking at her CSXQ/CSEQ responses for team sports, one might categorize this situation as a disappointment for this student, when actually the student simply has had a change in expectations. Researchers should expect that students without experience in a college setting will, once they have some experience, readjust their expectations. This means looking more carefully at the survey results to determine if student efforts are shifting, rather than being curtailed.
A small concern this author has in the demographics section is that the way a student describes his or her age is limited for a group of first-year students, as the lowest age group is "19 and younger" and the next group is "20-23." This author suggests that an 18-year-old has different expectations than a 19-year-old, and that there is even more difference between a 20 and a 23-year-old first-year student, yet the instrument will not let you examine this.
Also, institutions might consider having several students take a "trial run" of the CSXQ/CSEQ to identify any points that might be confusing for the intended test takers. For example, the questionnaire gathers information on course load in the first term by asking for "credit hours." Some institutions don’t have a "credit hour" system, so this question could confuse first-year students. In this case, students could be told to disregard the question on credit hours.
Experiences with the CSXQ
There are some readily available web-based reports on experiences with the CSXQ that might help institutions decide whether to administer the survey, and how to do so.
Appalachian State University compared the results obtained via paper to those obtained from web-based administration. While they do not report the response rates for the two conditions, they do report that the individuals assigned to the online version were more likely to answer every question on the form than those assigned to the paper version (98% versus 50%). They also found that expectations were higher among students completing the online version. Without information on relative response rates, however, it is difficult to interpret the possible reasons and implications of the finding.
One institution in the Minnesota State University system used CSXQ results to show how much time students thought they would spend studying in college, compared to the number of credit hours in which they were enrolled (link to the PowerPoint Slide Show). This indicated a lack of understanding among students of how much studying was required per credit hour. It exemplifies how student expectations, if understood through assessment, can be addressed early on by communicating how much time should actually be spent studying.
Experiences with the CSEQ
A great deal of published research that uses the CSEQ exists. Of particular interest are those studies looking at how institutions use the survey results. Like the CSXQ, numerous reports are available on institutional websites.
Returning to Appalachia State University, one can see an example of a short report (pdf) that compares expectations from the CSXQ with experiences from the CSEQ. A similar use of the instruments is illustrated by another short report (pdf) from Arizona State.
Some reports illustrate selective uses of the data, such as how Wake Forest University used CSEQ results to examine the impact of a computing initiative (link to the PowerPoint Slide Show). Others use CSEQ data as part of the re-accreditation self-studies, such as in this report from Shippensburg University.
Conclusion
The CSEQ and CSXQ are excellent tools by which you can understand the engagement and expectations of engagement of your incoming students. While both can be used as stand-alone surveys, this author particularly recommends using the CSEQ in combination with the CSXQ to best see how student expectations have been realized throughout the institution. Using the CSEQ/CSXQ surveys to look at liberal arts outcomes can be of use when examining levels of student engagement in "best practices," but other instruments looking more specifically at desired liberal arts outcomes (the CCTST/CCTDI, for example, to look at critical thinking) should be considered, along with qualitative approaches to capture the overall institutional culture.
References