Skip to Main Content

Assessing the use of Your First College Year Instrument For Liberal Arts Related Outcomes


Introduction


The survey instrument Your First College Year (YFCY) was created for higher education researchers and practitioners to gain perspective on the personal growth and academic development of freshman college students. The YFCY collects data on both cognitive (thinking/learning) and affective (feeling/emotions) measures, as well as provides comprehensive institutional and comparative data for analysis in three areas: 1) student persistence from year to year, 2) adjustment to college life, and 3) other first-year learning outcomes (Higher Education Research Institute, 2003). In considering if the YFCY is a useful tool for assessing the outcomes of a liberal arts education, it is important to consider its history, the overall goal of the survey instrument and its key characteristics, validity and reliability of the instrument, as well as the particular areas of the instrument that focus on liberal arts related values and outcomes.

History of YFCY and its use

Seeing a need for comprehensive, longitudinal data (information gathered over time) on the first-year experience, UCLA’s Higher Education Research Institute and the Policy Center On the First Year of College at Brevard College, developed the YFCY. It is the only national survey that specifically targets and assesses the academic and personal development of first-year college students. The survey instrument, initiated in 2000, allows institutions to identify features of the first year that encourage learning, campus involvement, retention and academic success, and overall satisfaction. Data from this instrument are especially useful for campuses seeking to enhance first-year programs and retention strategies and can be used to: compare institutions and students, describe students and their characteristics, create long-range studies, and analyze trends in student attitudes, behaviors, and other areas.

For comparative purposes, students’ responses can be compared to national and institutional peer groups as a whole. This allows participating institutions to see where their first-year students stand in relation to other first-year students. Responses can also be separated in order to compare outcomes (ex: adjustment or retention) for different groups of students such as students in learning communities or specific programs, as well as by gender, race, or campus residence. The YFCY also offers space for institutions to add their own supplementary questions in order to conduct a within-institution analysis. For example, if a particular campus would like to know more about higher order learning outcomes such as the ability to think critically and synthesize the work of others, they can create their own survey questions that measure these items.

Descriptive analyses provide information on cognitive (thinking) and affective (feeling) measures for single or multiple institutions. This instrument answers questions related to students' academic experiences, plans to return the following year, first year adjustment issues, time spent in different activities, and the larger issues of student values, goals, and attitudes.

Designed as a follow-up survey to the Cooperative Institutional Research Program (CIRP), which is a freshman-year survey, the YFCY also allows for long-range research in evaluating academic and personal development. Additionally, it considers the impact of institutional programs, policies, and practices on student outcomes and experiences. This data can be used along with local, campus-based assessment in order to develop a broader picture of the first-year experience.  For instance, campuses can use this survey instrument to measure whether their freshmen students are adjusting to college life and to measure if particular first-year programs are achieving specific outcomes, such as developing attitudes of service to others and the community.

The YFCY is also able to analyze trends over time because it repeats survey items from previous years. Individual campuses are then able to assess trends in such areas as first-year student characteristics, behaviors, values, satisfaction, and adjustment to college life.
   
Strengths and weaknesses of the YFCY: key characteristics

There are several positive aspects of this instrument that make it a useful tool for understanding students’ first-year experiences. First, it asks a broad range of questions in order to provide comprehensive data on the freshman experience. Second, administration of this tool is flexible and can either be done via the web or paper copies, or a combination of both. Schools have latitude in how they administer the YFCY so they can choose their own method of sampling and design their own marketing strategies. But, staff at UCLA’s HERI Institute indicated that the best response rate is achieved when the survey is administered in a proctored setting such as a class. Schools have also reported success with administering the survey in campus dorms where incentives play a role in motivating students’ return rates of the YFCY. Third, the turnaround time on the administration of the survey, and subsequent reports returned to campuses, is reasonable. Campuses must register by August and the survey is released in late February; schools can administer the survey any time in the spring, but must return all surveys by June. Surveys are then sent to an outside contractor where statistical data is processed. Campuses usually receive their reports by late August. Campus reports include the YFCY Institutional Profile, a Means Profile of the institution’s responses compared to respondents in other groups, and a Factor Analysis of National Aggregate Data that clusters survey items under broader topic headings.  A fourth benefit of the survey is the provision of data that can be merged with other campus-specific data (such as registrar information). The YFCY can be used as a stand-alone instrument, though staff at HERI indicate it is more effective when used as a follow-up to CIRP.  Also, the cost to administer the YFCY, including campus reports, is not expensive. The basic cost of the participation fee is $450.00, plus $2.00 for each returned survey. Finally, one of the greatest strengths of the survey is the section allocated for campuses to design their own survey items. This is particularly useful for individual institutions to measure campus-specific outcomes related to the liberal arts, such as higher order thinking skills. 

The general weaknesses, or challenges, of the first-year survey are threefold. First, it is a new instrument, initiated in 2000, so there is limited information in terms of empirical testing. Secondly, while it can be administered as a stand-alone instrument, it was designed as a follow-up to CIRP. Thus, any institution that would like to administer both surveys may be faced with budget constraints that make costs for both instruments prohibitive. Thirdly, because it only measures first-year outcomes, the data is not as rich as it would be if it were measuring liberal arts outcomes after four years of college experience. It should also be noted that, while the overall survey does measure what it claims to measure, it was not solely designed to measure outcomes of liberal arts education.

Validity and reliability of the YFCY
 
Psychometric testing (HERI Institute) has indicated that the YFCY is moderately reliable with reliability coefficients from .10 to .97, meaning that the instrument is fairly consistent and dependable. Instrument validity was considered from both the measure of construct and content validity. Construct validity assesses whether or not the theoretical construct measures what it is supposed to (3). The YFCY survey uses Astin’s (1) theoretical model of assessment and appears to measure what it claims to measure; this will be further explained in a later section of this review. Feedback from student focus groups indicated a high degree of content validity. These students reported and reflected on the clarity and relevance of each question and explained their understandings of each response option.

Assessing the YFCY: Its use in relation to assessing liberal arts outcomes

The real question to ask is, "How does this instrument contribute to our understanding of liberal arts education?" To answer this question it is important to first define, in general, what is meant by liberal arts education is. This is often difficult because various campuses may define it differently. For instance, The Center for Inquiry in the Liberal Arts at Wabash College has developed a "provisional" theory that hypothesizes three conditions must exist in order to support liberal arts education: 1) institutional traditions must place a higher value on developing a set of intellectual skills than on developing vocational or professional skill, 2) curricular and environmental structures must exist in combination in order to create cohesiveness and integrity in students’ intellectual experiences, and 3) institutional traditions must place a high value on interactions outside the classroom, including student/peer interactions and student/faculty interactions.

In conjunction with this provisional theory, it is useful to consider the work of Pascarella, Blaich, Wolniak, and Seifert (4). They approached their data by looking at what they considered "uniquely ‘liberal arts’ about students’ educational experiences in college" (p.2, handout from workshop at the Center of Inquiry in the Liberal Arts, Wabash College). Liberal arts education meant that students "go to college full-time, experience effective teaching, have high levels of interaction with faculty and peers in non-classroom and classroom settings, are encouraged to high levels of academic expectation and effort, learn in a setting that focuses on the integration of ideas, and take courses that emphasize study of the liberal arts rather than vocational/technical areas" (p.2).  While this approach has similar components in comparison with the Wabash College definition, there are unique differences, in particular, the focus on full time college attendance, effective teaching practices, and encouragement of high levels of academic expectation and effort. While these components may be assumed in the Wabash College provisional theory, they were not explicitly stated.

The work of Pascarella, Blaich, Wolniak, and Seifert (4) revealed that campuses with a strong liberal arts emphasis had positive effects on student learning outcomes in five areas: reading comprehension, critical thinking, science reasoning, writing skills, and openness to diversity. Our discussion at the Wabash College Reviewers’ Workshop (Spring 2004) also led to considering student outcomes in terms of lower order and higher order outcomes. Thinking about these and discussing them was useful when assessing liberal arts outcomes. Basic writing, reading, and math skills were deemed as lower order skills. Higher order skills were deemed as critical thinking, responsible citizenship, ability to self-reflect, wisdom, analytical ability, living humanely, effective use of judgment, ability to substantiate an argument, ability to synthesize others’ works, ability to critique the thoughts of others, and the ability to interact effectively and meaningfully with other cultures.

In terms of above frameworks for assessing liberal arts education it seems that in several respects the YFCY does indeed measure some of these outcomes. The survey is designed to show self-reported change over the freshman year and includes measures of: analytical and problem-solving skills, critical thinking skills, knowledge of other cultures, ability to get along with others, and understanding of local, national, and global issues. Other items in the survey that appear to assess liberal arts related outcomes (perhaps in a more vague way) require students to indicate the value of certain items relative to their personal importance such as: influencing the political structure, influencing social values, helping others, contributing to society through writing or other artistic work, becoming involved in environmental issues, developing a meaningful philosophy of life, and promoting racial understanding.

The YFCY as a model of assessment

The YFCY is methodologically grounded in aspects of Astin’s (1) model of assessment. Astin (1) claims that campus assessment should identify three factors that contribute to students’ educational development. First, colleges need to know something about students before they enroll in college, such as demographic information. Secondly, campuses need to know something about students’ experiences while they are in college, such as experiences related to faculty interactions. Finally, campuses need to know how experiences on campus affected student outcomes. For instance, did regular meetings with faculty members outside of the classroom increase student satisfaction? Or, did classroom activities foster student integration of knowledge across disciplines?

The YFCY does ask for demographic information, but primarily focuses on assessing the campus environment and campus outcomes. Assessment of the campus environment is related to questions that directly address students’ experiences on campus, such as time spent with faculty, use of campus facilities, or frequency of behaviors such as skipping class or using tutoring services. Assessment of outcomes is related to questions that address student learning, student adjustment, intent to reenroll, and overall satisfaction.

Concluding comments about use of the YFCY survey instrument

While there are aspects of the YFCY survey that would measure liberal arts outcomes such as growth in analytical ability and critical thinking, increased understanding of people from different cultures, and involvement in civic programs (to name a few), the YFCY was not designed with the intent to assess liberal arts outcomes. Rather, the survey was designed to measure student adjustment, satisfaction, and intent to reenroll. However, a positive aspect about the YFCY is that it enables campuses to design 30 survey items that can further assess outcomes of a liberal arts education.

References

 

  1. Astin, A. W. (1991). Assessment for excellence. Phoenix, AZ: Oryx Press.

     
  2. Blaich, C., Bost, A., Chan, E., & Lynch, R. (2004). Executive summary: Defining
    liberal arts education. Retrieved March 12, 2004, from
    http://liberalarts.wabash.edu/cila/home.cfm?news_id=1400

     
  3. Light, R.J., Singer, J.D., & Willet, J.B. (1990). By design: Planning research on higher education. Cambridge, MA: Harvard University Press.

     
  4. Pascarella, E.T., Blaich, C.F., Wolniak, G.C., & Seifert, T. A. (2004)
    Your First College Year (2004). Retrieved March 3, 2004, from http://www.gseis.ucla.edu/heri/yfcy