Understanding Survey Results
Excluded Respondents
Respondents Included in Raw Data File
Benchmark Reports
Means and Frequency Reports
Standards for Interpreting Mean Differences
CCSSE Sampling, Weighting, and Local Student Characteristics
Student Level Breakout Definitions
Student Identifier Data
Recorded webinar: CCSSE Reporting Overview
Excluded Respondents
The total count of respondents in an institution's raw data file will differ from the numbers reported in institutional reports due to intentional exclusion of certain surveys in the online reporting system. Respondents are excluded from the institutional reports for the following reasons.
- The respondent did not indicate full-time or part-time enrollment at the institution. (Variable: ENRLMENT is blank)
- The survey is invalid. A survey is invalid if a student does not answer any of the 19 sub-items in item 4, answers very often to all 19 sub-items, or answers never to all 19 sub-items. (Variables: CLQUEST, CLPRESEN, REWROPAP, INTEGRAT, CLUNPREP, CLASSGRP, OCCGRP, TUTOR, PARTICCBP, EMAIL, FACGRADE, FACPLANS, FACIDEAS, FACFEED, WORKHARD, FACOTH, OOCIDEAS, CONVSTUDIFF, SKIPCLAS)
- Respondents indicated their age as under 18. Note: Data for these students are not returned in the raw data file or institutional reports. (Variable: AGENEW is 1. Responses where age was not marked are included in the analysis)
- Respondents indicated that they had taken the survey in a previous class or they left item 3 blank. (Variable: TAKEB4 is 1 or blank)
- Oversampled respondents are not included in online reports because they are selected outside of CCSSE's primary sampling procedures. (PSAMPLE is 0) *not applicable to online administration
Respondents Included in Raw Data File
Raw data files contain responses from all students who completed CCSSE, with the exception of invalid surveys and those completed by students under the age of 18. For the purposes of working with your data file, excluded respondents do not have a weight listed for the IWEIGHT variable. Therefore, to run analyses without excluded respondents, simply remove any observations where IWEIGHT is missing. This will ensure that the analysis only includes primary sample respondents who do not meet any of the exclusionary criteria.
For the online administration, the number of respondents contained in the raw data file may differ from the data displayed in the Responder Tool during the administration. The Responder Tool only displays the number of respondents that reach the end of the survey; whereas, the data file contains data from all responders except for those under the age of 18, those who did not report enrollment status, and those who provided invalid responses on the 19 sub-items in item 4.
Benchmark Reports
CCSSE benchmarks are groups of conceptually related survey items that focus on institutional practices and student behaviors that promote student engagement—and that are positively related to student learning and persistence. Benchmarks are used to compare each institution's performance to that of similar institutions and to the CCSSE Cohort. The five benchmarks of effective educational practice in community colleges are active and collaborative learning, student effort, academic challenge, student-faculty interaction, and support for learners.
Benchmark reports consist of tables showing the college's scores on each benchmark, followed by means and frequency tables of items in each benchmark. While the benchmark scores provide an overview of how the college is doing in particular areas, colleges must be mindful that the results from the individual survey items composing each benchmark deserve examination.
The benchmark reports also include a graph that displays raw benchmark scores over the college's last four administrations, or up to the last four administrations. The raw benchmarks over time graph can be found on the third tab (Raw Benchmarks & Graphs) of the Benchmark Excel files in Standard Reports.
Means and Frequency Reports
Responses to individual CCSSE survey items are summarized in two formats—means and frequencies.
Means reports present an average for each survey item that has scaled responses (e.g., strongly agree to strongly disagree) and compare average item responses between member colleges and various groups (e.g., similarly sized colleges), or between subgroups within a college (e.g., men & women). Means are not run on dichotomous items, i.e., those with only two response options (e.g., yes/no; enrolled/not enrolled). These items are summarized in the frequency reports.
Frequency reports present the observed frequencies of occurrence (counts and percentages) of the values for each survey item, excluding demographic survey items. These reports are useful for understanding how data are distributed across response categories. Please note that counts and percentages on frequency reports are subject to rounding.
Standards for Interpreting Mean Differences
When interpreting mean differences across comparison groups, CCCSE uses a combination of two measures: (1) a t-test with a very conservative alpha level of .001 or less is used to determine if the difference between two means is significant and not likely due to chance, and (2) an effect size of .20 (absolute value) or more using Cohen's d is used to show the magnitude of difference between the two means. If a comparison is significant at an alpha level of .001 or less and has an effect size of .20 or greater, then it is considered to be a statistically significant difference worthy of further investigation. Comparisons that meet these criteria are marked with a double-asterisk (**). For internal analysis of small groups, it may make sense for colleges to use a larger alpha level but typically not a larger effect size.
CCSSE Sampling, Weighting, and Local Student Characteristics
Paper-and-Pencil Administration
In CCSSE sampling procedures, students are sampled at the classroom level. As a result, full-time students, who, by definition are enrolled in more classes than part-time students, are more likely to be sampled. This introduces a sampling bias resulting in disproportionately more full-time students completing the survey. Statistical weighting, discussed in detail below, is used to adjust for this bias.
Online Administration
In order to boost responses to the online survey, sampling is not employed; the survey invitation is sent to all eligible students. Research shows that full-time students and women are more likely to complete online surveys than part-time students and men. As a result, full-time students and women are over-represented in the final sample. Statistical weighting, discussed in detail below, is used to adjust for this bias.
Statistical Weighting
Prior to CCSSE 2022 reporting, weights were based solely on enrollment status to address bias introduced by the sampling process, i.e., full-time students, who by definition are enrolled in more classes than part-time students, are more likely to be sampled. However, respondents to online surveys tend to be disproportionately full-time and women. Therefore, for CCSSE and SENSE reporting moving forward, we are introducing new post-stratification weights based on both enrollment status and gender identity. Weights are calculated for each institution and are based on the most recent publicly available IPEDS enrollment figures. These new weights will be applied to both the paper-and-pencil and online survey results. The use of the new weights will not negatively impact the analysis of the paper-and-pencil surveys. For example, if the proportion of men and women among your survey respondents exactly matches the proportion in your population, results based on the new weighting would be the same as results based on enrollment status alone.
Deactivating Weights
Under certain circumstances, deactivating weights may be a more informative way to examine institutional CCSSE data. Even the most recent IPEDS data are approximately two years old and may not always accurately represent a college's current student population. For example, in the case that a college has experienced a significant change in enrollment characteristics during the two years prior to administering CCSSE, the college's institutional research department may want to consider whether the weights based on the IPEDS numbers are completely appropriate.
For example, a college where the vast majority of students are either full-time or part-time (e.g., 92% full-time) may want to look at the unweighted results for the majority group of students to guide campus discussions.
Student Level Breakout Definitions
Breakout reports including benchmarks, means, and frequencies are available in each of the areas below. Each category is based on student responses to specific survey items.
Full-Time & Part-Time (Enrollment Status)
Item 2: “Thinking about this current academic term, how would you characterize your enrollment at this college?”
Developmental & Non-Developmental
Three sub-items in Item 8: “Which of the following have you done, or are you currently doing at this college?”
8c. Developmental/remedial reading course (also referred to as Basic Skills, College Prep, etc.)
8d. Developmental/remedial writing course (also referred to as Basic Skills, College Prep, etc.)
8e. Developmental/remedial math course (also referred to as Basic Skills, College Prep, etc.)
If a student responded that they have taken or are currently taking any one or more of these three types of courses, they are classified as Developmental; if a student responded that they have not taken nor are currently taking any of these three types of courses, they are classified as Non-Developmental. In addition, to be classified as Developmental or Non-Developmental, a student must have responded to all three items.
Traditional & Nontraditional-Age
Item 38: “Mark your age group.”
Respondents under age 18 are excluded from all data sets. Respondents marking age groups 18-19, 20-21, and 22-24 are classified as Traditional-Age and those marking age groups 25-29, 30-39, 40-49, 50-64, or 65+ are classified as Nontraditional-Age.
First-Generation & Not First-Generation
Item 47: “Who in your family has attended at least some college? (Mark all that apply)”
If respondents indicated that their mother or father had attended at least some college, then those students are classified as Not First-Generation. If respondents did not indicate that their mother or father attended at least some college, but selected any other response option (including “no one”), they are classified as First-Generation. If respondents did not select any response options, their first-generation status is classified as missing.
Note: Prior to the 2023 administration, if respondents indicated that their mother or father had attended at least some college, then those students were classified as Not First-Generation; otherwise, students were classified as First-Generation.
Gender Identity
Item 39: “Your gender identity.”
Race/Ethnicity
Item 45: “What is your racial or ethnic identification? (Mark all that apply)”
Race/Ethnicity is coded as "I prefer not to respond" if a student selected this response option, regardless of whether another response was also marked.
If a student selected more than one race/ethnicity response option, the race/ethnicity variable is set to "Two or More Races."
0 to 29 Credits & 30+ Credits
Item 33: “How many total credit hours have you earned at this college, not counting the courses you are currently taking this academic term?”
Not Online-Only & Online-Only Students
Item 32: During the current academic term, how many classes are you taking...(a) Face-to-face, (b) Online, (c) Hybrid
If the student took the survey on-paper (in a classroom) or if they indicate taking at least one face-to-face or hybrid class, they are categorized as “Not Online-Only”
How Calculated Variables are Created
Student Identifier Data
In accordance with Texas state law and The University of Texas at Austin's policies, CCCSE does not provide student-identifier data in the institution's raw data file available for download via the CCSSE online reporting system. To request a securely transmitted data file with student identifiers, please contact your CCSSE liaison or surveyops@cccse.org.