Center for Community College Student Engagement (CCSSE)
The Center for Community College Student Engagement has designed several surveys that address different aspects of engagement, which is an important aspect of learning and provides a measure of the College’s success. Pima Community College has participated in three surveys:
- Community College Survey of Student Engagement (CCSSE)
- Survey of Entering Student Engagement (SENSE)
- Community College Faculty Survey of Student Engagement (CCFSSE)
You can access results from the three surveys below, and we will post the new results as they become available. In some cases, the reports provide summary information; in other cases detailed tables of results are provided.
- CCSSE Key Findings: 2011 | 2014
- SENSE Key Findings: 2011 | 2013 | 2017
- CCFSSE Summary Report: 2014
- CCFSSE Summary Tables: 2011 | 2014
- CCFSSE Promising Practices: 2011 | 2014
- Comparison of findings from CCFSSE and CCSSE: 2011 | 2014
Survey of Entering Student Engagement (SENSE)
The Survey of Entering Student Engagement (SENSE), a product and service of the Center for Community College Student Engagement, helps community colleges discover why some entering students persist and succeed and others do not.
Administered during the 4th and 5th weeks of the fall academic term, SENSE asks students to reflect on their earliest experiences (academic and services-related) with the college. SENSE serves as a complementary piece to the Community College Survey of Student Engagement (CCSSE), with a more narrowed focus on early student experiences.
The SENSE instrument includes items that elicit information from students about their first impressions of the college; intake processes such as admissions, registration, assessment, placement, orientation, and financial aid; how they spend their time as they begin college; how they assess their earliest relationships and interactions with instructors, advisors, and other students; what kinds of work they are challenged to do; how the college supports their learning in the first few weeks; and so on.
The Center offers four tutorials to help member colleges navigate and understand the various features of the SENSE online reporting system.
The SENSE 2017 (2015-2017) cohort includes 266 institutions in 41 states and the District of Columbia. One hundred and nineteen 2017 cohort colleges are classified as small (<4,500), 57 as medium (4,500-7,999), 61 as large (8,000-14,999), and 29 as extra-large institutions (15,000 + credit students). PCC is categorized as an extra large institution.
Key Findings Report
The SENSE Key Findings provides college-specific data in an easy-to-read and easy-to-share format including: benchmark comparisons between your college, top-performing colleges, and the SENSE cohort; the aspects of highest and lowest student engagement at your college; and selected results related to academic advising and planning.
Entering & Returning Students at Pima Community College
Identifying Entering and Returning Students: SENSE sampling targets developmental courses and gateway English and math courses because entering students are most likely to be enrolled in them. However, returning students also enroll in these courses. As a result, the raw data file include both entering and returning students. All reporting on this website except the “Entering & Returning Students” report include only entering student data. To distinguish between entering and returning students, Center staff use the response to item 6 (“How many semester/quarters have you been enrolled at this college?” [variable: TERMSENR). Students who respond “This is my first semester/quarter” (TERMSENR = 1) are classified as entering students and all others who answered the item are coded as returning students.
SENSE benchmarks are groups of conceptually related survey items that focus on institutional practices and behaviors that promote engagement among entering students. This report compares PCC results to other extra large colleges and the survey cohort as a whole. The six SENSE benchmarks are: early connections, high expectations and aspirations, clear academic plan and pathway, effective track to college readiness, engaged learning, and academic and social support network. Here is an explanation of how benchmarks are calculated.
Means and Frequency Reports
Responses to individual SENSE survey items are summarized in two formats: means and frequencies.
Means reports present an average for each survey item that has scaled responses (e.g., strongly agree to strongly disagree) and compare average item responses between member colleges and various groups, or between student subgroups within a college. Means are not run on dichotomous items (those with only two response options). These items are summarized in frequency reports.
Frequency reports show the counts and percentages for each item in the survey (excluding demographic survey items). These reports are useful for understanding how data are distributed across response categories, for example, how many students answer “rarely” or “very often” to a survey item.
Provide benchmark scores for each of the six SENSE benchmarks as well as means and frequency reports for benchmark items. Its shows the college’s scores compared to the extra-large college comparison group and the 2017 cohort of all colleges.
All Entering Students
Part-Time & Full-Time (Enrollment Status) Entering Students
The results in these reports break out results by enrollment status, which is self-reported on the survey. CCSSE assumes part-time students are underrepresented in the sample and adjusts their weight accordingly.
In addition to the SENSE questions, students were asked to answer two additional sets of items called Special-Focus Modules:
- The Financial Assistance is designed to help institutions understand students' experiences with the FAFSA and potential reasons for not completing it.
- The Promising Practices for Community College Student Success is designed to help your institution explore students' participation in promising practices—such as first-year experience, early alert, and accelerated developmental education—for which there is evidence of effectiveness in strengthening student learning, persistence, and attainment.
Comparison Groups provides alphabetical listings of the complete 2017 cohort (SENSE participants from 2015 through 2017), as well as group lists for comparisons based on, size, location, state, accreditation region, and consortium.
Table 1: Respondents to Underlying Populations Comparisons details respondent characteristics (including sex, race or ethnicity, age, and enrollment status) from PCC as well as population data for 1) PCC, 2) extra large 2017 SENSE Cohort colleges, and 3) all 2017 SENSE Cohort colleges.
Table 2: Percent of Target provides data on survey completion counts and rates for 2017 SENSE Cohort colleges as well as breakouts for colleges in each size category.
Table 3: Respondents to Underlying Population by College Size Report highlights data about respondent and population characteristics by institution size and at the cohort level.
Table 4: Underlying Population Percentages by Gender, Race/Ethnicity, and Enrollment Status examines college-level population characteristics (sex, race/ethnicity, and enrollment status) by institution size.
Table 5: Underlying Population Percentages by Age shows the percentage of students by age group for each participating college.
Table 6: Survey Completion Rates provides overall completion rates, within-class completion rates, and the percentage of sampled classes surveyed for all participating institutions. Overall completion rates are calculated by dividing the number of surveys returned by the number of surveys sent to the college. Within class completion rates are calculated by averaging the per class completion rate across all surveyed classes. The percentage of sampled classes surveyed is calculated by dividing the number of surveyed classes by the number of sampled classes.
For more information, contact the Institutional Research, Planning and Effectiveness.