This survey, which was conducted in the spring of 2003, examines the experiences of a representative sample of UC Berkeley undergraduates on a variety of curricular, administrative, and student service issues.

Notes: Counts are in bold, and precede items they are counting. Missing values are not included in the counts. Percents for multiple selection items are based on the total number of active cases.

Number of cases: 10,171

View the 2003 survey results.

Methodology

The primary objective of this survey was to collect a wide range of information about UC Berkeley undergraduates. Information about how students spent their time, which goals are important to them, extent of engagement with and access to faculty and teaching assistants, course and study related behavior, extent of use of many student services, time management, and some background characteristics was collected.

There were several secondary objectives. One was to initiate research on follow-up procedures with an eye towards the extent to which they could increase the survey response rate. Another objective was to co-ordinate the Berkeley survey with the random sample Systemwide Undergraduate Experience Survey (UCUES) being conducted across all of the undergraduate UC campuses.

In order to maximize the co-ordination of the Berkeley UCUES survey with the Systemwide UCUES survey, many of the questions were in common. Berkeley had to distribute its survey before the Systemwide survey because Berkeley is the only UC campus operating under the semester system and its semester ends before the quarters end on the other campuses. Because of this, Berkeley's questionnaire was slightly different from the questionnaire used on the other UC campuses.

This survey was conducted over the Internet. Students were emailed a message that described the survey, the prizes offered, and included a url linking to the survey. To take the survey a student had to click on the link and enter his or her SID. Only registered undergraduates could access the survey.

The initial survey population was defined as those undergraduates who were enrolled at the point of the Spring Census, a total of 22,969 students. However, nearly a thousand students were excluded from the survey if they were still non-respondents to the random sample Systemwide Cost of Attendance Survey (COAS) which was implemented several weeks before UCUES. Students who had already responded to the COAS were included in the UCUES survey. Because it was essential to achieve as high as possible response rate on the Cost of Attendance Survey, we did not want to lessen the likelihood of responding to COAS by simultaneously asking for a response to UCUES. Also, students who did not appear on the Registration file at the time of the survey, despite being on the Census file, were excluded. Hence, the final survey population consisted of 21,911 undergraduates and the SID's of these students were entered into the survey access area. It is of interest to note that of the 21,911 undergraduates in the survey population, 21,825 or 99.6% had email addresses. The initial survey invitation email was sent to this group of 21,825 undergraduates.

Prizes were offered to entice students to respond to the survey. Twenty-two cash prizes were awarded to survey participants. The first place prize was $500, there were twenty $50 prizes, and one special prize of $250 was also awarded.

The primary survey procedure consisted of sending emails to all students for the survey launch and then to non-respondents thereafter. The following table shows the relationships among the date of the email, the number of days elapsed since the survey launch, the number of replies, and the cumulative response rate.

Figure 1. Cumulative Response Rate by Days

DateDays ElapsedNumber of RepliesResponse RateEvent
4/10/03 0     1st Email: Launch
4/15/03 5 3,567 16.3% 2nd Email
4/25/03 15 6,252 28.7% 3rd Email
5/08/03 28 7,341 33.6% 4th Email
6/02/03 53 8,737 40.0% 5th Email
6/04/03 55 9,457 43.3% 6th Email Special Groups
6/11/03 62 9,707 44.5% 7th Email
6/17/03 68 10,171 46.4% Survey Ends

There is a pronounced effect to each email. During the day of the email and the day after, there are a large number of replies. Three days after the email, there are still some replies, particularly if it is an email reminder sent shortly after the survey launch. However, subsequent emails have a rapid fall-off in replies after a few days.

Even the 4th, 5th, and 6th emails produced noticeable increases in the number of respondents. Instruction ended May 13, finals began on June 16, and finals ended on June 24. It is of interest that a considerable percentage of replies were received after instruction ended and before the end of the survey (approximately 12.8%).

The primary follow-up procedure for target populations consisted of telephoning selected groups of non-respondents and sending them reminder postcards. The primary group of non-respondents called consisted of a random sample of 1995 undergraduates. The telephone procedure consisted of calling and leaving a message if the student was not home. The message briefly described the survey and encouraged student participation. In addition, if the student had questions, they were addressed. A group of students in majors where the department is scheduled for a program review within the next academic year were also called. Students in the following majors were called: English, history, nutritional science and toxicology, and sociology. Also, students who were members of the following ethnic groups received follow-up telephone calls: African American, Chicano, and Latino.

Other follow-up procedures consisted of designing, printing, and distributing posters, book markers, and circulars describing the survey and encouraging undergraduate participation. The posters were distributed to various academic departments, within the residence halls, and to student affairs offices by student affairs officers as well as six work study students. Also, at the behest of Vice Provost for Undergraduate Education Christina Maslach, Deans of schools and colleges were encouraged to ask that Student Affairs Officers in each department send an email to their majors urging completion of the survey.

The impact of the follow-up emails is apparent. There is a marked increase in response rates after each email although the magnitude of the increase declines as successive emails are sent. The telephone calls and postcards directed at the random sample group also had an impact. Although the overall response rate was 46.4%, the response rate for the random sample was 52.4%. The response rate for the non-random sample students, i.e., the remainder, was 45.8%.

It is more difficult to assess the effects that the posters and bookmarks had on the response rate since they were on-going, and not tied to any specific date. By attempting to contact students using a variety of media aside from email, e.g., postcards and posters, it is possible to overcome the possibility of inaccurate email address information. A few students did email mentioning either the phone call or a postcard and inquired about taking the survey.

One of the issues surrounding so many emails is whether students feel as if they are being spammed. Complaints were received from five students about the invitations to participate in the survey. The email address of either the survey manager or the project manager was displayed in the second, third, fourth, and fifth emails so it would have been relatively easy for students to email us. Students who requested that they be excluded from further emails did not receive additional emails.

Relatively few students contacted us explaining that they had difficulty accessing the survey. Just why they could not access the survey was difficult to determine with certainty. For a short period of time, i.e., less than a few hours, the server was down but it seems unlikely that this would have caused all of the problem. It is possible that some of the handful of students who could not access the survey were not entering their SID's correctly.

By emailing students the survey url and by using multiple email reminders, it is possible to conduct a large scale survey with a minimum of expense producing a reasonable response rate. Follow-up telephone calls and postcards appeared to increase the response rate by as much as six percent.

back to top