1998 UCUES: Results and Summary

Methodology

In the spring of 1998 a survey designed to obtain information on a wide variety of undergraduate issues was mailed to a sample of 4000 undergraduates. Issues investigated include progress made in various student skills areas, satisfaction with services provided, and career plans.

The results summarize the replies of the 1063 undergraduates who responded to the 73 multiple response items in the questionnaire. The results from the open-ended questions (#3, 14, 15, 17, 18) are not reported here.

View the 1998 survey results.

The survey was distributed in Spring 1998 to a sample of 4000 undergraduates. The sample was constructed so that 2000 persons were either from low income or from low parental educational level families. Sampling was done to ensure representation by ethnic group, gender, and class level.

The questionnaire was in the form of a self-mailer. The unfolded questionnaire was a piece of paper 17 inches wide by 11 inches high. It was first folded into half and then into thirds for mailing purposes. On one side the student’s address was placed for the outgoing mailout and another side contained the address for the Office of Student Research in a postage prepaid form. It was intended that survey recipients carefully open their mailed packets, which measured 3 5/8" x 8.5", complete their surveys, and refold them so that their addresses were inside. Outside would be the Office of Student Research address. Students complied with the instructions.

Several steps were taken to produce a response rate that was as large as possible. The questionnaire was shortened from the 1997 version and prize of $500 was offered to the winner of a random drawing among the respondents. In addition, an extensive follow up procedure was implemented.

Those surveyed were divided into three groups for follow up purposes: email, postcard, and null. The email group received an email follow up. There were 1508 persons who had email addresses and who indicated that they gave permission to use their email addresses. Of that number, 482 or 31.9% returned a questionnaire. There were 1000 persons who received a follow up post card. Of that number 224 or 22.4% returned questionnaires. The remaining group, which was like a control group, consisted of 1492 persons and they neither received an email nor a postcard. Of that number, 312 or 20.9% returned questionnaires. The Innovative Procedures section contains additional details on the email procedure.

Nonetheless, the results were disappointing. Only 1063 replies were received out of an adjusted total of 3860 for a response rate of 28% which is considerably lower than the response rate for the prior year.

Examining the response rates within various variables provided further information on the differences in response rate for various groups. Women, as is almost always the case, had a considerably higher response rate than men (30% versus 22%). The response rates for the main ethnic groups are: African American (17%), Hispanic (22%), Asian (28%), and White (29%). The response rates by class level are: freshmen (26%), sophomores (27%), juniors (24%), and seniors (26%). The response rates by father’s educational level are: some high school (27%), high school graduate (24%), some college (27%), two year college graduate (24%), four year college graduate (26%), and post graduate degree or study (28%).

The response rates by college are: Business Administration (27%), Chemistry (31%), Engineering (26%), Environmental Design (31%), Letters and Science (25%), and Natural Resources (29%). The response rates by SAT Verbal interval are: scores from 200 to 500 (22%), scores from 500 to 600 (28%), scores from 600 to 700 (28%), scores from 700 to 800 (34%).

It is possible that there are various interactions that will show greater differentiation of response rates. The following table compares response rates by entering status (new freshmen versus new transfers) by gender.

Response Rates by Gender by Entry Status

 FreshmenTransfersTotal
Females .31 .26 .30
Males .22 .17 .21
Total .27 .21 .26

The table shows that freshmen have higher response rates than transfers, .27 versus .21, and that females have higher response rates than males, .30 versus .21. Within the table, freshman females have the highest response rate and male transfers have the lowest response rate.

However, given the consistently low response rates across demograpic categories, it is likely that there are "global" factors that depress the response rate. In the case of this survey, a prime candidate is the time of year in which the survey is given. It is possible that not just seniors but that undergraduates in general regardless of class level are not as receptive to completing a survey late in the spring semester as they might be at other times during the year. Possible reasons why late spring is disadvantageous is that some undergraduates are making plans to leave UC Berkeley. They are more focused on what the world outside of Berkeley holds than on completing optional tasks, such as surveys, at Berkeley. Seniors may be graduating, some students may be focusing on their summer jobs, and some may be planning summer vacations. In the fall semester, these concerns probably do not occupy as much though as they do towards the end of the spring semester. For these reasons, it may be more advantageous to give the survey in the fall semester rather than in the spring.

Innovative Procedures

This survey experimented with innovative procedures in these areas: automation of the questionnaire distribution process; use of email for the first reminder notice; survey of non-respondents; and key-entry of the open-ended replies by a key-entry service. Each innovation will be discussed in turn.

Automation of the Questionnaire Distribution Process

The usual procedure for the distribution of the paper questionnaires is to have Campus Copy make copies and to fold the copies of the questionnaire. The folded questionnaire then has a self-adhesive address attached to it and a self-adhesive tab is attached to close the questionnaire. The attaching of the addresses and tabs traditionally had been performed either by OSR staff or by a hired temporary person.

The new procedure this spring consisted of having Campus Copy make copies and fold the questionnaire. A list of addresses was generated and emailed to Mailing Services. Mailing Services then printed the addresses on the questionnaires, attached a tab to close the questionnaires, and then mailed them.

The new procedures saved considerable staff time and also was implemented in a shorter time frame. The automation procedure used was definitely worthwhile.

Email for the First Reminder Notice

Traditionally, Student Research had follow-up postcards printed. Addresses would be attached by hand and the postcards mailed out about 10 days after the first mailing arrived.

This spring, email addresses were obtained for 1508 students. It was possible to devise an email script that emailed reminder notices. The intention of this procedure was to save the money spent on postage and on postcards and to save staff labor.

These objectives were met. In addition, the use of email produced an unintended benefit. About 100 students replied to the email stating they had not received a questionnaire. Examination of their addresses showed that in most cases they had failed to submit complete addresses. The students who responded to the email stating they did not receive a questionnaire and who submitted an address in the email message had a questionnaire mailed to their email addresses. Students who replied with an email but who did not submit a corrected address were mailed another survey. Subsequently it was noticed that students who stated they did not receive a questionnaire by email and who did not submit a new addresses, probably did not receive the additional questionnaires.

Too many persons responded to the email messages to handle the correspondence informally. Consequently, considerable additional effort was needed to develop a real-time tracking system. This system noted when the questionnaires were mailed, the addresses used, when email messages were sent, when additional copies of the questionnaire were sent and when replies were received. Since students may not realize that they submitted incorrect mail addresses, it was necessary to track where the questionnaires were sent. A problem with this system is that persons who have incorrect mail addresses that have questionnaires that are not returned and who do not respond to the email message, are still not participating in the survey.

The use of email for the follow up is a worthwhile procedure but it does require considerable additional effort on a daily basis. In addition, efforts should be made to more easily track the status of questionnaires. Perhaps a scanner which could automate entry of the returned questionnaires would help.

Survey of Non-Respondents

Some students who had not responded to either the first or second mailouts were contacted by telephone to determine the reasons for the non-response to the paper questionnaire.

The telephone survey was conducted on May 22, which was one day before the end of spring finals. The approximately 60 persons telephoned were drawn from a list of persons who had been sent questionnaires and whose questionnaires were not returned by the Postal Service. In addition, most had valid campus email addresses.

The questions asked were:

  1. Did you receive our survey, "Undergraduate Experience Survey, Spring 1998?"
  2. Did you respond?
  3. Could we have persuaded you to respond?
  4. On a scale of 1 through 5, with 1 being the best and 5 the lowest, how would you rate:
    1. Faculty instruction
    2. Availability of courses in your major
    3. Overall academic experience
    4. Overall quality of student life on campus
    5. UC Berkeley overall

Twenty-eight persons were reached. Of that number, 13 (46%) said they had received the survey, 8 (29%) said they had not, and 7 (25%) could not remember. If we re-percentage the figure including just those who remembered, then it seems as if 8/21 (38%) claim to have not received the survey. It is possible that some of those who claimed to not have received the survey are portraying their negligence to not reply in a socially acceptable manner. We have no way of determining if that is the case. Nonetheless, of the adjusted mailout count of 3860, 140 (3.6%) were returned as non-deliverable by the Postal Service.

A possibly reasonable approach would be to argue that as many questionnaires as were non-deliverable and returned were non-deliverable and not returned. The 38% non-receipt rate by itself seems too high. A guesstimate would be that perhaps twice the 3.6% or about 7% of the questionnaires were not delivered.

Most of those who did not respond claimed that nothing could be done to persuade them to respond. When asked if additional money would persuade them to respond, none replied yes. When asked if a shorter questionnaire would have persuaded them to respond, 11% replied yes. About 18% said that they would be persuaded to respond to an electronic version of the questionnaire.

Anecdotal reports from other UC campuses indicate that many non-respondents do not respond because they are under time pressure. That is, they may have fallen behind in their coursework or perhaps they perceive themselves to have fallen behind, and do not want to take the time to complete a questionnaire.

An additional question asking if respondents replied to surveys in general produced a 46% response rate indicating that respondents do not reply to surveys in general.

Although this survey does not provide substantive information about techniques for increasing the response rate, it still seems as if it is worth considering response rate enhancement techniques. For example, shifting distributing the questionnaire from the spring to the fall has had good results on some college campuses.

Another possible technique is to move to saturation surveying. This refers to using a minimal temporal separation between the survey instrument and the reminder notices. The concept behind saturation surveying is that of increasing the importance of the questionnaire by what is perceived to be frequent reminder notices. In this survey, there was about a ten day gap between the delivery of the survey instrument and the follow-up instrument, i.e., either a post card or electronic mail. With saturation surveying, the follow-up instrument would be received within 4 days of the initial instrument.

An informal tally of the replies to Question 4 seemed to roughly correspond with the responses to the corresponding parts of Questions 4 and 5 in the paper questionnaire. An insufficient number of replies were received to make a more precise comparison.

The survey of non-respondents, because of the additional effort required, probably should not be done for every survey.

Key Entry of Open Ended Question Replies

The traditional procedure was to have personnel in the Office of Student Research manually type in the replies to the open ended questions. This was a labor intensive procedure since it was an add-on task for those entering the replies.

The experimental procedure consisted of employing a key entry firm to enter all of the open ended question replies. Generally, the replies were short. Because at some point analysis and probably publication of at least some open ended replies will occur, replies with profanity, of which there were very few, were edited to remove the profanity.

One of the open ended questions asked those surveyed to list the author and title of the most recent book read for pleasure in this academic year. Although the replies are of general interest, it was not possible to tally the replies because of a lack of uniformity in reporting authors’ names and book titles.

back to top