Innovative Procedures

This survey experimented with innovative procedures in these areas: automation of the questionnaire distribution process; use of email for the first reminder notice; survey of non-respondents; and key-entry of the open-ended replies by a key-entry service. Each innovation will be discussed in turn.

Automation of the Questionnaire Distribution Process

The usual procedure for the distribution of the paper questionnaires is to have Campus Copy make copies and to fold the copies of the questionnaire. The folded questionnaire then has a self-adhesive address attached to it and a self-adhesive tab is attached to close the questionnaire. The attaching of the addresses and tabs traditionally had been performed either by OSR staff or by a hired temporary person.

The new procedure this spring consisted of having Campus Copy make copies and fold the questionnaire. A list of addresses was generated and emailed to Mailing Services. Mailing Services then printed the addresses on the questionnaires, attached a tab to close the questionnaires, and then mailed them.

The new procedures saved considerable staff time and also was implemented in a shorter time frame. The automation procedure used was definitely worthwhile.

Email for the First Reminder Notice

Traditionally, Student Research had follow-up postcards printed. Addresses would be attached by hand and the postcards mailed out about 10 days after the first mailing arrived.

This spring, email addresses were obtained for 1508 students. It was possible to devise an email script that emailed reminder notices. The intention of this procedure was to save the money spent on postage and on postcards and to save staff labor.

These objectives were met. In addition, the use of email produced an unintended benefit. About 100 students replied to the email stating they had not received a questionnaire. Examination of their addresses showed that in most cases they had failed to submit complete addresses. The students who responded to the email stating they did not receive a questionnaire and who submitted an address in the email message had a questionnaire mailed to their email addresses. Students who replied with an email but who did not submit a corrected address were mailed another survey. Subsequently it was noticed that students who stated they did not receive a questionnaire by email and who did not submit a new addresses, probably did not receive the additional questionnaires.

Too many persons responded to the email messages to handle the correspondence informally. Consequently, considerable additional effort was needed to develop a real-time tracking system. This system noted when the questionnaires were mailed, the addresses used, when email messages were sent, when additional copies of the questionnaire were sent and when replies were received. Since students may not realize that they submitted incorrect mail addresses, it was necessary to track where the questionnaires were sent. A problem with this system is that persons who have incorrect mail addresses that have questionnaires that are not returned and who do not respond to the email message, are still not participating in the survey.

The use of email for the follow up is a worthwhile procedure but it does require considerable additional effort on a daily basis. In addition, efforts should be made to more easily track the status of questionnaires. Perhaps a scanner which could automate entry of the returned questionnaires would help.

Survey of Non-Respondents

Some students who had not responded to either the first or second mailouts were contacted by telephone to determine the reasons for the non-response to the paper questionnaire.

The telephone survey was conducted on May 22, which was one day before the end of spring finals. The approximately 60 persons telephoned were drawn from a list of persons who had been sent questionnaires and whose questionnaires were not returned by the Postal Service. In addition, most had valid campus email addresses.

The questions asked were:

  1. Did you receive our survey, "Undergraduate Experience Survey, Spring 1998?"
  2. Did you respond?
  3. Could we have persuaded you to respond?
  4. On a scale of 1 through 5, with 1 being the best and 5 the lowest, how would you rate:
    1. Faculty instruction
    2. Availability of courses in your major
    3. Overall academic experience
    4. Overall quality of student life on campus
    5. UC Berkeley overall

Twenty-eight persons were reached. Of that number, 13 (46%) said they had received the survey, 8 (29%) said they had not, and 7 (25%) could not remember. If we re-percentage the figure including just those who remembered, then it seems as if 8/21 (38%) claim to have not received the survey. It is possible that some of those who claimed to not have received the survey are portraying their negligence to not reply in a socially acceptable manner. We have no way of determining if that is the case. Nonetheless, of the adjusted mailout count of 3860, 140 (3.6%) were returned as non-deliverable by the Postal Service.

A possibly reasonable approach would be to argue that as many questionnaires as were non-deliverable and returned were non-deliverable and not returned. The 38% non-receipt rate by itself seems too high. A guesstimate would be that perhaps twice the 3.6% or about 7% of the questionnaires were not delivered.

Most of those who did not respond claimed that nothing could be done to persuade them to respond. When asked if additional money would persuade them to respond, none replied yes. When asked if a shorter questionnaire would have persuaded them to respond, 11% replied yes. About 18% said that they would be persuaded to respond to an electronic version of the questionnaire.

Anecdotal reports from other UC campuses indicate that many non-respondents do not respond because they are under time pressure. That is, they may have fallen behind in their coursework or perhaps they perceive themselves to have fallen behind, and do not want to take the time to complete a questionnaire.

An additional question asking if respondents replied to surveys in general produced a 46% response rate indicating that respondents do not reply to surveys in general.

Although this survey does not provide substantive information about techniques for increasing the response rate, it still seems as if it is worth considering response rate enhancement techniques. For example, shifting distributing the questionnaire from the spring to the fall has had good results on some college campuses.

Another possible technique is to move to saturation surveying. This refers to using a minimal temporal separation between the survey instrument and the reminder notices. The concept behind saturation surveying is that of increasing the importance of the questionnaire by what is perceived to be frequent reminder notices. In this survey, there was about a ten day gap between the delivery of the survey instrument and the follow-up instrument, i.e., either a post card or electronic mail. With saturation surveying, the follow-up instrument would be received within 4 days of the initial instrument.

An informal tally of the replies to Question 4 seemed to roughly correspond with the responses to the corresponding parts of Questions 4 and 5 in the paper questionnaire. An insufficient number of replies were received to make a more precise comparison.

The survey of non-respondents, because of the additional effort required, probably should not be done for every survey.

Key Entry of Open Ended Question Replies

The traditional procedure was to have personnel in the Office of Student Research manually type in the replies to the open ended questions. This was a labor intensive procedure since it was an add-on task for those entering the replies.

The experimental procedure consisted of employing a key entry firm to enter all of the open ended question replies. Generally, the replies were short. Because at some point analysis and probably publication of at least some open ended replies will occur, replies with profanity, of which there were very few, were edited to remove the profanity.

One of the open ended questions asked those surveyed to list the author and title of the most recent book read for pleasure in this academic year. Although the replies are of general interest, it was not possible to tally the replies because of a lack of uniformity in reporting authors’ names and book titles.