Chewing Gum and Duct Tape:
Creating an Online Survey
Application from Available Parts
Patrick Burke, MA, Sandy Cook, Ph.D.
and Kevin Tomczyk
The
Slice of Life 2003,
Background/Problem:
o
Online administration was desired by faculty, staff, and
students
o
Commercial survey products did not meet the requirements or
were prohibitively expensive.
o
Information systems programmers had a two years project
backlog for developing new system.
o
Paper forms, administered during exams, averaged > 75%
return rate over the last two year (courses without exams averaged <
45%).
o
Needed a system that was as effective as paper, easy to
administer, did not require computer programming, and was cost effective.
Major
System Requirements:
Controlling
access was the major system barrier:
o
Did not want to set up and maintain own login system that
duplicates other campus logins.
o
Did not want to give students yet another login and
password.
o
Needed a system with minimal support overhead
o
Needed to maintain student anonymity while being able to
track responses
o
Existing courseware program was inadequate to meet needs
Method
and Tools:
o
Combined a web-based courseware system (Blackboard), for
authentication and creation of simple assessments with a one question
assessment (I submitted my rating form – Yes or No)
o
Linked to web-based PDF files (Teleforms
by Cardiff.com a product that creates scannable and
online forms).
o
Leveraged existing technologies to create an evaluation
system that met primary requirements.
o
Piloted hybrid system with two spring quarter courses that
did not have in-class exams and often resulted in poor return rates.
o
The forms were available the last week of class and finals
week, students were informed of the assessment via class announcements and
multiple email reminders.
Results:
o
Following what was considered a successful pilot (56%
average return), we used this method for all first and
second year courses beginning the Fall Quarter 2002.
o
Rate of written comments increased from 43% last year to 73%
(autumn and winter quarters).
o
The comprehensiveness of the comments increased.
o
Students indicated they liked the opportunity to fill out
the forms away from the pressures of the exams.
o
Computer crashes, the main obstacle, were easily resolved
using Blackboard's assessment reset feature.
Unanticipated
consequences:
o
Without guidelines, students used the online system as a
means expressing unprofessional responses, under the guise of anonymity. A follow-up was needed to discuss purpose of
comments and professional behavior.
o
Faculty began request more and more forms for students to
complete. Potential over saturation of
online rating forms could affecting overall response
rates. Follow-up with faculty on
judicious posting of forms was needed.
Conclusion:
The
combination of the two programs fulfilled all of our stated needs:
o
Flexibility: using existing PDF assessment forms it was flexible, and
not limited by HTML or commercial products.
o
Authentication with anonymity: it effectively limited
access to authorized users yet not linked to submitted data.
o
Resources: significantly decreased the post submission data
processing and no need to purchase new program.
o
Response Rate: With multiple reminders, rates were better than
non-exam administrations and close to paper administration. With an expectation
that, as students become more facile with the system, the rates will improve.
In addition students provided more comprehensive comments.
Novelty/Discussion:
o
This project represents a significant paradigm shift.
o
Before this project, the question was always, "Build or
buy?"
o
This project effectively combined two existing, deployed,
robust technologies to do what neither could independently.
o
In the future, the success of this small project means the
question will change from "Build or Buy?" to "Can we do it,
faster, more reliably, and for less cost with what we have already?"