Looking back on the course pilot itself, what worked best?
Our opinion surveys and focus groups indicate that students adapt well to the redesigned course, after an initial adjustment period. A pre-semester training program for the peer tutors appears to have reduced the occasional but persistent complaints about the effectiveness of these helpers.
What worked least well?
Perhaps the greatest implementation problem was the unreliability of Emporium hardware and course software (including the database system) in the first year. This distracted Emporium helpers from the main job of assistance with learning and contributed to general student annoyance with the novel idea of doing coursework in the computer lab setting. These problems have now been overcome; the presentations and quizzes run smoothly, and the Math Emporium has blended into the teaching scene (and is, in fact, a preferred place to work for many students).
A Sun 450 Enterprise server, a large budget item for the grant, went on line this fall. Delays in getting it ready led to some short-term reliability problems at the beginning of the semester. Software has been upgraded to Authorware 5, enabling us to improve interactivity of the presentations. For the quizzes, the upgrade improves database connectivity and reduces administrative overhead for special needs students.
At this stage, the quizzes perform well technically, but to avoid significant proctoring, we keep them at a fairly low level in terms of difficulty and weight in the course grade. Experience has shown that students will not take the quizzes seriously unless they correspond to at least a minimal amount of grade credit.
We originally planned to link exercises using MATLAB directly to the course delivery system. Prototypes were developed, but they did not appear to add enough value to be worth pursuing. At this point there are no plans to add this kind of module to the system.
What are the biggest challenges you face in moving from the course pilot to the project's next phase?
On the development side, the main focus of activity is the online testing engine and a database of questions for it. The testing engine has been in use for traditional multiple choice quizzes in three math courses, and both midterm tests will move online next semester. By the spring of 2001, we will have in place a test recovery capability, allowing students to continue a partially completed test interrupted by network or machine failures with a minimum of information loss and inconvenience. Additionally, we will be making tools available to instructors that will permit easy configuration and previews of tests to be administered. A grade reporting/test assessment tool will also be available to instructors; it will allow them to access details of a student's performance on a test and examine statistics aggregated class-wide or across the question database as a whole.
Program in Course Redesign Quick Links: