Evaluation of the Course Revision

In order to evaluate student efficacy as a result of the revised pilot section of CMS 3270 Micro-Based Software in the Fall of 2001, we developed and administered a Student Satisfaction Survey. We personally were satisfied with our revision plans, but what did the students think of the class? Micro-Based Software had always been a popular elective, and we wanted it to remain so. The 10 students in the pilot class had not experienced the previous curriculum, but they did know they were signing up for a pilot class that was a significant revision of what they might have heard 3270 was all about. The section was added and scheduled at a time after the initial offering of the classroom section. Half of the students in the section were Aviation Management majors from the School of Professional Studies. The other five were CMS majors from the School of Business, taking an elective course. All were juniors or seniors at Metro. The instrument was designed to measure the degree to which a student agreed with our main objectives, not as a comparison between the two curricula.

Students in the pilot class in the Fall of 2001 were generally pleased with the revised curriculum and their grades were consistent with grades in the traditional section of Micro-Based Software. The following semester, we incorporated the changes into all sections of the course, including the Web offered section. While there are noteworthy differences in a course taken online and one taken in the classroom, the problem solving scenarios and the assessment of student learning remain the same. Students in the classroom section become very close, and the experience is truly rewarding for the instructor.

The first time the course was taught online presented a few problems in the newly developed problem solving area. We divided the students into six groups of five students each. In our online system, there is a place called Profile for the students to enter their names, phone numbers, and any other information they would like to share. We told the students to be sure to put their phone numbers where they wanted to be contacted and their e-mail address in there. We put the list of group names in the e-mail area called the Forum and told the students the first person on the list was responsible for setting up the meeting times and task allocations. The first problem was posted in a bulletin board type place in the Forum. The students were told what was expected of them as they were in the online class. However, there was no way for them as a group to ask the instructor questions. Some of them asked questions in e-mail and others just worked it out for themselves. The result was not always as satisfactory as it was in the classroom environment, where there was more give-and-take and discussion of what was expected from them. Another major problem is students can decide to drop the course and not participate; an official drop slip does not have to be processed until about six weeks into the semester. Thus, teammates (and instructors) do not know a student has dropped, and someone given an assignment may just not do it and not even tell the team.

Table 1: User Satisfaction Survey, n = 8

Question

(Disagree)

Scale

(Agree)

 

0

1

2

3

4

5

Confident of ability to solve problems

   

2

5

1

Proficient in advanced WP skills

    

7

1

Proficient in advanced SS skills

   

3

5

0

Proficient in introductory DB skills

   

3

3

2

Proficient in Office 2000

    

5

3

Improved analytical reasoning ability

    

5

3

Class syllabus and class notes were helpful

 

1

0

2

4

1

Ability to self-learn new applications

   

2

2

4

Learned from classmates

   

5

3

0

Course content too much work

1

1

1

4

0

1

Course content too little work

1

1

3

3

0

0

Expected grade in course (A=5)

   

2

3

3

Note: Two students from the original 10 dropped out after the September 11 World Trade Center incident. They were Aviation Management majors.

Students from the online course were asked to answer in anecdotal fashion three questions pertaining to (1) what they liked about the course, (2) what they did not like about the course, and (3) what they thought could be done to improve the course. Generally, they wrote very favorable comments. They really liked the software tutorial portion of the course and the ability to become more proficient in Office. Of course, as in all online courses, they liked the ability to do the majority of work at home. (Two quizzes must be taken at the school's testing center and several of them objected to that. However, our department has adopted a policy that tests cannot be taken online as we want some assurance that the person signed up for the course is doing the work.)

Seventeen students out of 26 answered the survey and six of them objected to the group problems, mostly because of the difficulty of getting together with their teammates. They also felt the group problem scenario was incorporated in enough CMS courses already, and they just wanted to learn Office. However, they really liked doing the Microsoft Office User Specialist tests (MOUS) and learning the latest Office software package.

It might take more organization on the part of the instructor to arrange the problem-solving portion, but it can be duplicated online, which was one of our major considerations. We do not offer Web courses in CMS that cannot closely approximate the classroom environment, since our administration transcripts do not distinguish between a course taken online and one taken in the classroom. Both environments have the same curriculum and close to the same delivery.



Computing Information Technology. The Human Side
Computing Information Technology: The Human Side
ISBN: 1931777527
EAN: 2147483647
Year: 2003
Pages: 186

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net