DREW, CAN WE USE YOUR ONLINE COURSE AS A GUINEA PIG?
by Andrew Habermacher
In February 2004, members of the FIPSE “Quality Matters” project management team asked if I would permit my online cultural anthropology course to be used as a test case in the training workshop. They needed an example of an actual online course to train peer reviewers in the use of the newly developed quality evaluation rubric. After thinking it over, I decided it was a good opportunity to find the strengths and weaknesses in the course, and to discover how to improve the course. So, I agreed to guinea pig the course at the March 12, 2004 FIPSE Quality Matters training workshop to be held at Community College of Baltimore County, Catonsville, MD.
I felt the online anthropology course was pretty good since it had been through a continuous process of change and revision for nearly four years, and during that time it had been routinely offered at Prince George’s Community College and occasionally through the Maryland Consortium. Though I was sure the course could be improved, I had done pretty much everything I knew to do to make the course effective. I thought putting the course through the peer review process would be helpful to find blind spots and issues I had not thought to address. I hoped submitting the course for review by experienced teachers/developers of online courses from various institutions around Maryland would provide useful recommendations for improvement, and perhaps some validation for those parts of the course which were well conceived.
Though hopeful the peer reviewers would find the online anthropology course acceptable, I really was not sure how it would fare in their hands. When I developed the course, I did not have the rubric questions to be used by the peer reviews as a guide. However, I knew the rubric questions were based on published research on standards of good practice, and I had considered some of those in designing and revising the course. So, I was very, very curious to see what the rubric-based peer review of the course would reveal.
I was pleasantly surprised to learn that, using the Quality Matters rubric, the 27 peer reviewers gave the anthropology course a passing grade (75 points out of 81 or 92.6%). They felt the course was well organized, clearly presented, and had a variety of appropriate assignment types that related to the course objectives. They also thought the course had plenty of interactive components (notably, graded discussion board conferences).
The peer review of the course found four areas needed some attention. I have listed them below, and followed each with a comment of my own.
Reviewers’ Comment: Minimum technology requirements, prerequisite knowledge,
My Response: This is an accurate criticism. These are, however, clearly stated on the college’s distance learning Web page, so I would quibble about the need to also include them in the course itself.
2. Peer Reviewers’ Comment: Though clearly stated in the syllabus, the course learning objectives are not articulated and specified on the module/unit level.
My Response: This is certainly true and can be changed, since each lesson does relate to one or more of the learning objectives. However, I question whether this level of redundancy is really necessary in the course. I am not really convinced it would make the course better for students, but it would certainly make it easier for peer reviewers to know if all the units relate to one or more objectives.
Peer Reviewers’ Comment: Course instructions do not explain how
the college's student academic
My Response: The college support service of primary utility to students taking this course is the Writing Center. The Analytical Essay Guide in the Course Documents area of the course does explain how students in need of this service may use it. I suppose it could also be put in the syllabus.
4. Peer Reviewers’ Comment: Web pages do not provide equivalent alternatives to auditory and visual content.
My Response: This issue is in the Americans with Disabilities Act (ADA) portion of the rubric, and I must confess that in developing the course I did not consider such issues. I would need some guidance on how to do this since I am not knowledgeable about the ins and outs of the ADA requirements.
The peer review of my online course was a positive and helpful experience. Other online instructors may also wish to participate in the course peer review process. Reviewers’ comments can help you know if your course meets the quality standards for online teaching in the areas covered by the newly developed Quality Matters rubric. Online faculty can discover what is strong about their courses, and also what the rough areas may be and how to fix them.
This guinea pig survived; humans should, as well.
The Instructional Area Newsletter, Volume 19, No. 3