![]() |
Abstract |
|
![]() |
1. Background Students who take digital media courses are interested in learning how to do photographic processing, audio/video production, and multimedia programming with the tools and languages of the day. For practical reasons, digital media curriculum needs to include specific application programs -- for example, Photoshop, Premiere, and Flash. However, focusing too narrowly on specific application programs has its drawbacks. One program works differently from another, even within the same medium (e.g. digital imaging or digital audio). In addition, versions of these programs change quickly, sometimes within a matter of months. Our goal is to create material that explains fundamental concepts of lasting utility, and to explain these concepts in a way that facilitates the students' ability to work in application programs and to be adaptable to new applications when they encounter them. For the primer and art module -- which is aimed at students who may have a non-technical or non-mathematical background -- this entails using simple analogies from everyday life, and showing how the concepts are important to their use of digital media tools in a generic explanation. For the computer science module -- which explains the mathematics and algorithms behind the scenes of digital media -- this entails identifying the material that is important to the students' advanced use of their tools. In the case of all three modules, it is important to identify the concepts that students most often misunderstand, and to find better ways to explain these. An additional component of our digital media curriculum development project is an assessment of the pedagogical value of the material. Meaningful assessment is never easy in educational research, and in some ways we have found this part of our work to be the most difficult. The digital media course we teach is offered only once a year and typically has between 10 and 30 students in it -- a small population on which to base a significant statistical analysis. The small size of our classes also precludes doing comparative studies where one group uses our curriculum material while another uses other sources. In any case, such comparative studies entail treating groups of students unequally and risk disadvantaging one group in favor of another. Comparing student performance from one year to the next is another approach. But this method of assessment is slow-going, and results are difficult to interpret when comparison groups have different backgrounds and skill levels from the outset.
In developing an assessment plan, we are dealing with these problems
in two ways. First, we are working with another university with
large classes where
comparative
studies or pre/post testing might yield statistically significant results.
Secondly, we are trying to devise a common-sense, practical approach to
assessing our own
students' performance based on observing what they do and do not understand.
This paper describes an experiment recently conducted to help us develop
our "in-house" assessment
strategy. |
![]() |
![]() |
3. Assessment of Student Learning
and Student Reaction to Curriculum Material The component most often not completed by the students was the MatLab exercise in Non-Linear Companding. In general, the students found the class lectures most helpful, with the text-based material a close second. With regard to their preference for the text-based material over the on-line material, we find it difficult to draw any conclusions. The on-line demos for the audio unit (Sound Fundamentals and Audio Dithering) are not the best ones among our curriculum material in the sense that they are fairly short and not as interactive as they could be. The exercise on Non-Linear Companding -- also an interactive computer-based exercise -- was carefully constructed and we expected it to be helpful in clarifying the concepts of µ-law encoding both graphically and mathematically. However, half the class failed to complete this exercise, probably because it was the last assignment handed out prior to the closed-lab, and also because the students were required to use MatLab to do the exercise, a program that some of them had not used before. When pressed for time, students omitted doing this exercise since they were not required to turn in their answer sheet for it. |
![]() |
![]() |
![]()
|
![]() Table 1. Students' rating of how much they learned from each learning unit. |
![]() |
|
![]() |
![]()
|
![]() Table 2. Assessment of students' knowledge and reasoning as evidenced by the closed lab worksheet. |
![]() |
4. Conclusions Related to Future
Plans for Assessment
|
![]() |
![]() |
5. References Maor, Dorit, and Ken Knibb, "Video Analysis: A Qualitative Tool for Investigating Students' Learning in a Constructivist-Oriented Multimedia in a Science Classroom, http://www.aare.edu.au/99pap/mao99401.htm. 6. Acknowledgments |
![]() |