Recognizing a Surface from Different Viewpoints:
Results from a Questionnaire Item and Task-Based Interviews

Teri J. Murphy, University of Oklahoma

Abstract: This paper will discuss a task-based questionnaire item involving visualization skills that results in noticeable performance differences between students who used computers and students who did not. Results will also be discussed from task-based interviews used to seek a deeper understanding of how students were approaching the problem.

Technology offers a compelling opportunity to enhance students' intuition about mathematical objects and processes by way of well-crafted visual representations. As visualization tools gain popularity, however, the mathematics community is in need of a deeper understanding of relevant issues: how visualization complements traditional approaches, how it contributes to learning, how to facilitate visualization that contributes to learning, and how such issues affect the inclusion of students whose approaches to mathematics do not conform to traditionally rewarded strategies.

The multivariable calculus course at the University of Oklahoma (OU), as at most other institutions of higher education, is the culmination of the calculus sequence in mathematics. In the prerequisite calculus courses, students study ways to analyze functions of one variable, that is, 2D objects that they can easily draw on paper. Multivariable calculus extends the analytic tools from the preceding courses to functions of two or more variables, that is, objects that live in 3-space. Understanding this content requires students to visualize the 3D objects being studied -- an ability that often eludes many otherwise successful students (Zimmermann and Cunningham, 1991). As we began the project, we presumed that the use of technology would be beneficial somehow, but not (necessarily) to enhance traditional skills. Rather we concentrated our efforts on employing Mathematica's powerful graphing capabilities where they are most immediately needed: with the highly visual content of multivariable calculus.

We used computers in three ways:

  1. We produced graphics and animations for display during class.
  2. We assigned CAS-based problems for the students to do.
  3. We created a web site with the intent that all students -- not just those designated to participate in the project -- could make use of the resources there.

A primary resource we offer is sets of graphics in sequence meant to supplement typical textbook illustrations that tend to be too busy for students to decode. We hope that by "breaking down" the illustrations we can help students understand and internalize certain mathematical processes (e.g., construction of a tangent plane or approximating a volume with boxes) and also enhance visualization skills of the objects being studied. For examples of the graphics and animations that we designed see the Calculus at OU website; see also Murphy, Goodman, & White (in press) and Murphy, et al (1999) for further discussions of how we employed computers in our multivariable calculus classes.

As part an effort to assess our activities we administered a questionnaire to students in all six sections of multivariable calculus in Spring 1999 -- three project sections and three non-project sections. It was administered after the bulk of the "visualization" part of the course (2 of 3 chapters) and included demographic and background items, a computation item (compute a second partial derivative) and a "visualization" item (select from a set of graphs the ones that represent a given surface viewed from a different perspective), and items about the usefulness of course components.

Results from ongoing analysis include:

  1. Students in the project sections said that they use calculators, a computer algebra system, other software, and the internet. Students in the other sections said they didn't -- except for the internet in a stifled way (Murphy, Goodman, & White, in press).
  2. Students in the project section said that every component of the course helped them learned the material. This result suggests that students want all the help they can get (Murphy, Goodman, & White, in press).
  3. Class means on the computation item (below) were not statistically significantly different. This result indicates that the students who used the CAS did not lose computation skills.

    Calculate fxx (i.e., , the second order partial derivative of f with respect
    to x both times) for the function . Show your work.

  4. For the visualization item (below), the proportion of students responding correctly on five of the six choices were not significantly different between project and non-project sections (proportions on shown in the figure below). However, on the remaining distractor (choice (c)) the success rates were significantly different (t(120) = 2.6, p < .005).

Consider the graph at the right. Identify which of the graphs below depict the same surface but from a different viewpoint. Circle all that apply. Note: "x" marks the positive x-axis, "y" marks the positive y-axis, and "z" marks the positive z-axis. The same domain was used for all graphs.

The proportion of students responding correctly to each choice are shown on the figure, non-project vs. project

(a) does not match: 88% vs. 84%

(b) matches: 86% vs. 91%

(c) does not match: 65% vs. 84%

(d) does not match: 91% vs. 93%

(e) matches: 86% vs. 85%

(f) does not match: 91% vs. 95%

While this last item is not conclusive, the results for choice (c) indicate that something interesting is happening with this item: significantly fewer -- nearly 20% -- of the project students than the non-project students were distracted by this distractor. Combined with the negligible differences in the other five choices, this result deserves special attention in the form of additional data collection.

To this end, we have conducted task-based interviews with seven students. Ongoing analysis indicate that successful students seek specific characteristics of the original surface to use as a means of eliminating choices. Three choices are eliminated early based on one characteristic. Two choices match the original graph and all seven students chose these two as matches after being unable to eliminate them. Six of the seven students eliminated the remaining distractor as their last step in solving the problem. All seven of the students sought a second characteristic of the original graph to use in the elimination process. Since all seven students ultimately answered the question correctly (as expected based on how the students were selected), we can not say for certain how the students who answered incorrectly on the questionnaire (particularly those in the non-project sections) went about solving the problem. We speculate, however, that these students neglected to seek additional characteristics of the original graph and accepted the choice as a match based on the initial characteristic they were using.

Ultimately, instructors involved in the project were not asking whether to use technology but how to use it in effective ways to facilitate student learning. This question can only be answered by understanding how students learn. In this case, we seek a deeper understanding of how students learn to visualize and to work with the 3D objects in multivariable calculus. To this end, we will continue to collect and analyze questionnaire and interview data.


Murphy, T.J., Goodman, R. E., & White, J. J. (in press). Using the WWW in multivariable calculus to enhance visualization. Internatioonal Journal for Engineering Education.

Murphy, T. J., White, J. J., Kline, B. J., Black, E., Goodman, R., & Hofer, M. (1999). Using Mathematica with multivariable calculus. Proceedings of the 1999 American Society for Engineering Education Annual Conference and Exposition.Charlotte, NC.

Zimmerman, W. & Cunningham, S. (Eds.). (1991). Visualization in Teaching and Learning Mathematics. (MAA Notes No. 19). Washington, DC: Mathematical Association of America.