Publications

(Last modified 06 January 2008)

Home

Team Members

Projects

Opportunities

Publications

Software

Sponsors

 

 

Chris's Homepage

 

Journal Articles

Book Chapters

Conference Papers

Workshop Papers

Technical Reports and Unpublished Writings

 

Journal Articles

  • Hundhausen, C.D., Farley, S.F., & Brown, J.L. (Under review). Can Direct Manipulation Lower the Barriers to Computer Programming and Promote Transfer of Training? An Experimental Study. Submitted to ACM Transactions on Computer-Human Interaction on 26 December 2007.

    Presents an expanded version of our VL/HCC 2006 Best Paper Award winner, which experimentally compared a direct manipulation and textual programming interface for novices. This expanded article includes a detailed post hoc video analysis of participants' programming processes, which sheds further light on the experimental results.

  • Hundhausen, C.D., & Brown, J.L. (2007). An Experimental Study of the Impact of Feedback Self-Selection on Novice programming . Journal of Visual Languages and Computing 18(1), pp. 537-559.

    Presents an experimental evaluation of the impact of one dimension of live editing—feedback self selection—on novice programming performance. Results indicate that, as long as feedback is delivered without delay, there exists no difference between receiving feedback automatically, and receiving feedback only on-request.

  • Hundhausen, C.D., & Brown, J.L. (2008). Designing, Visualizing, and Discussing Algorithms within a CS 1 Studio Experience: An Empirical Study. Computers & Education 50(1), pp. 301-326.

    Presents an empirical comparison of art supplies and the ALVIS Live! algorithm visualization software within the context of a "studio experience"—a novel CS 1 pedagogical activity in which student pairs develop solutions to algorithm design problems, create accompanying visual reprsentations, and finally present their visual solutions to the class for feedback and discussion. The centerpiece of the article is a series of post-hoc content analyses of the presentation sessions. These analyses highlight not only the pedagogical benefits of visualization-mediated discussions, but also the pedagogical tradeoffs of art supplies and ALVIS Live! in this context.

  • Hundhausen, C.D., & Brown, J.L. (2007). What You See Is What You Code: A 'Live' Algorithm Development and Visualization Environment for Novice Learners. Journal of Visual Languages and Computing 18(1), pp. 22-47.

    An extended version of our VL-HCC '05 paper, this article presents and evaluates a "live" editing model to support novice programming and visualization.

  • Hundhausen, C.D. (2005). Using End User Visualization Environments to Mediate Conversations: A ‘Communicative Dimensions’ Framework. Journal of Visual Languages and Computing 16(3), pp. 153-185.

    Drawing both on a theory of communication, and on empirical studies in which end user environments were enlisted to support human communication, proposes a provisional framework of six 'Communicative Dimensions' of end user visualization environments. By characterizing those aspects of end user visualization environments that impact social interaction, our framework provides an important extension to Green and Petre's 'Cognitive Dimensions' framework.

  • Suthers, D., Hundhausen, C.D., & Girardeau, L. (2003). Comparing the roles of representations in face-to-face and online computer supported collaborative learning. Computers & Education 41(4), pp. 335-351

Describes an empirical comparison of face-to-face and distal (in separate rooms, communicating via textual chat) collaborative scientific inquiry. Participants in each condition were asked to use the Belvedere knowledge-mapping software to represent data items, hypotheses, and evidential relations as they worked through a science challenge problem. Dependent measures included post-test performance, and counts ond percentages of verbal and representational acts dedicated to various types of discourse. While no significant differences in learning outcomes were found, significant differences between the two groups' discourse and activity were found.

    Presents two ethnographic field studies of a junior-level algorithms course that included exercises in which students constructed and presented their own visualizations of algorithms under study. Note that this is a journal article version of Chapter 4 of my dissertation.

  • Naps, T., Roessling, G., Almstrum, V., Dann, W., Fleischer, R., Hundhausen, C., Korhonen, A., Malmi, L., Mchally, M., Rodger, S., & Valazquez-Iturbide, J.A. (2003). Exploring the role of visualization and engagement in computer science education (Report of the ITiCSE 2002 Working Group on "Improving the Educational Impact of Algorithm Visualization"). SIGCSE Bulletin 35(2) , 131-152.

    Reports three surveys of computer science faculty (n ~= 150 for all studies combined) that focused on their opinions and use of algorithm visualization technology in undergraduate computer science education. Proposes a framework for studying the impact of student engagement with algorithm visualization technology on learning.

  • Suthers, D., & Hundhausen, C. (2003). An experimental study of the effects of representational guidance on collaborative learning processes. Journal of the Learning Sciences 12(2), 183-219.

    Presents an empirical comparision of three representational environments (plain test, graph, and matrix) for recording data and hypotheses within the context of a scientific challenge problem. Performs various qualitative and quantitiative post-hoc analyses of participants' talk (content analysis) and representational artifacts to draw conclusions regarding the differences in the representational environments.

  • Hundhausen, C.D., Douglas, S.A., & Stasko, J.T. (2002). A Meta-Study of Algorithm Visualization Effectiveness. Journal of Visual Languages and Computing 13(3), 259-290.

      A greatly-revised and re-focused version of my original Ph.D. comprehensive exam paper "A Meta-Study of Software Visualization Effectiveness" (see below), this paper performs a systematic meta-analysis of 24 experimental studies of algorithm visualization effectiveness.

  • Hundhausen, C.D. & Douglas, S.A. (2002). Low Fidelity Algorithm Visualization. Journal of Visual Languages and Computing 13(5), 449-470.

    A journal article version of the empirically- and theoretically-driven algorithm visualization system design aspects of my dissertation. This article has been reviewed once, and will be resubmitted with revisions for eventual publication.

    Back to Top

 

Book Chapters

Back to Top

 

Conference Papers

    Presents a provisional framework of communicative dimensions, derived from my ethnographic fieldwork of an undergraduate algorithms course (see article below), that describe aspects of end-user environments that impact human social interaction. Here are the presentation slides.

    Draws on ethnographic studies conducted as part of my dissertation research to motivate and define the requirements for a new breed of "low fidelity" algorithm visualization technology to be used as part of an alternative teaching approach in which students construct their own visualizations, and then present those visualizations to their instructor and peers for feedback and discussion. Presents SALSA (Spatial ALgorithmic Language for StoryboArding), a high-level, interpreted language for programming low fidelity visualizations, along with ALVIS (ALgorithm VIsualization Storyboarder), a graphical environment for constructing SALSA scripts by direct-manipulation. SALSA and ALVIS pioneer a novel spatial relations-based method for defining algorithm visualizations, along with a novel visualization presentation interface that supports reverse execution, and dynamic mark-up and modification.

  • Hundhausen, C.D. & Douglas, S.A. (2000). Using Visualizations to Learn Algorithms: Should Students Construct Their Own, or View an Expert's? In 2000 IEEE Symposium on Visual Languages (pp. 21-28). Los Alamitos, CA: IEEE Computer Society Press.

    Motivates and presents an controlled experiment that tested the hypothesis that students who construct their own algorithm visualizations will learn an algorithm better than students who view an algorithm visualization constructed by an expert. This experiment is also presented in chapter 6 of my dissertation.

      Illustrates the manner in which a research method called visualization storyboarding, together with a semantic-level analytical framework, can be used to derive an empirically-based, semantic-level software visualization (SV) language for the bubblesort algorithm. Demonstrates how the semantic-level language can be used as a framework for evaluating the usability of existing computer-based SV systems.

  • Naps, T.L, & Hundhausen, C.D. (1991). The evolution of an algorithm visualization system. Proc. 24th Annual Small College Computing Symposium (Morris, MN), 252-257.

    Describes the four-year evolution of the GAIGS (Generalized Algorithm Illustration through Graphical Software) algorithm visualization system, which has been used as the basis for the laboratory component of Lawrence University's computer science courses since 1988. PC- and Windows-based versions of the system, complete with extensive documentation and supporting software, are available via anonymous ftp from Lawrence's ftp site in the /anonymous/math directory.

    Back to Top

 

Workshop Papers

  • Crescenzi, P., Hundhausen, C., Stasko, J., Faltin, N., Naeher, S., Fleischer, R., Roessling, G., & Sutinen, E. (2002). The Algorithm Animation Repository. Paper presented at the Second Program Visualization Workshop, Hornstrup Centret, Denmark, June 27-28.

  • Hundhausen, C.D. (2002). The Algorithms Studio Project. Paper presented at the Second Program Visualization Workshop, Hornstrup Centret, Denmark, June 27-28.

  • Hundhausen, C., & Douglas, S. (2000). Low fidelity algorithm visualization. Paper presented at The Visual End User Workshop, 2000 IEEE Symposium on Visual Languages, Seattle, WA, September 10.

  • Hundhausen, C.D. (1999). Using Representations to Assess Level of Membership in a Community of Practice. Paper accepted for presentation at the CSCL '99 Workshop "Collaborative Use of Representations: Analyzing Learning Interactions."

    Develops an empirical method, rooted in Cultural Consensus Theory, that uses the way in which an individual constructs and interprets external representations as a basis for quantitatively assessing that individual's level of membership in a community of practice.

  • Hundhausen, C.D. (1994). Toward the development of highly interactive software visualization systems: A user-centered approach. Paper presented at the International Workshop on Software Visualization, SIGCHI '94 (Boston, MA).

    Outlines the human-process centered approach to interactive software visualization that has formed the foundation of my research, and describes my forays into exploring two of those processes using a research technique called constructive interaction.

    Back to Top

 

Technical Reports and Unpublished Writings

Presents a critique of software visualization technology with respect to its effectiveness in the tasks for which it is designed, drawing on published empirical research. I aim to submit a revised version of this paper to the Journal of Visual Languages and Computing.

Douglas, S.A., Hundhausen, C.D., & McKeown, D. (1995). Toward empirically-based software visualization languages. Technical Report CIS-TR-95-12, Department of Computer & Information Science, University of Oregon, Eugene.
      Presents a polemic against using factors analysis as a means for assessing the effects of algorithm visualization on learning, drawing extensively from the first ever empirical study to consider the value of algorithm visualization as a learning aid (viz., J. Stasko, A. Badre, & C. Lewis. "Do algorithm animations assist learning? An empirical study and analysis. In Proc. INTERCHI '93 (Amsterdam, The Netherlands), pp. 61-66, 1993).

      Critically assesses the claims that computer scientists have made about algorithm visualization systems by attempting to quantify and qualify their benefits. The analysis draws from around 50 published sources spanning several disciplines, including cognitive psychology, cognitive science, and computer science.

Back to Top