CS295J/Literature to read for week 4.11: Difference between revisions

From VrlWiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 2: Line 2:
: Again, discusses a way to evaluate a user's cognitive load. Again, this is relevant to the reviewers comments that there lacked a plan to evaluate how well the visualizations would jive with cognitive principles. If we are using them properly, then cognitive load levels would be low. (Owner: [[User:Jenna Zeigen|Jenna Zeigen]], Discussant: [[User:Clara Kliman-Silver|Clara Kliman-Silver]], Discussant: ?)
: Again, discusses a way to evaluate a user's cognitive load. Again, this is relevant to the reviewers comments that there lacked a plan to evaluate how well the visualizations would jive with cognitive principles. If we are using them properly, then cognitive load levels would be low. (Owner: [[User:Jenna Zeigen|Jenna Zeigen]], Discussant: [[User:Clara Kliman-Silver|Clara Kliman-Silver]], Discussant: ?)


* [http://dl.acm.org.revproxy.brown.edu/citation.cfm?id=1357101&CFID=49369132&CFTOKEN=47488919 Integrating statistics and visualization: case studies of gaining clarity during exploratory data analysis] Perer-2008-ISV (Owner: Diem Tran, Discussant: ?, Discussant: ?)
* [http://dl.acm.org.revproxy.brown.edu/citation.cfm?id=1357101&CFID=49369132&CFTOKEN=47488919 Integrating statistics and visualization: case studies of gaining clarity during exploratory data analysis] Perer-2008-ISV (Owner: Diem Tran, Discussant: [[User:Jenna Zeigen|Jenna Zeigen]], Discussant: ?)
:This paper tempts to integrate statistics methodology and visualizations in interactive exploratory tools. The authors conduct long term case studies to evaluate this approach, which give positive results about the effectiveness of the approach.  
:This paper tempts to integrate statistics methodology and visualizations in interactive exploratory tools. The authors conduct long term case studies to evaluate this approach, which give positive results about the effectiveness of the approach.  
:The paper is an example of how exploratory tool can be evaluated in a long-term manner.
:The paper is an example of how exploratory tool can be evaluated in a long-term manner.

Revision as of 18:00, 28 September 2011

Again, discusses a way to evaluate a user's cognitive load. Again, this is relevant to the reviewers comments that there lacked a plan to evaluate how well the visualizations would jive with cognitive principles. If we are using them properly, then cognitive load levels would be low. (Owner: Jenna Zeigen, Discussant: Clara Kliman-Silver, Discussant: ?)
This paper tempts to integrate statistics methodology and visualizations in interactive exploratory tools. The authors conduct long term case studies to evaluate this approach, which give positive results about the effectiveness of the approach.
The paper is an example of how exploratory tool can be evaluated in a long-term manner.
Claims that the design of visualizations can be informed by cognitive principles and, based on those principles, be automated. Their research has implications in any domain in which visualization in useful, including our proposed project. This is relevant to Jenna's response to reviewer #4, by showing that cognitive principles can and have been implemented in the design of visualizations. (Owner: Michael, Discussant: Jenna Zeigen, Discussant: ?)
This work first proposes a visual variability analysis technique, it analyses the correlation between functional goal models and softgoals(requirements variants), and tries to find the most satisfactory goal models and requirements. This can be used to illustrate how cognitive models like goal maintainence and multi task can be designed and achieved in our project. And the paper also implemented a tool to support the technique and this could give us some insights of how to put cognitive model into practice. (Owner: Chen, Discussant: ?, Discussant: ?)
This article reviews modern eye tracking techniques as they aim to monitor different aspects of cognitive load. While the techniques employed are aimed at instruction, they are relevant to the project in that users will, presumably, have to learn to use the tools we build. It reviews different methodologies currently employed in eye tracking and cognitive load research. Cognitive load plays a significant role in any sort of data visualization, and therefore, information from this article should inform many parts of the proposal (Owner: Clara, Discussant: ?, Discussant: ?)
A reviewer worries, "The primary risk in this proposal is that the cognitive models which can be created are too weak to support the design process." We can provide evidence that this risky part is feasible by pointing to the findings cited in this article (and others from the issue that the title refers to) of successful uses of cognitive models for developing and designing interfaces.
(Owner: Nathan, Discussant: ?, Discussant: ?)
The paper presents a taxonomy of ten primitive analysis task types for information visualization, providing a language and vocabulary to discuss and evaluate these tools. The taxonomy was created in part by analyzing student questions about data visualizations; the task types are motivated by real analysis needs. Using a taxonomy like this, or building one, would be important in the proposed "Task Analysis" section Caroline mentioned in her reviews response. We need to be more explicit about the task types we can model, both at the specific level of brain diagram tasks and in a more generalized form. (Owner: Steven Gomez)
Provides several business models of sustainable open source development. It provides case studies for each of proposed open-source business models. The Macro R&D model provides the most relevant model and case study for our purposes. This could easily be relevant if the researchers are looking to create long-term tools that need to be supported long after the research funding ends, which answers the specific critique that the proposal did not adequately address the issues of sustainability.
This is a study of visual analysis behavior that uses an unstructured environment to study how analysts deal with data outside of the constraints of a specific visualization system. Analysts and pairs of analysts were given a collection of physical cards containing information on an infectious disease outbreak, and were allowed to arrange, organize, and take notes on this information as they saw fit. Their behavior is analyzed in terms of the visual metaphors and organization strategies they employ. While this specific model might make more sense for a visual analytics application than for scientific visualization, I think that creating a situation in which you can study users' behavior outside the constraints of software is a good strategy for understanding a task at a high level.
(Owner: Caroline, Discussant: ?, Discussant: ?)
This paper serves as a good introductory reading for cognitive task analysis by giving the definitions and introducing and categorizing the existing cognitive task analysis methodologies. The paper also presents a human-centered information-processing model for cognitive task performance. This model focuses on identifying all cognitive aspects of human performance in technical work, and may serve as reference if we need to break down the brain diagram exploration process into sub-tasks. (Owner: Hua, Discussant: ?, Discussant: ?)