Cognitive Modeling and Eye-tracking: Difference between revisions

From VrlWiki
Jump to navigation Jump to search
added links to new proposal resources
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== Proposals ==
We have submitted several proposals to NSF related to cognitive modeling.
* [[/NSF_Proposal|September 2015 proposal]]
* [[/NSF_Proposal#Past_Proposals_to_NSF|Proposals prior to 2015 that were not funded]]
== Resources ==
== Resources ==
Eye-tracking resources:
Eye-tracking resources:
Line 53: Line 58:
|-
|-
|}
|}
Other:
* [http://www.cogain.org/wiki/Eye_Trackers List of eye-trackers compiled at COGAIN]


== Agenda Items ==
== Agenda Items ==
Line 60: Line 68:
#* Can we enumerate some experiment ideas?  Are they useful and novel?
#* Can we enumerate some experiment ideas?  Are they useful and novel?
#* David's is high, but we all know how much he'll do...
#* David's is high, but we all know how much he'll do...
##* Proposed experiments by Radu (nudging) and Caroline (multiview) might produce new hypotheses faster/better if collected data includes eye focus
#** Proposed experiments by Radu (nudging) and Caroline (multiview) might produce new hypotheses faster/better if collected data includes eye focus
##* Cognitive modeling to estimate performance with brain diagram use is likely to have more to work with if data acquired includes eye tracking
#** Cognitive modeling to estimate performance with brain diagram use is likely to have more to work with if data acquired includes eye tracking
# Head-mounted system, or a device that will sit in front of the display?
# Head-mounted system, or a device that will sit in front of the display?
#* Will we use this in the CAVE or for mobile devices, or just desktops?
#* Will we use this in the CAVE or for mobile devices, or just desktops?

Latest revision as of 22:54, 15 June 2015

Proposals

We have submitted several proposals to NSF related to cognitive modeling.

Resources

Eye-tracking resources:

  • Cogain.org --- gaze-tracking community site: seems geared toward tool-building for mobility-impaired etc.
  • Tobii client info --- Andrew Duchowski's Tobii client libary at Clemson

Modeling resources:

  • CogTool --- visual cognitive modeling tool for UI prototyping from Bonnie John's group at CMU
  • ACT-R overview --- tutorial and links from "Theories in HCI" course at Maryland
  • ACT-R -- CMU ACT-R research group, with links to Lisp architecture
  • Python ACT-R --- Python alternative to CMU's CLisp ACT-R environment

Reading:

Device Manufacturers

Manufacturer URLs Notes
Tobii tobii.com

demos: glasses demo, usability testing with Tobii, gaze interaction

products include: glasses, sensor integrated into 17" displays, and independent sensors.

They actually have an app store for disseminating projects built with the Tobii API.

Mirametrix mirametrix.com

demos: set-up, overview+web browsing demo

webcam-style sensor
SensoMotoric Instruments (SMI) smivision.com

demos: Head-mounted, multi-user heatmaps, devices set-up

head-mounted and webcam-style tracking devices
LC Technologies eyegaze.com

demos: overview

not clear to me whether this is a combined sensor-display

Other:

Agenda Items

Things to decide:

  1. Enthusiasm for investing in eye-tracking...
    • Can we enumerate some experiment ideas? Are they useful and novel?
    • David's is high, but we all know how much he'll do...
      • Proposed experiments by Radu (nudging) and Caroline (multiview) might produce new hypotheses faster/better if collected data includes eye focus
      • Cognitive modeling to estimate performance with brain diagram use is likely to have more to work with if data acquired includes eye tracking
  2. Head-mounted system, or a device that will sit in front of the display?
    • Will we use this in the CAVE or for mobile devices, or just desktops?