Cognitive Modeling and Eye-tracking: Difference between revisions
Jump to navigation
Jump to search
Steven Gomez (talk | contribs) No edit summary |
Steven Gomez (talk | contribs) added links to new proposal resources |
||
| (11 intermediate revisions by 2 users not shown) | |||
| Line 1: | Line 1: | ||
== Proposals == | |||
We have submitted several proposals to NSF related to cognitive modeling. | |||
* [[/NSF_Proposal|September 2015 proposal]] | |||
* [[/NSF_Proposal#Past_Proposals_to_NSF|Proposals prior to 2015 that were not funded]] | |||
== Resources == | == Resources == | ||
Eye-tracking resources: | |||
* [http://www.cogain.org Cogain.org] --- gaze-tracking community site: seems geared toward tool-building for mobility-impaired etc. | * [http://www.cogain.org Cogain.org] --- gaze-tracking community site: seems geared toward tool-building for mobility-impaired etc. | ||
* [http://andrewd.ces.clemson.edu/tobii Tobii client info] --- Andrew Duchowski's Tobii client libary at Clemson | * [http://andrewd.ces.clemson.edu/tobii Tobii client info] --- Andrew Duchowski's Tobii client libary at Clemson | ||
Modeling resources: | |||
* [http://cogtool.hcii.cmu.edu CogTool] --- visual cognitive modeling tool for UI prototyping from Bonnie John's group at CMU | |||
* [http://www.cs.umd.edu/class/fall2002/cmsc838s/tichi/actr.html ACT-R overview] --- tutorial and links from "Theories in HCI" course at Maryland | |||
* [http://act-r.psy.cmu.edu/actr6 ACT-R] -- CMU ACT-R research group, with links to Lisp architecture | |||
* [http://sites.google.com/site/pythonactr Python ACT-R] --- Python alternative to CMU's CLisp ACT-R environment | |||
Reading: | |||
* [http://act-r.psy.cmu.edu/publications/index.php CMU CogModeling Reading List] --- starting point for a collection of modeling papers, scroll down for list of pdfs | |||
* [http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=1221372&isnumber=27434&tag=1 Real Time Eye Tracking for Human-Computer Interfaces] --- Amarnag et al., 2003 | |||
* [http://www.computer.org/portal/web/csdl/doi/10.1109/IV.2003.1218045 CAEVA: Cognitive Architecture to Evaluate Visualization Applications] --- Juarez-Espinosa, 2003 | |||
== Device Manufacturers == | == Device Manufacturers == | ||
| Line 15: | Line 29: | ||
| Tobii | | Tobii | ||
| [http://www.tobii.com tobii.com] | | [http://www.tobii.com tobii.com] | ||
| products include: glasses, sensor integrated into 17" displays, and independent sensors | demos: | ||
[http://www.tobiiglasses.com/scientificresearch glasses demo], | |||
[http://www.youtube.com/watch?v=tpLUkKN3AWE usability testing with Tobii], | |||
[http://www.youtube.com/watch?v=NBIjWA8CHls gaze interaction] | |||
| products include: glasses, sensor integrated into 17" displays, and independent sensors. | |||
They actually have an [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers app store] for disseminating projects built with the Tobii API. | |||
|- | |||
| Mirametrix | |||
| [http://www.mirametrix.com mirametrix.com] | |||
demos: | |||
[http://www.youtube.com/watch?v=iZbwunb1NQg set-up], | |||
[http://www.mirametrix.com/s1-eye-tracker.html overview+web browsing demo] | |||
| webcam-style sensor | |||
|- | |||
| SensoMotoric Instruments (SMI) | |||
| [http://www.smivision.com/en/gaze-and-eye-tracking-systems/ smivision.com] | |||
demos: | |||
[http://www.youtube.com/watch?v=gxuQzfx-H0E Head-mounted], | |||
[http://www.youtube.com/watch?v=0hGAUCGmPzE&feature=related multi-user heatmaps], | |||
[http://www.youtube.com/watch?v=Tr3Nh7ApRb8&NR=1 devices set-up] | |||
| head-mounted and webcam-style tracking devices | |||
|- | |- | ||
| LC Technologies | | LC Technologies | ||
| [http://www.eyegaze.com eyegaze.com] | | [http://www.eyegaze.com eyegaze.com] | ||
demos: | |||
[http://www.youtube.com/watch?v=SCBvAfCUTyk overview] | |||
| not clear to me whether this is a combined sensor-display | | not clear to me whether this is a combined sensor-display | ||
|- | |- | ||
|} | |} | ||
Other: | |||
* [http://www.cogain.org/wiki/Eye_Trackers List of eye-trackers compiled at COGAIN] | |||
== Agenda Items == | |||
Things to decide: | |||
# Enthusiasm for investing in eye-tracking... | |||
#* Can we enumerate some experiment ideas? Are they useful and novel? | |||
#* David's is high, but we all know how much he'll do... | |||
#** Proposed experiments by Radu (nudging) and Caroline (multiview) might produce new hypotheses faster/better if collected data includes eye focus | |||
#** Cognitive modeling to estimate performance with brain diagram use is likely to have more to work with if data acquired includes eye tracking | |||
# Head-mounted system, or a device that will sit in front of the display? | |||
#* Will we use this in the CAVE or for mobile devices, or just desktops? | |||
[[Category:Projects]] | |||
Latest revision as of 22:54, 15 June 2015
Proposals
We have submitted several proposals to NSF related to cognitive modeling.
Resources
Eye-tracking resources:
- Cogain.org --- gaze-tracking community site: seems geared toward tool-building for mobility-impaired etc.
- Tobii client info --- Andrew Duchowski's Tobii client libary at Clemson
Modeling resources:
- CogTool --- visual cognitive modeling tool for UI prototyping from Bonnie John's group at CMU
- ACT-R overview --- tutorial and links from "Theories in HCI" course at Maryland
- ACT-R -- CMU ACT-R research group, with links to Lisp architecture
- Python ACT-R --- Python alternative to CMU's CLisp ACT-R environment
Reading:
- CMU CogModeling Reading List --- starting point for a collection of modeling papers, scroll down for list of pdfs
- Real Time Eye Tracking for Human-Computer Interfaces --- Amarnag et al., 2003
- CAEVA: Cognitive Architecture to Evaluate Visualization Applications --- Juarez-Espinosa, 2003
Device Manufacturers
| Manufacturer | URLs | Notes |
|---|---|---|
| Tobii | tobii.com
demos: glasses demo, usability testing with Tobii, gaze interaction |
products include: glasses, sensor integrated into 17" displays, and independent sensors.
They actually have an app store for disseminating projects built with the Tobii API. |
| Mirametrix | mirametrix.com
demos: set-up, overview+web browsing demo |
webcam-style sensor |
| SensoMotoric Instruments (SMI) | smivision.com | head-mounted and webcam-style tracking devices |
| LC Technologies | eyegaze.com
demos: overview |
not clear to me whether this is a combined sensor-display |
Other:
Agenda Items
Things to decide:
- Enthusiasm for investing in eye-tracking...
- Can we enumerate some experiment ideas? Are they useful and novel?
- David's is high, but we all know how much he'll do...
- Proposed experiments by Radu (nudging) and Caroline (multiview) might produce new hypotheses faster/better if collected data includes eye focus
- Cognitive modeling to estimate performance with brain diagram use is likely to have more to work with if data acquired includes eye tracking
- Head-mounted system, or a device that will sit in front of the display?
- Will we use this in the CAVE or for mobile devices, or just desktops?