CRTypist UI
CRTypist is a computational-rationality model simulating how people type on touchscreens, capturing trade-offs between speed and accuracy. The model enables prediction of individual typing behavior and optimization of layout designs. This demo shows how the model simulates how people type.
Link to code: https://osf.io/qphs7/overview
Eyeformer UI
Eyeformer is a Transformer-based model for predicting personalized eye scanpaths. The model learns user-specific gaze patterns via reinforcement learning, improving prediction for chart and GUI perception tasks. This demo generates predicted eye scanpaths over images.
Link to code: https://github.com/YueJiang-nj/EyeFormer-UIST2024
WigglyEyes UI
WigglyEyes is a novel method to infer eye movement patterns from typing behavior using keypress timing and sequence data. Enables eye-tracking inference without cameras, bridging typing and visual attention analysis. This demo shows the eye movement patterns through typing speed and noise.
Link to code: https://github.com/quintus0505/wigglyeyes
Chartist UI
Chartist is a computational model of eye-movement control during chart reading. The model predicts fixations and saccades based on visual salience and task goals, simulating how users extract information from bar, line, and scatter plots. This demo shows the eye-movement prediction scanpath over a chart, based on the tasks.
Link to code: https://osf.io/bas39/
Pedestrian crossing
The Pedestrian crossing decisions presents a model of pedestrian crossing decisions based on the theory of computational rationality. It is assumed that crossing decisions are boundedly optimal, with bounds on optimality arising from human cognitive constraints.
