Modelling attention levels using microsaccade rates in response to vibrations in the peripheral field of vision
Keywords:
Peripheral vision field, Microsaccade, Visual attention, Latent resource, Bayesian modelAbstract
Viewer’s eye movements and behavioural responses were analysed in order to determine the relationship between selective perception and visual attention during a dual detection task in the central and peripheral fields of vision, in order to design better functioning information displays. Changes in visual attention levels were evaluated using temporal microsaccade rates. Response accuracy of stimulus detection and the microsaccade rate were analysed using a hierarchical Bayesian modelling technique. In the results, the dominance of the response in the peripheral field of vision is confirmed to be deviations in the estimated parameters. Also, chronological changes in levels of attention and the contribution of these changes to behavioural responses were examined. The relationship between behavioural responses, microsaccade rate, and the directional dominance of certain viewing areas in the peripheral field of vision were discussed, in order to evaluate the level of visual attention of viewers.
Downloads
References
[1] Underwood, G. (2005) Cognitive Processes in Eye Guidance (Oxford, UK: Oxford University Press).
[2] Wood, J.M. and Owsley, C. (2014) Gerontology viewpoint: Useful field of view test. Gerontology 60(4): 315–318.
[3] Klein, C. and Ettinger, U. (2019) Eye Movement Research, An Introduction to its Scientific Foundations and Applications (Springer Nature Switzerland AG).
[4] Smith, A.T. (1994) The detection of second-order motion. In Smith, A.T. and Snowden, R.J. [eds.] Visual Detection of Motion (London, UK: Academic Press), 145–176.
[5] Lleras, A., Buetti, S. and Xu, Z.J. (2022) Incorporating the properties of peripheral vision into theories of visual search. Nature Reviews Psychology 1: 590–604.
[6] Green, C.S. and Bavelier, D. (2006) Effect of action video games on the spatial distribution of visuospatial attention. Journal Experimental Psychology, Human perception performance 32(6): 1465–1478.
[7] Dingler, T. and Schmidt, A. (2016) Peripheral displays to support human cognition. In Bakker, S., Hausen, D. and Selker, T. [eds.] Peripheral Interaction –Challenges and Opportunities for HCI in the periphery of Attention– (Switzerland: Springer International Publishing AG Switzerland), 167–182.
[8] Klein, R.M., Reichertz, M., Christie, J., Wong, J. and Maycock, B. (2019) On the roles of central and peripheral vision in the extraction of material and form from a scene. Attention, Perception, & Psychophysics 81: 1209–1219.
[9] Sun, Y., Fisher, R., Wang, F. and Gomes, H.M. (2008) A computer vision model for visual-object-based attention and eye movements. Computer vision and image understanding 112: 126–142.
[10] Ishiguro, Y. and Rekimoto, J. (2012) Peripheral vision annotation: Noninterference information presentation method by using gaze information. Journal of IPSJ 53(4): 1328–1337.
[11] Kishishita, N., Orlosky, J., Kiyokawa, K., Mashita, T. and Tekemura, H. (2014) Investigation on the peripheral visual field for information display with wide-view see through hmds. Transactions of VRSJ 19(2): 121–130.
[12] Shimura, M., Suzuki, H., Shimomura, Y. and Katuura, T. (2015) Perception of visual motions stimulation in peripheral vision during eye fixation. Japan Journal of Physiology and Anthropology 20(2): 95–102.
[13] Ueno, T. and Nakayama, M. (2021) Influence of peripheral vibration stimulus on viewing and response actions. In Proceedings of ICPR2020 (ETTAC2020) (Milan, Italy: Springer): 1–8.
[14] Kashihara, K., Okanoya, K. and Kawai, N. (2014) Emotional attention modulates microsaccadic rate and direction. Psychological Research 78: 166–179.
[15] Dalmaso, M., Castelli, L., Scatturin, P. and Galfano, G. (2017) Working memory load modulates microsaccadic rate. Journal of Vision 17(3): 1–12.
[16] Poletti, M. and Rucci, M. (2016) A compact field guide to the study of microsaccades: Challenges and functions. Vision Research 118: 83–97.
[17] Krejtz, K., Duchowski, A.T., Niedzielska, A., Biele, C. and Krejtz, I. (2018) Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze. PloS One 13: 1–23.
[18] Dienes, Z. and Mclatchie, N. (2018) Four reasons to prefer bayesian analyses over significance testing. Psychonomic Bulletin & Review 25: 207–218.
[19] Parr, T., Rees, G. and Friston, K.J. (2018) Computational neuropsychology and bayesian inference. frontiers in Human Neuroscience 12(61): 1–14.
[20] Nguyen, M.H., La, V.P., Le, T.T. and Vuong, Q.H. (2022) Introduction to bayesian mindsponge framework analytics: An innovative method for social and psychological research. MethodsX 9(101808): 1–16.
[21] Posner, M.I. (1980) Orienting of attention. Quarterly Journal of Experimental Psychology 32: 3–25.
[22] Carrasco, M. (2011) Visual attention: The past 25 years. Vision Research 51: 1484–1525.
[23] Inoue, Y., Tanizawa, T., Utsumi, A., Susami, K., Kondo, T. and Takahashi, K. (2017) Visual attention control using peripheral vision stimulation. In Proceedings of International Conference on Systems, Man, and Cybernetics (SMC): 1563–1568.
[24] Patrick, J.A., Roach, N.W. and McGraw, P.V. (2019) Temporal modulation improves dynamic peripheral acuity. Journal of Vision 19(13)(12): 1–19.
[25] Classen, S., Wang, Y., Crizzle, A.M., Winter, S.M. and Lanford, D.N. (2023) Predicting older driver on-road performance by means of the useful field of view and trial making test part B. American Journal of Occupational Therapy 67: 574–582.
[26] Matthews, T., Ratternbury, T., Carter, S., Dey, A.K. and Mankoff, J. (2003) A peripheral display toolkit. In Proceedings of UIST2003: 1–10.
[27] Guzman, E.S.D., Yau, M., Gagliano, A., Park, A. and Dey, A.K. (2004) Exploring the design and use of peripheral displays of awareness information. In Proceedings of CHI2004: 1247–1250.
[28] Bakker, S., Hausen, D. and Selker, T. (2016) Introduction: Framing peripheral interaction. In Bakker, S., Hausen, D. and Selker, T. [eds.] Peripheral Interaction –Challenges and Opportunities for HCI in the periphery of Attention– (Switzerland: Springer International Publishing AG Switzerland), 167–182.
[29] Matthies, D.J.C., Haescher, M., Alm, R. and Urban, B. (2015) Properties of a peripheral head-mounted display. In Proceedings of 17th International Conference on Human- Computer Interaction: 1–6.
[30] Horiuchi, K., Ishihara, M. and Imanaka, K. (2017) The essential role of optical flow in the peripheral visual field for stable quiet standing: Evidence from the use of a head-mounted display. PloS One 12: e0184552: 1–16.
[31] Abraham M, H., Kitson, A., Nguyen-Vo, T., Benko, H., Stuerzlinger, W. and Riecke, B.E. (2018) Investigating a sparse peripheral display in a head-mounted display for vr locomotion. In Proceedings of 2018 IEEE Conference on Virtual Reality and 3D User Interfaces: 571–572.
[32] Lee, J.H., Yeom, K. and Park, J.H. (2023) The effect of video see-through HMD on peripheral visual search performance. IEEE Acess 11: 85184–85190.
[33] Richards, K., Mahalanobis, N., Kim, K., Schubert, R., Lee, M., Daher, S., Norouzi, N. et al. (2019) Analysis of peripheral vision and vibrotactile feedback during proximal search tasks with dynamic virtual entities in augmented reality. In Proceedings of Symposium on Spetial User Interaction (SUI ’19): 1–9.
[34] Matsui, K. and Nakamura, S. (2018) Influence on time evaluation by presenting visual stimulus in peripheral vision. Journal of IPSJ 59(3): 970–978.
[35] Kwak, Y., Hanning, N.M. and Carrasco, M. (2023) Presaccadic attention sharpens visual acuity. Scientific Reports 13(2981): 1–11.
[36] Schwartz, S., Vuilleumier, P., Hutton, C., Maravita, A., Dolan, R.J. and Driver, J. (2005) Attention load and sensory competition in human vision: Modulation of fmri responses by load at fixation during task-irrelevant stimulation in the peripheral visual field. Cerebral Cortex 15: 770–786.
[37] Moore, T. and Fallah, M. (2001) Control of eye movements and spatial attention. PNAS 98(3): 1273–1276.
[38] Kowler, E. (2011) Eye movements: the past 25 years. Vision Research 51: 1457–1483.
[39] Collewijn, H. and Kowler, E. (2008) The significance of microsaccades for vision and oculomotor control. Journal of Vision 8(14)(20): 1–21.
[40] Raveendran, R.N., Krishnan, A.K. and Thompson, B. (2020) Reduced fixation stability induced by peripheral viewing does not contribute to crowding. Journal of Vision 20(10)(3): 1–13.
[41] Hung, S.C., Barbot, A. and Carrasco, M. (2023) Visual perception learning modulates microsaccade rate and directionality. Scientific Reports 13(16525): 1–13.
[42] Watamaniuk, S.N., Badler, J.B. and Heinen, S.J. (2023) Peripheral targets attenuate miniature eye movements during fixation. Scientific Reports 13(7418): 1–10.
[43] Martinez-Conde, S., Macknik, S.L., Troncoso, X.G. and Dyar, T.A. (2006) Microsaccades counteract visual fading during fixation. Nueron 49: 297–305.
[44] McCamy, M.B., Otero-Millan, J., Macknik, S.L., Yang, Y., Troncoso, X.G., Baer, S.M., Crook, S.M. et al. (2012) Microsaccadic efficacy and contribution to foveal and peripheral vision. The Journal of Nueroscience 32(27): 9194–9204.
[45] Hafed, Z.M., Yoshida, M., Tian, X., Buonocore, A. and Malevich, T. (2021) Dissociable cortical and subcortical mechanisms for mediating the influences of visual cues on microsaccadic eye movements. frontiers in Neural Circuits 16(638429): 1–18.
[46] Liu, B., Nobre, A.C. and van Ede, F. (2023) Microsaccades transiently lateralise EEG alpha activity. Progress in Neurobiology 224: 102433.
[47] Shelchkova, N. and Poletti, M. (2020) Modulations of foveal vision associated with microsaccade preparation. PNAS 117(20): 11178–11183.
[48] Itti, L. (2005) Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition 12(6): 1093–1123.
[49] Borji, A. (2013) State-of-the-art in visual attention modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(1): 185–207.
[50] Tatler, B.W. (2007) The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. Journal of Vision 7(14): 4.1–17.
[51] Yu, G., Herman, J.P., Katz, L.N. and Krauzlis, R.J. (2022) Microsaccades as a marker not a cause for attention-related modulation. eLife 11(e74168): 1–14.
[52] Yip, S.H. and Saunders, J.A. (2023) Restricting the distribution of visual attention reduces cybersickenss. Cognitive Research: Principles and Implications 8(18): 1–18.
[53] Nicholls, M.E. and Roberts, G.R. (2002) Can free viewing perceptual asymmetries be explained by scanning, pre-motor or attentional biases? Cortex 38: 113–136.
[54] Butler, S., Gilchrist, I.D., Burt, D.M., Perrett, D.I., Jones, E. and Harvey, M. (2005) Are the perceptual biases found in chimeric face processing reflected in eyemovement patterns? Neuropsychologia 43: 52–59.
[55] Leonards, U. and Scott-Samuel, N.E. (2005) Idiosyncratic initiation of saccadic face exploration in humans. Vison Research 45: 2677–2684.
[56] Nicholls, M.E., Loftus, A., Mayer, K. and Mattingley, J.B. (2007) Things that go bump in the right: The effect of unimanual activity on rightward collisions. Neuropychologia 45: 1122–1126.
[57] Nicholls, M.E., Loftus, A.M., Orr, C.A. and Barre, N. (2008) Rightward collisions and their association with pseudoneglect. Brain and Cognition 68: 166–170.
[58] Klatt, S., Noël, B. and Schrödter, R. (2024) Attentional asymmetries in peripheral vision. British Journal of Psychology 115(1): 40–50.
[59] Wagenmakers, E.J., Marsman, M., Jamil, T., Ly, A., Verhagen, J., Love, J., Selker, R. et al. (2018) Bayesian inference for psychology. part i: Theoretical advantages and practical ramifications. Psychonomic Bulletin & Review 25: 35–57.
[60] Lee, M.D. (2011) How cognitive modeling can benefit from hierarchical Bayesian models. Journal of Mathematical Psychology 55: 1–7.
[61] Haaf, J.M. and Rounder, J.N. (2017) Developing constraint in Bayesian mixed models. Psychological Methods 22: 779–798.
[62] Sch onbrodt, F.D. and Wagenmakers, E.J. (2018) Bayes factor design analysis: Planning for compelling evidence. Psychonomic Bulletin & Review 25: 128–142.
[63] Muto, H. and Nagai, M. (2020) Mental rotation of cubes with a snake face: The role of the human-body analogy revisited. Visual Cognition 28: 106–111.
[64] Muto, H. (2021) Evidence for mixed processes in normal/mirror discrimination of rotated letters: A bayesian model comparison between single- and mixeddistribution models. Japanese Psychological Research 63(3): 190–202.
[65] Dubiel, M., Nakayama, M. and Wang, X. (2023) Modelling attention levels with ocular responses in a speech-in-noise recall task. In Proceedings of ACM Symposium on Eye Tracking Research & Applications 2023 (ETRA2023) (ACM): 89:1–7. https://doi.org/10.1145/3588015.3589665.
[66] Ueno, T. and Nakayama, M. (2021) Estimation of visual attention using microsaccades in response to vibrations in the peripheral field of vision. In Proceedings of ACM Symposium on Eye Tracking Research & Applications 2021 (ETRA2021) (ACM): 1–6.
[67] Engbert, R., Sinn, P., Mergenthaler, K. and Trukenbrod, H. (2015), Microsaccade toolbox 0.9. http://read.psych.uni-potsdam.de.
[68] Kubo, T. (2012) Introduction to statistical modelling for data analysis (Tokyo, Japan: Iwanami Shoten).
[69] Nakayama, M. and Ueno, T. (2023) Latent attention resource estimation of peripheral visual stimuli using microsaccade frequency modelling. In Proceedings of 27th International Conference on Information Visualisation (iV) (IEEE): 142–147.
[70] Nakayama, M. and Ueno, T. (2023) Estimation of latent attention resources using microsaccade frequency during a dual task. In Proceedings of ACM Symposium on Eye Tracking Research & Applications 2023 (ETRA2023) (ACM): 41:1–2.
[71] Anderson, S.F., Kelly, K. and Maxwell, S.E. (2017) Sample-size planning for more accurate statistical power: A method adjusting sample effect sizes for publication bias and uncertainty. Psychological Science 28(11): 1547–1562.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Minoru Nakayama, Takahiro Ueno

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.