The Science Behind ULTIMEYES®

ULTIMEYES® optimizes visual processing to reduce blurring. Proprietary algorithms monitor your performance and adapt to it, creating a customized session to ensure optimal progress.

Numerous scientific studies conducted over more than a decade support the principles upon which ULTIMEYES® was created.

ULTIMEYES® is the result of collaboration between Vision Science and Entertainment Software to improve how you see. ULTIMEYES® tailors itself to your unique abilities and is designed to improve visual acuity, contrast sensitivity and attention to yield an overall improvement of your vision. The patent-pending methods of perceptual learning established by Dr. Aaron Seitz, a renowned expert in the field of Perceptual Learning, combined with interactive gaming dynamics proven to engage players, produce high levels of continued focus and, in turn, produces results.

How It Works

ULTIMEYES® strengthens how the brain processes the visual input from the eyes. Patent pending neuroplasticity technology synchronizes task reinforcement with the appropriate stimuli to improve brain plasticity and vision. ULTIMEYES® pairs this breakthrough science with popular game dynamics that heighten levels of engagement and provide the positive reinforcement required to drive progress. In addition, combined audio and visual stimuli ensure that brain plasticity is maximized.

ULTIMEYES® is designed from the ground up to incorporate theory driven, and empirically supported, approaches to vision training into an entertaining video game, by incorporating already proven components along with:


  • alerting and orienting cues (sounds spatially located with visual targets)
  • training of executive attention (distractors progressively become more similar to tasks targets)
  • tasks designed to help with sustained attention (exercises become progressively longer with time)

These approaches, such as multi-sensory stimuli, motivating tasks, and consistent reinforcement to the training stimuli as found in a well-designed video game, are our way of creating a positive outcome for the user.

References

  1. Deveau, Lovcik, and Seitz (2014). "Broad-based visual benefits from training with an integrated perceptual-learning video game", Vision Research, doi: 10.1016/j.visres.2013.12.015.
  2. Seitz, A. R. & Dinse, H. R. A common framework for perceptual learning. Curr Opin Neurobiol 17,148-153, (2007).
  3. Levi, D. M. & Li, R. W. Perceptual learning as a potential treatment for amblyopia: a mini-review. Vision Res 49, 2535-2549, (2009).
  4. Seitz, A. R., Kim, R. & Shams, L. Sound facilitates visual learning. Curr Biol 16, 1422-1427 (2006).
  5. Polat, U. Making perceptual learning practical to improve visual functions. Vision Res 49, 2566-2573, (2009).
  6. Seitz, A. R. & Watanabe, T. The phenomenon of task-irrelevant perceptual learning. Vision Res 49, 2604-2610, (2009).
  7. Beste, C., Wascher, E., Gunturkun, O. & Dinse, H. R. Improvement and impairment of visually guided behavior through LTP- and LTD-likeexposure-based visual learning. Curr Biol 21, 876-882, (2011).
  8. Xiao, L. Q. et al. Complete Transfer of Perceptual Learning across Retinal Locations Enabled by Double Training. Curr Biol 18, 1922-1926, (2008).
  9. Green, C. S. & Bavelier, D. Action video game modifies visual selective attention. Nature 423, 534-537 (2003).
  10. Polat, U. et al. Training the brain to overcome the effect of aging on the human eye. Scientific reports 2, 278, (2012).
  11. Baker, C. I., Peli, E., Knouf, N. & Kanwisher, N. G. Reorganization of visual processing in macular degeneration. J Neurosci 25, 614-618 (2005).
  12. Huxlin, K. R. et al. Perceptual relearning of complex visual motion after V1 damage in humans. J Neurosci 29, 3981-3991, (2009).
  13. Vaina, L. M. & Gross, C. G. Perceptual deficits in patients with impaired recognition of biological motion after temporal lobe lesions. Proc Natl Acad Sci U S A 101, 16947-16951,(2004).
  14. Ostrovsky, Y., Andalman, A. & Sinha, P. Vision following extended congenital blindness. Psychol Sci 17, 1009-1014, (2006).
  15. Li, R. W., Klein, S. A. & Levi, D. M. Prolonged perceptual learning of positional acuity in adult amblyopia: perceptual template retuning dynamics. J Neurosci 28, 14223-14229, (2008).
  16. Seitz, A. R., Kim, D. & Watanabe, T. Rewards evoke learning of unconsciously processed visual stimuli in adult humans. Neuron 61, 700-707, (2009).
  17. Shams, L. & Seitz, A. R. Benefits of multisensory learning. Trends Cogn Sci, (2008).
  18. Kim, R. S., Seitz, A. R. & Shams, L. Benefits of stimulus congruency for multisensory facilitation of visual learning. PLoS ONE 3, e1532 (2008).
  19. Zhang, J. Y. et al. Stimulus coding rules for perceptual learning. PLoS Biol 6, e197 (2008).