learning.understanding.cognition.intelligence.data science

Mohan Ji

Credentials: Ph.D. Student, Psychology

LUCID Research Project: Visual Perception

Research Advisors: Bas Rokers

 

I am interested in the neural mechanisms of visual perception, especially 3D motion perception; that is, how our brains process motion information when objects move in a 3- dimensinoal space. Research investigating motion perception in human are usually conducted using 2D motion as a template: objects moving on a flat surface. However, in real world, objects usually move in a 3-dimensional space. We believe the way brain processes 3D motion information is different from the way 2D motion information is processed. This gap between 2D and 3D motion hinders our ability to understand how we perceive object motion in the real world. I aim to close this gap by investigating 3D motion perception using a combination of behavioral (psychophysics) and neuroimaging experiments.

The accurate perception of object motion is critical to survival. There are different visual motion signals available in the real world, and signals are integrated to form the full percept. However, which signals, and how are they combined is still poorly understood. A current project I am working on investigates how monocular, information received by one eye, and binocular cues, binocular combination of retinal signals from the two eyes, contribute to the perception of 3D motion. The results will help us better understand how different visual motion cues are integrated to form the full 3D motion percept and provide the opportunity to investigate the neural processes of 3D motion.

I am also interested in how advances in technology improve our ability to better present visual stimuli that reflect reality. Virtual Reality (VR) technologies have made it possible to present 3D visual stimuli in a more immersive and realistic way. Conventional behavioral studies investigating 3D motion perception typically use stereoscopes or 3D shutter glasses to simulate depth information on a computer monitor screen. The problem with these methods is that observers are still viewing stimuli on flat screens and it is reasonable to think they have the expectations that stimuli are moving on a flat surface. VR displays, on the other hand, solve the problem by placing observers in a far more immersive environment. They also provide additional cues such as motion parallax by updating the display based on head motion. In a current project, we aim to uncover how different visual motion signals are integrated in VR environment and the benefits of additional cues offered by VR technology.

Conference Proceedings

Ji M., Thompson L.W., Rosenberg A., Rokers B. (2017). Elucidating Neural Computations Underlying 3D Motion Perception. eLUCID8. Madison, WI.

Thompson L.W., Ji M, Rosenberg A., Rokers B. (2017). The contributions of monocular and binocular cues to the perception of 3D motion. Vision Sciences Society. St. Petersburg, FL.

Katyal S, He S, Ji M, He B, Peterson G, Engel S.A. A population of neurons that signal interocular conflict signal in human visual cortex [abstract]. Program No. 360.06. 2016 Neuroscience Meeting Planner. San Diego, CA: Society for Neuroscience, 2016. Online.