Mingli Liang Abstracts

Mingli Liang

Ph.D. Candidate

Cognitive Sciences

 

59th Society for Psychophysiology Research Annual Meeting

Washington, D.C.

September 25-29th, 2019

 

Most approaches to understanding human spatial navigation involve navigation with desktop computers, which provide an incomplete sampling of the rich set of body-based cues involved in navigation. In addition, many such studies involve invasive recordings from the human brain, which are limited to clinical situations. Here, we present a novel experimental set-up combining wireless non-invasive neural recordings and virtual reality with free ambulation to investigate the neural correlates of space/time information during spatial navigation. In our first study, we found that frontal-midline delta-theta oscillations increased during movement compared to standing-still periods, suggesting a role of cortical low frequency oscillations in navigation. In a follow-up study, we aim at identifying the precise drivers of such low-frequency oscillations. Participants navigate in a virtual plus maze, with four possible targets. Before looking for a target, they enter a teleporter inside the maze, and the spatial/temporal characteristics of such teleporters will give participants cues of where they should go next. Analyses will test whether frontal low-frequency oscillations code spatial/temporal information during navigation. Together, our novel approach allows the unprecedented opportunity to record from the healthy human brain simultaneously during navigation approximating our real-world experience, providing insight into how low-frequency oscillations code aspects of space and time during navigation.

 

Abstract for Lay Audience

Neurons are the basic computational units of the brain, and different neurons can communicate via releasing neurotransmitters (a chemical substance). Large neuron population computation summates as electrical signal (voltage). They can be detected traditionally by placing sensors inside the brain (usually in the case of epilepsy patients for medical needs). But sometimes the signal is powerful enough that they can be picked up even after traversing through scalp bones and skin. Such electrical signal can be interpreted as the input and output of neural computation processes.


How our brain supports the function of spatial navigation still undergoes heated debate (for example, how do we robustly estimate distances, time, speed, and integrate them into an organic entity). Past investigations have successfully linked coding of speed and distance with fluctuations in the cellular voltages. However, such investigations were accomplished by placing sensors (electrodes) underneath scalp bone (electrocortcography, ECoG) or inside deep brain structures, which inevitably causes damages to humans, and must be restricted to clinical settings.


Electroencephalography (EEG), is a tool to measure the summation of electrical signal over human scalp. EEG includes a net of multiple sensors distributed on scalp. It is non-invasive, safe and cost-friendly, compared to ECoG. In the current investigation, we develop a virtual reality environment that selectively emphasizes distance or time information during navigation. We asked 19 healthy college students to navigate inside a large-scale virtual city while monitoring their EEG changes wirelessly. Inside the virtual reality we developed, there are teleporters that would teleport people certain distance ahead. We therefore manipulated the characteristics of the teleporters: how far the distance participants teleported ahead, and how long they stayed inside the teleporters. This way, we can conceptually disentangle time and space information. We plan to analyze the characteristics of the EEG voltage fluctuations (considered as a neural pattern). We would like to test whether a neural pattern is consistently correlated with distance coding or temporal coding. Therefore, the current study will shed lights on the neural mechanisms of spatial navigation.


The current study is innovative in the following three aspects: 1) we combine the latest technology of virtual reality and wireless EEG monitoring, making what was impossible now possible. The most recent development in virtual reality allows users to virtually “explore” a large-scale environment by walking on a stationary treadmill. The new wireless EEG recording technology frees the participant being recording away from wires and amplifiers (a necessary component placed on desk for EEG). In contrast to what we accomplished in the current study, it was impossible in the past, without VR or wireless EEG monitoring. 2) we investigate the role of time processing in human spatial navigation, the importance of which has been underappreciated for a long time. Research in spatial navigation has placed emphasis on role of distance/metric information; but just like how our memory works, we might need to know both “where” and “when” to explore a new environment! 3) the project might inspire new interventions for improving disorientation in navigation. The study develops a novel method of detecting brain patterns that are related to information coding in navigating a virtual reality. One possibility in the future is to investigate whether such brain patterns predict how well people learn and remember a new space! The coolest thing is the wireless EEG recordings do not pose harm to the users. Therefore, the study could inspire industrial application of studying brain and space.

Last updated 24 Feb 2020