May 17, 2023
1:30 pm / 3:00 pm
TITLE: Signal Processing and Analysis of Noisy Eye Position Sensor Data
INVESTIGATORS:
Jennifer Raymond (1)
Brian Angeles (1)
Sriram Jayabal (1)
- Department of Neurobiology
DATE: Wednesday, 17 May 2023
TIME: 1:30–3:00 PM
LOCATION: Conference Room X399, Medical School Office Building, 1265 Welch Road, Stanford, CA
ABSTRACT
The Data Studio Workshop brings together a biomedical investigator with a group of experts for an in-depth session to solicit advice about statistical and study design issues that arise while planning or conducting a research project. This week, the investigator(s) will discuss the following project with the group.
INTRODUCTION
Our lab measures eye velocity responses to visual and vestibular stimuli and their modification by learning, and the neural underpinnings thereof.
BACKGROUND
A key function of the brain is to learn about the statistical relationships between events in the world. A mechanism of this learning is associative neural plasticity, controlled by the timing between neural events. Here, we show that experience can dramatically alter the timing rules governing associative plasticity to match the constraints of a particular circuit and behavior, thereby improving learning. In normal mice, the timing requirements for associative plasticity in the oculomotor cerebellum are precisely matched to the 120 ms delay for visual feedback about behavioral errors. This task-specific specialization of the timing rules for plasticity is acquired through experience; in dark-reared mice that had never experienced visual feedback about oculomotor errors, plasticity defaulted to a coincidence-based rule. Computational modeling suggests two broad strategies for implementing this Adaptive Tuning of the Timing Rules for Associative Plasticity (ATTRAP), which tune plasticity to different features of the statistics of neural activity. The modeling predicts a critical role of this process in optimizing the accuracy of temporal credit assignment during learning; consistent with this, behavioral experiments revealed a delay in the timing of learned eye movements in mice lacking experience-dependent tuning of the timing rules for plasticity. ATTRAP provides a powerful mechanism for matching the timing contingencies for associative plasticity to the functional requirements of a particular circuit and learning task, thereby providing a candidate neural mechanism for meta-learning.
METHODOLOGY
We have previously collected eye position data at a sampling rate of 1 kHz. The data corresponds to the eye movement of an animal either being sinusoidally rotated 180 degrees clockwise and counterclockwise at a rate of 1 Hz. Particular training protocols are conducted to either increase or decrease the magnitude of the eye’s sinusoidal motion. We then differentiate our eye position signals to extract the corresponding velocity traces, and using the stimulus signal data as a reference, we would like to compute the average eye velocity trace over a single 1 Hz period of the stimulus oscillation.
STATISTICAL ISSUES
- Is there a better and more principled way to characterize the timing of the eye movement response to sinusoidal stimuli? In much of our previous work, we have fit the eye velocity responses with a sinusoid and reported the amplitude and phase of the fit. But now we are seeing interesting timing effects that are not captured by the sinusoidal fits. In bottom of Fig 2H of the bioRxiv preprint, we just plotted the time (ms) of the absolute peak of the learned eye movement trace for each mouse (calculated by average eye velocity response across ~40 stimulus repetitions post-training minus avg of pre-training eye velocity response).
- Unfortunately, our processed and filtered data is still quite noisy, and differentiating the noisy data only makes it worse. We typically apply a lowpass Butterworth filter on our positional data, use a windowed Savitzky-Golay filter to get the corresponding velocity trace, and then apply a custom saccade detection/removal algorithm. We now would like to explore other possible methods to get a cleaner velocity trace from our noisy positional data that has minimal effect on the temporal and amplitude information within the data.
- As time allows, we would also appreciate advice about several aspects of the pre-processing steps used to compute eye velocity.
- methods for digital differentiation and filtering of raw eye position-related signals to obtain eye velocity
- identification of eye saccades (brief, discrete high velocity/acceleration eye movement events, which we exclude from the analysis) vs. lower frequency continuous “smooth” eye movements and noise
- elimination of very high frequency noise, which appears in the raw data as a single, occasional wayward 1ms sample in an otherwise smoother raw trace of eye position as a function of time.
ZOOM MEETING INFORMATION
Join from PC, Mac, Linux, iOS or Android:
https://stanford.zoom.us/j/91706399349?pwd=UXFlclNkakpmZC9WVWwrK244T2FwUT09
Password: 130209
Or iPhone one-tap (US Toll):
+18333021536,,91706399349# or
+16507249799,,91706399349#
Or Telephone:
Dial: +1 650 724 9799 (US, Canada, Caribbean Toll) or
+1 833 302 1536 (US, Canada, Caribbean Toll Free)
Meeting ID: 917 0639 9349
Password: 130209
International numbers available: https://stanford.zoom.us/u/abKRNREFBK
Meeting ID: 917 0639 9349
Password: 130209
Password: 130209