May 31, 2023

1:30 pm / 3:00 pm

TITLE: Signal Processing and Analysis of Noisy Eye Position Sensor Data

INVESTIGATORS:

Jennifer Raymond (1)

Brian Angeles (1)

Sriram Jayabal (1)

Department of Neurobiology

TIME: 1:30–3:00 PM

LOCATION: Conference Room X399, Medical School Office Building, 1265 Welch Road, Stanford, CA

ABSTRACT

The Data Studio Workshop brings together a biomedical investigator with a group of experts for an in-depth session to solicit advice about statistical and study design issues that arise while planning or conducting a research project. This week, the investigator(s) will discuss the following project with the group.

INTRODUCTION

Our lab measures eye velocity responses to visual and vestibular stimuli, their modification via training and learning, and the neural underpinnings thereof. There were great ideas (at the previous workshop) regarding methods to characterize the timing of our eye velocity cycle averages. Unfortunately, time constraints prevented us from presenting our questions and challenges regarding the pre-processing of the eye position data to handle noise and artifacts in the eye position recordings and differentiate to obtain eye velocity.

BACKGROUND

Our lab is interested in understanding the algorithms that the brain uses to learn. To do so, we use oculomotor learning (learned changes in the eye movement responses to visual and vestibular sensory stimuli) as an experimental behavioral model owing to its simplicity, experimental and analytical tractability. We collect eye position data from mice using a magnetic sensing method developed in the lab, as they track a moving visual stimulus or counter-rotate their eyes during head rotation (a vestibular stimulus). We can train the mice to alter the amplitude or timing of the eye movement responses.  We would like to optimize the methods we use to pre-process the raw eye position data and the amplitude and timing of the eye movement responses.

METHODOLOGY

Horizontal eye position time-series data is acquired from magnetic sensors at a sampling rate of 1000 Hz, which then undergoes multiple processing steps:

  1. From the raw position data, 1 ms (single sample) transient noise artifact spikes are removed by applying Laplace smoothing (i.e. linearly interpolating the center point of the spike with the average of its nearest neighbors).
  2. A 9th order lowpass (zero-phase) Butterworth filter is applied with a cutoff frequency between 15 and 30 Hz on a mouse-by-mouse basis.
  3. The corresponding eye velocity trace (first derivative) of each block is approximated using a Savitzky-Golay filter over a 30-ms (i.e., 30 sample point) window.
  4. Saccades (brief, discrete high velocity/acceleration eye movement events, which we exclude from the analysis) and other unwanted artifacts in the eye position recordings (caused by electrical noise or body movements/vibrations) are removed by using velocity thresholding; which involves computing the squared differences between the velocity trace and its corresponding 1 Hz sinusoidal fit, and removing the sample points where its corresponding squared difference exceed some set threshold value.
  5. Velocity cycle averages are then computed over a single sinusoidal stimulus cycle.
  6. We typically average the eye velocity responses across stimulus repetitions, and then calculate the differences between velocity averages post- vs pre- behavioral training to calculate the learned change in the eye movement behavior in each session/mouse, and then conduct statistical tests comparing different populations of mice, and/or different kinds of training.

STATISTICAL ISSUES

We would like advice regarding several aspects of the pre-processing steps used to compute eye velocity from noisy positional data.

  1. Recommendations regarding the elimination of the 1ms high frequency transient noise found in our raw position data which we currently remove via interpolation.
  2. Importance of the order of pre-processing steps (e.g., application of a lowpass filter on the raw position signal before or after its differentiation).
  3. Methods for filtering and differentiation of raw eye position-related signals to remove the noise without affecting the eye movement signal.
  4. Best approaches for the detection and removal of eye saccades and unwanted motion artifacts from the eye velocity data.

ZOOM MEETING INFORMATION

Join from PC, Mac, Linux, iOS or Android:

https://stanford.zoom.us/j/91706399349?pwd=UXFlclNkakpmZC9WVWwrK244T2FwUT09

Password: 130209

Or iPhone one-tap (US Toll):

+18333021536,,91706399349# or

+16507249799,,91706399349#

Or Telephone:

Dial:  +1 650 724 9799 (US, Canada, Caribbean Toll) or

+1 833 302 1536 (US, Canada, Caribbean Toll Free)

 

Meeting ID: 917 0639 9349

Password: 130209

International numbers available: https://stanford.zoom.us/u/abKRNREFBK

Meeting ID: 917 0639 9349

Password: 130209

SIP: 91706399349@zoomcrc.com

Password: 130209