Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: basic support for eye tracker data #10751

Closed
dominikwelke opened this issue Jun 13, 2022 · 2 comments
Closed

ENH: basic support for eye tracker data #10751

dominikwelke opened this issue Jun 13, 2022 · 2 comments
Assignees
Labels

Comments

@dominikwelke
Copy link
Contributor

dominikwelke commented Jun 13, 2022

Describe the new feature or enhancement

hi all,

for the 2022 sprint I suggested to add basic support for eye tracking data to MNE.
these data could be useful for several purposes, from improving data cleaning/IC labelling to all types of new analyses s.a. saccade related potentials.

@drammock suggested to post this issue ahead of time to discuss the topic.
I'd be happy to hear your feedback!

Describe your proposed implementation

there are several steps that I think would be necessary for a minimal implementation.

  • deciding how to store these data it in the MNE ecosystem
    e.g. simply as a misc channel type along the other sensor data?
  • handling of different sampling frequencies of the devices and missing data
    (as recording periods might only imperfectly overlap)
  • basic I/O tools
    loading raw eye track data and/or annotiations (blinks, saccades) from different file formats (at least from text/csv files)
  • alignment methods
    e.g. based on common annotations/trigger

a fuller (but still basic) version could also do (parts of) the following:

  • allow to identify events from the raw eye track
    most importantly blinks, and saccades (e.g. via the Engbert&Mergenthaler algorithm)
  • basic visualization tools

..and then of course there would be a host of specific tools the gaze information might be used for.
this might be beyond the scope of this issue, though.

Describe possible alternatives

..not even opening this box ;)

Additional comments

@larsoner
Copy link
Member

deciding how to store these data it in the MNE ecosystem e.g. simply as a misc channel type along the other sensor data?

+1 for a new channel type in the FIF standard

handling of different sampling frequencies of the devices and missing data (as recording periods might only imperfectly overlap) ... alignment methods e.g. based on common annotations/trigger

Already handled by https://mne.tools/stable/generated/mne.preprocessing.realign_raw.html

basic I/O tools loading raw eye track data and/or annotiations (blinks, saccades) from different file formats (at least from text/csv files)

I would start maybe by adapting the I/O from https://github.com/pyeparse/pyeparse, which isn't really maintained anymore (and should be deprecated in favor of proper MNE-Python support)

@larsoner
Copy link
Member

larsoner commented Oct 1, 2023

I think we can close this as we have some basic support now, feel free to open new issues for additional functionality!

@larsoner larsoner closed this as completed Oct 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
No open projects
Status: 🔨 In Progress
Development

Successfully merging a pull request may close this issue.

2 participants