You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
for the 2022 sprint I suggested to add basic support for eye tracking data to MNE.
these data could be useful for several purposes, from improving data cleaning/IC labelling to all types of new analyses s.a. saccade related potentials.
@drammock suggested to post this issue ahead of time to discuss the topic.
I'd be happy to hear your feedback!
Describe your proposed implementation
there are several steps that I think would be necessary for a minimal implementation.
deciding how to store these data it in the MNE ecosystem
e.g. simply as a misc channel type along the other sensor data?
handling of different sampling frequencies of the devices and missing data
(as recording periods might only imperfectly overlap)
basic I/O tools
loading raw eye track data and/or annotiations (blinks, saccades) from different file formats (at least from text/csv files)
alignment methods
e.g. based on common annotations/trigger
a fuller (but still basic) version could also do (parts of) the following:
allow to identify events from the raw eye track
most importantly blinks, and saccades (e.g. via the Engbert&Mergenthaler algorithm)
basic visualization tools
..and then of course there would be a host of specific tools the gaze information might be used for.
this might be beyond the scope of this issue, though.
deciding how to store these data it in the MNE ecosystem e.g. simply as a misc channel type along the other sensor data?
+1 for a new channel type in the FIF standard
handling of different sampling frequencies of the devices and missing data (as recording periods might only imperfectly overlap) ... alignment methods e.g. based on common annotations/trigger
basic I/O tools loading raw eye track data and/or annotiations (blinks, saccades) from different file formats (at least from text/csv files)
I would start maybe by adapting the I/O from https://github.com/pyeparse/pyeparse, which isn't really maintained anymore (and should be deprecated in favor of proper MNE-Python support)
Describe the new feature or enhancement
hi all,
for the 2022 sprint I suggested to add basic support for eye tracking data to MNE.
these data could be useful for several purposes, from improving data cleaning/IC labelling to all types of new analyses s.a. saccade related potentials.
@drammock suggested to post this issue ahead of time to discuss the topic.
I'd be happy to hear your feedback!
Describe your proposed implementation
there are several steps that I think would be necessary for a minimal implementation.
e.g. simply as a
misc
channel type along the other sensor data?(as recording periods might only imperfectly overlap)
loading raw eye track data and/or annotiations (blinks, saccades) from different file formats (at least from text/csv files)
e.g. based on common annotations/trigger
a fuller (but still basic) version could also do (parts of) the following:
most importantly blinks, and saccades (e.g. via the Engbert&Mergenthaler algorithm)
..and then of course there would be a host of specific tools the gaze information might be used for.
this might be beyond the scope of this issue, though.
Describe possible alternatives
..not even opening this box ;)
Additional comments
The text was updated successfully, but these errors were encountered: