ai-deep-dives
AI Deep Dive: Training a Transformer Model on EEGs
Sep. 29, 2022—Friday, September 30 at 1:00pm the Data Science Institute will host Prof. Sasha Key in a discussion of applying transformer deep learning models to the problem of analyzing multichannel EEG in response to multiple stimulus/recording conditions (e.g., faces vs. objects, speech vs. nonspeech, attend vs. ignore, etc.). Transformers are powerful sequence learners, and can be...
Understanding “Authorship” of the Torah (05/06/22)
Apr. 5, 2022—About Dr. Phil Lieberman, Jewish Studies Research Context 1 The Torah was originally transcribed in Medieval times, where medieval scholars transcribed the consonants. This could produce ambiguity; consider the New York Times – if you read the words “sh rd ths” – that could resolve to many different phrases when you add in the vowels....
Multimodal Neuroimaging Data (04/15/22)
Apr. 5, 2022—About The goal of this project is to employ deep learning on paired EEG-MRI data in order to make MRI predictions based on EEG alone. The project currently has ~40 subjects with paired MRI-EEG data (collected separately but with the same task design), which will grow to ~250 subjects over the next several years.
Student Teacher Interaction Analytics (04/01/22)
Apr. 5, 2022—About The Education and Brain Science Research Lab is beginning the Student Teacher Interaction Analytic (STiA) project to determine the relationship between executive function language used by teachers during reading instruction and reading outcomes. Executive function (EF) is a set of cognitive controls that support us in planning, monitoring, and executing our behaviors and actions...