Fundamentals of EEG based Brain-Computer Interfaces
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 254 | |
Author | ||
License | CC Attribution 4.0 International: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/53044 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | |||||
Genre | |||||
Abstract |
| ||||
Keywords |
|
00:00
ComputerInterface (computing)Computer animationJSON
00:21
ComputerInterior (topology)Ext functorElectronic data interchangeFundamental theorem of algebraSoftware developerComputerMetric systemLevel (video gaming)Interface (computing)TrailComputer animation
01:03
Computer hardwareComputer hardwareProjective planeSoftware developerBitOpen setGradientReading (process)WhiteboardInterface (computing)Computer animationPanel painting
01:58
Translation (relic)Function (mathematics)Interface (computing)Task (computing)FeedbackReading (process)Interface (computing)outputPreprocessorTask (computing)Support vector machineWhiteboardComputer animation
02:40
Function (mathematics)Wave packetMultiplication signSupport vector machineInformationGoodness of fitSet (mathematics)Translation (relic)Virtual machineVirtualizationDisk read-and-write headKeyboard shortcut
03:26
Keyboard shortcutOpen setState of matterFeedbackInterface (computing)Virtualization
03:52
State of matterLine (geometry)FrequencyOscillationMathematical analysis2 (number)RandomizationRange (statistics)Multiplication signType theoryOrder (biology)Disk read-and-write headComputer animation
04:31
Task (computing)Software developerState of matterPower (physics)Level (video gaming)PlotterRange (statistics)Multiplication signTerm (mathematics)Spectrum (functional analysis)MathematicsComputerFrequencyOscillationDifferent (Kate Ryan album)AdditionAlpha (investment)Musical ensembleInternational Date LineNoise (electronics)Source code
06:47
Row (database)Area2 (number)Physical systemComputerDifferent (Kate Ryan album)Noise (electronics)Line (geometry)MathematicsAverageLecture/ConferenceComputer animation
08:06
BiostatisticsElectric power transmissionFilter <Stochastik>FrequencyDisk read-and-write headDirected graphMereologyNP-hardDifferent (Kate Ryan album)NumberMultiplication signComputer animation
09:01
Task (computing)Event horizonComputerType theoryProgramming paradigmLine (geometry)CASE <Informatik>VirtualizationRow (database)Keyboard shortcutMultiplication signMedical imagingProcess (computing)Connectivity (graph theory)Noise (electronics)Vector potentialInterface (computing)Whiteboard
11:23
AuthenticationPhysical systemMultiplication signType theoryRow (database)Numbering schemeInterface (computing)Lecture/Conference
12:02
Interface (computing)Digital photographyThumbnailAuthenticationSet (mathematics)PasswordNumbering schemeForcing (mathematics)Computer animation
12:36
Computer-generated imageryAuthenticationStreaming mediaTask (computing)Multiplication signArmComputer animation
13:27
ResultantComputer animation
13:46
Metric systemMedical imagingCross-validation (statistics)Bit rateLine (geometry)Multiplication signSound effectPhysical systemWave packetBiostatisticsAuthenticationNumberValidity (statistics)Set (mathematics)PlotterSubsetBitDifferent (Kate Ryan album)AverageEqualiser (mathematics)Error messageRoundness (object)Fitness functionTask (computing)LinearizationMathematical analysisComputer animation
17:07
InternetworkingBitMultiplication signLecture/Conference
17:39
Medical imagingSpherical capWave packetSoftware testing40 (number)Lecture/Conference
18:22
Software testingCartesian coordinate systemSimilarity (geometry)
19:04
AuthenticationMedical imagingInternetworkingBitDifferent (Kate Ryan album)BiostatisticsValidity (statistics)CASE <Informatik>Computer animation
19:54
Different (Kate Ryan album)Right angleInternetworkingCross-validation (statistics)Multiplication signComputer animation
20:17
Lecture/ConferenceComputer animation
20:37
Computer animation
Transcript: English(auto-generated)
00:20
Okay, so
00:22
Maybe you are Always wondered how you could do Jedi metrics with a computer and that's exactly why we are here now So gnudi is going to tell you fundamentals of eg based brain computer interface and he's always been fascinated with the human brain and
00:43
He is a researcher and that scope and I give the stage to you Hello the reason why I'm giving this talk is Recently there has been development in electroencephalography
01:01
That was developed about 100 years ago And has been used in research and in medicine as well but we now have a consumer grade EEG headsets as well as some open hardware projects
01:20
aiming to develop EEG headsets There I have a picture of the emotive epoch which I think was the first consumer grade EEG headset and Actually, I think the aim of the open BCI project is to
01:40
get cheap research grade hardware I'm not going to explain too much about About the devices I want to talk a bit about how we can use eg readings to
02:01
To have a brain computer interface a brain computer interface Typically consists of a user having a task the task can be Thinking for example to it to have some input or if
02:22
If it's used to drive an electric wheelchair, for example, it could be a thought to go forward The signal has to be the eg signal has to be acquired I'm not going to talk about that. I'm more focusing on the pre-processing of the data and the feature extraction
02:41
Classification can generally be done with all kinds of classifiers Popular our support vector machines But a good feature extraction is essential we cannot really do Machine learning approaches where we learn the features because we typically do not have very much training data
03:04
doing eg experiments with human subjects takes a lot of time and also The data might contain Private information so often the data sets are not made public
03:24
So, yeah, that's what I'm going to talk about mainly but generally after classification we have an output translation It can be a virtual keyboard or something And there can be that is optional a feedback that allows the user to train the brain computer interface
03:43
Here I'm showing you the timeline of an eg signal this was a resting state Experiment so the subject was just resting doing nothing with eyes open You can't see very much in the type line it
04:01
Looks quite random Random oscillations quite low generally the signals are in the millivolt range So one of the first steps that you can do to a time frequency analysis so what we have here is we have 14 seconds of
04:21
eg and 14 electrodes at the 14 channels as the emotive epoch eg's headset has And here we also have 14 seconds and this is basically doing Couple of yeah computing
04:43
Spectra for different time slots So you'd see the development of the spectrum over time and what you see here is one of the things that make it difficult most of the signal power Is in the range below 5 Hertz
05:07
The different frequency ranges are typically associated with For example, yeah some states of mind like sleep states Actually, it's quite easy even with the timeline to
05:22
To see in which sleep state someone is Important is see our alpha band That's something that should be in in the plot that I showed before as well
05:41
We couldn't really see that in the timeline if actually I had instructed the subject to have the eyes closed but not open we would have seen oscillations in that range on the electrodes that are above the visual cortex because They would go to idle state as if nothing is seen and we would have more power in there
06:09
If we are doing eg experiments We typically look into changes because we have this huge random noise signal Where we have basically no idea what it means
06:24
We typically have experiments where we look at some At least two different states So we define something as a baseline and then we typically Look at what changes if we for example, we have a resting state task for the subject and then the task is to think
06:46
maybe the top The command to move the wheelchair What we see here is again from the same eg recording What I did was I used
07:03
Two seconds before that are not plotted here, but it looks about the same computed the average and Divided it all by it we call that a baseline correction. So now
07:20
It's easier to see changes here in the other areas so Having a baseline is something that you Normally do if you have some eg experiments We also have some other problems besides the general background noise that we have
07:44
It's artifacts here. It's a similar Timeline the difference is The subject was instructed to blink in intervals you see that there are some huge peaks there especially on the lower and the higher lines that's because
08:03
according to the 1020 system The electrodes are numbered around the head. So on the top and on the bottom It's basically the electrodes on the forehead. So what we see there is eye artifacts actually to see artifacts
08:23
That's quite easy in the eg timeline The problem is getting rid of it if we don't want to have it. So we start typically by instructing subjects Not to blink not to move Also a problem can be having for example the 50 Hertz power grid
08:47
Can also be an artifact in the signal therefore we typically have Filters at frequency There are different approaches to to get rid of the artifacts the simplest one is cutting out of the data
09:08
The parts with artifacts, but this basically means we have to repeat the experiment to have it several times But we are doing that anyway
09:23
if we One approach for brain computer interfaces are event related potentials If we have an event that could be a subject is shown a picture or any other stimuli
09:45
And will you repeat that And then we average over all the reputations of showing this image Then all the random noise will Will cancel out and what we are left with is
10:02
The eg component that actually depends on the processing of showing this image And that we call an event related related potential This is just an example Typically, it doesn't look that nice
10:21
We Typically count the peaks so we have three Positive peaks here we call p1 p2 p3 and the p3 or also called p300 because it's about 300 milliseconds after the stimulus that is something that we use for brain computer interfaces
10:43
Because There's the so-called oddball paradigm the p300 is only there if something is relevant to your task And it's not happening very often. So the p300 speller, which is Something like a virtual keyboard
11:01
Unfortunately, I don't have an animation the lines and rows would light up at At Some speed if you want to type a letter you would focus on a letter and when it lights up That is a rare and relevant event to you
11:22
Because you want to type it so exactly in that case you will have The p300 in your ERP and of course The system that is providing the stimuli Is recording the timing and then knows which
11:41
Which letter actually was lit up when this? Appeared so this way you can type things But again, you would need to to stare at one letter for a while because it has to be repeated several times
12:01
So I want to present More specialized p300 based Brain computer interface that is an authentication scheme the idea is we are having 100 Normal photos can be anything And we select thumb which are our password
12:25
So this is the example that is actually a set that we use for the experiments Five very different photos. So now I'm doing a small experiment with you try to
12:42
Remember those photos and try to See them and count them in this video stream
13:13
So I think I'm not asking you to raise your arms because I can't see you anyway. I guess some of you Might have seen all five pictures might some might have counted less
13:25
For the first time the task is not really easy, but generally When in this experiment you you counted something you will probably have had this p300 in your brain
13:43
Those are the results of the experiment I Hope you can read it Might be a bit too dark As we have 95 non target image and only five percent so five
14:00
target images we use f1 score to evaluate the classifier We did cross validation on the data here Where we have the best score we also did Train the classifier, which was in simple a linear discriminant analysis
14:26
We trained it just by The experiment done by one person and then we also tried to do a general classifier that we trained With other people's data for that we used the best data sets that we have that had the highest
14:42
score in the cross validation It still works the interesting thing here is that The classification of the EEG data is possible without tuning the classifier to the user Making the system non biometric
15:04
This is the number of trials that we averaged so we actually showed 50 of those bursts that you have seen just before that takes about 20 minutes and No one wants to use an authentication system where one log in takes 20 minutes
15:22
So we looked at how many bursts do we actually need but it seems like Up to 50 it still increases and there is a huge difference between different sets. So The highest line is is the top rated Subjects and the others are much lower. So it depends a lot on the subject
15:43
maybe also on how well EG headset fit and of maybe how well they focused on the task Yeah, but it's The biggest effect that we found was
16:03
For an authentication system we want to have Permanence, so if we want to log in again after a few months, it should still work. So we had three sessions with some months in between And actually we got better scores over time We feared that they might degrade but it seems there's a training effect
16:25
Yeah, and the signal is permanent enough So here is the final score. That's a plot because even though it's no real biometric system We are measuring biosignals and that can go wrong
16:41
Therefore we have false acceptance rates and false rejection rates and we are coming to an equal error rate of about 10% Which is of course too bad, especially when you see that one authentication round takes about 20 minutes Yeah, so the idea was if we have the
17:02
The fear showing the image is very fast. We can have a very short Log in time, but it actually didn't really work well enough and That it There are three minutes left are there questions
17:30
30 ah, I found my was again. So indeed we have a bit of time There are two microphones and the Internet's so remember and there's a question at that microphone
17:42
Hello there. Um, I was wondering have have you heard of This being used to detect terrorists There was a there was an experiment done where they showed images of things that you're not supposed to know as a regular citizen So they would show all these innocent images and then there would also be a blasting cap or the magazine of an ak-47
18:07
Or stuff that you're supposed to know if you went to a terrorist training camp and they would could do it the exact same P300 thing I thought that was interesting and of course if you read about You know about training camps and terrorists you would fail the tests, which would be interesting. Yeah, I
18:26
Haven't heard about the application on terrorism, but very similar application on criminal Investigations as a lie detector based on p300. So basically the same you are showing some pictures that only the
18:42
only the Yeah, yeah that that only the one who's guilty knows and but there are also some papers about if you are if you are attacked by this piece 300 based guilty knowledge test how to How to prevent being detected
19:03
Okay, another question that might perform Two small questions one is which was the eg headset used in the image-based Authentication was at the open BCI and then also is p300 Like individualized like does it need a lot of calibration or is it something that you can just detect like straight in
19:23
I'm so the eg headset used for for the experiments was the emotive epoch and The p300 seems to be a bit individual there are approaches doing biometrics by looking at
19:40
P300 but our approach was to have a general p300 detector that would work on anyone that was the difference between the IC the individual classifier and GC general classifier You can see the score difference here the middle and the right one And the left one was a cross validation
20:02
Okay, it's the question from the internet I'm looking at the no question from the internet So unfortunately the time is running out and so I have to ask you to approach community directly and I don't know you're here. You're at the Congress. Yes can be contacted in some way I guess okay, then I would say thanks Mooney again for his talk
Recommendations
Series of 10 media
Series of 25 media
Series of 24 media