OpenBCI Primer
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 19 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/18059 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
2015 Spring NuPIC Hackathon11 / 19
1
4
5
7
8
9
11
13
14
16
17
18
00:00
CodeDiagramGraph (mathematics)Computer hardwareImplementationInformationMicroprocessorGame theoryTelecommunicationComputer programmingOpticsDigital signalFrequencyProduct (business)MicrocontrollerVariable (mathematics)Time domainAnalogyInferenceThermal fluctuationsLine (geometry)TheoryPhysical systemProjective planeLocal ringSampling (statistics)Term (mathematics)Device driverVisualization (computer graphics)Hydraulic jumpWaveformRevision controlPrototypeOperator (mathematics)Connectivity (graph theory)Graphical user interfaceVector potentialPulse (signal processing)AdditionPoint (geometry)Bridging (networking)WaveVariety (linguistics)outputCrash (computing)32-bitStudent's t-testStatement (computer science)Greatest elementPlastikkarteOutlierAlpha (investment)Bit rateOpen sourceWhiteboardDisk read-and-write headEndliche ModelltheorieDifferent (Kate Ryan album)Signal processingNeuroinformatikHookingElectronic program guideImage resolutionMultiplication signBounded variation2 (number)Right angleRaw image formatDemo (music)Physical computing1 (number)Software development kitNetwork topologyFunction (mathematics)DigitizingQuantum stateQuicksortRoboticsOpen setSpring (hydrology)Data conversionWhiteboardBrain–computer interfaceSuccessive over-relaxationAnnihilator (ring theory)Raw image formatGame controllerDesign by contractComputer animationDiagramLecture/Conference
09:30
Boom (sailing)DiagramNoise (electronics)Thermal conductivityCodeFunction (mathematics)FrequencyType theorySynchronizationEntire functionDialectSeries (mathematics)Steady state (chemistry)Forcing (mathematics)Group actionLetterpress printingPower (physics)Materialization (paranormal)MereologyPhysical systemResultantVisualization (computer graphics)PrototypeMeasurementDependent and independent variablesMehrplatzsystemProcess (computing)Connectivity (graph theory)RoboticsAsynchronous Transfer ModeVector potentialDistribution (mathematics)Hand fanAdditionOpen setPersonal identification numberGraphics libraryInteractive televisionJava appletFilm editingSoftware frameworkAlpha (investment)Open sourceWhiteboardDifferent (Kate Ryan album)TouchscreenMultiplication signPattern languageGame controllerTwitterCoroutineBit error rateMachine visionMulti-agent systemSource codeInternet forumView (database)Computer animationLecture/Conference
19:01
Control theorySoftwareGame theoryType theoryLevel (video gaming)DialectIntegrated development environmentDecision theoryMathematical optimizationState of matterConnected spaceRing (mathematics)Loop (music)Physical systemPlateau's problemProjective planeResultantHypothesisSoftware industryQuicksortGoodness of fitDependent and independent variablesRange (statistics)HypermediaProcess (computing)Field (computer science)Personal digital assistantPoint (geometry)Automatic differentiationoutputOpen setCartesian coordinate systemInformation overloadObservational studyGreatest elementClosed setDisk read-and-write headDifferent (Kate Ryan album)NeuroinformatikDegree (graph theory)Context awarenessMultiplication signUniform resource locatorRight anglePattern languageWritingLecture/Conference
28:31
Spring (hydrology)Diagram
Transcript: English(auto-generated)
00:12
Connor Musimano, he's the CEO and co-founder of OpenBCI, who manufactures high-tech jewelry. As you can see, Connor demonstrating here.
00:22
More of a fashion statement than anything else. Thanks, man. So thanks, guys, for all taking the time and watching me debug the system live. So this is OpenBCI, and this is our really cool logo. So OpenBCI was started by me and my business partner, Joel
00:42
Murphy. So Joel was my physical computing teacher at Parsons. And so I learned how to use an Arduino, and I learned introductory electronics. And a lot of the code that I've learned was working with him. He had started his stint into open source hardware with the Pulse sensor, which you can see on the upper right.
01:01
It's an optical heart rate monitor for Arduino. And then he was contacted by somebody on an SBIR DARPA grant. And the DARPA grant was to create low-cost EEG for nontraditional users. And then he knew that I had spent my whole time in my master's program doing EEG-related art projects.
01:22
So he reached out to me, and he was like, hey, you want to work with me on this? And I said, absolutely. So then after about three months, we took our first prototype to Maker Faire, and everyone at Maker Faire was like, this is unbelievable. We have to have this. And so we decided to kickstart. So Joel and I, that's the two of us with 3D-printed headsets.
01:42
So what we set out to do is essentially build this piece of electronics, this prototyping tool that could be used by a variety of different people, so artists, researchers, game developers, to extract biosignals from the body and allow you to do anything that you
02:01
want with those signals. And a lot of our, at least a lot of my inspiration came from LEGO, because who here has played with LEGO at some point in their life, and who here actually thinks it shaped the way that they think quite substantially? So LEGO has this amazing model where they give you
02:20
an instructional guide and how to build the castle or how to build the pirate ship, and they give you all these pieces. And then when you're done, you're like, oh, I'm bored with my pirate ship. You can do whatever you want with it. And so what we decided is we're gonna apply the same approach to turning people into cyborgs. So essentially figure out what pieces you want
02:42
and then stick them together. And so Arduino did a really amazing job. Who here has worked with Arduino before? So for those of you that don't know, Arduino is a electronics prototyping tool, essentially. So it's really great for beginners, because you can hook up an LED and a push button,
03:00
and you can write a few lines of code to push the button to make the LED turn on and off. But then a lot of experienced electrical engineers actually use it as their initial prototyping tool. So they'll build shields for the Arduino that do very powerful things. For instance, our first OpenBCI board was an Arduino shield, and then we took the components that were working and the ones that we wanted,
03:20
and we manufactured our own PCB based on the shield combined with the Arduino. And the great thing about Arduino is that it's programmable, so you can apply code to the microcontroller and have everything operating on hardware and never actually doing anything on the computer. So we wanted to combine all of these aspects
03:41
with biosensing, as you can see. So we ran a Kickstarter in late December of 2013, or December of 2013, and we were successfully funded. We more than doubled our goal, which was awesome. We weren't sure if we were gonna hit it, but it turns out there are a lot of people that are interested in doing the same thing that we were doing.
04:01
And this is the board that you see in my hand. This is a diagram of the various components. So there are eight bio-potential inputs. So there are eight channels that you can listen to EEG, ECG, or EMG, so any electrical potential coming out of your body. There's a high-powered analog to digital converter,
04:21
so this is 24-bit resolution, varied sampling rates. So the theoretical sampling rate is up to 16 kilohertz, but for EEG, you don't really need more than 250. Muscle, you generally want a little bit higher, so maybe 1,000 kilohertz or 500 kilohertz, or 500 hertz, rather, or one kilohertz. There's an accelerometer.
04:41
There's the same microprocessor that the Arduino has, which is the ATmega328P, and then we have a different variation of the board, which is a 32-bit PIC32 microprocessor. And then there's wireless communication and a local SD card. So one of the big issues about sending EEG data over the air is that you run into a bottleneck,
05:00
a data bottleneck, in your wireless communication most of the time. And so we put the local SD card there so that you could write high-resolution data at high sample rates to the SD card and then down sample to get it over the air, and that way you can visualize your data live in a visualizer, but then you can also have all of that data for post-processing at a much higher sample rate.
05:21
So yeah, this is kind of a quick summary of EEG. The brain and your body gives off a lot of analog electricity all the time, and what we do is we take that analog signal and we convert it into a digital signal. And I'm sure many of you are familiar with this, but the higher resolution your data packet,
05:42
the closer your digital signal ends up looking to your analog signal. And so we have a really high-resolution data packets, 24-bit resolution. So here's a little demo. So your brain is like an ocean of electricity, and when you're working with EEG,
06:01
you're standing on the beach, essentially, and you're watching waves crash on the beach and you're trying to make inferences about what's going on inside of the ocean. And so obviously the more beaches that you can stand on at the same time, the better inferences you can make, but ultimately you're watching waves crash on the shore. And so what you have to do is you have to look for outliers.
06:21
You have to look for things that stand out, maybe a succession of high-amplitude waves, and then you try and attribute those to some other contextual information. So if you look at a standard battery, over time you'll see a steady voltage. As the battery starts to lose life, the voltage will gradually drop. With EEG, it's very similar to touching the multimeter
06:42
to both ends of the battery, except you're touching it to a point on your noggin, on your head. And then the minus side, your reference, and that's usually a place that's electrically neutral, like your earlobe or your mastoid, which is the bone behind your ear. And then what you'll see is a variable voltage over time. So you'll see this fluctuation or this waveform.
07:02
That's a raw EEG wave on the bottom. So this is an example of wave addition. You see A plus B equals C. With EEG, what we're trying to do is we're trying to look at the raw wave and we're trying to extract frequencies from that raw wave. So when you hear the term brainwave, I'm sure we've all heard the term brainwave,
07:20
but what it's literally referring to is prevalent frequencies or predetermined VINs of frequencies that we're extracting from a raw signal. So when I close my eyes, and I'll demo this later, my visual cortex will produce an alpha frequency. And you'll see that waveform in the raw signal very pronounced.
07:40
And this is actually what it looks like. So that's one of our early versions of our GUI. And then if you look at the time domain here, if you count the little waves per second, you'll notice that there's about 14 kind of high amplitude waves that are dominating the signal. And then here's the FFT graph. And then at about 14 hertz, you see a spike.
08:02
And the majority of that is in the back of the head. You see the high opacity nodes there. And it's because your visual cortex produces very strong alpha when you close your eyes. So here's a better visualization. This is a newer version of our GUI. So here you see channel two is connected to my forehead,
08:22
and I'm gonna demo this in a second. Actually, whatever, I'll just demo this in a second. So these are our products. We have an eight-bit board, a 32-bit board, and then a 16-channel R&D kit. And then what we've been doing, we put a lot of effort into the hardware and making the hardware extremely modular.
08:42
But we started building SDKs and drivers for open frameworks, processing, and a lot of existing signal processing tools for biosensing. So BioExplorer, BCI 2000, BrainBay. And there's a huge, there's a plethora of these out there. And so a lot of our community members have actually already started building these bridges.
09:02
So this is a collage of different projects that people have started implementing. So on the upper left is a motor cortex classification. So a PhD student in France is trying to classify motor cortex signals and then mapping them to just a simple game where he basically has a binary input
09:20
and he can have a character jump up over, almost like Mario, he just has to avoid the gaps. And his character constantly runs forward and he has to jump over them. Here we have five different people plugged into the same open BCI board, controlling a floating shark flying through a room. And this is a diagram of how it works. So what we do is we have kind of a commander
09:42
who's dictating to the rest of his team which action the shark needs to perform. And by closing your eyes, you produce an alpha. So each person's alpha is mapped to a different trigger of the shark. And so the commander will say forward and then the forward guy will close his eyes and his brainwave will produce the alpha to move forward
10:00
and then they rotate through their respective jobs. But there's other ways of doing this with a single user. So one technique is called SSVEP, which is a Steadied State Visually Evoked Potential. So if you look at flashing frequencies of light, your visual cortex will actually start mimicking those frequencies. And so you can have a screen or blinking LEDs
10:23
at different frequencies and you can have each frequency mapped to a different controller output. So if you look at eight hertz, that's mapped to left. If you look at 12 hertz, that's mapped to right. You close your eyes and produce alpha, it moves the robot forward. And so we've done that as well.
10:41
And then we've also worked on controlling 3D printed prosthetics with EMG signals from the hand. And then 3D printing electrodes. So there's a lot of new materials out now, conductive materials, conductive ABS and PLA. So we're looking at 3D printing the entire EG headset. So these are different prototypes that we've been working on and here,
11:02
I'll actually turn this on. This is the Ultracortex, which is, you know, one step beyond the neocortex. And so the Ultracortex has a Twitter account, by the way, you guys should all follow it. And we're probably gonna launch a Kickstarter campaign later this summer
11:21
to print and distribute EEG headsets that work with the OpenBCI board. We're also looking at 3D printing everything, every component of the headset. So working with routine companies like Voxel8, who are looking at metallic ink in addition to new types of filaments to print electrodes and components
11:42
and then just have the actual EEG hardware, the amplifier just plug into the back, as you see here. So why do we do it? For people like you and actually, you know, we're big fans of what Numenta is doing because everything that we do is open source and it's very community driven.
12:01
So since we're an open source company, we basically live or die depending on how well our community supports what we do. And at the same time, we believe that as we move forward as a species, as humanity, we need to make sure that the technology that we build that's gonna be most instrumental in the future of who we are and what we do is transparent and understood by everyone.
12:22
And so that's a lot of what we stand for behind OpenBCI. So that's that, thank you. I'll actually demo the technology. So this is, we have an open source processing GUI, so processing is a creative coding framework
12:41
based on Java. It's not the fastest framework, but it works especially with graphics and there's a lot of really cool graphics libraries that are innate to the system. And so without further ado,
13:01
if you wanna look at this code, it's all online on our GitHub. Yes, so here you can select from a series of, you can select your board essentially. You can name a data file, so anytime you're logging,
13:20
anytime you're recording data live from the board, it's storing it in a CSV that you can then access for post-processing. You can have an eight-channel board or a 16-channel board. Here you can play back previously recorded files. If you want to, generate synthetic data.
13:46
If this is all connected correctly and the power's on, cool, that's what you wanna see. So now I'm going in. This is not the entire screen because my screen resized.
14:03
So here, this doesn't make any sense right now because I'm actually connected, uh-oh, yeah. So since I've connected my head, I'm going to demo all three biosignals at the same time.
14:23
So I've got EEG, which is my brain activity. I've got EMG, which is my muscle activity, and ECG, which is my heart activity. So what I have to do, and this is a really cool feature of OpenBCI, is that while the board is actually streaming data, I can reprogram the various channels
14:42
to optimize them for different things. So channel four is listening to a bio-potential measurement from my right forearm to my left wrist, and therefore, since it's sampled across my chest, you see a big EMG spike. I'm not sure why this is cutting.
15:01
It's probably because I'm touching more pins. Refrain from touching more pins. Hopefully this fixes it. Should be okay.
15:22
So what I'm gonna do is I'm gonna remove channel four from the reference that is being used for my EEG channels, and I'm also gonna remove it from the bias. So the bias is a ground, it's establishing a common ground between the board and my body,
15:41
and in doing so, the bias is kind of a souped-up ground, and so what it does is it senses common mode noise on all the channels, and then takes that common mode noise waveform, inverts it, and then sends it back into your body to create destructive interference for that signal.
16:02
Okay, so I'm not gonna force you guys to do that again, but essentially what I'll do is go back to the demonstration here. I wanted to demo this live, but if you guys wanna stick around and watch it afterwards, I think there's just a lot of moving pieces right now.
16:22
So this is what I was going to demo to you, which is the exact setup. Here is heartbeat in channel four, boom boom. Here you see in channel two this huge EMG spike that's a result of me closing my eyes,
16:40
and so when you close your eyes, your visual cortex will produce this alpha waveform, and here you can see in the red channel around 12 hertz, 12 hertz is the frequency, most people's alpha falls somewhere between nine and 12 hertz, and so the various regions of your neocortex, like your visual cortex
17:02
is a very big part of your brain. Your occipital lobe is very big, and so what happens is as you close your eyes and your visual cortex is not processing tons of patterns and decoding and looking and observing things in the environment, as a result, like when your eyes are open,
17:21
your visual cortex basically looks like static, it looks like noise, because everything's doing its responsibility, but then when you open or when you close your eyes, your neurons don't really, they don't know what to do, they're kinda just like, oh, we're chillin' now, we're in chill mode, and as a result, they kinda start, they don't stop firing, they still need to fire, and so they end up finding each other, they end up firing in synchronization
17:42
and it starts producing this alpha frequency. And so different regions or other regions of your brain behave similarly, so if you relax every muscle in your body, your motor cortex will begin to produce a mu frequency, and so one technique of applying EEG for interaction
18:02
is to have someone sit very still and then tell them to think about moving some limb or piece of their body, so if they engage their right hand, they don't even actually have to move it, they can think about moving their right hand, the entire motor cortex will be in a mu frequency and then that region that's responsible for activating the right hand
18:21
will start looking like static, and so you can build classifiers around that where you can look at right hand versus left hand versus feet, and so that's a common technique for building interactive tools using EEG, but interaction, I would argue, is not the best thing to do with EEG. I think that there are many better ways
18:42
of building interactive tools, for instance, EMG, so EMG signals are very easy to produce, and even quadriplegics can blink their eyes, grit their teeth, and these EMG signals, we've trained ourselves over our lives to have very strong control over it, so a lot of the novelty of BCI and EEG,
19:04
you realize it wears off very quickly, but media does a really bad job for the industry about saying mind control this, mind control that, when really that's not the most practical application of EEG, I think that EEG is, we're so far away from hitting the plateau
19:22
of non-invasive BCI and understanding what it's capable of, but I think those practical applications are not gonna be interaction, I think it's gonna be understanding brain activity in the context of environmental stimuli and looking at diagnoses for medical diseases
19:42
and things like that, so combining EEG with neurostimulation, like transcranial direct current stimulation for closed loop systems, but yeah, so I'm bummed that I couldn't get the device working on this whole thing, but if you guys wanna stick around, I'll set it up on my computer and I can show you, but if you have questions, you have an answer.
20:14
If you closed your eyes, you actually see a different type of signal, but if you start thinking about seeing the same thing
20:20
you saw with your eyes open, does the brain start firing the exact same way when you were actually watching somebody run with your eyes open versus when you're closed? Yes, to a certain degree. So it won't be as strong of a signal, but by imagined movements, we'll activate the same regions of the motor cortex as actual movements
20:40
and I think that's, I'm not a neuroscientist, but I think that that is true, that you will get, those regions of the brain will be active. So when you initially close your eye, that's just a temporary signal that you're getting just because you're closing your eye and resetting, but if you start thinking about the same thing
21:01
that you were seeing, then you may go back to what it used to be before you closed your eye. Hypothetically. Hypothetically, but a construct like that, something that's so specific, is very difficult to detect with EEG. So for instance, like person riding bike would be almost impossible to pick up with EEG,
21:22
whereas eyes open versus eyes closed is possible, or alertness versus mind-wandering states. So kind of more, I guess you could say emotions or higher level mental states, as opposed to very granular patterns like that are very difficult to, kind of like,
21:41
going back to the ocean analogy, it's difficult to look at that tectonic plate at the bottom of the ocean, but you can see big wind patterns influencing waves. But what about athletes that use mental, you know, they see themselves running over and over and over again,
22:00
and then they see their times increasing, would that have some kind of effect, would that be able to assist athletes, for example, or people that are dealing with injury? There are a lot of, so Red Bull is actually doing a really interesting project right now. And it's called Project Endurance,
22:20
and so they've actually been rigging athletes up to EEG, EMG, ECG, and all of these sensors, and then trying to optimize athlete performance, both physically and psychologically, using neurofeedback techniques. So I would look into that project, it's a really cool project. Do you have a sensor somewhere near motor vortex,
22:46
can you pick up the new van? Yeah, I mean, so I just stuck these electrodes wherever I wanted to, but you could array them all over your head, or you could target the motor cortex specifically, so that's one advantage to our system,
23:02
is that you can essentially build your own straps, or build your own head wear to target very specific locations. To the previous comment, the eyes closing is only going to affect an alpha, it's only the visual cortex, and the motor cortex is a separate game. But I had another question, which is,
23:22
do you imagine that this would be able to detect an emotional response? In other words, if you ask somebody a question on which they're likely to have a strong opinion, do you think you could infer from the signal whether they have a strong positive response versus a strong negative response?
23:43
There are people claiming to be doing this already, with EEG, so there are companies, like software companies that are developing proprietary software for emotional detection, and they were, I think there was a company in Israel that was out there for brain tech that was doing this and running people
24:04
through ad campaigns for the election, like the recent election in Israel, and they were getting, they were classifying emotional states and emotional responses to different political ads. So it is happening, and I do think it's possible. It's not easy, though. Even with this level, as opposed to
24:23
classical EEG with whatever the 17 sensors are? Yeah, so I mean, the EEG is EEG, like if you have a high sample rate, and you're in a good environment with low noise, if you can build a Faraday cage to prevent alternating current and light from influencing signal, you'll get a good signal
24:42
if you have a high sample rate and good connection. And I think OpenBCI is overpowered for a lot of the applications that it's being used for. Got it, thank you. Sure.
25:04
I read somewhere that one of the issues is some kind of a helmet that maybe incorporates some of the stuff that the pilot wears. Do you know anything about that? Yeah, actually it was a DARPA solicitation, or DOD solicitation, I think, recently about Air Force wanting to embed EEG
25:22
and maybe even neurostimulation, like TDCS techniques, and have closed-loop systems inside of a fighter pilot helmet. I don't know much more than that, though, but I think it is being researched, so, yeah. Yeah, just to follow up on that,
25:41
I mean, a lot of the research that they're funding is based on the ability to sort of detect sort of when pilots or when soldiers fatigue or don't fatigue, the idea of sort of at what point they really wanna sort of push their soldiers to the limit, at what point can we say, okay, these soldiers can make, or these pilots can make really good decisions without their sleep back or fatigue back.
26:01
So that's, I think, the bulk of why they funded and where it's going into. Yeah. Yeah, I know that with TDCS, that's a big one. I think there was a study done with drone operators, and they tested, you know, because TDCS, who's familiar with TDCS? Transcranial direct current stimulation. So there's a whole field of T blank S,
26:22
which is transcranial this or that stimulation. So they're trying out direct current, alternating current, magnetic, random noise, and they're basically just sending electricity into the head and trying to influence brain activity and see what happens. And it's a pretty widespread thing now. There's a whole DIY TDCS subreddit,
26:42
and it's a very active community. There are people that are sticking nine-volt batteries to their head and trying different things out. But one study that was done and apparently found statistically valid results was TDCS used for improved or enhanced performance through essentially closed loop systems to optimize alertness.
27:01
And so they tested TDCS against Ritalin, NuVigil, caffeine, and a control group, and they found TDCS to work more consistently and for longer durations of time than any of the drugs, which was pretty crazy. So yeah, any more questions?
27:28
Sorry? So we don't do any TDCS, and I haven't done much research,
27:40
but I pay attention because it's interesting and terrifying at the same time. But I think the standard acceptable range is two to 20 milliamps. I think most people are not sending more than 20 milliamps and low voltage signals through the ring.
28:01
So your input on the OpenBCF can handle that kind of abuse? We don't do any of this. I mean, I don't, oh, you're saying like, if you were doing the TDCS, yeah, I don't think it, it wouldn't be a problem. Yeah, so I don't think so. Otherwise you'll have to buy two boards, so.
28:24
Cool. Thank you, Connor. Really appreciate it. Thank you, Matt. Cheers. Thanks, guys. Thank you. Thank you.