We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Ruby On Robots Using Artoo

00:00

Formal Metadata

Title
Ruby On Robots Using Artoo
Title of Series
Number of Parts
50
Author
License
CC Attribution - ShareAlike 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language
Producer
Production PlaceMiami Beach, Florida

Content Metadata

Subject Area
Genre
Abstract
The robotics revolution has already begun. You can buy drones and robotic devices at local retail stores. Unfortunately, it’s hard to develop code for robots, and nearly impossible to create solutions that integrate multiple different kind of devices. Introducing Artoo, a new robotics framework written in Ruby. Artoo can communicate with many different kinds of hardware devices, and integrate them together. With surprisingly few lines of code, you can write interesting applications that tie together Arduinos, ARDrones, Spheros, and more... even all at the same time! Artoo is based on Celluloid, giving it the ability to support the levels of concurrency that are required for industrial-strength robotic applications. Artoo has a built in API-server, debugging console, and even shows you how to do Test-Driven Robotics (TDR). The time has come for Ruby-based robotics, and Artoo can help lead the way!
Computer programmingDreizehnStaff (military)Self-organizationRoundness (object)Centralizer and normalizer
Archaeological field surveyComputer programmingGroup actionInformation technology consultingSoftware developerLength of stayMereologyBoss CorporationRoboticsRight angleExistenceDiagram
RobotNP-hardRoboticsSoftware developerComputer hardwareWeb-DesignerComputer animation
RobotSurface of revolutionRoboticsMicrocontrollerMereologyComputer hardwareSource codeNumberFamilyDifferent (Kate Ryan album)Flow separationData storage devicePhysical computingSoftware frameworkMultiplication
Software frameworkManufacturing execution systemRoboticsComputer hardwareDifferent (Kate Ryan album)RoboticsPhysical computingMultiplicationComputer hardwareDifferent (Kate Ryan album)CoroutineArithmetic meanMultiplication signProjective planeEntire functionConcurrency (computer science)SoftwareDivisorComputer animation
Demo (music)SoftwareConcurrency (computer science)Multiplication signProjective planePoint (geometry)Mereology
RoboticsDevice driverDevice driverCodeWeb-DesignerCommunications protocolComputer architectureMicrocontrollerSoftware design patternConnected spaceBitComputer fontRoboticsEndliche Modelltheorie2 (number)Computer animation
Network socketSerial portObject-relational mappingPattern languageDifferent (Kate Ryan album)Communications protocolConnected spaceAdaptive behaviorEvent horizonComputer hardwareRoboticsBitCASE <Informatik>ArmDevice driverRow (database)Cartesian coordinate systemGame controllerGoodness of fitComputer animation
RoboticsSoftware testingRoboticsRepresentational state transferGame controllerGoodness of fitCartesian coordinate systemSoftware testingCASE <Informatik>Multiplication signComputer animation
RobotSphereMessage passingMoving averageRobotics2 (number)Software testingMultiplication signQuicksortTask (computing)Pattern languageComputer animation
Interface (computing)Line (geometry)Demo (music)Device driverWhiteboardCommunications protocolProduct (business)Sheaf (mathematics)FirmwareRevision controlArrow of timeSoftware repositoryArrow of timeBuildingFunctional (mathematics)Software testing2 (number)MicrocontrollerComputer-assisted translationRoboticsQuicksortPattern languageInternetworkingCodeSoftware repositoryVideoconferencingKey (cryptography)Game controllerMaxima and minimaSoftwareComputer programmingReal numberFocus (optics)Right angleCommunications protocolTouchscreenSinc functionBlogWhiteboardForm (programming)DigitizingLaptopCategory of beingMultiplication signBlock (periodic table)Software developerSimilarity (geometry)CASE <Informatik>Roundness (object)Spring (hydrology)Demo (music)InformationWaveService (economics)Physical computingWebsiteDifferent (Kate Ryan album)Monster groupComputer animation
Demo (music)Physical computingPattern languageAnalogyRoboticsDigitizingDifferent (Kate Ryan album)Personal identification numberBitComputerSystem on a chipOpen sourceSingle-board computerArmLink (knot theory)LaptopCoprocessorFingerprintMaizeWebsitePhysical systemComplete metric spaceConnected spaceFigurate numberSoftwareComputing platformComputer hardwareCASE <Informatik>Device driver
Different (Kate Ryan album)Device driverGeneric programmingConnected spaceComputing platformOperating systemDigitizing
Sheaf (mathematics)Device driverTable (information)Boom (sailing)ComputerPersonal identification numberExecution unitComputer programmingRoboticsSoftwareSphereType theoryFlash memoryConnected spaceCASE <Informatik>Exception handlingNumbering schemeDifferent (Kate Ryan album)
Device driverSheaf (mathematics)SphereGraph coloringRoboticsCodeCollisionMaxima and minima2 (number)Connected spaceIP addressFunction (mathematics)Network socketCuboidCASE <Informatik>
Moving averageDevice driverSheaf (mathematics)Game theorySet (mathematics)IP addressConnected spaceSoftware developerDirection (geometry)RandomizationRoboticsNetwork socketGraph coloringSerial port2 (number)CASE <Informatik>Message passingElectronic visual displayComputer animation
Computer hardwareAdventure gameArithmetic meanAdventure gameComputer hardwareMicrocontrollerGoodness of fitLie groupSoftware development kitTwitter
Hydraulic motorSoftware development kitTwitterSelf-organizationPoint (geometry)WhiteboardDigitizingRoboticsPlastikkarteComputer animation
Video gameGame theoryTwitterDemo (music)Game theoryVideo gameRoboticsMathematicianComputer animation
Graph (mathematics)Rule of inferenceCellular automatonSpacetimeUniformer RaumMultiplication signVideo gameGame theory2 (number)SphereDifferent (Kate Ryan album)BitRoboticsDot product
RobotVideo gameGame theoryGame theoryCollisionRoboticsConnected spaceDifferent (Kate Ryan album)Video gameEstimator40 (number)BitProgram slicingDistanceMultiplication signArtificial lifeFourier seriesInverse element
Artificial lifeVideo gameForm (programming)FrequencyNatural numberArtificial neural networkProcess (computing)Multiplication signAsynchronous Transfer Mode
RobotDevice driverCollisionSphereVideo gameNatural numberMoment (mathematics)Code2 (number)Form (programming)Social classRoboticsCollisionIP addressCASE <Informatik>Different (Kate Ryan album)Multiplication signModul <Datentyp>Hash functionAsynchronous Transfer ModeRule of inferenceGraph coloringComputer animation
Game controllerDemo (music)Open setView (database)Multiplication signSpacetimeCodeDemo (music)ArmRow (database)Set (mathematics)
Sheaf (mathematics)Device driverSource codeMotion captureVideoconferencingSoftware frameworkComputer-generated imageryCodeLibrary (computing)Pattern recognitionSoftware frameworkDrum memory2 (number)BitOpen sourceConnected spaceMetropolitan area networkGroup actionComputing platformFigurate numberAngleMotion capturePattern languageLevel (video gaming)Hybrid computerElectronic visual displaySet (mathematics)CASE <Informatik>VideoconferencingMoment (mathematics)ComputerWindowMachine visionForm (programming)CausalityComputer virusComputer animation
Service (economics)Real numberMetropolitan area networkCode
CodeMultiplication signComputer hardwareComputer animation
Software frameworkMotion captureVideoconferencingComputer-generated imageryJoystickDevice driverGame controllerSheaf (mathematics)Computer hardwareTriangleMultiplication signSoftware testingConnected spaceGame controllerComputer programmingCodeEvent horizonElectric generatorClassical physicsNintendo Co. Ltd.Generic programmingJoystickExpected valueInsertion lossGame theoryDivisorComputer animation
TowerGame controllerRow (database)
Game controllerDemo (music)Device driverGreatest elementGroup actionJoystickSheaf (mathematics)GradientGame controllerMultiplication signCoprocessorCodeCore dumpArmJoystickConnected spacePhysical systemReal-time operating systemMarginal distributionGroup actionProcess (computing)Compass (drafting)Computer clusterMultiplicationDemosceneComputer animation
RoboticsSpacetimeVector spaceDifferent (Kate Ryan album)Multiplication signJava appletGame controllerProjective planeScripting languageComputing platformFormal language
Projective planeComputing platformCuboidComputerDot productTwitterProgramming languageRoboticsEvoluteWordQuicksortArtificial life
RobotTime evolutionVirtual machineNumberTwitterRight angleRoboticsArithmetic meanProcess (computing)
RobotRoboticsVirtual machineData miningMathematical analysisArithmetic meanProcess (computing)
RobotAgreeablenessPower (physics)Multiplication signRoboticsRadical (chemistry)Mathematical analysisMathematical singularityTerm (mathematics)AgreeablenessPower (physics)WordGroup actionVideo gameHybrid computerComputer animation
RoboticsPhysical lawRobotOrder (biology)ExistenceGame theoryGodRoboticsPhysical law1 (number)NumberCategory of beingOrder (biology)ExistenceRevision control3 (number)Computer animation
RoboticsMathematicsRevision controlPatch (Unix)Point (geometry)Physical lawComputer animation
SoftwareVideoconferencingState of matterEvent horizonComputer animation
Transcript: English(auto-generated)
Good afternoon, everybody. This is RubyConf 2013. Yeah.
So before we get started, we just want to say a very special thank you to the organizers of Ruby Central and to all the conference staff, the sponsors, and to all of you for being here. Thank you so very much. Let's give a big round of applause for everybody.
So I am deadprogram, aka Ron Evans in the real world. I'm the ringleader of the Hybrid Group. This other guy over here is Adi Zankich, aka Adrian Zankich. He's the serious programming guy at the Hybrid Group. So he does all the work and
I take all the credit. Yeah. I love how that gets applause. So we're with the Hybrid Group. We are a software development consultancy based in sunny Los Angeles, California. And, among other things, we are the creators of KidsRuby. How did you guys like my
new boss this morning? She's awesome, right? The funny part is you're kidding. You think I'm kidding. Yeah. So, but here today, we are here to talk to you about Ruby on robots. Bzzzz. This robot is not with us today.
So let me ask you, is innovation dead? I mean, William Gibson said the future is already here. It's just not very evenly distributed. Isn't that really true? I mean, many of us have been doing web development for years, and yet we've been seeing a very interesting thing happening as we've been creating all these different technologies. We've discovered that innovating is
really hard. I mean, doing something genuinely different. And in fact, it's especially hard when you're dealing with hardware. So about six years ago, my younger brother Damon and I started building unmanned aerial vehicles using Ruby. A number of people remember that. And we had to source parts from literally all over the globe, aka China,
and they would ship at these really amazing microcontrollers and we would put them in blimps and they would burn up and we would order more. Nowadays, though, you can go to the Apple store and buy several different kinds of robots. I mean, the robot revolution is already here. So we're here to introduce to you R2, which is Ruby on robots. It is a Ruby framework for
robotics and physical computing. It supports multiple hardware devices, different hardware devices, and multiple hardware devices at the same time. In Ruby? I mean, are we serious? Yes! We are extremely serious. And the reason for that is a remarkable piece of technology called Celluloid. Tony, are
you here? By any chance? You bailed on my talk? What's up with that? So anyway, a bunch of the committers from Celluloid are here. And actually, this is probably the, one of the most important technologies to occur in the entire Ruby community in years, and if you're not paying attention to this, you need to be. In fact, you're probably using it right now if you use Sidekiq, which is another great project.
So it runs on the MRI Ruby, of course. It runs far better, however, on JRuby, thanks to the concurrency of the JVM. Excellent piece of software. You probably saw Charles and Tom's talk earlier. Great work. But we're gonna be showing most of our demos today on Rubinius, the Ruby of the future. If you love any of these projects, please
go help them. Brian Shirai is here. Brian, are you, are you here? He's probably with his daughter. Where are all my friends? Anyway, these, aww. Giant hugs. Channeling my inner tenderlove. So anyway, this is an amazing project. It just reached the 2.0 release, and Rubinius X has
been announced. There's really exciting things happening to it, and it's an important part of, really a pillar of the future of Ruby. Anyway, back to R2. So R2 is to robotics, like Rails is to web development. I'm gonna say that again because it's really, really important. R2 is to robotics, like Rails is to web
development. Actually it might be a little bit more like Sinatra, as you can tell from this code example. So we see, first of all, we require R2. Then we're going to declare a connection to an Arduino that's going to use an adapter called Formata, which is a serial protocol that you can use to communicate with various Arduinos and Arduino-compatible microcontrollers. Then we're gonna connect
on a particular port. Then we're gonna declare a device. This device is an LED. It's gonna use the LED driver and be connected to pin 13. Then the work that we're going to do is every one second, we're going to LED dot toggle, which is going to turn the LED either on or off. So this is kind of the canonical make an LED blink. So
we'll show you that in a minute. So R2's architecture. We have a couple of very important design patterns that we're utilizing within R2. Here's a little bit of an outline. So the main entity in R2 is, of course, the robot. And we have two things underneath that. We have the connections, as you saw before, and we have devices. Now we are using the adapter pattern in both
of these cases. So connections use an adapter. We can use, similar to the way ActiveRecord or other ORMs work, we can actually use this adapt- these different adapters to talk to different kinds of hardware. So where connections control how we actually communicate whatever protocols with the device, then the devices control behaviors. LEDs know how to
blink. Drones know how to fly. Et cetera. And then we are also using the publish and subscribe pattern via events. Devices, via their drivers, can detect events and then tell the robot about them. So it also has an API in R2. I mean, what good is a robot unless you can control it via an API across the intertubes, right?
So here's an example of both a RESTful API and a WebSockets API that could be used by two different applications to talk to the MCP or the master control program, which will then control all of the different robots. And there you have it. Of course, test-driven robotics is very, very important. I mean, we are Rubyists and we test first,
right? Well, traditionally in robotics, the way you do testing is you would turn on the robot and jump back really fast. I have scars. However, this is Ruby, and we could do a little better. Here's an example of TDR, or test-driven robotics, as we call it. In this case, we're actually using mini-spec. We're using mocha, and we are
using timecop. So let's take a quick look. First we're going to declare start right now. The robot is going to be the main robot, which is, as you'll remember, very similar to the Sinatra syntax. And then before this test, we're going to timecop travel to the start. Then we start our robot's work. It must roll every three seconds. So we travel to three seconds after the start.
We're going to expect a roll command, and then we process messages to give celluloid's mailboxes a chance to catch up with its actors. So this way we do not have to wait just a little over three seconds to test something that takes three seconds. Because otherwise, if we wanted to test something like turn the sprinklers on once a week, we'd have to wait a week. That's not good. You think your CI is bad. Try
it with robots. So of course R2 also has a command line interface because to, one of the important patterns that we've discovered, as we call it, robot ops, is that you definitely want to use commands to connect to all the devices. You do not want to do these sort of things manually. Anyway, though I've done enough talking, how about a
demo? You guys want to see a demo? All right. So, the first thing we're gonna take a look at is the DigiSpark microcontroller. So the DigiSpark is what we might call the minimum viable microcontroller. Oh yeah, please. It's very small. We have to get very close. Oh, we hope, if
you have the camera. All right. So this is it. It's extremely small. Oh, can't see it. I can see it. One of the keys. Hey, there it is.
So this is it. It's a rather small device, as you can tell. It actually, let's take the shield off. This is the DigiSpark itself. It's a very small, whoop, thank you. It is a very small, AT-tiny powered microcontroller that actually uses another protocol called LittleWire, similar to Fermata, but it runs on even smaller microcontrollers. Are we
good? All right. We're in focus. And we're going to use this LED shield that plugs into it, ish. It's better when it's over here. All right. So the program that we're gonna run, nope, that's not it. That's definitely not it. OK, so
the program we're gonna run here is the first thing we're gonna do is we're gonna connect to the DigiSpark using the LittleWire adapter with the vendor and ID, since it's a USB device. We're gonna connect to the board to retrieve its device info, which we're gonna display on this screen that you won't be able to see. Then the LED device, we're going to, again, toggle it every second. So you see it's
exactly the same code as we were using with the Arduino. See a pattern forming? All right. Let's run this. Oh, right. The video. Executing code. So it should start to flash. A three
thousand dollar lamp! All right. And, and you're applauding. That should be Apple. All right. So, moving on. So what do you do with a flashing LED? Well, we are software developers, and of course what we do is we check our Travis build status notifications. Yes, the, the build notifier is
to physical computing like the blog engine was to website development. That's the thing you do. All right. So let's take a look at some code real fast. All right. So in this case, we're going to require R2. We're gonna require Travis. We're gonna connect to the DigiSpark, and there's different LEDs.
Then we're gonna connect to a broken repo that we've called broken arrow, in tradition of flying things that don't necessarily work. We're gonna connect to the Travis repo. Then every ten seconds we're gonna check the repo. If the repo is green, we're gonna turn on the green LED. When we're testing it, it's going to turn blue when we check either the build is running or we're checking the Travis status. And then, if the build fails, we're
gonna turn red. And then last, we have a couple of functions. One turns on one particular LED, and the other turns them all off. All right. So, if this actually works across the internets, it will turn blue that we're checking the Travis build status, and then we'll turn red since
broken arrow is a broken build. Working. Working. Working. Fail. Now, we could go in and fix the build, but in the interest of time, let's just
move on to the next thing. All right. So what is the next thing? Ah, yes. So, one of the greatest things on the internet are cats. And the only thing better than cats are internet-enabled cats. For example, internet-enabled cat
toys. So in this case, we have a cool little device that we've made, kind of home brew, but we like it. It's got two servos and it plugs into the DigiSpark, and then it's connected to this fun little toy. That's the right angle. Can you guys see this OK? We
don't have a cat. They wouldn't let us bring one in. We have a robot cat, but not the same. All right. So let's take a look at the code. Wait, that's something else. Where is the code? Oh, I forgot to load it. All right. Well, in any case, we're using this leap motion. Yes. This leap motion is going to allow
us to, with nothing more than his hand waves, control these servos, moving this cat toy to the invisible internet cat on the other side. Oh, wait.
Can't see the toy. All right. We broke it.
I don't know how long this would last with a real cat, but it's still cool. Look, Ma! Just hands! Thank you. OK. So, so now let's
switch to something else. The BeagleBone Black. So one of the important robot ops patterns that we want to share with you is, you do not want to think you are going to develop robotics on your notebook computer unless you plan on duct taping it to a drone. Which you might try. It might work for you. On the other hand, there are amazing single-board system-on-chip
or SOC Linux computers that are very, very inexpensive. The Raspberry Pi is one. Another one, though, that is a little bit more powerful, but is also open source hardware, is the BeagleBone Black. Where is my video? There we go. So the BeagleBone Black is a very, very cool, also ARM
cortex-powered, single-board computer. It has a one gigahertz processor and 512 megabytes of RAM. In this particular case, we are running an Arch Linux distro that we have built that is also available on the Arch, a link from the Archwood dot io website, where it includes everything you're going to need, software-wise, to turn this into a complete, full physical computing
and robotics platform. Can you see this? OK. Yeah, we need it on the other side. So let's take a closer look. It's, it's naked. We have cases, but. So you see that it has a lot of different pins that
you can plug into for digital io, for analog io, for pulse-width modulation, and for I2C. Now, I might mention, you saw before that we had difference between drivers and connections. Well, we have generic drivers for general-purpose io and for I2C devices. So
you can actually use these same drivers on an Arduino, on a Raspberry Pi, on the BeagleBone Black, on the DigiSpark, or on any other platform that supports Linux GPIO. Think about that. Kinda fun. So what we're gonna do here is we're going to show our, there we go. The
Blink program that we showed before, with a slightly different syntax, but same idea. Except in this case we're gonna use the connection to the BeagleBone using the BeagleBone's adapter, and we have a slightly different pin numbering scheme, because the BeagleBone Black's pins are different. In similar fashion, the Raspberry Pi pins. This is actually what the pin is labeled
on the device. That way you're not trying to look, go to a lookup table. I mean, there's software for that. It's called R2. All right. Back to the camera. So now we're going to, he's actually SSH'd in to this Unix computer and is going to make our gigantic LED start flashing. If all goes well. Blink
on the BeagleBone. Yes, it's real. All right.
So that was cool, but can we get a little more exciting? Like, yes. So let's bring in another toy. Another robotic device. The Sphero. So the Sphero is from Orbotics. Oh, yeah,
the camera. The camera. So the Sphero is a small robotic sphere from Orbotics based out of Boulder, Colorado. Fantastically interesting toy. It might be the minimum viable robot, because it actually possesses input. It
has accelerometers that can detect collisions. It has output. It can change its color. And it can move around on its own volition. It is a Bluetooth device, so we're gonna connect up to it using another R2 program. Which, let me show you the code for that.
All right. So in this case, we're gonna require R2. We're gonna make a connection to the Sphero using the Sphero adapter on a particular IP address. Another one of the lessons from the robot ops toolbook is you definitely want to use serial to socket connections. You don't want to try to connect directly to the serial ports. You know, that way
we can use nice TCP and UDP kind of style software development. In this case, the work that we're gonna do is every one second, we're going to display a little message, and then we're gonna set the Sphero's color to a random color, RGB color, and then we're going to roll at a speed of 90 in a random direction. Crazy Sphero. So let's
go and see what happens. So blue means we're connected to the Bluetooth device. Oh, by the way, we are running this off of the BeagleBone Black as well. It is
alive. So one thing we did want to mention before
we go any further is, choose your own hardware adventure. What good is this stuff if you don't have some hardware? Well, luckily we have a lot of wonderful friends. And these friends said please give away hardware to the awesome Ruby community. So you get to choose your own hardware adventure. Now, not everyone is gonna get hardware today. Only those who
go to the Twitterverse and appeal to the magnificence of the robotic overlords, and say please give me a microcontroller. So if you tweet Digistump and R2IO, you can win one of our R2 DigiSpark starter kits that comes with the microcontroller, it comes with an RGB LED shield, it comes with all the little
connectors that you will need to connect it to motors or servos or other things. So Digistump and R2IO to win a DigiSpark starter kit. If you want to win a BeagleBone Black starter kit that includes a BeagleBone Black, SD card, jumpers, breadboard, everything you need to build your own robot, tweet BeagleBoard.org, kind of long, sorry, and
R2IO to win that. And if you want to win a Sphero 2.0, the hot new item, then you tweet go Sphero and R2IO. All right, so we'll run through that again. DigiSpump and R2IO to win that. BeagleBoard.org and R2IO, if you go in that direction, and if
you go north, you get to try to win a go Sphero and R2IO. So again, our criteria is whichever tweet we like most. So, beg. It's OK. All right. On to the demo. So Conway's Game of Life. Who knows about Conway's Game of Life? A decent percentage, but let's just do a quick mathematical review. So John Conway was
a math-a-magician who invented something that we call cellular automata. It basically says by using very, very simple algorithm we can get interesting emergent behaviors, essentially kind of like a swarm of robots. And we thought, let's just do the rules here real fast. So it's usually played on graph paper, using a paper
and pencil. By the way, I highly recommend graph paper for doing creative work. It's fantastic. All right. So with graph paper, you would draw some cells, which are the dots. And then the rules are, if a cell has less than two neighbors, it dies on the next turn. If a cell has exactly two neighbors, an empty
space, a new cell is born into it, and if where a cell is there are more than three neighbors, it dies from overpopulation. So that would be the second move. So first move, second move, and so on. Well, we thought it would be really cool to do Conway's Game of Life with robots. But
we realized we'd have to do things a little tiny bit differently. One of the differences is that the Sphero does not possess the ability- I don't want to start connecting. The Sphero does not possess the ability to see other Spheros. However, it does have an accelerometer to detect collisions. So by doing a little bit of an inverse Fourier transform, we can basically turn the collisions into
an estimation of the proximity within a slice of time, and thereby we can make a decision about whether or not this is actually collided, and whether it's your live or die. So let's watch artificial life with R2 and Sphero. I'm alive. I'm looking for love in all the
wrong places. They need just a little contact. Not
human. Sphero. Actually, human contact would probably work, but. Oh! Two died. I feel traumatized, even when artificial life loses it. They can come back to
life. We call that zombie mode. And eventually they might all die, or it might just go on for long periods of time. It's very hard for me to kill off any life form, artificial or natural. So let's give it a brief moment. The last
Sphero. There's something kind of epic. Maybe someone will compose a ballad. Ooh! Anyway, I think you guys kind of get the idea. Let's take a quick look at some code.
So in this case we're actually using R2's modular mode. So we're declaring a class, which is the Conway Sphero robot. So the, it's connecting to a Sphero. The device is a Sphero. The work it's gonna do is, first it's born. Then on a Sphero's collision, and here we see an example of R2's evented syntax. On the collision we're gonna call the contact method. Every three
seconds we're gonna make a movement if we're alive, and every ten seconds a birthday, if we're still alive. Life is short, hard, and colorful in Spheroland. So then you see some of our helpers, check if we're alive, rebirth. If we actually follow the rules, we can be born. Life and death. So you kind of get the idea. Oh, wait.
There's one last thing that's kind of important here. So then we declare a hash with all of our different IP addresses and the names of the Spheros, and then each Sphero we create a new one and then tell all of them to go to work at the same time. A swarm of Spheros. All right. So now let's do something completely different.
This is the time to put on your protective gear if you have some. So we're gonna demo the ARDrone, and we're moving over there so we have some space. Oh, yeah.
We forgot to set this up before. Whoops. Oh, the Sphero. I mean, sorry. The ARDrone. Yeah. That thing. So many people got to see Jim Weirich's Argus gem. We're actually using the Argus gem wrapped up inside of the R2 ARDrone adapter, and we've done
a few contributions to it ourselves. It's very, very cool. Thank you, Jim. We really appreciate it. Standing on the shoulders of giants is awesome. All right. So what we're gonna do here is take a quick look at some, some code.
All right. So in this case we're gonna make, require R2. We're gonna make a connection to the ARDrone via its adapter. The device is gonna be the drone. You're seeing a pattern forming. So the work we're gonna do is first we're gonna start the drone. Then the drone's gonna take off. After fifteen seconds it's going to hover and land. And then after twenty seconds, stop.
So a little bit of automated drone flight. So this is the drone. Hello. ARDrone and R2.
So now, for this next demonstration, we're going to
need a courageous volunteer from our studio audience. I mean courageous. Like, this is kind of dangerous. And you have to be tall. Oh yeah, let's just use one of the hybrid group members, because I believe we are insured for them. And if not I could just drive them home. Daniel
Fisher, a hybrid group strong man. All right. Oh, OK. Yes, thank you. All right. So what we're going to do, recently we added, there we go, recently we added OpenCV support to R2. If you're not familiar with it, OpenCV is probably the most important computer vision library. It's open source.
It has, it's a very deep and rich platform. And so what we're gonna do here is we're gonna make a connection to the capture device. Then we're gonna make a connection to the video device. And we're gonna make a connection to the AR drone. We're gonna use a facial recognition set of data. And then the work that
we're gonna do is we're gonna capture each frame and display it on a window, which we'll see in a moment. We're gonna start the drone and take off. After eight seconds it's gonna boost up to about facehugger level. After ten seconds it will hover again, and then at the mysterious thirteen second mark it will begin its facial recognition mission.
It should detect Daniel's face, and then as he tries to evade it, it should follow him. I think you see now why we chose our own volunteer. All right. So without further risk to us, cause it's gonna be over there. Get behind me, man. All
right. This is how you know it's real. We swear. All right. Wanna enlarge the window, please?
Well when we say customer service drone, we mean customer service drone. Evasive. We put in code to
stop it before it got too dangerous. So, thank you Daniel. I owe you a drink, man. All right. So we promised new hardware every time
we do a show, and basically we cannot disappoint. So why is it getting dark? Oh, that explains it. I'm like, it's all getting dark in here. Around the edges, especially. So we have some really awesome new hardware. It's really small, so it's hard
to find. Oh, OK, yeah, let's do that first. All right. Actually we do have two kinds of new hardware today. So the first thing is, many of you might have seen us fly the AR drone around with a Wii Classic controller using an Arduino. But even Nintendo has stopped dealing with the Wii.
So we thought, hey, it's time to get into the modern generation. So we now support the PS3 controller and the Xbox 60, Xbox 360 controller. So Adrian, who is serious programming guy and test pilot, is going to use this generic GameStop PS3 style controller to fly this AR drone around. Let's see
if, do I have the code here? Yeah, I went the wrong way. There we go. So we can see that we're gonna declare a connection to the AR drone. We're gonna declare the device of the drone, a connection to the joystick, and then we
see that we're gonna, whoops, we're gonna handle a bunch of these controller events. For example, when he hits the square button it will take off, the triangle button it will hover, the X button it will land, et cetera. And so now if all goes as expected, oh yes, reset the drone. If
it comes really close to you, please duck. This is human powered flight, so blame him. No, it all comes here. Blame me. I have Band-Aids. I
think. Standing by. This is Tower. How you
doing Control? Still trying to connect to the Wi-Fi on the drone. We're, we're standing by Ground Control. There's a certain cadence to this. If you don't do it right, it literally won't take off. I mean if you don't say it right. If there's anybody
from Rocket City here, please correct my English. American. Standing by. Yeah, get some altitude. Get some altitude. AR drone PS3 controller. Aw, you're
not gonna buzz the back row? No, don't do it, don't do it. It's I want you to.
It's tempting, but no. Not today. OK. So now. The grand finale. What you've all been waiting for. The Crazy Fly AR Drone and PS3 controller. So what is the Crazy Fly? Are
we crazy? We are. Extremely, if you haven't noticed that. It's probably the minimum viable quadcopter. This is the Crazy Fly from Big Crazy out of Sweden. It's a, it's gonna hurt, you know. It's really small. How bad can it hurt? So this is
actually a very, very impressive piece of technology. It also has an ARM core tox processor running a real time operating system. It's got a accelerometer, three axis accelerometer. It's got a magnetometer, A to A compass. And it also has a barometer for altitude detection. It's actually quite an acrobatic drone. Very,
very hard to control. Luckily it's very small. Also luckily it only has about a seven minute lifespan on that battery. So there's that. If you can get away from it for six and a half minutes, you're fine. Actually in about five minutes the sensor
starts going off, so if you throw some tin foil, run that way. All right. So what we're going to do. The first time ever anywhere seen is we're actually gonna control both of these drones with the same code, which, let's take a look at it. All right. So, first we're going to require
R2. Then we're gonna make a connection to the crazy fly using its adapter. And we're going to then connect to the joystick. Then we're gonna connect to the AR drone. So the work that we're gonna actually do here is we're gonna use the controller to control the crazy fly. And then we're gonna use the AR drone to do automated take-off. And
if this all goes as expected, the AR drone should take off and hover, and then Adrian should be able to kind of fly around it manually using the crazy fly. Should be interesting. Let's do it. Standing by. Multi-drone. Two drones, one code. We so crazy. And our test pilot. Was
that fun? I love this stuff. Yeah, let's, the
question was, is it possible to control them both with one controller? The answer is yes. However, because there's a significantly different vector of thrust in the crazy fly versus the AR drone, we didn't really have time to get that perfected with the amount of space we had in the hotel room here. And we kind of didn't want to spoil all
the surprise, because as soon as you start flying something around, people start coming in and swarming on it. So, so we'll get to that. Come and see us at robots.conf in December here back in Florida. But wait. There's more. There's always more. We heard that, you know, some people really, really like JavaScript
these days, and so we thought we'd like to put some robots on JavaScript. So Cylon dot JS is a project we just announced last month, and it lets you use CoffeeScript or JavaScript with node JS to do basically the same exact thing that you just saw, except in those languages. And so it's actually available now. It doesn't have all twelve platforms supported. The R2 does, but it's,
but we're getting there. And then today we announced GoBot. Because we heard that the Go programming language was something you guys were kind of interested in too, and we sort of liked it ourselves. And we thought, Go Go Robot. So actually GoBot is
this month's project announcement. It's literally very, very hot and fresh. In other words, it barely works, but it kind of does. And I've, exactly. We have artificial life with GoBot already. So, so check it out. Join the robot evolution, because we need
all of you to help us build this future. So r2 dot io, or follow us at r2 io on Twitter. Cylon JS dot com, or follow us at Cylon JS on Twitter. Or GoBot dot io and follow us at GoBot io on Twitter. So once again, those numbers again. R2 dot io,
Cylon JS, and GoBot io. All right. So I, for one, say welcome to the machines. But there are a few questions that we have. One of them perhaps has to do with robot economics. So when machines are doing the jobs that humans do now, what will we do?
Kurt Vonnegut in Player Piano kind of posited a future where the satisfaction that we have would be greatly lacking because of a meaning and a purpose that we needed in our lives. And what about the pay that people need for jobs? And what about robot ethics? Do robots actually have ethics? Or do they only have the ethics that we give to them?
So what is going to happen? The answer is, I don't know. However, I know some actual professional futurists, and Daniel Rasmus is a very dear friend of mine. He wrote a great book called Listening to the Future, where he talks about something called scenario analysis. So we're gonna do a little scenario analysis of what we think is gonna happen. And we're gonna use two axes. The first
one is robot sentience. Will they become intelligent or not? And the other one is robot friendliness. Will they be friendly or hostile? And because we are a Los Angeles based company, a hybrid group, we look at everything in terms of Hollywood movies. So if the robots become not intelligent, and they're
not friendly, we get the movie Brazil. In other words, if the robots become intelligent but they're not friendly, we get Terminator. Enough said. Now, if the robots are not sentient but friendly, we get Power Rangers. And then, if the robots are both
sentient and friendly, we get Singularity. So this guy spent a lot of time thinking about what was gonna happen in the future. I hope everyone knows that this is Isaac Asimov. You didn't know Conway's Game of Life, but thank
God you know Asimov. So he wrote the three laws of robotics. And I'm gonna read them to you now. Number one. A robot may not injure a human being or, through inaction, allow a human being to come to harm. Number two. A robot must obey the orders given to it by human beings, except where such orders would conflict with
the first law. And then number three. A robot must protect its own existence, as long as such protection does not conflict with the first or second laws. So how's that been working out for us, guys? But I don't think it's fair to blame the robots, because this, in fact, is not a robot. It is a drone. And it is being
controlled by a person in a hidden bunker, where they are far, far away from the battlefield where these weapons are actually being utilized against real humans. So we propose a small change, just a little patch revision to Asimov's first law, version 1.1.
A human may not injure a human being, or through inaction, allow a human being to come to harm. Imagine that future. Let's make that future. Thank you.