Ember as Song
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 24 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/46569 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
EmberConf 20202 / 24
2
5
8
10
11
13
15
16
17
19
20
23
24
00:00
Linker (computing)RhombusShape (magazine)CodeMusical ensembleProgrammer (hardware)Computer animationJSON
00:21
BuildingElement (mathematics)ProgrammschleifeConditional probabilityBridging (networking)MereologyService (economics)Component-based software engineeringSheaf (mathematics)VolumeSound effectEuclidean vectorPattern languageContext awarenessImplementationSingle-precision floating-point formatProgrammer (hardware)Pattern languageMereologyBridging (networking)Connectivity (graph theory)Mobile appSheaf (mathematics)Single-precision floating-point formatLevel (video gaming)Web 2.0Volume (thermodynamics)Condition numberSound effectProgrammschleifeService (economics)Equivalence relationTemplate (C++)RoutingNumberGame controllerState of matterWeb pageImplementationMusical ensembleUniform resource locatorMappingMeasurementSoftware bugBitElement (mathematics)Keyboard shortcutContext awarenessPrimitive (album)Computer programmingFundamental theorem of algebraRight angleComputer animation
06:07
Service (economics)Library (computing)Sampling (statistics)Category of beingSingle-precision floating-point formatService (economics)Extension (kinesiology)Function (mathematics)Row (database)Structural load
06:47
Euclidean vectorParallel portConnectivity (graph theory)Electronic visual displayTemplate (C++)Musical ensembleMereologyVolume (thermodynamics)
07:29
MereologyEuclidean vectorScheduling (computing)MultilaterationTemplate (C++)Connectivity (graph theory)Multiplication signMereologyScheduling (computing)Musical ensembleRight angleComputer animation
08:12
Euclidean vectorMereologyConnectivity (graph theory)MereologyLevel (video gaming)Template (C++)
08:35
Euclidean vectorConnectivity (graph theory)Module (mathematics)Cross-site scriptingParameter (computer programming)Social classCategory of beingTemplate (C++)Constructor (object-oriented programming)Graph coloringLocal ring
09:17
Service (economics)Game controllerConnectivity (graph theory)Template (C++)RoutingStandard deviationMereology
10:14
Scale (map)Key (cryptography)Key (cryptography)String (computer science)Moment (mathematics)Scaling (geometry)Visualization (computer graphics)Musical ensembleComputer animation
10:45
MeasurementStandard deviationMultiplication signOctaveFunction (mathematics)Beat (acoustics)Electronic visual displayElectronic signatureMereologyParameter (computer programming)Connectivity (graph theory)
11:47
Scaling (geometry)LengthElectronic mailing listGroup actionKey (cryptography)Multiplication signParameter (computer programming)MereologySocial classConnectivity (graph theory)Scheduling (computing)
14:26
Visualization (computer graphics)Graph coloringSet (mathematics)Parameter (computer programming)Social classSynchronizationMultiplication signBitConnectivity (graph theory)2 (number)Musical ensembleVisualization (computer graphics)MereologyMeasurementScheduling (computing)Template (C++)Negative numberRight angle
16:54
Web pageGraph (mathematics)Service (economics)Template (C++)Multiplication signMeasurementDefault (computer science)Scaling (geometry)
17:46
MultiplicationMereologyMeasurementService (economics)Multiplication signSet (mathematics)Scheduling (computing)Sampling (statistics)Different (Kate Ryan album)
18:45
Mereology2 (number)Loop (music)ProgrammschleifeMeasurementParameter (computer programming)Multiplication signNumber
19:34
Drum memoryMusical ensembleSound effectDefault (computer science)Sampling (statistics)MultiplicationChainDrum memorySoftware development kitPersonal area networkFunction (mathematics)WaveformEnvelope (mathematics)OscillationMembrane keyboardParameter (computer programming)NeuroinformatikRow (database)Service (economics)Different (Kate Ryan album)Electronic visual displayComputer musicComputer animation
21:52
Drum memoryNeuroinformatikVolume (thermodynamics)Goodness of fitGreatest elementRing (mathematics)BitMusical ensembleVarianceRandomizationMereologyBeat (acoustics)Connectivity (graph theory)Software development kitDrum memorySource code
23:12
VacuumLoop (music)Software development kitDefault (computer science)Scheduling (computing)Parameter (computer programming)Beat (acoustics)ProgrammschleifeMultiplication signMereologyDrum memoryLoop (music)Pattern languageComputer animationSource code
24:12
Game controllerService (economics)Inheritance (object-oriented programming)RootControl flowTwitterPlanningMusical ensembleSheaf (mathematics)RoutingMultiplication signBeat (acoustics)Group actionTheory of relativityDrum memoryTemplate (C++)Game controllerService (economics)Loop (music)Inheritance (object-oriented programming)Computer animation
Transcript: English(auto-generated)
00:04
Hi I'm James C. Davis and I'm a professional programmer and an amateur
00:25
songwriter. I'm going to talk to you today about something that I call ember as song. So it started with an idea. There was a brainstorming session for EmberConf proposals where Melanie Sumner had thrown out this idea of composing a song alongside building up an Ember app. So building up the
00:45
song piece by piece and the Ember app and then coinciding. So I really like this idea but I wasn't really sure how to make it happen. So the idea grew. It just sat with me for a while. Like how best can I compare building an app with
01:01
composing a song? Can I map the elements of a song to concepts in Ember? I wasn't really sure how to do that and make it work and then it hit me. I could create an Ember app that is a song and then this just led to a whole exploration into that. So I've long thought that programming and songwriting
01:24
were very similar. They're both very creative endeavors. They're both complex and they share many fundamental concepts like patterns and loops, conditionals, problem-solving. It's often when you're creating music you're
01:41
trying to figure out how to get things to work together and bug fixing. Sometimes something's wrong and you have to figure out how to make it work. So what's in a song? A song is composed of sections. So you have like an intro and verse and a chorus and bridge and these may repeat and be
02:03
intermixed with each other. So what are sections composed of? Well a section has instruments and the instruments play different parts and the parts are made up of individual notes. Those notes are organized into measures and they can be
02:24
strung together to create musical phrases so like sentences or paragraphs. So I'm going to map these concepts to Ember. So here's some of the mappings that I've done. Sections I've mapped into routes because those are kind of like
02:41
locations in the in the song. Instruments I've mapped to services which I'll demonstrate in a little bit. Parts are components because that's what makes it up. That's what makes up the song and that's what makes up the app is the components. And then notes I've mapped to contextual
03:02
components. Specifically it's a contextual component within the part because it needs context. So a section mapped to a route. So it's a location of the song. It's composed of different instruments playing parts together the
03:21
same way a route will have a route template where you have different things working together. It's really like the musical equivalent of a page. So an instrument I've mapped to service and this is because instruments are global. You have instruments that you use all throughout the song and they
03:42
maintain state when moving through sections. They keep playing so if there was a note that was sounding at the end of a section it'll continue into the next section. Volume you can control the volume of the instruments and that's a global control or effects that are applied on the instrument. So part I've
04:02
mapped to component because a part is made up of notes for one instrument and it's often a pattern that repeats within a section and it could also be reused across multiple sections. So you might have a part in one section and
04:23
use that same part later the same way you would do a reusable component and number. So notes I've made contextual components because they belong to a part. They need the context. They need to know which instrument that they go with and they need to know when to play relative to other notes. There's
04:46
other globals as well like tempo. Tempo is the speed that the song plays at. There's the master volume so all the instruments together. There's a volume control that's global. There's also the playing state such as start,
05:04
pause, stop, and where we are in the timeline. These could be handled by a single or multiple services depending on how you want to group them together. So how do I take this concept of Ember as song into an actual implementation?
05:23
So I started with the Web Audio API. The Web Audio API is extremely powerful but it's a very low level. I wanted to start with something a little higher for making music so I found tone.js and tone.js uses the Web Audio API. It's
05:42
built on top of it but it provides primitives for making music not just sounds which is basically what the Web Audio API gives you. So I want to create a hello world. So my goal here for hello world was to create one instrument, a piano let's say, and just play a single note, middle C. So middle C
06:03
is the white key right in the middle of the piano keyboard. So I created a service, generated a service, called it piano, and I went ahead and added a name so we'd have something human readable. And so with this service I created a property called inst for the instrument and I used
06:24
something called sample library which is an extension to tone.js that lets you play samples and samples are just recorded instruments, little single note recordings of instruments. And so I load up the piano instrument, piano
06:41
sample, and I send it to master and so master is your output basically to your speakers. And so I wanted to be able to work with the instrument within the template. So tone.js is made for creating music in JavaScript and I really wanted to create it in Ember and Ember templates. And so what I did
07:02
is I created a component to represent the instrument. And so you can see the component displays the title of the instrument so you can see what it is, has a container, parallel I'll talk about later, but it yields a contextual component called part and passes through the instrument plus volume and
07:24
humanize and things I'll talk about later to it and it yields a part. Okay so the part component. The part component is responsible for connecting notes to an instrument. It's responsible for scheduling notes to play and I say scheduling because in tone.js you schedule notes to play at certain times.
07:47
The timing is not exact enough in JavaScript to say just play this note right now. You actually schedule it to play at a certain time. Music it has to be very exact to sound right. It's also responsible for scheduling draws
08:01
and draws are like anything where you manipulate the DOM and we'll see that a little bit later. And also for looping so any repeats. So those are all handled by the part component. So the template for the part component looks like this and so we have the didInsertElement modifier we call an
08:21
init part and it yields a contextual component called note as I had said earlier that I map notes to contextual components and it passes something called add note and something called active note to that component. So let's look at this note component. So the template's very simple has a local class I'm using Ember CSS modules here and it has a conditional
08:48
class on this dot active something called active and active gives it a background color and then if we look at the backing class we can see
09:01
there's a constructor that calls that add note that was passed in. It grabs a bunch of properties of the arguments and then active is a calculated getter there. So let's play it. So here's how you would use it. You would invoke
09:21
the instrument component pass it the instrument so this instrument is a service we're now in a route template so we needed to create a controller so I create a controller and inject the piano service into the controller and that's really all the controllers there for is to get the piano service into the template. So it yields I of the contextual
09:45
component part which yields P with the contextual component note and here we say pitch equals C4. So C4 is middle C. Tone.js uses it's a standard way of naming the notes that starts all the way down at C0 go C D E F G A B
10:07
and then C1 because the notes repeat those seven notes repeat so C4 is right in the middle. So let's see what that sounds like and there you
10:22
have it we press middle C and you also saw that it lit up which we'll talk about visualization in a moment. So let's do a scale. A scale is a string of notes one after another can be ascending go up get higher or descending in pitch and it's basically like for the C scale at least hitting all the
10:43
white keys on the piano. So we can implement a scale like this so we have our instrument we pass it a piano and down here in the part I've actually divided things up into measures. The measure component there is really just for visual display it just divides it up visually here and in the output. So
11:08
you can see I have pitches starting at C4 and I'm ascending up C D E F G A B and then C5 that's the next octave once you wrap back around that's an octave and here I've specified the time argument. So time in Tone.js is
11:28
defined with measures and beats. So standard time like a standard time signature is 4 4 which is 4 beats per measure. So you see our 4 beats it
11:42
starts at 0 so 0 1 2 3 then the second measure 0 1 2 3. So let's see what that scale sounds like. There we go the C scale. Alright so how does this
12:01
work? So here's our part component we looked at earlier so you can see that it yields this note component and then here's the backing class for that note component. So we can see that it's calling add note that it was passed in
12:20
and it's passing in basically the arguments that's supposed to be args there where it says this. Anyways so the part component so let's take a look at this we haven't seen this yet. So it defines an empty array of notes create
12:44
defines this action called add note takes a note calculates an ID for the note so just the next one in the list just uses length for that pushes on to notes adding the ID and then returns the ID and you can see down the lower left back to the part add note component where add notes called and
13:04
returned with the ID which gets assigned to that note. Then we have trigger synth and trigger synth takes a time and a note and it calls trigger attack release which an attack and a release attack is basically hitting the key and
13:21
releases releasing the key for a piano or any instrument on off and so it just doesn't on and off. In tone JS there there is attack and a separate release so that you could actually hold the note for a long time and and do that but we're just gonna do one hit here so what this is actually doing it's
13:42
not actually playing the note when you calls trigger attack release it's scheduling it for a certain time and next we have a knit part so the knit part action is what actually ties the notes together with trigger synth so
14:01
what it calls instantiates a new part and part capital P here is something from tone JS that's the tone JS part and you pass it a trigger synth which is a callback and you pass it an array of notes and it goes
14:20
through those array of notes and it calls trigger synth for each of them scheduling them all at the appropriate time. So let's talk about visualization for a minute. So if we look at the template for the note component you can see we have this active class and we have a getter this
14:43
is in the backing class where we compare the active note argument to the ID of of the note and active sets a background color it's that orangey color you can see when they light up and so let's see how this actually
15:01
works. So in part component you can see we set active note to negative one there will never be a note with ID of negative one because it starts at zero and in trigger synth we've added a little bit more here so you see this
15:22
draw.schedule so draw.schedule is something that tone JS gives us to be able to manipulate the DOM at a certain time in sync with the music. This is a little bit tricky because the DOM takes a little bit of time to update so it actually factors that in to try to get it to line up real nicely so what
15:42
is happening here is we're actually saying at the time the same time that we've scheduled the note to play we are setting the active note to the ID of the note that we've scheduled to play at that time so that will light it up that will set the active note it'll match the appropriate note component set
16:05
that class to active and it'll light up orange but then we want it to turn off when we release and so the next draw.schedule right here is setting it back to negative one and what we have to do here is we have to set the time in the future when we know that it's going to be released so we have the
16:26
duration and duration up there as you can see is 4n. 4n is how you specify a quarter note in tone JS so that's one quarter of a measure and four four so
16:42
time here has to be in seconds so tone JS provides this time capital T here where you can pass a duration and convert it to seconds and then just add it to the time so it knows when to turn off okay so what if we want to multiple instruments at the same time so two instruments so over on the left we
17:03
have that's our piano scale we had before and over on the right we've created something new a violin so these would actually be in the same template it actually doesn't matter what order they go in as far as when things will play but they will visually the one that is above the other will be
17:22
above the other on the page so the violin is another service very similar to the piano service we inject it into the controller and use it here here we're using half notes you can see a specified duration on the note instead of the default quarter note so we have just two of them in a
17:42
measure and then a little riff there so let's see what that sounds like and there you go piano and violin playing together okay what about multiple parts
18:03
so you might have multiple parts for an instrument if you wanted some to repeat or you just wanted to break them up into different phrases so here's where you can use start so start schedules that part to happen that whole part a whole set of notes to happen at a certain time and that's
18:20
specified in measures so here we have the first part we have flute so another sample instrument service injected the controller and we've started the first one at zero and then we have two measures and then we started the next one at two so that'll be the third and the fourth measure so
18:42
it'll play the first part and then the second part let's see what that sounds like and second part all right let's talk about loops so a loop is
19:03
just a repeat it said parts are responsible for loops so to specify a loop you pass the loop argument and you specify a number of how many times to loop but you also need to specify loop end so that's actually how
19:21
long to go before you loop there's also a loop start if you don't want to loop all the way back to the beginning but here we're just going to loop back to the beginning we've defined two measures and we've said we want to loop after two measures so just repeat the whole thing so let's see what that sounds like so you just played through it twice all right let's
19:50
create a drum kit so drums are pretty interesting because they're an instrument that's actually composed of multiple sub instruments so what I've done here is I've created a drum service and these are actually all just
20:04
one thing I've just broken it apart for a display here so I give it a name drum kit and then instead of creating an inst I create a kick so the kick drum is the big one down on the floor the bass drum give it a name I give it a default pitch and what default pitch lets me do is define if I don't give it
20:24
a pitch just use this pitch and it's a drum drums are tuned they do have pitch but here we've just given it a C one a very low because it's a bass drum C note and for the inst we've created a membrane scent then this this
20:42
comes from tone JS and what a synth is as opposed to a sample so while samples are recordings of instruments a scent is actually completely generated by the computer and you give it all these various parameters that could do a whole talk on oscillators and envelopes and how that works and different wave
21:02
forms and combining them together but basically it's computer-generated music here and then we send that to master that's our output snare snare is actually pretty hard to synthesize well so I'm actually using a sample here again with the default pitch and then the hi-hat this one's a little more
21:23
complicated because it uses this pan ball to get some effect but this kind of demonstrates how you can chain things with tone JS together you can create this pan ball effect send that to master and then you actually connect your hi-hat synth to that so you chain chain things together chain
21:44
instruments to effects to a master output and it's a metal scent which is just a different kind of scent so how you use this is invoke instrument pass it drums but in the part you specify the sub instrument that'll
22:03
actually pull off that sub instrument off of the instrument because again the instrument is passed to the contextual part component and so we do that here and then we have the snare part the snare was a little bit loud so I pass
22:22
volume to it to turn it down a little bit and then I've specified half notes here to let the snare ring a little while it sounds better and finally down at the bottom we have the hi-hat and the hi-hat was very loud so I turned it down a good bit I'm also passing this thing called humanize
22:41
and what humanize is in tone JS is it introduces a little bit of random variance instead of hitting right exactly on the beat and that's because humans don't always hit exactly right on the beat and it turns out computer generated music can feel sort of sterile sometimes because it's so precise and so
23:00
this is changing that up just a little bit now I've scheduled the snare to start at one and the hi-hat to start at two because for this demonstration I just wanted to play one after the other kick drum okay we have a drum kit so
23:26
let's loop it so drum loops so often in a song you have a riff like a drum riff that pattern that plays over and over again in a loop so the way you do that is you can actually pass the loop argument to the instrument and
23:44
it'll pass that down to all the parts and then we're gonna schedule these to all play simultaneously so the default start is zero so I haven't specified that here but what I've done is I've specified some notes to play at various
24:00
different times so the kick is going to be on the first beat this snare is going to be on the third beat and the hi-hats on every beat and this is actually a very standard rock beat let's see what that sounds like very
24:26
standard beat cool all right let's put it all together so I've actually composed a song with ember which I'm going to demonstrate in a minute so I've created routes and controllers for these different sections the intro the verse
24:42
and the chorus and I've set it up to actually auto advance among the sections so it'll actually do intro a verse a chorus and then a final verse the routes handle all the timing and advancing so there's some timing about where to get these sections to play in relation to each other it's all
25:04
handled by the routes and then the advancing which is just transition transition to the controllers are really there just to inject the instruments the services to get them into the templates and so I set this up with a parent route for the drums because I want those drums to loop all
25:24
the way through the whole song and not have to redefine them for every section and then we have child routes for each of these sections so let's have a listen and there we go there's a song so future plans I want
26:50
to write more songs this was actually a lot of fun once I got things set up and going it was it was fun to build that too it was really fun to work with I really want to turn this into an add-on and put it on the community I
27:01
think it would be great for other people to be able to play with this I want to continue to add features there's a lot more we could do and a lot more stuff I could get into the template and make it usable and I want to try collaboratively writing songs using github I think that would be pretty cool and it's probably been done before I'm sure but I think this would
27:23
be any any way to do it so I want to thank you all for listening to my talk I'm gonna put all of this up at jamescdavis.github.io slash embersong and if you want to follow me I will tweet out when I release this as an add-on and do more work on it it's jamcdavis on
27:43
Twitter without the E and everywhere else discord and github it's jamescdavis so thank you very much