We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Functional Programming the Long Road to Enlightenment: a Historical and Personal Narrative

00:00

Formale Metadaten

Titel
Functional Programming the Long Road to Enlightenment: a Historical and Personal Narrative
Serientitel
Anzahl der Teile
170
Autor
Lizenz
CC-Namensnennung - keine kommerzielle Nutzung - Weitergabe unter gleichen Bedingungen 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nicht-kommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
This talk outlines developments in programming from the beginning of programming (in 1948) to today. In particular I'll talk about the development of Erlang and about the trends in programming that lead to Erlang and what these trends mean for the future. Work on Erlang started in 1985, so we'll turn the clock back to 1985 and see what the world looked like then. C++, Java, Haskell, Javascript, Ruby, Python and Perl had yet to be invented. Most people believed, incorrectly as it turned out, that Ada and PL/1 would dominate the future. But I was more interested in Prolog and Smalltalk. Prolog was the answer, but I didn't know what the question was. I'll talk about how we grew a programming language, and what the main factors were in spreading the language. I'll relate my personal experience with Erlang to broader developments in programming and try to see how the two fit together. I'll also take a peep into the future and speculate about where computing is going.
ProgrammierungFunktionale ProgrammierspracheFatou-MengeNummernsystemErlang-VerteilungAppletPascal-ZahlendreieckBimodulStandardabweichungCASE <Informatik>Formale SpracheNummernsystemDynamisches SystemFunktionale ProgrammierspracheFigurierte Zahlt-TestProgrammierspracheEntscheidungstheorieZeitrichtungNeunzehnQuick-SortPunktProzess <Informatik>AppletFamilie <Mathematik>SoftwareentwicklungMultiplikationsoperatorRechter WinkelPerspektiveComputervirusAbgeschlossene MengeTermKalkülBitAssemblerMailing-ListeArithmetische FolgeDatenstrukturFaktor <Algebra>Logische ProgrammierungBeobachtungsstudieRechenschieberPhysikerAbstimmung <Frequenz>TypentheorieMaschinencodeInformatikInformatikerEndliche ModelltheorieNabel <Mathematik>Design by ContractSoftwareReelle ZahlFatou-MengeLastLeistung <Physik>Algorithmische ProgrammierungZahlensystemProgrammierungImplementierungZahlenbereichLambda-KalkülNational Physical LaboratoryEinsProgrammiergerätDatenparallelitätNebenbedingungDomain <Netzwerk>Parallele SchnittstelleSchedulingObjektorientierte ProgrammierspracheParallelrechnerComputeranimation
Lambda-KalkülKalkülProgrammierungFunktionale ProgrammierspracheVersionsverwaltungDefinite-Clause-GrammarConcurrent PrologParallele SchnittstelleFormale SpracheKernel <Informatik>KrümmungsmaßComputerSystemprogrammierungProzess <Informatik>InformationDatenverarbeitungssystemExploitDesintegration <Mathematik>ARPANetRuhmasseVirtuelle MaschineNo-Free-Lunch-TheoremGebäude <Mathematik>QuadratzahlMagnettrommelspeicherROM <Informatik>BefehlsprozessorMini-DiscDiskettenlaufwerkWärmeausdehnungQuellcodeStatistikEindringerkennungRechenwerkInklusion <Mathematik>Graphische BenutzeroberflächeErlang-VerteilungSoftwareentwicklungTouchscreenVersionsverwaltungVirtuelle MaschineLipschitz-StetigkeitUnrundheitReelle ZahlLastQuick-SortDatenparallelitätProgrammierungEchtzeitsystemTranslation <Mathematik>MultiplikationsoperatorBootenNeuronales NetzGruppenoperationComputerarchitekturFunktionale ProgrammierspracheBeobachtungsstudieFaktor <Algebra>Logische ProgrammierungImplementierungFormale SpracheRefactoringHorn-FormelArithmetisches MittelMereologieProjektive EbeneKartesische KoordinatenParallelrechnerTypentheorieDatenverarbeitungssystemHydrostatikGroßrechnerDatenflussplanResultanteVerkehrsinformationMessage-PassingRechter WinkelAutomatische HandlungsplanungIntegriertes InformationssystemProgrammierspracheNatürliche SpracheNeuroinformatikDynamisches SystemBildschirmfensterPhysikalisches SystemGenerator <Informatik>PinchingInformatikerParallelisierungDifferenteFestplatteInformatikGebäude <Mathematik>ZweiProzess <Informatik>Modifikation <Mathematik>PlastikkarteSoftwareentwicklerHalbleiterspeicherWärmeausdehnungParallele SchnittstelleMathematische LogikMetropolitan area networkProdukt <Mathematik>PunktNeunzehnARPANetBitrateGamecontrollerKrümmungsmaßQuadratzahlSystemaufrufLeistung <Physik>RotationsflächeAppletProtokoll <Datenverarbeitungssystem>Gleitendes MittelSchnittmengeArithmetische FolgeComputeranimation
DiagrammRechenwerkFunktion <Mathematik>CASE <Informatik>TermMultiplikationPrimzahlzwillingeTelekommunikationVerschlingungOrdnungsreduktionProdukt <Mathematik>WarteschlangeErlang-VerteilungNichtlineares GleichungssystemAggregatzustandPhysikalisches SystemLoopWeb-SeiteTermSymboltabelleLogische ProgrammierungSoftwareentwicklungNichtlinearer OperatorWarren Abstract MachineProdukt <Mathematik>Prozess <Informatik>Virtuelle MaschineWeb-SeiteSpeicherverwaltungFormale SpracheDatenparallelitätTupelPlotterVersionsverwaltungMultiplikationsoperatorPunktDiagrammHalbleiterspeicherZahlenbereichQuick-SortParallele SchnittstelleVollständigkeitOrdnungsreduktionGenerator <Informatik>Projektive EbeneSoftwareDatenverwaltungZeiger <Informatik>ProgrammierungMultiplikationDatenbankImplementierungInterpretiererMessage-PassingLogischer SchlussFrequenzRechter WinkelOverhead <Kommunikationstechnik>Funktionale ProgrammierspracheRechenschieberGruppenoperationGanze ZahlPrototypingRechteckSchedulingTabelleAbstraktionsebeneSchreib-Lese-KopfTopologieDreiecksfreier GraphMailing-ListeTermersetzungssystemMultifunktionEntscheidungstheorieSelbst organisierendes SystemGüte der Anpassungp-BlockCoprozessorZehnElektronische PublikationBitProgrammierspracheFächer <Mathematik>Mechanismus-Design-TheorieObjektorientierte ProgrammierspracheErlang-VerteilungZustandsmaschineMaschinencodeSpeicherschutzZweiMinkowski-MetrikMathematische LogikSchnittmengeStandardabweichungTypentheorieDatenstrukturMomentenproblemARM <Computerarchitektur>Objekt <Kategorie>AutomatUnrundheitTotal <Mathematik>WhiteboardSystemaufrufFinitismusPhysikalisches SystemLeistung <Physik>XML
EinsÜbersetzer <Informatik>Pi <Zahl>EmulatorRechenwerkOpen SourceErlang-VerteilungSteuerwerkZahlenbereichGruppenoperationProgrammbibliothekComputerspielSelbst organisierendes SystemFundamentalkonstanteMaschinencodeProdukt <Mathematik>ProgrammiergerätMinkowski-MetrikVererbungshierarchieTupelObjektorientierte ProgrammierspracheZweiVirtuelle MaschineZehnCompilerByte-CodeOrdnungsreduktionSoftwaretestOpen SourceProgrammierspracheParametersystemInformatikFrequenzEndliche ModelltheorieSuite <Programmpaket>Ganze ZahlDatenparallelitätInnerer PunktFunktionale ProgrammierspracheQuick-SortEmulatorGamecontrollerProzess <Informatik>InterpretiererRechenschieberWellenpaketProjektive EbeneMultiplikationsoperatorGebäude <Mathematik>SoftwareInstantiierungMomentenproblemMicrosoft dot netBitPhysikalisches SystemAppletHardwareSoftwareentwicklungObjekt <Kategorie>EntscheidungstheorieStrategisches SpielWeg <Topologie>Mechanismus-Design-TheorieWort <Informatik>Kartesische KoordinatenErlang-VerteilungOrtsoperatorMetropolitan area networkSpeicherverwaltungRechter WinkelProgrammierungDifferenteEinsDatenverwaltungPunktTermKlasse <Mathematik>OvalMailing-ListeRobotikVierzigDemoszene <Programmierung>Arithmetisches MittelDatenstrukturQuellcodeAnnulatorAxiomErwartungswertWeb SiteResultanteComputeranimation
Open SourceErlang-VerteilungSteuerwerkDefinite-Clause-GrammarParallele SchnittstelleOrdnungsreduktionHydrostatikGasströmungLeistungsbewertungGebäude <Mathematik>ServerProgrammierungFacebookChatten <Kommunikation>Reelle ZahlGruppenoperationFlächentheorieServerParallelrechnerPhysikalisches SystemBenutzerschnittstellenverwaltungssystemKartesische KoordinatenLogische ProgrammierungErlang-VerteilungFacebookSoftwareentwicklerNortel NetworksOffene MengeProtokoll <Datenverarbeitungssystem>MathematikAppletATMChatten <Kommunikation>LastInformatikFrequenzBenutzerbeteiligungHeegaard-ZerlegungBasis <Mathematik>MultiplikationsoperatorWeg <Topologie>Quick-SortGüte der AnpassungOpen SourceTelebankingGruppenoperationProdukt <Mathematik>t-TestEinfache GenauigkeitNebenbedingungMomentenproblemProjektive EbeneRechter WinkelResultanteTypentheorieFlächeninhaltParallele SchnittstelleMaschinencodeSoftwareentwicklungZahlenbereichHochdruckDienst <Informatik>Nortel NetworksExpandierender GraphHydrostatikProzess <Informatik>BildschirmmaskeLeistungsbewertungProgrammierungSchnitt <Mathematik>Arithmetisches MittelSoftwareSpeicherbereichsnetzwerkLeistung <Physik>Virtuelle MaschineDynamisches SystemComputerarchitekturFitnessfunktionSchreiben <Datenverarbeitung>E-MailZentrische StreckungBestimmtheitsmaßVollständigkeitDelisches Problem
SteuerwerkErlang-VerteilungPetaflopsIntelSpeicherabzugPrognoseverfahrenEreignishorizontParallele SchnittstelleSystemprogrammierungRandwertBeschreibungskomplexitätSoftwareMathematikBeweistheorieSpeicherabzugGeradeParallele SchnittstelleDatenparallelitätFunktionale ProgrammierspracheEDV-BeratungQuick-SortMultiplikationsoperatorGemeinsamer SpeicherBildschirmmaskeFrequenzProzess <Informatik>Güte der AnpassungSchlüsselverwaltungMaschinencodeWhiteboardHeegaard-ZerlegungBridge <Kommunikationstechnik>ComputersimulationFehlermeldungKartesische KoordinatenSoftwareComputerarchitekturPetaflopsCachingDatenverwaltungRuhmasseDifferenteMailing-ListeSichtenkonzeptLeistung <Physik>TesselationSupercomputerZahlenbereichTypentheorieTotal <Mathematik>EnergiedichtePrototypingDämpfungEreignisdatenanalyseNeuroinformatikMooresches GesetzBitAggregatzustandAppletMathematikStapeldateiGrundraumVirtuelle MaschineVollständigkeitResultanteRandwertZweiunddreißig BitProgrammierparadigmaGenerator <Informatik>Erlang-VerteilungVorhersagbarkeitHalbleiterspeicherFormale SpracheProgrammierungProjektive EbeneGanze ZahlOrdinalzahlDatenbankEreignishorizontProdukt <Mathematik>Strategisches SpielMetropolitan area networkSchreib-Lese-KopfPhysikalisches SystemKomplex <Algebra>Higgs-MechanismusMini-DiscBinärdatenEinfache GenauigkeitGruppenoperationKreisflächeParametersystemSampler <Musikinstrument>Natürliche ZahlStereometrieOrdnung <Mathematik>Gleitendes MittelIntelStützpunkt <Mathematik>Rechter WinkelHilfesystemForcingParallelrechnerDienst <Informatik>TermVierzigGebäude <Mathematik>BeweistheorieTextur-MappingPunktOffice-PaketFlächeninhaltLuenberger-BeobachterComputeranimation
ZeichenketteMailing-ListeGanze ZahlSelbstrepräsentationPunktKette <Mathematik>MaschinencodeFunktionale ProgrammierspracheEvoluteCodierung <Programmierung>Formale SpracheSoundverarbeitungDesign by ContractComputeranimation
Transkript: Englisch(automatisch erzeugt)
Hello, you hear me anybody hear me anybody not hear me put your hands up if you can't hear me Yeah, I'm giving two talks Here so hope I've given the right tool. This is one about functional programming And my I've sort of tried to interleave
What I've done in functional programming with the history of what was happening at the time so I'm gonna go back to about 1985 when I started doing this and I Kind of give it a historical perspective. I think when I started programming there were only a few programming language. I had to choose between
Fortran and assembler and kobold and now I think it's 2500 programming languages to choose between It's actually much easier to choose between three than 2500 and of those is probably only about 30 or 40 that are worth using anyway, and and That's tomorrow's lecture how we got into this colossal mess and how we've totally fucked up all the
software structures with far too many programming languages and and I'm one of the people who've been making programming languages. So I've kind of contributed to this. So so tomorrow I'll admit my sins and and possibly think of talk about some little ways of
Getting out of that but that's tomorrow's lecture. Not today's lecture. So this lecture is all about Well, the title is functional programming the long load to enlightenment And I'm going to tell you about two things. I'm going to tell you a little bit about the history of functional programming And I'm going to interleave that with with my own personal involvement with that
so you can kind of see where how things fit in and So I'll run it along a sort of historical timeline And that timeline starts Back in for my part in about 1985 This this slide actually is it's quite an old slide I've used it quite a lot because it represents the world as it as it as it was in 1998 or 1999
when I gave an invited talk to the ICFP International Conference on functional programming where I gave the history of airline And it stops there for a reason which I couldn't reveal at that conference
So I shall reveal why it stops there and then what happened after that. So I'm gonna go back to 19 about 1985 when when I I I used to be a physicist and I got a job at Ericsson in the computer science lab and That was a newly started lab
and so We were just kind of messing around with programming languages. So here I am in 1995 I was a young lad of you know How old was I then 30 or something, you know with a glint in my eye and didn't know anything about programming I used to be a physicist and I Was working as a computer scientist. It's quite fun. So
If you look oh, sorry, just back off a little bit if we look at the decades of programming Not not many not many languages have really survived into the future Some of them have took with them ideas and they've lasted forever others Others kind of died or they're still around because there's legacy code, but they haven't influenced future languages so we have we have these decades of programming and in the
Starting off in the 1950s. That was the first stage of programming languages like LISP algol, COBOL Mercury autocode things like that came along and and what lived into the future was I guess the the FORTRAN and the COBOL and the LISP LISP was the Progenitor of all the dynamic languages and FORTRAN was the the progenitor of the statically typed
Imperative languages and so on and then in in the 60s APL and PL 1 PL 1 of course was at the time. Everybody said PL ones the language of the future Everybody will be programming in PL 1 and it wasn't the case. It wasn't the case didn't work out that way at all
70s basic small talk Scheme the born shell C came along Small talk, of course the progenitor of the object-oriented languages. I mean this was object-oriented done correctly It was replaced by Java and C++ and object-oriented done incorrectly
Small talk only got one thing wrong was a concurrency model, but apart from that is a pretty decent language and then Oh, wait a minute, I haven't got well, it's a language which is survived today and still use I could have should have put prologue in the list, which is 1972, but I don't think people use it today
it's highly influential and it's it's It's one of those nice languages that should have survived but unfortunately not many people use it has survived in niche Domains like constraint logic programming is used to schedule airlines and things like that Basically prologue is so good. There aren't any problems that are worthy of its use
Robert Kowalski said that the the prologue was a solution in search of a problem But there weren't any problems that were difficult enough for it. So, you know So people use Java and things like that instead and then in the 90s Haskell And a load of scripting languages came along
No, the real sort of the Haskell II type name came relatively late They came in the 1990s the logic programming languages came in the 70s So actually 20 years ahead of the functional programming language and I'm not talking about functional programming as an academic discipline of course the church and the lambda calculus from the 1930s, but Nobody knew about that because if he couldn't actually run any programs and the type lambda calculus
1936 or something like that and then in 2000, we've got C sharp and scar and go and 2010 I don't know which languages will survive from 2010 possibly Julia Which seems to be a very nice language Well, I'd have to look I have to give a lecture in 2030 to Probably won't be alive then so you'll have somebody here will have to give a lecture in 2030
right, so These are the kind of significant languages. I found this on the net somewhere and I sort of Somebody's just made a list and I'm just looking at the dates when they came in 1957 Fortran first dynamic language lisp
1960 first logic programming language prologue who the arrow points wrongly 1970 I just said 72 it was 70 there small small talk 80 actually the small talk 72 which surprisingly was done in 1972 you could guess that from its name and
Standard ML came in on 1984 Haskell 1990 Talking about functional programming really there wasn't much functional programming before ML I think standard ML in 1984 Of course, although it came out in 1984. I didn't know about it in 1984
So when I started work, which was in 1985, what did I know about? Well, I knew about logic programming I knew about prologue and I knew about small talk and And of course everybody knew about Ada and pl1 and the algol family of languages. So so So I've been as like I didn't actually realize I was a functional proud so a closet functional programmer for five years
because I started off with prologue and And small talk kind of merged the two together to make a parallel Programming a parallel logic programming language and that with time became more and more functional. So I sort of slowly Sort of moved over to functional programming
Right. I just thought this was nice as well because it if you're functional programmers, so it's a very nice talk by David Turner, which gives the history of the road to Haskell and starts of course The lambda calculus or the typed lambda calculus in 1936. These are these are from his lecture
List came in 1960 it's list is not actually an implementation of the lambda calculus because it turns out that McCarthy didn't actually know about the lambda calculus at the time and wasn't influenced by Church in the slightest it was an independent invention Then Algol 60 I swim if you see what I mean Peter Landon one of his he wrote this great paper than eight
I think it's an eight next eight hundred was it six hundred eight hundred eight hundred programming languages I believe and he had this I swim notation which was the first and then pal and sassle and In the late 1969 to 80 NPL hope and and
So on devoted came from Edinburgh and by a strange coincidence I I had a job in Edinburgh and because I was a physicist and I just got a job Working at Edinburgh, so I was fortunate enough to learn prologue from Robert Kowalski the prologue was very early and There's a student there. We were all going like well
Kowalski's either mad or he's a genius We couldn't figure out which of the two it was because we didn't really understand what he was talking about It's only years later. I've realized that he certainly wasn't mad. I mean, he was very very smart and then rod burst all Did this new programming language? and then ML and then hope and then Miranda came out of that and then
Haskell came out and it's quite interesting. I talked to David Turner about this and He remembers coming to Erickson Giving us talk on sassle when I was doing airline and we were exchanging ideas there We both remember this and and we got on talking about types and dynamic types and static types and
What do you think about types? He's oh well, ah Doesn't really matter, you know, so sassle was dynamically typed Haskell statically typed and you know, it's not a religion I mean, it's just a sort of practical thing that they're equally good He said I thought that's quite nice because he was a father of the Polymorphic statically typed languages and yet he didn't really
Didn't really wasn't very religious about his very practical man And then okay, so that's functional program logic program. It actually came earlier 1972 Alan coal Mariah and Philip Rosso based on the work of kowalski kowalski made a fear improver for horn clauses Which could be programmed in an efficient way and out came
Logic, I mean sequential logic prologue Because there's no time or anything like that prologue turns out to be quite easy to parallelize So the first parallel programming languages Came out in the early early 80s concurrent prologue and parlog and kl1 flat GHC
These are languages that very much influenced airline Not not in the syntax or anything like that, but in the implementation Because when we when we were doing airline We were looking to how parlog had implemented and how kl1 was implemented to gain inspiration
This is all before I started work, this is prehistory in 1982 something very significant happened Japan started the Minister of International Trade started an 850 million dollar project to create a massively parallel computer based on prologue
this was a Japanese effort and and it came out in 1982 and This guy Edward Shapiro is American went off to Japan sent there and he wrote a study and he reported back and he said As part of Japan's effort to become a leader in the computer industry The Institute for the new generation computer technology has launched a revolutionary tenure plan for the development of large computer systems
Which will be applicable to knowledge information systems. This was the fifth generation and That didn't that report didn't have much Significance, I think it was probably largely until Feigenbaum wrote this book the fifth generation which came out
I think in I can't remember the date. I think it's 83 something like that. Yeah, I think it's 83 This book caused a storm in America And they said the Japanese are going to build this thing this massively powerful parallel computer that can do Everything and we have to do something about this And so as a result of that in the u.s
Very quickly, they formed the strategic computing initiative with DARPA funding Which is going to fund it with 1 billion dollars from 1983 to 1993 and The one of the people in there wrote the machine would run 10 billion instructions a second to see here think like a human You know, it was going to do real-time natural language translation and all these things. It was fantastic. Never we've wow. This is great
Once we built this mega lip prologue machine I Just put there in 1987 They cut all the funding for this project because they weren't getting anywhere and the pro at the project kind of fizzled out by 1997 but in 83 there was loads of money around and in Britain
The alve project started and they they chucked in 350 million pounds to build a machine To it was a declarative machine to build Hope hope was hope was a sort of it come out of ML and new programming language from Edinburgh it was hope was called after Hope Park Square, which is in Edinburgh and
with the Department of Machine Intelligence and artificial intelligence and centered around Hope Park Square and They were going to make this declarative architecture that would execute declarative functional and logic programs extremely quickly And if you look at hope that's a factorial function in Hope looks like that
And you'll notice very similar to what factor looks like in Haskell or Erlang or any of these programming language So they all kind of date back from there. So that's a prehistory and so then I arrive at Ericsson in 1985 The projects haven't been cancelled because it's not yet It's not yet up to 1987 where they realized none of this stuff was going to work and cut all the funding and the funding
Is absolutely amazing. So it's really good and everybody's rushing around going what the hell, you know prologue prologue prologue Let's build declarative machines and do all this stuff. So I sort of arrived there and I learned prologue Wow, everybody's gonna do everything in prologue in the future
I thought turned out to be completely wrong, but but that's what I thought at the time So here we are in 1985 and At the time Everybody said well Ada and PL one are going to be the languages of the future It's good. Everybody's gonna be reprogramming PL one forever. And you know, how many of you program in PL one?
How many in Ada Oh one person very good. Thank you very much Yes, right So I take with a pinch of salt Somebody says whatever it's gonna be programming in Java plus plus in the future or whatever They say cuz they these things last a few years and then something better come Well, if something better doesn't come along with sunk because we need better stuff
In 1985 IBM was at its height. It had 450,000 employees worldwide And it actually started sinking after that Microsoft Windows was released in November and machines had a few megabytes of memory So this is 1985
It's a typical PC typical now wasn't it typical it's a PC you could buy if you had lots of money It's not the one you bought at home one they bought for you at work and it had a colossal 256 kilobytes of RAM you could extend it up to three megabytes by adding a few Expansion cards and you needed a lot of these are big big cards you plug them in and it had this
blazingly fast 8 megahertz clock and You can have a 20 or 10 megabyte hard disk. That's 1985 actually So So you have to ask yourself, you know that thing in 1985 would boot in 120 seconds Process a day 10,000 times faster should boot in 12 milliseconds
But my machine doesn't boot in 12 millisets. So what the hell went wrong? I'll be talking about that in my lecture tomorrow. What the hell went wrong? Different different subject. So using machines like this I wanted to program this this is a telephony Of course I was working for Ericsson this is a telephony
Flow diagram it showed three parallel processes Time is proceeding down the screen and message passing is sent between the three communicating finite-step machines and I find a convenient way to program that so I wrote Right about 1985 I discovered prologue and I started writing these things in prologue and
I was trying to dig out the first version. I've completely lost the first version. So the first version I could find is This documentation for version 1.06 and in the comments it says
Version 1.03 lost in the mists of time, so I haven't actually got anything that goes back beyond that so that's the first ever thing that I could find and And Yeah, so that's that's kind of round about 1986. I was actually developing a programming language, but I didn't know I was doing developing a programming language at the time
If anybody had said that in 30 years time or 20 years time lots of people be using it I'd have said they were mad, you know, because I didn't even know I was making a programming language so Having developed a programming language the first significant thing that happened Was we had the good fortune to come in contact with a group of people who wanted to use
The technology we were developing. This is Sherstnerdling and she had an application. She wanted to build something Now I would reckon, you know, it's very good if you're making a new technology Get a user, you know I mean computer scientists will just invent stuff if they're given nothing to do computer scientists will just invent stuff
Which is totally useless Right because they don't actually have any problems given no problems at all They will invent their own problems and solve them Right, so you want a real human being of flesh and blood some who isn't interested in programming isn't interested in computer science And wants to build something right and then you cherish them
So this is Sherstnerdling and what she wanted to do was program this thing. This is a telephone exchange. It's a MD 110 and she worked out a Better way of programming these. These were programmed in a language called Plex. Plex was a For its time. It comes from 1978. It's a very good programming language
It's probably the precursor of a lot of object-oriented programming languages, but it was proprietary. It was secret Nobody knew anything about it for that reason it never spread but it had certain concepts in it like blocks and signals which have correspondences in languages like prologue Sorry, not prologue small talk and language like that and it had above all it had memory protection between processes
So Erlang's got a lot of things that came from Plex actually that the memory protection in particular came from Plex since she wanted to program this thing and She was a great fan of what you call fishbone diagrams. These are logic trees that are That are sheer decision trees with they haven't got no cycles in to in in terms like that
She would she would write diagrams like this That's that's a bit of a fishbone diagram, it's just it's just a finite state machine with no cycles in it But what you want to do is execute lots and lots of them in parallel So you've got thousands of them running in parallel So, how do you run lots and lots of these in parallel so that's that was the question
I wanted to answer given a finite state machine How do you do that? Well, one way of doing it is to run it and Pull it out of the database. That's not a really good way of doing it There are other ways of doing that the way we
Hit upon or thought up was just to put this in a process. It's in memory We need a very lightweight mechanism for doing this So I wrote this in prologue and this was oh, there we are back again That was if you look at that diagram there. It turns out to be a bit of code that looked like that That's prologue actually because prologue has infix operators
And so I could do that in prologue, but prologue is a sequential language and so the way of implementing concurrency was You you execute a sequential process And then when it runs out of things to do I mean more or less run to completion actually run For a certain number of inferences or until it wants to receive a message and know it has no such message and then suspend it
Suspend it means putting it in a database and then just around a simple round-robin scheduler. So it's I wrote that stuff and Then by 1988 this was delivered to Sestan Oudling and her group and they built a
Telephony exchange with it a prototype of that, but it wasn't very quick the Here you can see I I called a multi processor version. This is the first version that Had parallel processes in and as you can see it took four days to there's four days for total rewrite of the language So so I mean that's just a
Smallish prologue prologue prologue program, maybe a couple of thousand lines of prologue. You could rewrite it all in a few days and It was pretty slow. It was doing 245 reductions a second a reduction is just a function call, but it was fast enough to
be able to Prototype this telephone exchange that we wanted to build For 1998 Ericsson took a decision to make a product based on that We decided that we would we would have to speed up the Implementation language far too far far too slow. It had to be about 8,000 times faster
To make it into a product then we needed documentation and all this other stuff. So we started two activities in parallel one was the programming of the product which we expected to take about two years and the other was the speeding up Virtual machine which we also expected to take about two years. We thought they would converge it'd be fast enough and
We could do all this stuff. So 1998 we started that project. We need to develop courses. This is before PowerPoint so It was just sort of hand-drawn overheads on on Slides, I'm not very good artist, but I quite fun doing the slides I think I think PowerPoints just sort of killed all creativity
You know, it's it's this this was a this would see this is a head and tail of a list And we needed documentation they say why you gotta have documentation So I wrote the complete documentation of the first airline system, which fit it onto one page actually So that's see that's the first documentation. I mean it's now
120,000 files of XML or something and it's a bit bigger than that But that was good enough and the users were very good. They didn't complain if you change the language of you know If the next version of language wasn't the same as the version there the week before they they just happily swallowed that without complaining So it's very good and we had small machines Remember I said that machine was pretty small. So stuff we didn't use we removed
Nowadays, you've got whacking great big machines all this crap that you should throw away You don't have to throw it away because your machines are so big You can swallow all the crap as well as the good stuff But in those days we had to throw away all the rubbish because space was limited right Performance right well performance was absolutely lousy
This this is a sort of time Plot up to the late 1984 It just shows the number of reductions per second and the technology we use to do that and the green and red shows green is Experiments and red is production they put this stuff into production and you'll see there that
There's a gap it's about a sort of Year and a half period when you mess around with an interpreter and then Or an implementation technique and then you put it you deploy it and you've got these Phasing in and coming out with different technologies. So so the one killer reductions per second. That was the interpreters
We did a failed experiment with strand strand of the logic programming language developed by in Foster based on KL 1 and the Japanese fifth generation languages and That's the first The first and last time I ever predicted how far something would go before I had
Implemented it because we confidently told people how fast this thing would go before we'd implemented and measured and were completely wrong So I've never done that since then so all the project managers keep going. Well, when will your program run? Well, I don't know How fast will it be? I don't know and you just just stonewall forever if a project manager asks you when your software will be ready
Just say I don't know. I've never done it before and never give in right because you always get into trouble if you say it's going to take six weeks and it takes two years you get into trouble and And even if you take six weeks and it takes two minutes, you know You also get into trouble and you never know. So it's just just say don't know right? So that was a failed experiment. There was no production and then there was a jam the Joe's abstract machine
Based on the Warren abstract machine and that that was in production for quite a long time until it was replaced by a better machine and And here we get up to 1988 so This is I don't know if you want to know how airline works
Anybody want to know how airline works briefly? Oh, jolly good So this is how airline works Each process has got a stack and a heap and some registers and all of that fits into about 350 bytes. Okay And it's pre allocated. So we had pre allocate Actually, I think we pre allocate a kilobyte per process in that the stack and heap that grows together in a set of registers
I'll just look at it. This is how the jam worked actually Ellen's got terms symbolic terms think of them as structs in C. There's a struct Containing a rectangle an atom a rectangle. That's a symbol and a couple of integers 10 and 20
So in memory that's represented as a tag pointer or was it still represented tag point? The tag that says hey, I'm pointing to a tuple That's can be on the stack or heap and there's an address and at that point on the heap In fact for a tuple always on the heap it points to a tag pointer and says hey
It's a 3 arity 3 that says I'm a struct with three things in it And then there are three pointers that says hey I'm an atom and there's a pointer into an atom table and then there's something that says hey I'm an immediate integer and the value of tens in there. This is for a 32-bit or a 64-bit machine So that's that's the memory organization
so If you wanted to build that tuple you could say Push integer 20 push the stack and heap you say push integer 20 push integer 10 push atom rectangle Make tuple 3 this is pretty much like the old Warren machine used to work You just you're just moving things from the stack and heap with instructions
Okay So when you've done all that you're going to have something like that on Wait a moment the built object the tuple you're gonna have a tuple that points to something on the heap that says I'm a struct of Three words long and it says I'm an atom and here's a point
Everything's fully tagged and it's just sitting on the heap doesn't take much space and the code to do that Let's let's suppose we've got a function foo That returns a tuple 10 ABC the complete code just says well enter through That's just a label that says this is the start of the foo function
What are you going to do push out an ABC push in 10 make tuple 2 and return? now each of these things Just becomes bytecode if I take those two instructions Push int 10 and Make tuple 2 will the push int 10 might be the bytecode 16 because it's a bytecode of machine tens immediate value
It's a push short integer instruction 20 might mean make tuple and the arguments 2 and then there's just a little C interpreter that just interprets that So it's pretty simple that will that's literally The compiler to do that just spits out bytecode the emulator
Just execute that it's very much very much like the JVM very much like the dotnet virtual machine the instructions in many Instances pretty similar actually, I mean the you know push integer just sort of sticks an integer on the stack I can't really do anything else so I wrote a Compiler for that in alling itself because we had one in prologue and then I wrote an emulator for that in in
In prologue again and here it was the Jam You can see from the handwriting. It says I couldn't I wanted to call it Joseph Joe's own super airline programming No, that didn't really fit and then Joe's own engine Joe. No, it's called Joe and it actually went to
30-12 oops That was an airline reduction per second when when it was interpreted and we could compile this into this bytecode and execute it with again all in prologue that ran at 35 oops when it was compiled and
We could compile it once a day when I when I went home and it would be ready the next morning so you could run It but that was good enough. We ran through all the test bench suites everything And ran at 35 oops, but the whole machine design was ready And so Mike Williams then came along Mike's my mate who knew see he read my I was writing this in C
I had never written C before I started writing it to speed it up. I'd written Fortran before that My first ever C program was C virtual machine to run Erlang and Mike read it and said this is the worst program I have ever seen in my entire life He said so he rewrote it. Oh, so now there were three I mean there was Robert had joined me about a year before so he's writing all the libraries
I write the compiler Mike was writing the Writing the emulator and and what we did that and yeah now it beatled along at Oh About a hundred hundred thousand reductions per second. It was really quite quick Well C is actually better than prologue for Implementing virtual machines, you know, it's almost as ideas assembler. So it's pretty good. So here we are. We're at 1990
Yeah, and we've got this thing going so what happens now show the movie Oh, no, I won't show that I'll hop over that and we we we had great fun This is the lab in 1992
We had to read to show we had to illustrate concurrency for people parallelism. So so we thought how do you do that? Well, we have a model train and a telephone exchange and we'll control them from different parallel processes in the same virtual machine. So and this was at a This was the best people who went on the Allen course always remember this because the exercise you put two trains on the tracks
There's one Train pointing to the right and the programming exercise was to swap them and There were little sensors that told you when every real-time programming actually great fun And and there's Robert and we showed this at a trade fair together with the telephony
Controlling the exchange at the same time that was in 1992. So that that was quite fun and Then what happened right? Well, nothing much happened then 1992 to 1995 nothing much happened. We we started airline user conferences and things And that's one of the first ones it was at that time. It wasn't open source
It was limited to Erickson and we had one room with held 80 people And that's a number of people who could come and we had our yearly conference with great fun Then one of the two things two things have led to Ellen being a successful programming language One of the first of these two things happened
On the 8th of December 1995 a project called a XEN was cancelled That was a big project It's gonna do everything in C++ been running for about six years and had a lot of programmers were involved in this and that it was cancelled and It was Erickson had built the hardware and the software and that software just didn't work
By coincidence, we were working in the same building as this project. So we knew all about it and we'd also been Itching to program their hardware so managed to get hold of the hardware With after a lot of opposition, so we were programming the same application as they were programming
and We'd done it with six people and they'd done it with about 800 people and at the time we will Words were said and things, you know, so we were very untactful in those days. I think about the differences here and So we got a lot of enemies by mistake, which is coincidental
But anyway, they cancelled the XEN project and decided to keep all the hardware and now do it in airline Okay, because we could do that. So we started an airline group and that started this OTP stuff. I Moved from a lab into a production organization we formed this OTP group and I was technically responsible for this and we started building this OTP system and
So in 1996 this project called a XD started and Stuff happened really really quickly. We built up a group. We retrained 60 60 programmers became airline programmers and they programmed away and
off we went and 1996 to 1998 not much happened. They were just kind of building their stuff This is typical. You have these periods of stuff that happens really quickly and then you have these long Periods where nothing appears to happen and then things happen really really quickly But in 1998 nothing much happened until we get to the end of the slide
there's a reason why the slide ends there because Well, I have to go back a bit see why it ends here is well in 1998 the XD is a tremendous success It's works Stupendously well and is sold all over the place and it was such a success that Erickson bandit banned airline, right?
And There were reasons for that So I don't really want to go well, well two of the reasons where it wasn't well the first reason it wasn't Java and the second reason it wasn't C++ and Erickson had taken some strategy decisions to only program in Java and C++. So it wasn't really that they didn't
Wasn't what it was that upset people it was what it wasn't that upset people so they decided to ban it And that's why this slide Stops there because I had to give a talk at ICFP about the same date after it had been banned And so we were out rather in this awkward position of saying, you know going out there and so why is great
You know Erickson's using it. We're doing all this wonderful stuff. I Didn't really want to say it had been banned because that would that would sort of You know stifle any enthusiasm in the audience, you know Maybe I'll go and try it and then they know that it's been banned And so so I just sort of forgot to tell them that which was funny He said all C++ was banned 11 times he said so I thought we've only been banned once I think it was
Yeah, so we were banned now that was really bad news actually well at the time it was really bad news I remember I thought yeah, I couldn't sleep or get ulcers and oh dear. What are we gonna do? So
Well what happened after that was was quite fun because we banned so after it was banned There was a about a four-month period where the people the computer science lab people sort of went into a little huddle I mean, what should we do? What should we do? What should we do? Well, we can't fight the technical director because he's very big big powerful man. So so I know we'll all quit
We said start a company We want to use Erlang, right? So So then Erlang after four months became open source through a mechanism That is still amazes me that we actually managed to persuade the Erickson management Well, if we're not gonna use it release it as open source and we managed to do that amazingly. I still don't understand how
And four days after it released as open source. We all left by a strange coincidence And the matter well it turned out, you know, you know, there's Dilbert Carter, you know venture capitalists Just hand you money and stuff. Well that was in the golden days of 1998, you know
So four days later we formed blue tail and the Erickson the Erlang development split into two groups basically The people had been in the computer science lab We all left and started a company and the people inside Erickson had built the commercial product stuff They went into flying under the radar mode
They didn't really want to annoy anybody and so We keep changing the name so they wouldn't know it was Erlang. It was quite good I Seriously, if your project is banned change the name, okay No, it's I'm serious. I years later I was working Erickson and my boss came in said this this project you're doing us
Do you want the good news the bad news? Well, what's the bad news? Well, the project you're working on has been cancelled So what's the good news? We're starting a new project Just change the name it takes six months to catch on right? It's very good method right? So it's split into two and So
Now I'm not actually in Erickson anymore. I'm now outside Erickson. I haven't been watching carefully. What's been going on Haskell is now becoming pretty popular Prologue has lost popularity totally has kind of sunk like a stone. It's niched into constraint logic programming
and Interest in parallel parallel machines has gone. There was a whole flurry of trying to make parallel architectures It's gone because every single project they're trying to make a parallel machine failed and Every single research project that tried to parallelize legacy code failed or failed
Managed to get a 15% speed up after massive efforts I mean people been trying to parallelize fortran as long as fortran has been around and they've never got more than 15% and everybody's arguing about dynamic and static typing and lazy evaluation and We're all having great arguments, this is 1998 right in this period now from
1998 to 2014 this would see blue tail was formed 2008 things now are on the track to where we are today After two years blue tail was acquired from 54 million dollars
Which was kind of nice because we'd formed it known stock in it and it was quite fun and a guy called Alexey Sheplin Sheplin, I don't know if I'm pronouncing his name correctly started building an XMPP server in Erlang. He was in the Ukraine Blue tail was acquired by Altium web systems. Altium web systems was acquired by Nortel networks. Nortel networks went bankrupt
everybody got fired or And then out of the embers of Of these groups and three companies formed Tail F
Formed out of the embers of the collapse of Nortel networks Klarna was founded and These seeds are founded in 2005 In 2006, we'll learn more about those later 2006 Alexey Sheplin was awarded the Erlang user of the year 2007 I wrote a book on Erlang because I hadn't written one for 14 years
Then things started then the application started coming out Facebook chat We suddenly announced two guys in in Facebook had written a chat server in Erlang They were it was deployed now and it was running the chat services inside
Facebook 2008 This book that I'd written about Erlang that stimulated the Haskell people. So Andrew Sullivan wrote this real-world Haskell and suddenly O'Reilly said, well, why aren't we publishing books of our Program, you know, so I've been said for years, you know publish the stuff get the books out So people can learn to do it and a whole movement started started building up then
2009 whatsapp was founded Actually quite interestingly but because of a number of reasons one was they were looking at the Facebook server at the time the Facebook chat Server and that seemed to be good. It was the highest Performing XMPP server that you could make on the planet XMPP server had captured more than I think about
Okay, so so a company called process one had taken this and were competing with a Java server Now the Erlang server was not only faster than the Java server I think it's about four times faster, but it was free and the Java server cost money So they had a product that was four times faster and free so that swept the world and we had
Sixty seventy percent of the XMPP market XMPP is this Instant messaging protocol XML based protocol that's used in corporations tend to use it more and that they're open Jabba servers For example, these XMPP that server was the basis of the Facebook chat engine
And so at that time oh and there's now one or two books I'd written my book and My good friend Francesco Sisavini and Simon Thompson Francesco was a master's student that I supervised at
Ericsson and he he went on to form Erlang solutions He subsequently wrote programming Erlang or Erlang I can never remember which is that the O'Reilly book and and so the whatsapp people said whoa, okay, so we're going to build this application and they started building it in Erlang with ten engineers in in When did I say 2009?
Same year as the Erlang books came out and then a load of more books came out and then In 2014, of course whatsapp was acquired for 19 billion dollars by Facebook Which is ironic really because by then they had dropped Facebook chat and reimplemented it in Java Tran plus plus or something
But but the reason for that was they said they couldn't get any you know The the Erlang stuff didn't fit into their infrastructure. So what they had built themselves with their own engineers They then bought later for 19 billion dollars from somebody else to buy it. Well, not not only the technology but the user base Which I think is Interesting to say the least
So there we there we go So if you if you look back at this you'll see you'll see in this timescale when you're developing a technology it seems to go in sort of periods of Sort of this is There's a gap about three to five years before you you you you sort of rush into this new area
So we're gonna use this new technology We get all excited you start a project you start a company or something like that and then nothing seems to happen Because you've got to do three to five years work before there's a result and then they pop out Three to five years upstream and then there's a massive flurry of it attention. So Erlang's gone like that for the last
We keep on going like oh Nothing's happening. Nobody's using it and then and then suddenly I will start tweeting air What's apps being sold for the largest acquisition ever and it's all programmed in Erlang, you know? Oh, well, that's cool. And then so that the Erlang user conference in San Francisco suddenly Was quite funny they had parallel sessions But the what's apps guys were talking about how they'd built what's app application in Erlang and there were some parallel sessions
But not many people went to the parallel sessions In fact, no people at all went to the parallel sessions and I think they scheduled it in the smallest room or something So great great room change at the last moment anyway So How it spread it's actually been rather like
the spread of publishing printing from the Gutenberg Did the first printing press there was a 14-year delay He took seven years to be an apprentice and once you'd been an apprentice after seven years you had to work for your
Master or whatever whatever it's called for a seven year period and then then you could go and start your own company about every 14 Years was a doubling What we've seen in the Erlang developments actually actually the sort of the sort of Erlang DNA started in the computer science lab Stayed there for quite a long time. It's split into it's split into blue tail
This is outside Ericsson, by the way, it's split into blue tail Blue tail lasted three or four years. It's split into into tail F and Klarna This is the sort of genetic material. Klarna is a banking online banking place in Sweden. That is the hottest IT banking startup in Sweden now employs about oh, I
Don't know lots and lots of people and was expanding. It's all written in Erlang the front the front and back ends written in Java and things like that tail F is They do net comp and sold into Cisco and places like that and then Erlang solutions sort of split off and does consulting and It's cross own they cross own Basho which done react database and trifle Basho's done the react database
Which found itself all over the place? There are other things outside that consortium couch DB for example is quite a few databases written in Erlang couch DB's written in Erlang and Was chosen by CERN for the large hadron experiment and so helped discover the Higgs boson
which is so Erlang's helped discover the Higgs boson, which is quite nice because I'm an ex-physicist and Where have we got to? Oh, yes technical. I just Then in the same period of these companies and things were going on I think we've come back full circle now to to looking at parallelism again because
the thing that Erlang's kind of pretty suited form which gave it a boost in about 2004 2005 was how to handle concurrency Because the multi-core computers came along in 2007 Intel made I think that's called a Phoenix I can't remember the name Phoenix or something an 80 core network on chip architecture who did 1.1 teraflops at 62 watts
That's totally amazing performance 2007 tile era came out with the tile 64 and Managed to get hold of the first production batches of those things and we took it in the lab and ported Erlang to it and
We took one we had to left in the application. We we ported it to the tile 64 Without doing it ran 33 times faster Okay, just first time didn't do anything to the code at all I'm very pleased about that We showed it to the management and they said why doesn't it go 64 times faster?
64 cores and we said well, you know Amdahl and all who's he Amdahl floor don't know why doesn't it? Well, yeah, but I want it to go 64 times faster. So kind of working at that making it go faster Now I'm looking at adapt Eva's made this parallel aboard I I think these network on chip architectures are really
In 2015, we'll probably see a thousand twenty four cores Commercially available running at about 15 watts These are super computers on a single chip and I think the key to the future is actually low energy. We don't the people making the chips say these are high-performance chips
But nobody can think of applications that need that performance I mean unless we simulate that and unless we do whole-brain simulations and things like that and that need massive computing power There are very few problems that need that amount of computing power on the other hand We do need low energy computation and low energy computation can be achieved by massive parallelization And by dropping clock frequencies, so there's a new generation of processes which become highly parallel
With very low clock frequencies which have different funny cache behaviors and learning to take code Or learning to program those is essential for generating low energy applications So I think that's very interesting and the Allen programming model actually fits into that quite nicely because
Caching is extremely important there and all airline processes. Just have their own little cash and heaps So they're already having having written it with processes. You've already ensured that you that those processes are cash They can't look at anybody else's memory or anything like that. So I think it's well-placed for that and Just a few sort of observations, um, the the predictions that I
Thought people were making in 1985 were were almost completely wrong The predictions as to which languages would be used in the future turned out to be completely wrong Predictions about Use of legacy code turned out to be completely wrong. I remember
quite clearly when I worked for Ericsson talking to the head of strategy and he said in the future Everybody will be programming Plex because that was the language that we programmed in and I said no, no, we won't He said yes, we will it's company strategy New products will be programmed in Plex as we've always and we promised our customers backwards compatibility with Plex and I said no
No, it won't happen 20 years time. Not a single line of Plex will be written. I was right There is we don't write any Plex at all zero squelch zero the only Plex we've done we've bunged into virtual machines We don't dare change a single line of code because we'll introduce 2.5 errors for every line of code
we change we put them in a virtual machine and nobody knows what they do and Hopefully one day we'll chuck them away in the bin because because it's rubbish. Well, it's not rubbish It works, but nobody knows why it works. That's my lecture tomorrow. The people who wrote the stuff are dead and there's no spec Right, so it's tricky the most significant events that that
Led to to surrounding escaping and things were totally unplanned and were non-technical in nature They were things like being banned. There are things like Projects failing and running in when they fail all this good argument just never worked ever
Well it did but you sort of do it, you know this engineer stuff. What's the problem? list the sort of 10 best solutions take three of them study them and build Prototypes and then go to the management with the results of the best prototype and they'll go and do that That doesn't work at all Wait for a crisis and run in quickly
That's what you should do look for projects that are gonna fail. That's a good tip, you know, sort of When all else fail, you know when they're drowning when all else fails save me So they rush in that's that's the time to do it Some predictions of future technology were correct. I remember remember my some old blokes and go parallel young man
You know the future is parallel. We said that in about 1980 mid 80s and it's completely correct. That's where the future is. It's parallel computing. We're going into this transition period Of trying to learn how to program parallel computers and we're in this I noticed that a couple of days ago Apple released this Swift
Parallel functional programming language, you know, which I was oh, it's great. You know Apple's sort of joined this functional programming bandwagon Yeah, they have but instead of sort of climbing on board with the Haskell people and making a cocoa bridge That's good. They make their own one and and and they will gain market share due to that and in 15 years time
We'll be cursing them Because they'll have all this legacy code that nobody knows how it works and will have changed language yet again Can't they just join forces, please, you know join the Haskell people join the airline people join the ML people Don't make your own mistakes Benefit from the previous mistakes that we've made
Yeah, these large legacy systems we built would be totally collapsed and are not used I Think while I talk about this tomorrow Management has this weird view that fixing up legacy code is well We call these millions of lines of code and you want to rewrite it all yes, it's gonna be quicker No, yes, no, and they don't do it, you know, so the new startups that do build they don't take the legacy code
They just build it from scratch in the best technology of the day. Some of those are going to win Okay, we'll go to Darwin type survival of the fittest will happen Software has hit complexity boundaries years ago
And it's in a complete mess Functional programming offers some slight improvement on that. It's not proof. It's not proof. It's It's less things that you can shoot, you know strong type systems things like that provide you with less ways to Blow your foot off by mistake. I mean I tomorrow's lecture I
think you know 332 bit Three 32-bit no six 32-bit integers in C have more possible states than the number of atoms on the planet Okay, but this machine this has got 250 gigabyte solid-state disk The number of states this machine can be in is 2 to the power of 250 giga, right?
That is more than the total number of states in the universe Okay, so when I've got a problem on my computer and say well, I've got the same problem I tried this and it worked and I try and it doesn't work That's because our machines are in a different state to start with we need mathematics We need strong tools to prove systems correct, and we don't know how to do that
And we need to compose systems from small bits where we try to prove the systems correct And we need to build them together and we don't know how to do this and we've got My generation of program is in the last 30 years has created Trillions of years of mess that you guys are gonna have to clean up and I'll talk about that tomorrow
So so I think that's about what I wanted to say, so thank you very much questions
What? Oh, sorry. Yes Oh, yeah, the question with the strings are just lists of characters Why? Well, cuz that's the correct way to do it No, I mean a string well in Erlang a string
It's just a list of integers where the integers are code points and they can be Unicode code points or Latin one code points or whatever you want, but there's no notion of a UTF-8 string or any particular encoding there Though they're just syntactic strings don't exist. Okay, basically strings don't exist in any language
They're the syntactic sugar for these things with quotes around them You see it's just an internal representation of this literal that's got quotes around it. That's what you call a string They don't actually exist. They're not like integers Yes, well I included oh
Well, I had list lisp on my list of a lisp Well, the list of functional language. That was the evolution of Haskell. So so lisp didn't play any role in the evolution of Haskell, but but
Yeah, okay, yes No, no, no, well that thing I showed was with was an excerpt from David Turner's paper and and and it's what he wrote In his paper and he didn't include lisp in that chain of history as they were right more questions. No
Okay. Thank you very much