Bestand wählen
Merken

Coming Soon: Machine-Checked Mathematical Proofs in Everyday Software and Hardware Development

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
picked out it was
found that time and the
so the next speaker of today with token with does more about the glorious future very much check processing them will be used in every softer a all the way down to water development we have other means to power is a faculty at MIT any bills computer system and a son proving them using Cox not freedom as you think it is thank you and so
from the earliest days of programming people started to realize that the actual act of writing the code was going to be a relatively small fraction of the time that they spent building useful things and here's a quote from 1 of the pioneers of computer science telling us about how some point you realize a good part of his life was going to be spent finding errors in his own code so we know that it would make a huge difference for developers they get a lot more cool stuff built if they could reduce the time that they spent trying to get the code to be correct trying to understand what the code does all these activities besides just writing it and I want to tell you about some opportunities with technologies that are getting mature enough to potentially make a big impact in saving a lot of people time a building correct systems and I think we should just ready enough that if you wanna be on the bleeding edge you could start looking into the system so they so a
1st on the summarize up perspective on what programmers are doing today to gain confidence that the code is correct so there's debugging the classic way to find a flaw in a program where you to do some detective work make successive runs the program change small thing which in runs see what happens differently on each execution I'll graph these different approaches in a very wishy-washy to access way of rigor and then used to refer to the when you some sort of method understand your program how closely tied it is to the actual code that run so you you know it's accurate and incompleteness is covering how many different execution scenarios you get some guarantees about so with with debugging the only get guarantees about this scenarios we run so that slow completeness but I'm on call calling a high river because it's literally when your code the you can also when you write new code in the 1st place in a white border or just Muirhead think through carefully all the things that can happen we consider all sorts of what if scenarios convince yourself don't worry my programs ready for those arrives in new code to be ready for that there can be all sorts of branching and we can think very carefully and of were very principled and we get high levels of completeness but low rigor because this is just an abstract whiteboard level headed exercise
testing has a very similar profile to debugging we create a bunch of scenarios that are received a list of test cases that we can run over and over again in an automatic way an alternative is to put lots of comments in your program that capture exactly your expectations what should be true whenever you reach their comments and then read the code and think through the interactions between all these expectations so we could call this the comments approach which might have a high level of completeness they could become it's everywhere saying a lot of things about a lot of code but it's not so rigorous because there's no there's no typically no tool that connects the comments to the actual running behavior of the code another approach is modeling and there's
a whole spectrum of of different levels of seriousness we can use another thing we can do relatively cheap out and where we may be read a lot of code but don't think too hard about it or we can have some more in depth ordering where we probably need to read a smaller amount of code that we can be much more of the role in checking the properties of that code so the techniques or to tell you about a way to sort of get the best of both worlds to get high ratings on both of these activities In contrast to the traditional methods so I'm a professor
university you probably used to people from that sort of place of shooting out all these little greek letters and then the mathematical symbols and saying this is formal methods this is great we're improve things about programs I think this is an even more popular research area here in Germany than in the US where I'm from it's been around for a long time still not too widely used in industry in the open source world but maybe there's hope I think most people in those world the use of this kind is an area
where people doing formatted researcher saying actually the right thing to do to prove the program don't you want to do the right thing and then the developers to sort of nod and smile politely and then need to get back to work because they need to implement a few new features for the version of the system that ships next week I wanted argue that formal methods if not already there today are very close to the point where they will actually save you time they will reduce the cost of developing systems especially those where bugs are especially costly so the the pitch I'm
trying to make is that in the history of computing we've gone through that layers upon layers of building new high-level abstractions that lower the cost of creating particular functionality and we built into these levels upon earlier ones they provide new possibilities for modularity and abstraction and formal methods is actually an enabler for some of these out next level of possibilities for dividing big programs into small pieces that we can understand effectively the so what I'm
claiming should be feasible by will say 10 years from now it with the right kind of community forming around and doing the work is complete deployed computer systems with top-to-bottom correctness proofs meaning from hardware up to applications out where these troops will be checked algorithmically so we don't have to worry about a low reliability people reading long mathematical documents and we will get to the point where none of the code that you needs to be trusted to guarantee the properties you care about whether the insecurity or other kinds of correctness and as if that weren't good enough but by the way that this will play a
hardware designs not just software code
and In this kind of approach should be able to lower development costs in the same way that a good systematic testing framework can and to a large extent should be able to replace very time-intensive activities like debugging and testing and code review so I better going a little more detail on what kind of techniques and talking about here and why we should believe this stuff
is is ready but before I get into that the next level of detail just summarize how this kind of approach can substitute for the traditional development activities of debugging were were exploring concrete executions of a system being to proving were were exploring symbolic arguments that cover all possible executions instead of testing but applies the concrete scenarios we can view specification writing where we describe general requirements the that can cover all scenarios the instead of dividing are coded in detail we gotta the specifications in detail specification should be much shorter and typically don't include optimizations which can be some of the hardest parts of code to understand so
summary of wind up going to cover in the rest of this talk I'm going to introduce the key technology that I'm sorry yeah which is called proof assistants all cover some common objections and the standard answers to those questions about why do we believe this can actually be applied to real systems why can be applied cost-effectively I'll explain a few different ways of this technology can be used in the development of computer systems of thought about 2 case studies 1 in hardware and 1 in cryptographic software and I'll finish up with some pointers to further reading for of continuing the journey along learning the subject the the so when the 1st
questions people have about formal methods is is essentially based on the argument embodied in in a paper from 1979 by the molybdenum per less called social processes and proofs of theorems in programs that a lot to say in that paper but I would have to boil it down to the they say real systems are so complex and they change so frequently that is beyond the the powers of human concentration to read careful proofs about those systems and it's not just 1 proof it's version after version of the system you'll need a new proof to to cover the correctness of the the new system and I would say today we we learned that these authors were both right and wrong about this point probably right that people can't be trusted to read Proops about real systems but they were implicitly assuming you would be people read the proofs today we have a better way let's get algorithms to read the proofs so
here's how this kind of proving consist a scale up to real systems so that he is approved engineer trying prove a theorem he wrote a proof about it the and don't worried enough to write it all himself there's an ecosystem of libraries of existing proved facts as well as the proof techniques that can be reused and the proof is actually as the
source code it's checked in the version control like and get help and that allows a number of developers to collaborate on writing different theorems of the larger development and get them to all lined up in the same place and have their versions tract we can plug garbage control into
continuous integration which is effectively running over and over again checking is this proof still convincing is so convincing every time there's a change and how we note convincing We have an automatic proof
checker that reads the source code of our proofs and applies some straightforward rules to validate they're really prove the fact of the claiming to prove by the way up 1 risk here is that some of these programmers or developers to prove the wrong theorems so we do some code review where each proof engineer might read the theorem statements written by the others and make sure that were really proving what we set out to prove the so 1 probably some of you thinking this picture looks just like the way that softer development works today in the real world and that is very true it's also true that we also have up all these tools already for doing this kind of development for systems with correctness proofs and I will try to demonstrate that the rest of the talk so this feature is more or less here today though there are a lot more rough edges on the proof versions of some of these tools than on the traditional programming versions and is there's a lot of opportunity to get involved with improving the tools to so the the key system organ use here
is called a proof assistant it's a software package that is basically an IDE for stating improving mathematical theorems where you have to do manual work to write the proofs of the theorems as ASCII source code but then there checked automatically just like were used to with say type checkers and programming the there are a number of prove assistance that have
been developed for the last few decades and that pretty usable today well is called Isabel Hall a large part of about development was done here in germany at TU Munich and probably the best known application is the L 4 . Verify project which has done a correctness proof of an operating system of another system called CORK developed at INRIA and France and 1 of its best known application discomfort which is a C compiler with a proof of correctness bridging the gap from the C code down to the assembly code that is produced the I'm going to focus on the Cox system in this presentation because that's what I using most of my work so how does this process work
where we have a sort of collaboration between human and machine to come to a convincing proof well the human user is repeatedly making suggestions to the automatic engine of basically saying I think the following strategy ought to work to the next step of this proof the engine up spins for a while and then says OK we do that the news what's left approved please tell me what to do here and the Library of steps or tactics is not fixed we can plug in new libraries to the tactic engine and using higher and higher level abstractions of what constitutes a valid a piece of an argument In the end after you finish the cycle and there is nothing left to prove that and the tactic engine output was called approved term which is able essentially a program in a small language with just a few reasoning steps built into it and in fact everything I've shown you up to this point is on trusted in the sense of trusted computing bases and security the only thing we need to trust the only place where boarding infrastructure could lead us to accept an incorrect proof it is a separate relatively small proof checker that only looks at the proof term ignores the way it was produced and checks whether it actually establishes the theorem that we were interested in so where are those
primitive deduction steps that the small proof checker is implementing all this give you 1 classic example of a kind of stuff which is more or less built in systems like Coq This is the modus ponens rule it says if you know p implies q and you know p then you're allowed to deduce q so imagine all set of roughly 10 rules like this that we write out in ASCII syntax for that the core proof terms of the system and this is actually a language inspired by functional programming it's a lot like a camel in fact Camel was invented used implement cock in the eighties and so we ones like this that are essentially chain the other proof steps and we have little applications like that modus ponens at the and and use those this those deduction rule we use here and believe it or not these chains of steps are enough to reproduce all the results from Bath in computer science and all sorts of other logic-based fields and it's easy to write a relatively small trustworthy checker for applications of these rules that will apply to all those different domains so let me make this a bit more
concrete and show you an example
of a proof 2 in Cox so I'm going to be really working with this coding in in max that in the next moreover cock authoring
of course so-called is based on a functional
programming language it's a lot like Haskell oral camel for those of you know those we can do things like right this
expression which is a list of natural numbers 513 89 are checked command and I pressing keys and then I got this highlighting in this region which indicates this is now with the processing of the proof assistant has moved up to and it tells me this is a list of natural numbers Nicholas Booleans and I can use this alternative notation for lists with an infix operator colon Cohen for adding a new element at the front of list so this is the list consisting of 539 when empty list here is another way of writing this and I've also
written a recursive function here called like that for any type a takes in a list of flight they return the natural number and we do pattern matching insane and the list maps number 0 and non list matched to 1 plus the length of the tail of the list were and Simon name Ellis primed detail and the time to go over all the classic ideas from functional programming like enhanced when and now hopefully this will make just enough
sensor people aren't used to seeing those but at it's very similar notation to in those languages so it's prove
a theorem office object was like this list it's 3 3 good and I also define another
classic function that takes 2 lists essentially concatenates them making new list that's 1st the elements of the 1st list then the elements of the 2nd list you just write this as plus 4 lists in Python for instance here I'm calling upon the making append
1 2 with 3 4 5 and get 1 2 3 4 5 down here Stephen know already see some of these elements of led medical as an IDE were getting constant feedback as we go much like the ominous red squiggly is in eclipse when you make a mistake in your code of will have those coming up shortly when we try to do some proofs and get this feedback on the computational behavior were defining so
is 0 Freddie Taipei and virtuous less of that type if I append them and then take the length that's the same as taking the sum of the individual lengths so there's this sort of
red background admitted telling us is the fact we can improve yet so as prove that so we don't have to look at that red anymore and some this say my 1st at this I'm going to prove this by induction on the variable L 1 and then what this is going to do is create 2 new subgoals use our current goal just repeated from up here when I run this give us more space there there are now 2
subgoals a a base case an
inductive case the the base case l
1 was replaced with an empty list and inductive case was 1 was replaced with an arbitrary non empty list with some head element and some tail element so let's start
proving the base case what they'll do it I noticed the goal here started a 4 also around the step called intro that's gonna take out the for all instead give us a local variable called Ellis to the this and then I noticed of
this looks pretty obvious just from the definition of a pencil let's ask octaves algebraic simplification with that definition essentially running parts of a program at compile time if you will and then that was looks pretty clear the same
thing is on this both sides of equality so I can save reflexivity in that case is proved I was
going to be inductive case of started the same way get rid of that for all down below
here and simplifying now
I notice this this term here length of a path and follow blah which is actually also present up here what appears here so the general ball the double line our hypotheses are known facts that were allowed to appeal to here we have an induction hypothesis which as the theory stated essentially by applying specifically to the tail of the current list so I can say with state that no equality used to rewrite will match this with this and therefore will be able to replace it with this so we only
that and now reflexively applies
good and the most exciting moment of any proving copyright QED process that and if the system doesn't complains is the equivalent of taking your your code into eclipse and not getting any respect with under this is a correct proof Hawkins convinced of the theorem is true the thank you 1 more
example of 2 to show how we reuse lemmas improves this is a reverse function for lists that takes a list and then steps down and building a new list of using you can function to repeatedly add all elements that were on the front onto the end
so reverse 1 2 3 2 3 2 1
and so then we could proved way threat
of the I'll do this by induction on the list of reversing on a show the length is not changing the reversal lists to the base case is pretty simple
here use algebraic simplification and
that's obviously true and see
what happens in this case well here I see have a length of an apparent that doesn't directly my induction apothesis but luckily I proved something called link length-dependent handling remember will find out like append which replaced and the length of append with some of the lights but again we going to the right of plus the length of the singleton less than I can simplify this the and now a very close I can rewrite with i a less by the way this capital S is the
plus 1 function sense for successor so this is pretty obviously true though capitalist is not defined as plus 1 little different but I can call a procedure called omega which is a solver for what's called linear arithmetic which knows and approve all sorts of numerical facts like
that so that was an example primarily of using a fact we already proved with the larger proof and this allows us to build up to proof that cover pretty significant systems and their correctness properties 1 last example
I will not go through much detail I just wanna show that
even lot for program like a right in 1 page of code it's not very straightforward naively to establish the correctness I define the language of abstract syntax trees for expressions with up + sometimes well as constants and variables to i've defined a recursive
function to evaluate those expressions to numbers given values for all
variables and i've defined
a medium complexity optimizer that finds parts of the arithmetic there we can actually simplify at compile time is a classic optimization called constant folding
but what all the code there as 1 show
here I started trying to prove the correctness
namely when you constant fold and then interpret it's the same answer if you interpret the original program in the same variable environment access and then I started approved by induction on the 1st case was easy
2nd case is easy but then started needing to do case analysis on which constructor was used to build particular terms in the abstract syntax tree type and I needed to a sub analysis inside that 1 and I need to do a sub case analysis and that 1 and having the same thing over and over again a lot of stuff is going on here but it's very repetitive and at some point I just gave up and wrote you that's not due to answer them manually so keep that in mind but for now this is some reason to be skeptical that this approach will scale to full-scale realistic deployed systems if we have to work this hard on a program that that's in 1 page but I will come back to that alright so out
having concretize things about user proof assistant you might have a few questions on we try answer standard ones now 1 is OK we know it's pretty hard for programmers to get the code right despite all these techniques that we have there still frequently high-cost bugs in the wild
is it really that much better to make programmers get the specs right for many systems we have trouble imagining how they unambiguous specification would be much shorter or simpler than the original code
1 answer is I would claim verification of this kind is most worthwhile when we apply it to the most commonly used system infrastructure that sits underneath everything else that we right so processes operating systems compilers databases and so on are 1st foundation for everything else we do and 2nd they do turn out to have much shorter specifications and implementations for instance in a compiler you have a specification that doesn't say anything about the particular optimizations the compile the users even though those are the most complex part of it and the easiest ones to introduce budget at so in my experience
when we think about what makes us back a lot shorter than the original program if we take a spec which covers the functional behavior of a system and then we added optimizations that make the system fast enough you little enough memory and so on then we recover the original implementation so to the extent there are a lot of optimizations in the system and then there's a real opportunity to respect that is much clearer than the original system but still this is
essentially programming just in a different language and it's still hard to get right so let's take an example the think through what might be the biggest challenges in that have a compiler who practice theorem essentially says the compiler bridges the gap between the semantics of the source language and the target language for this purpose let's say the target languages machine language and the maddening the semantics where at the word semantics really means reference interpreter or something like that tho usually it's written under or logical notation I we wrote the compiler because I wanted to use it to compile some applications and imagine each application has its own specification and then at the application theorem kind of snaps together with the compiler theorem to the semantics of the source language of the compiler the down below I my
processor that implements the machine language semantics on top of say the HTL semantic search semantics for whatever language the processor is written so now think of all this together as 1 composite verified unit the theorems are proved that the different parts all compose with each other to create the abstraction of a thing that sits on a particular circuit board and behaves in this application specific way the really neat thing about this style of development is that the 1st note of the code we wrote his trusted anymore it's all internal details inside this big block and it's all covered by the proof that we've done 2nd even the internal specifications are now no longer trusted become encapsulated details of the system and its proof so mistake in say the semantics of the C programming language which you can imagine is pretty easy to make a for trying to write it out the meaning of C and not miss any cases does not allow you to accidentally accepted incorrect system in this style as long as you keep that specification internal to the overall functionality that you're building so let's
contrast this with what we have to do to convince ourselves in the the current standards offer development approach we have to run not just unit tests about our individual libraries but also full system integration tests because as we compose libraries and modules together the state space of the system is growing exponentially and therefore it's much easier to have a complete feeling coverage for individual functions then for full systems in contrast with the computer proof assistant approach to system integration theorems that we prove actually imply all the components inside the box are doing what they need to do to make the full system work and we really fundamentally don't need the equivalent of unit tests in such a central role as it has today In the old way we have to do careful code review of all of our components because about a quarter case in any 1 of them could completely destroy our guarantees of the whole system in contrast with the proof based style we all need to do careful code review on the externally facing specifications in fact you can even accept In theory a component from your worst enemy as long as it was proved to meet the appropriate specification you can write it without even inspecting the code and this is potentially a way to support say downloading applications from untrusted parties in doing proof checking to guarantee that they meet particular security policies so another concern
I showed that as I tried to write out the proof for a medium-sized program model laughter compiler optimization it got out of hand quickly and I ran out of patience well
we're familiar with this scenario if you imagine it's in 1950 or so and someone says writing machine code program sure is a lot of work and it's never to be very popular today we know we have compilers that will go from successively higher levels of abstraction automatically right that code for us and by the way we also libraries in each of these language levels that we can typically reuse for most of the work say if you're putting together a web application with a popular framework we don't expect to rewrite everything each time in fact the same story
applies in the world of machine check theorem proving where we can build up late of automatic prove generators we can build up libraries of lemmas that we appeal to for a commonly useful facts and this is the secret to building up an ecosystem of tools to make it out relatively low cost to create a new development with a proof that is roughly similar to 1 you've done previously so let
me now go back and make these proofs that I wrote before a little more appetizing
and some we started this
1 it also I wrote this truth out
here the with some very manual steps but it turns out I can just do this I can 1st and we need a semicolon to say take this run this step a bunch of new subgoals will exist afterward there on this 1 all of them and then I run this 1 all the next 1 hasn't for kind of funny name called intuition actually refers to intuitionistic logic but our people typically experience it in a different way 1st that would mean and then that
just proves everything and let's see if we
can get lucky with the same thing the next proof the the
telescope Ellis and it
almost almost worked 0 were missing is the recall down here I mentioned learner we already proved cock hadn't been told that it should consider that lemma so didn't try so if I say 2 steps and they take plan we already
proved it's called like the band and I'm going to say they remember that lim I proved that
will be useful to try to use it as a rewriting rule
the and I can say on rewrite with score
but announce applied here or almost done I just need to save these 3 steps should be done over and
over again until the theorems proved so
this is an example of a loop here I this is actually a Turing complete scripting language it's very different from 0 1 the primitive proof term language the just 10 or so steps in it we can build up whatever abstractions we need here with loops and recursion and data structures so what we're really doing is for writing scripts and find the proofs for us a particular script is designed to be readable by humans but that because you can read the script doesn't mean you understand every precise detail behind the low-level prove its generating that would generally be completely overwhelming Taft understand those details and nice thing about writing scripts like this is that often if you change the system you verifying the wrote the script well it will keep working that's a partial answer to the concerns from that 1979 article about Proops keeping up with changes to systems so it's an engineering problem just like a good library design we want design good proofs that are adaptable to many similar situations we have to give up any formal assurance if we do this properly OK so
there's this example which I didn't have time to explain in
detail as well as show how short the proof can be approved someone before won't even need those what we do here induction E much of
the simple tuition a a few
cases are left when I see you I like this
is the district program contains a program-level pattern-match over term tests we should that that's essentially a case split in the program we should mirror with a K. split in the proof so let's try that match Goals construction for examining what we're trying to prove and
doing something differently based on what we see and I well do so look for any pattern match this is a syntax for matching any pattern-match inside the goal and I will do case analysis on that and then simplify things afterward
In all say it do this over and over again on all the subgoals that we generated the now a
very close to done with just a bunch of arithmetic facts and here we have the vowel on a particular term which give the definition of the valve take their word for it can be simplified suffice to say simple not just below
the line but everywhere than those will go to
some performs as well now notice that this form with a bunch of things are equal to evaluation results I would use those equalities to rewrite down below here the still need you that is well repeatedly look for the cases of hypothesis that says something is equal to evaluate something rewrite that and then get so now
all those are going to get support and things like N. equals 0 % and it turns out that I have how these it
says I have 53 these facts whole bunch of not too interesting things with zeros and pluses and everything it turns out all these follow from the axioms of the algebraic structure called semiring which the natural numbers but I'm using you belong to so I can just say that by
ring and then groups that
is correct that proof so
cool thing about this kind of proof Nagin I added a completely new kind of node to my syntax trees like I got really are really ambitious and added subtraction and then this same proof would pretty much work without any changes and even more ambitious changes the language or the optimization could lead to similar outcomes so this essentially is how prove offering scales to large systems and to rapidly changing systems so back to the slides
so this is that the underlying technology for a bunch of different cool things that we can do with proof assistants so that traditional mode math group says that the user has some sort of theory of mind went to prove it passes it off the
projector and we make sure that convincing do the same thing
for programmers were stopped and implementation and up prove a theorem about it and check the proof is true
that's called verification there's also kind the other direction called synthesis where we don't even have the artifact to begin with we just have a wish list a specification says I sure wish I had a full body did bats and then we
actually pass that specification into some automated system that builds the implementation for us and actually build the proof of the implementation at the same time and can check that proved without having to trust this and this is engine that did this work for us both these can often be useful made that build on the same technology I also should example so far that
only write programs in the functional language that's built into the court system but it's such a rich logic that we can encode pretty much any other language you can think of we can write down its syntax and semantics and so here's some examples of
languages that development in Coq have worked with by formalizing what exactly these languages are going all the way from high-level database query languages down to hardware description languages and we can find these compose them do all surgical things we can also use a wide variety of
tools to construct our programs and our troops we can use compilers between languages we can build automated provers that live within the cops ecosystem and generate cop fruits you build synthesis tools the key thing is every tool we
introduce in this ecosystem just like say in a regular stop for ecosystem everything needs to somehow connect to a common binary object format here everything needs to connect to a common proof format the most straightforward way of doing that is to take each of these pieces say some arrow at this translation and prove the trip the translator itself is correct it always maps inputs to appropriate outputs but another approach is to create a proof Generating piece that for instance takes an input produces not just the output but also proof that that particular output is correct and for different circumstances it's more appropriate to use 1 of these 2 styles likely the same proof checker can be used both of them
so to get more concrete let me tell you about 2 projects that I've been working on recently 1st one's called coming it's all about our proof support for digital hardware it's it's joint work with a number of other people at MIT listed here and who will basically the story is of
course imagine that we want to support rapid open-source development of new digital hardware systems by mashing up components from libraries so for instance you might pull some processes out of 1 library a cache-coherent memory system out another if we write our own new accelerators for a particular purpose and by the way all these boxes have Proops so when you assemble your system you know just a run at you also combine the proofs of the pieces into a proof that the full system is correct and can potentially no correctness without being to do any debugging if you I prove you're top-level it's written in your top-level theorem correctly and check that it follows from the constituent fruits what I mean by correctness
well imagine we have a highly optimized implementation of some hardware components and we've also written a specification which is essentially the simplest possible way to express functionally what that box is supposed to do we ignore optimization performance etc. we might call this a reference implementation not just back and we want to prove what are in the semantics literature we would call a contextual refinement property out what that really means is let's invite into the picture and omniscient judges is going to very carefully inspect both boxes discover some inputs look at the outputs and what he's trying to do is find any difference in the way they behave so if this judge can ever get the implementation to have some externally visible I behavior that the spec couldn't possibly generate then Our system but if it every possible interaction he could have anything the left side and do the right side can also do then we say is a correct implementation
and our system we write these hardware components in the blue spec high-level hardware language which makes it particularly straightforward to define what exactly of the legal ways to interact with the the system and importantly all many internal implementation details may not be observed directly just like you can't peek into the private fields of a Java class and that's important for facilitating optimization without seeing the details in the spec the so a few words
about the Uzbek language it's kind in another joint style where program modules are objects that that private state and public methods that are the only ones allowed access the private state and methods to call the
methods and other objects every method call
appears to atomically without any in interleaving concurrency and we can summarize any atomic method call with the trace evolve in the sub method calls that happened some coming into this module some going out others with their arguments and return values and then this this property that connects the implementation the the spec is basically saying any trace that my implementation could ever generate could also be generated by the specification this is a nice modular way of setting things up because it
allows you to better link modules or objects together as you do that certain methods become private and they stuff appearing in the traces and you can hide the details of how you spec is implemented with a particular hierarchy of optimize components so let me show you a
quick example of some code in the County framework systems come tutorial
um the here is some code
for this is a producer consumer example was with to hardware nodes with a Q in between them this is the producer side the module so it's so particular registering it and a rule that might run on a particular clock cycle reads a register calls and a method of another module the cetera 1 thing I wanna point out here cock has an extensible Pasha so in this framework we talk talk the syntax of this hardware description language there's no built-in keyword model register rule any of that we define the syntax and semantics in logic of this harbor language and that allows us to use it to define systems and you proofs and use the consumer
definitions a lot shorter than we define a
back that actually combines the producer and the consumer into 1 module 1 avoids using intermediate Q and then with less than 100 lines in the file
we get down to prove a theorem that the implementation is a correct refinement of the specification and take my word for all this can be scaled up to processes cashmere memory systems we've done all those examples of so we prove those individually we compose them and also proved that compose the proofs at the same time and it is quite general yeah
are also wanna mention 1 application area within working on this technology some of you might have heard of the wrist 5 instruction set I think there was another verification talk about it on the 1st day of the Congress and where despite the shared here in the back it says an instruction sets 1 to be free with 5 is is basically like the Linux of hardware instruction sets if we think of Linux in 1995 or so it is an open instruction-set controlled by a nonprofit foundation in terms of the definition of what are the instructions what they do and up as a lot of opportunities for getting involved including informal methods so we've been working within the risk by validation to formally define the meaning of the instructions that we've been verifying components that you can use to assemble your own was 5 systems with correctness proofs we've been working toward some support for verifying the correctness of respired machine code involves infrastructure for composing those proofs together and are also there's some nice
consequences of this open model if it's really well with formal methods so in the traditional world the processor manufacturers control the intellectual property that defines the instruction set and the processes and they actually don't want most developers to know how things really work because then they could potentially build competing processes and and 4 at the bottom lines of the companies that own the IP but in the was 5
open model then the would have a formal semantics at the center of everything that is exactly what the instruction set is allows everyone to build a clone version of of the processor all those can be made open source because of theorems associated with them connected to this formal spec others can come along a mashup existing components and but it can dramatically lowered the cost of getting into harbor development in this kind of space so we're trying to build the formal semantics to enable the story of 2nd case study on talk about to finish
up here is what we've done in cryptographic code namely elliptic curve cryptography which is used in the a lesson and many other settings the also joint work with you folks at MIT and so it's the story here on
cryptography as we know is really important and and most application using photography to take libraries off the shelf which were written by a small handful of elite cryptographic implementers I would think about how hard they have to work to make get everything together here there's uh medium-size said about cryptographic algorithms that all the primitives we need it turns out many of these are parameterized by large prime numbers that are the module I for arithmetic and there's a medium set of different hardware architectures that we'd like to target it turns out that in practice for every choice of 1 circle from each of these bubbles and expert spends at least a few days rewriting everything from scratch possibly in assembly language and this is not a very scalable process and
of course when doing that sort of thing we can believe that bugs to be introduced that have serious security consequences so what if we could have a compiler that automatically did that work when you choose something from each of these bubbles and produces not just the fast low-level code but also a machine checkable proof that it truly implements the whiteboard level math the so that's all
we're after here so we're doing got crypto verification 1 kind of property people often proved that photography is that the these these high-level security properties like if you don't know the secret key and I'll take you more than polynomial time to compute something 2 protocol verification we can show that a particular mathematical algorithm needs such a property and through an implementation synthesis process we can then generate low-level efficient code that follows the mathematical algorithm are project is only in that 2nd category that this but a lot of interesting work in the 1st category to and our
system is and has been adopted by Chrome through the boring SSL library to replace the manually written elliptic curve finite field arithmetic that they had had previously you can find our implementation of curves to 5 5 1 9 in chromatin 64 which is in beta now and the next version of Chrome after that will have our implementation of the 256 together these are the curves that that are used in the vast majority of POS connections so really quick
demo of running this
generation process what we've done is
we've built a library of functional programs that capture the generality of arithmetic in this domain so it turns out that these numbers are way too big to fit in hardware registers we have to be clever about splitting 1 number among many registers and that essentially implement graceful right take ourselves with some catches for performance on popular processes and we do that using standard of programming stuff like folds and maps and so on for instance here's a definition of
multiplications flat map and a map I won't go into details on exactly representation this is kind of a simplified version for demo purposes but we could talk to process all this
and Nikolay down the bottomless file where I'm going essentially state
the following goal assume we have to 10 tuples of integers turns
out that on 32 bit processors for curve 2 5 5 1 9 you want represent each number with 10 different word size digits so that's why this attend tuples we say please find an element of the set of tuples that are the correct answers to the problem of multiplying the 2 inputs mod to the 255 minus 19
and run a few steps here the 1 explained in detail to start symbolic
manipulation and then the step is here is gonna run for about 10 seconds I think we are awesome Weird asking cork to do some correct by construction derivation of concrete code
for computing this multiplication result we get a bunch of topics like this I think this is the
last digit of the answer and some of the arithmetic for computing it has a lot of redundant parts to it so let's use the the ring algebraic laws to simplify this a bit more on this 1 takes a few tens of seconds to run but what's happening is the systems automatically computing the code that the traditionally has been written by hand for but this cryptographic primitive and it's doing it in a way that doesn't just build the code also builds the proof that it's related back to the original white board level model arithmetic definition and this will
finish any moment now and it will no longer have all these constants that these will be combined into a nicer form and the output of a public here is not the end of the story we also later phases that lowered into see like code do bounds analysis to make sure we signed enough bits to reach register of that all happens automatically given the other parameters the
in here we have an example of the output of so this might look
familiar to people that implementing curve to 5 I 1 9 it's still a relatively high level form without breaking things down at individual temporary variables but we didn't have to do
mainly figure of for instance there should be a 38 year and 19 year and all those things it was done for us automatically and you can put a different parameters and generate the standard code for any of the other curves in any of the hardware architectures that folks like to use the money
so would finish this up but we are implementation automatically build their code from a way of writing down the prior models and suggestive way likely here this is typically have these are written as sums and differences of powers of 2 or small multiples of them we wrote a Python script that uses that to generate a few lines of configuration to our system
and then and now we scraped the of elliptic curve mailing list of
modern crypto . org to find every prime number that appears in the archives switch use them all as inputs for automatically generating code there are only a few weird prime numbers posted there that are actually very inefficient implement that was probably a newbie to the list 2 was made a suggestion was quickly rebuffed but we generated those 2 were tried to sometimes ran out of memory with those
and we automatically build code for all these cases compared against the multiprecision library was very standard off-the-shelf library for a bit integer arithmetic you're results generated for 64 bit xity 6 year
for 32 bit ARM running on an Android phone in generally the ugly multiprecision version of things will take 2 0 to 10 times longer to run 1 of the elliptic curve operations them with our code which was created to be automatically and also has strong guarantees of correctness so what if you
have
how this leave you with a few pointers to places the follow up if you'd like to learn more about this subject number 1 place to start an online book called suffer foundations written by Benjamin Pierce and others it introduces cock just assuming a standard undergraduate computer science material no formal methods or even functional programming experience I should also advertise
our multi-university initiative in the US called the science of deep specification deep back up in particular in summer 2018 will be running a summer school that introduces all this stuff with some hands-on practice with hardware verification C program verification and to other topics in cock the built in Princeton New Jersey and you can spread subscribe to a mailing list on the web site if you'd like get updates as we go up and down more the details here the GitHub projects for of my own project that I mentioned here and 2 of the others that we made of code contributions to thank you it we have about 2 minutes for DNA the microphone and they say a great talk from my enjoyed really much of a Christian watch what strategies you have for some finding the right theorems because I think this is the hardest part here and you said you will make you know mn reviews for this and I think maybe that's not enough for me for example when you have a reverse function and the sorting function you can have the theory in that that store for the invariant that the length is always be the same when you executed by maybe that's enough for them and for the Syrians them so what I use strategies because maybe I think the combination and to have some concrete example test the test the functions also good here souls students are maybe not enough what before this sense yes so this relates to a number of the African we ask questions answers that I have in the middle of the talk art generally we build infrastructure to support applications and applications tend to have more straightforward specifications in many cases than the systems of the structure below the try to set things up so if we got the infrastructure theorem wrong we will not be able to use it to prove the application results so in general old 1 technique is when you build a verified component used in as many different concepts as possible and don't peek back inside the implementation only look at the original theorem you chose and by forcing yourself to prove all the larger systems that did use your component you will get out the bugs in the specification of the components that you're reusing it and you mentioned might be useful to have a set of test cases as part of a specification for a component that that could be a good strategy also but it's kind of our covered by reusing your component in many different settings where you naturally say calling a function with the with different arguments and you need that function to be corrected at the larger system to be correct and from the think URI much for the excellent talk some the specifications of many many problems also interval time so you have not only to have the correct answer you have to craft to correct answer before the TCP session times old but you look at war the machine gets removed to the deadline is over whole do you right on time and specifications in call like this thing will finish off to at most this many MS yeah and will probably hold often trying to prove that you don't get bored for now but maybe someday but for the time of property I I I think so we're working in a very general mathematical framework that can essentially encode anything you'd bind a math textbook and it's pretty easy to define what time is it from the program execution perspective and have been projects that for instance prove that are a C compiler preserves running time between the source code and the assembly code the pretty from probably naive model at the at the bottom and I hi I don't have any projects of my own in in that veins so far but a number of people have looked at that time so I think my main answer is very flexible framework to encode whatever you want you can prove what everyone is just a question of how much labor is going to be required and we'll learn more about the abstractions that will reduce that labor over time thank you we have pressure from the signal yes may be somewhat related to the last 1 so from control self help again side-channel attacks for example by proving a certain function runs in constant time because the main point of manuscript implementations divide side-channel attacks uh and do the automatically created of limitations that you presented to address this issue so the automatically created implementations I presented are actually in a very restrictive language that enforces constant time by construction it's all straight line code think of it as the performance critical inner loops of the Cryptographic Library and so yes that 1 does by construction of provide constant time the full methods projects have done automatic program analysis to guarantee constant time properties I think this is important kind of analysis to do it's among the easier ones for for a formal methods which is great given that the big payoff insecurity so I'm optimistic about scaling up to whatever settings is needed come yes so
about scaling up to what extent is the user that to what extent look at human factors because in many cases that's has this great well it depends if you're talking about the humans who were writing the proofs then that's us were very concerned about us and we make an effort of ourselves by improving tools are do you mean the users of the software or hardware that were creating in some way they might be worse off because of this methodology yeah so as to what extent so here we're talking about like full system specifications it's really a question of what the including your assistance as I see all year Aston uniformly multiuser interference also will do things like the top will on of the hardware models that has an input channel and output channel we have no idea what's on the other side but specifications as we get this output so I get this input you give that output that sort of thing I don't know of anyone who's been doing anything like cognitively modeling uses and what you should expect from them in this kind of setting up that that could be worth while though I expect there to be a lot of disagreement about exactly how to write the amount thanks before 1 this so it's the start by saying that our agree with everything centers agreed that she me on any of the following like basically believe that the average industrial parameters the done proposed a of man's so anyway Toni Morrison losses were is going on mistake any has the following quote which is 10 years ago researchers in the formal methods predicted the programming world would embrace with gratitude every assistance promised by formalization does all the problems of reliability that arise when programs origins more safety critical and then the he of more stuff concludes in which is like the most dismal sad ever is turned out that the world just does not suffer significantly from the kind of problem that our research was originally intended to solve OK so in my life I believe that programmers will accept this now when they have for 30 years and developed in so many confused at the cycles on the ideas of research that but do work in my experience at least for some isolated communities of developers like the authors of the main crypto libraries they do worry about correctness and they're very interested in exploring these these kinds of techniques so that's 1 answer the let's say the very high assurance domain where people are especially interested in especially willing to accept certain cost the correctness of the of the of the answer I give is that I think were progressively improving the usability and our creating many component libraries that can do most the work for you freeze projects you don't have to start from scratch and I think you were waiting towards passing some critical point where the ecosystem is big enough that there is a a fundamental change in the cost of applying these tools and if we get there then I think that make the difference for the the kind of barrier that or was talking about whom by doing that but I also think about my example is no references and all you need in order to eliminate all sort language are you know parametric polymorphism and some types and people still say polynomials of from the sorted out because of the states what I guess you can discuss this this later of the Geneva on how does the halting problem but what you can prove and is that a problem in real life applications of the old improbable applies to arbitrary pieces of code the get thrown down in front of you and you can ask questions about them luckily the piece of code we care about were written by people who had understandings of them in mine and I personally believe that the author of the code should be responsible for distributing proof with that and it is undecidable to check if the proof is convincing which is the real problem that that matters for this domain I my country and there very examples you give us with proves the pros to me look still wary of numbers and because this is some examples very simple so is there uh more optimization possible and people are doing that are there are also a wide variety of approaches to prove automation I think of those proofs is pretty short they definitely significantly shorter than the programs of the applying to media 1 5th the size of the of the program what which if you look at these kind of . 20 years ago it's a big improvement already in we can extrapolate from that and hope for even shorter proofs in the future question microphone 3 and he said that in the semantics of news that on methods executed atomically and so what actually happens if that's a failure and during a method well so this is in hardware land where where within effort everything's running just propagation of signals over wires so you have to plan ahead and detect the failures of guard conditions at the beginning and then you just don't of look at what's flowing down that part of the circuit I know that it is safe to it and if so could be this thing it's text text not to do
fast and the and the best to In fact the I
Magnetbandlaufwerk
Wasserdampftafel
Fakultät <Mathematik>
Flüssiger Zustand
Mathematisierung
Token-Ring
Information
Arithmetisches Mittel
Virtuelles privates Netzwerk
Software
Datenverarbeitungssystem
Protokoll <Datenverarbeitungssystem>
Softwareentwickler
Leistung <Physik>
Hardware
Subtraktion
Punkt
Programm
Code
Whiteboard
Übergang
Computerspiel
Bereichsschätzung
Datenverarbeitungssystem
Perspektive
Softwareentwickler
Informatik
Programm
Bruchrechnung
Fehlermeldung
Vervollständigung <Mathematik>
Graph
Mathematik
Abstraktionsebene
Gebäude <Mathematik>
Systemaufruf
Softwareentwicklung
Physikalisches System
Codec
Quick-Sort
Reihe
Mereologie
Vollständigkeit
Fehlermeldung
Softwaretest
Subtraktion
Vervollständigung <Mathematik>
Kategorie <Mathematik>
Interaktives Fernsehen
Profil <Aerodynamik>
Softwareentwicklung
Mailing-Liste
Bitrate
Punktspektrum
Code
Übergang
Motion Capturing
Erwartungswert
Softwaretest
Äußere Algebra eines Moduls
Kontrast <Statistik>
Modelltheorie
Vollständigkeit
Formale Grammatik
Softwareentwickler
Punkt
Flächeninhalt
Open Source
Versionsverwaltung
Softwareentwicklung
Symboltabelle
Physikalisches System
Softwareentwickler
Grundraum
Quick-Sort
Programmfehler
Lineares Funktional
Hardware
Punkt
Kategorie <Mathematik>
Abstraktionsebene
Formale Grammatik
Speicher <Informatik>
Softwareentwicklung
Kartesische Koordinaten
Code
Übergang
Eins
Methodenbank
Datenverarbeitungssystem
Datenverarbeitungssystem
Rechter Winkel
Code
Analogieschluss
Minimum
Softwaretest
Umwandlungsenthalpie
Algorithmus
Parametersystem
Hardware
Minimierung
Güte der Anpassung
Substitution
Quellcode
Physikalisches System
Code
Framework <Informatik>
Übergang
Softwaretest
Funktion <Mathematik>
Code
Mereologie
Minimum
Softwareentwickler
Maßerweiterung
Hardware
Theorem
Prozess <Physik>
Punkt
Programmverifikation
Versionsverwaltung
Mathematik
Kolmogorov-Komplexität
Stetige Abbildung
Algorithmus
Datenverarbeitungssystem
Kryptologie
Software
Reelle Zahl
Theorem
Notepad-Computer
FAQ
Softwareentwickler
Zeiger <Informatik>
Hardware
Leistung <Physik>
Programm
Formale Grammatik
Beobachtungsstudie
Autorisierung
Parametersystem
Hardware
Siedepunkt
Prozess <Informatik>
Softwareentwicklung
Physikalisches System
Objekt <Kategorie>
Konzentrizität
Software
Datenverarbeitungssystem
ATM
Lesen <Datenverarbeitung>
Notepad-Computer
Zentrische Streckung
Theorem
Kontrollstruktur
Mathematik
Desintegration <Mathematik>
Versionsverwaltung
Kontinuierliche Integration
Zahlenbereich
Quellcode
Stetige Abbildung
Theorem
Programmbibliothek
Gamecontroller
Softwareentwickler
Versionsverwaltung
Hilfesystem
Theorem
Kontrollstruktur
Selbst organisierendes System
Desintegration <Mathematik>
Programm
Versionsverwaltung
Zahlenbereich
Mathematik
Code
Stetige Abbildung
Theorem
Datentyp
Statistische Analyse
Notepad-Computer
Softwareentwickler
Befehl <Informatik>
Softwareentwicklung
Schlussregel
Programmierumgebung
Quellcode
Physikalisches System
Software
Schlüsselverwaltung
Programmierumgebung
Versionsverwaltung
Modul <Software>
Notepad-Computer
Kernel <Informatik>
Prozess <Physik>
Punkt
Sampler <Musikinstrument>
Compiler
Formale Sprache
Virtuelle Maschine
Kartesische Koordinaten
Kombinatorische Gruppentheorie
Term
Code
Übergang
Virtuelle Maschine
Theorem
Programmbibliothek
Stützpunkt <Mathematik>
Nationale Forschungseinrichtung für Informatik und Automatik
Softwareentwickler
Funktion <Mathematik>
Parametersystem
Assembler
Computersicherheit
Abstraktionsebene
Softwareentwicklung
Betriebssystem
Physikalisches System
Quick-Sort
Kollaboration <Informatik>
Datenverarbeitungssystem
Mereologie
Dreiecksfreier Graph
Strategisches Spiel
Projektive Ebene
Digitales Zertifikat
Compiler
Term
Notepad-Computer
Resultante
Schreiben <Datenverarbeitung>
Bit
Softwareentwicklung
Formale Sprache
Primitive <Informatik>
Schlussregel
Kartesische Koordinaten
Physikalisches System
Term
Quick-Sort
Eins
Domain-Name
Verkettung <Informatik>
Datenfeld
ASCII
Menge
Formale Sprache
Speicherabzug
Primitive <Informatik>
Funktionale Programmiersprache
Informatik
Term
Demo <Programm>
Autorisierung
Programmiersprache
Nichtlinearer Operator
Unterring
Maschinencode
Prozess <Physik>
Prozess <Informatik>
Extrempunkt
Natürliche Zahl
Mailing-Liste
Element <Mathematik>
Template
Endogene Variable
Zeichenkette
Mailing-Liste
Pufferspeicher
Arithmetischer Ausdruck
Zahlensystem
Puffer <Netzplantechnik>
Äußere Algebra eines Moduls
Notepad-Computer
Hilfesystem
Skript <Programm>
Schlüsselverwaltung
Lineares Funktional
Dicke
Elektronische Publikation
Natürliche Zahl
Formale Sprache
Klassische Physik
Zahlenbereich
Mailing-Liste
Element <Mathematik>
Endogene Variable
Office-Paket
Objekt <Kategorie>
Mapping <Computergraphik>
Zahlensystem
Theorem
Parametersystem
Datentyp
Mustersprache
Hilfesystem
Skript <Programm>
Steuerwerk
Funktionale Programmiersprache
Instantiierung
Rückkopplung
Dicke
Datentyp
Gewichtete Summe
Statistische Schlussweise
Statistische Schlussweise
Element <Mathematik>
Code
Raum-Zeit
Template
Endogene Variable
Datentyp
Total <Mathematik>
Skript <Programm>
Programmierumgebung
Inklusion <Mathematik>
Elektronische Publikation
Datentyp
Statistische Schlussweise
Statistische Schlussweise
Mailing-Liste
Softwareentwicklung
Element <Mathematik>
Lie-Gruppe
Template
Endogene Variable
Mereologie
Oktave <Mathematik>
Hilfesystem
Skript <Programm>
Computeralgebra
Steuerwerk
Schreib-Lese-Kopf
Dicke
Elektronische Publikation
Datentyp
Reflexiver Raum
Mailing-Liste
Term
Template
Physikalische Theorie
Statistische Hypothese
Endogene Variable
Puffer <Netzplantechnik>
Differenzkern
Vollständige Induktion
Skript <Programm>
Reflexiver Raum
Steuerwerk
Gerade
Speicherverwaltung
Aggregatzustand
Lineares Funktional
Dicke
Datentyp
Elektronische Publikation
Prozess <Physik>
Momentenproblem
Statistische Schlussweise
Statistische Schlussweise
Reflexiver Raum
Mailing-Liste
Physikalisches System
Element <Mathematik>
Äquivalenzklasse
Lader
Template
Endogene Variable
Puffer <Netzplantechnik>
Reverse Engineering
Theorem
Lemma <Logik>
Hilfesystem
Skript <Programm>
Ereignishorizont
Inklusion <Mathematik>
Lineares Funktional
Dicke
Datentyp
Elektronische Publikation
Statistische Schlussweise
Statistische Schlussweise
Reflexiver Raum
Kardinalzahl
Binder <Informatik>
Lader
Brennen <Datenverarbeitung>
Template
Quick-Sort
Algorithmische Programmiersprache
Endogene Variable
Hilfesystem
Skript <Programm>
FAQ
Minimierung
Formale Sprache
Zahlenbereich
Lader
Komplex <Algebra>
Code
Template
Homepage
Überlagerung <Mathematik>
Variable
Arithmetischer Ausdruck
Abstrakter Syntaxbaum
Skript <Programm>
Ereignishorizont
Lineares Funktional
Elektronische Publikation
Kategorie <Mathematik>
Softwareentwicklung
Physikalisches System
Brennen <Datenverarbeitung>
Endogene Variable
Konstante
Zeichenkette
Puffer <Netzplantechnik>
Rechter Winkel
Mereologie
Hilfesystem
Rekursive Funktion
Konstruktor <Informatik>
Elektronische Publikation
Punkt
Statistische Schlussweise
Statistische Schlussweise
Softwareentwicklung
Physikalisches System
Term
Template
Code
Endogene Variable
Homepage
Zeichenkette
Variable
Puffer <Netzplantechnik>
Parametersystem
Abstrakter Syntaxbaum
Datentyp
Skript <Programm>
Programmierumgebung
Analysis
Umwandlungsenthalpie
Minimierung
Compiler
Datenhaltung
Programmverifikation
Programm
Implementierung
NP-hartes Problem
Physikalisches System
Code
Programmfehler
Eins
Systemprogrammierung
Font
Rechter Winkel
Fokalpunkt
Mereologie
Rechenschieber
Notepad-Computer
Normalvektor
Demo <Programm>
Standardabweichung
Instantiierung
Maschinencode
Subtraktion
Compiler
Minimierung
Formale Sprache
Implementierung
Kartesische Koordinaten
Bridge <Kommunikationstechnik>
NP-hartes Problem
Formale Semantik
Zahlensystem
Rechter Winkel
Theorem
Maßerweiterung
Implementierung
Umwandlungsenthalpie
Interpretierer
Approximationstheorie
Softwareentwicklung
Quellcode
Physikalisches System
Software
Festspeicher
Wort <Informatik>
Compiler
Maschinencode
Komponententest
Quader
Desintegration <Mathematik>
Formale Sprache
NP-hartes Problem
Kartesische Koordinaten
Äquivalenzklasse
Raum-Zeit
Code
Physikalische Theorie
Formale Semantik
Physikalisches System
Komponente <Software>
Softwaretest
Einheit <Mathematik>
Rechter Winkel
Code
Theorem
Programmbibliothek
Notepad-Computer
Coprozessor
Kontrast <Statistik>
Softwareentwickler
Softwaretest
Umwandlungsenthalpie
Programmiersprache
Lineares Funktional
Vervollständigung <Mathematik>
Feinstruktur <Mengenlehre>
Raum-Zeit
Abstraktionsebene
Computersicherheit
Gebäude <Mathematik>
Systemintegration
p-Block
Physikalisches System
Modul
Coprozessor
Arithmetisches Mittel
Einheit <Mathematik>
Datenverarbeitungssystem
Komponente <Software>
Mereologie
Aggregatzustand
Standardabweichung
Programm
Maschinencode
Minimierung
Compiler
Abstraktionsebene
Web-Applikation
Formale Sprache
Softwareentwicklung
Framework <Informatik>
Code
Übergang
Virtuelle Maschine
Generator <Informatik>
Theorem
Maschinencode
Lemma <Logik>
Programmierparadigma
Programmbibliothek
Softwareentwickler
Unterring
Elektronische Publikation
Prozess <Informatik>
Statistische Schlussweise
Default
Reflexiver Raum
Tablet PC
Template
Endogene Variable
Zeichenkette
Font
Mailing-Liste
Rechenschieber
Skript <Programm>
Intuitionistische Mathematik
Demo <Programm>
Normalvektor
Theorem
Fehlermeldung
Elektronische Publikation
Datentyp
Statistische Schlussweise
Automatische Handlungsplanung
Schlussregel
Reflexiver Raum
Programmierumgebung
Template
Steuerwerk
Endogene Variable
Steifes Anfangswertproblem
Gruppe <Mathematik>
Lemma <Logik>
Speicherabzug
Skript <Programm>
Datentyp
Vervollständigung <Mathematik>
Mathematik
Statistische Schlussweise
Abstraktionsebene
Güte der Anpassung
Formale Sprache
Physikalisches System
Term
Endogene Variable
Loop
Turing-Test
Theorem
Speicherabzug
Programmbibliothek
Skript <Programm>
Skript <Programm>
Rekursive Funktion
Primitive <Informatik>
Datenstruktur
Softwaretest
Konstruktor <Informatik>
Theorem
Elektronische Publikation
Matching <Graphentheorie>
Statistische Schlussweise
Softwareentwicklung
Term
Kontextbezogenes System
Template
Endogene Variable
Zeichenkette
Pufferspeicher
Puffer <Netzplantechnik>
Mustervergleich
Heegaard-Zerlegung
Skript <Programm>
FAQ
Analysis
Resultante
Elektronische Publikation
Statistische Schlussweise
Term
Template
Statistische Hypothese
Kontextbezogenes System
Endogene Variable
Zeichenkette
Bildschirmmaske
Differenzkern
Wort <Informatik>
Skript <Programm>
Gerade
Leistungsbewertung
Subtraktion
Minimierung
Natürliche Zahl
Formale Sprache
Gruppenkeim
PASS <Programm>
Kontextbezogenes System
Template
Knotenmenge
Algebraische Struktur
Pufferspeicher
Unterring
Abstrakter Syntaxbaum
Total <Mathematik>
Skript <Programm>
Zentrische Streckung
Elektronische Publikation
Mathematik
Statistische Schlussweise
Reflexiver Raum
Physikalisches System
Endogene Variable
Rechenschieber
Zeichenkette
Hilfesystem
Axiom
Speicherverwaltung
Subtraktion
Programmverifikation
Programm
Gruppenkeim
Implementierung
Mathematik
Physikalische Theorie
Richtung
Font
Theorem
Rechenschieber
Notepad-Computer
Beamer
Normalvektor
Demo <Programm>
Umwandlungsenthalpie
ATM
Mathematik
Logiksynthese
Default
Programmverifikation
Mailing-Liste
Physikalisches System
Quick-Sort
ATM
Logiksynthese
Compiler
Formale Sprache
Binärcode
Formale Semantik
Regulärer Graph
Code
Retrievalsprache
Translation <Mathematik>
Funktionale Programmierung
Zeitrichtung
Softwareentwickler
Funktion <Mathematik>
Logiksynthese
Hardwarebeschreibungssprache
Machsches Prinzip
Softwareentwicklung
Physikalisches System
Ein-Ausgabe
Lineares Funktional
Mapping <Computergraphik>
Objekt <Kategorie>
Generator <Informatik>
Formale Sprache
ATM
Dateiformat
Logische Schaltung
Instantiierung
Quader
Open Source
Logische Schaltung
Programmverifikation
Zahlenbereich
Physikalisches System
Assembler
ROM <Informatik>
Coprozessor
Eins
Komponente <Software>
Open Source
Physikalisches System
Modul <Datentyp>
Komponente <Software>
Theorem
Festspeicher
Programmbibliothek
Projektive Ebene
Logische Schaltung
Softwareentwickler
Hardware
Instantiierung
Umwandlungsenthalpie
Theorem
Subtraktion
Hardware
Quader
Kategorie <Mathematik>
Minimierung
Klasse <Mathematik>
Applet
Formale Sprache
Implementierung
Interaktives Fernsehen
Physikalisches System
Ein-Ausgabe
Arithmetischer Ausdruck
Formale Semantik
Komponente <Software>
Datenfeld
Funktion <Mathematik>
Komponente <Software>
Wort <Informatik>
Implementierung
Funktion <Mathematik>
Inklusion <Mathematik>
Programm
Objekt <Kategorie>
Umwandlungsenthalpie
Parametersystem
Kategorie <Mathematik>
Datenparallelität
Minimierung
Formale Sprache
Hierarchische Struktur
Systemaufruf
Implementierung
Softwareentwicklung
Ordinalzahl
Binder <Informatik>
Modul
Systemaufruf
Eins
Objekt <Kategorie>
Modul <Datentyp>
Komponente <Software>
Ablaufverfolgung
Aggregatzustand
Objekt <Kategorie>
Stellenring
Reihenfolgeproblem
Punkt
Formale Sprache
Template
Framework <Informatik>
Code
Kontextbezogenes System
Formale Semantik
Mailing-Liste
Message-Passing
Pufferspeicher
Knotenmenge
Font
Rechenschieber
Skript <Programm>
Modelltheorie
Modul
Demo <Programm>
Normalvektor
Schreiben <Datenverarbeitung>
Hardware
Hardwarebeschreibungssprache
Statistische Schlussweise
Default
Schlussregel
Physikalisches System
Modul
Variable
Systemaufruf
Gruppenoperation
Schlussregel
Endogene Variable
Eingebettetes System
Puffer <Netzplantechnik>
Parametersystem
Dreiecksfreier Graph
Lesen <Datenverarbeitung>
Hilfesystem
Vollständigkeit
Logische Schaltung
Offene Menge
Formale Semantik
Theorem
Maschinencode
Prozess <Physik>
RISC
Sampler <Musikinstrument>
Implementierung
Statistische Hypothese
Kartesische Koordinaten
Oval
Term
Komponente <Software>
Physikalisches System
Umkehrung <Mathematik>
Theorem
CAE
Gerade
Implementierung
Modul
Formale Grammatik
Schreiben <Datenverarbeitung>
Hardware
Default
Programmverifikation
Validität
Physikalisches System
Elektronische Publikation
Modul
Systemaufruf
Schlussregel
Arithmetisches Mittel
Software
Flächeninhalt
Funktion <Mathematik>
Komponente <Software>
Festspeicher
Helmholtz-Zerlegung
Hilfesystem
Wort <Informatik>
Zentraleinheit
Formale Grammatik
Beobachtungsstudie
Mashup <Internet>
Formale Semantik
Softwareentwickler
Prozess <Physik>
Kategorie <Mathematik>
Open Source
Formale Grammatik
Versionsverwaltung
Raum-Zeit
Formale Semantik
Menge
Offene Menge
Theorem
Minimum
Modelltheorie
Coprozessor
p-Block
Softwareentwickler
Zentraleinheit
Gerade
Klon <Mathematik>
Subtraktion
Primideal
Implementierung
Kartesische Koordinaten
Code
Dämpfung
Algorithmus
Kryptologie
Code
Programmbibliothek
Primitive <Informatik>
Auswahlaxiom
Elliptische Kurve
Expertensystem
Expertensystem
Hardware
Kreisfläche
Primzahl
Assembler
Kryptologie
Modul
Arbeit <Physik>
Menge
Vorwärtsfehlerkorrektur
Programmbibliothek
Unternehmensarchitektur
Prozess <Physik>
Primideal
Polynom
Compiler
Programmverifikation
Implementierung
Statistische Hypothese
Assembler
Abstraktionsebene
Whiteboard
Code
Übergang
Virtuelle Maschine
Algorithmus
Elektronische Unterschrift
Code
Protokoll <Datenverarbeitungssystem>
Computersicherheit
Implementierung
Expertensystem
Algorithmus
Mathematik
Protokoll <Datenverarbeitungssystem>
Kategorie <Mathematik>
Logiksynthese
Computersicherheit
Kryptologie
Programmverifikation
Mathematisierung
Quick-Sort
Programmfehler
Arbeit <Physik>
Kategorie <Mathematik>
Projektive Ebene
Schlüsselverwaltung
Demo <Programm>
Prozess <Physik>
Versionsverwaltung
Statistische Hypothese
Template
Font
Umkehrung <Mathematik>
Code
Rechenschieber
Kurvenanpassung
Elliptische Kurve
Hardware
Kryptologie
Softwareentwicklung
Ranking
Generator <Informatik>
Lemma <Logik>
Funktion <Mathematik>
Rechter Winkel
Heegaard-Zerlegung
Benutzerführung
Faltung <Mathematik>
Programmbibliothek
Versionsverwaltung
Standardabweichung
Instantiierung
Theorem
Zahlenbereich
Implementierung
Kurvenanpassung
Domain-Name
Pufferspeicher
Mailing-Liste
Programmbibliothek
Galois-Feld
Demo <Programm>
Einfach zusammenhängender Raum
Elektronische Publikation
Betafunktion
Statistische Schlussweise
Default
Physikalisches System
Mapping <Computergraphik>
Helmholtz-Zerlegung
Hilfesystem
Vorwärtsfehlerkorrektur
Funktionale Programmiersprache
Benutzerführung
Term
Demo <Programm>
Bit
Subtraktion
Selbstrepräsentation
n-Tupel
Versionsverwaltung
Zahlenbereich
Case-Modding
Element <Mathematik>
Oval
Steuerwerk
Multiplikation
Mailing-Liste
Skript <Programm>
Coprozessor
Kurvenanpassung
Demo <Programm>
Gammafunktion
Elektronische Publikation
Prozess <Informatik>
Krümmung
Statistische Schlussweise
Dämon <Informatik>
Elektronische Publikation
Ein-Ausgabe
Endogene Variable
Mapping <Computergraphik>
Tupel
Lemma <Logik>
Existenzsatz
Menge
Ganze Zahl
Hilfesystem
Wort <Informatik>
Logische Schaltung
Aggregatzustand
Resultante
Konstruktor <Informatik>
Bit
Zehn
Rechenzeit
Relationentheorie
Dämon <Informatik>
Zwei
Derivation <Algebra>
Physikalisches System
Gesetz <Physik>
Code
Whiteboard
Endogene Variable
Übergang
Unterring
Mereologie
Skript <Programm>
Computeralgebra
Modelltheorie
Logische Schaltung
Demo <Programm>
Bit
Momentenproblem
Code
Übergang
Gebundener Zustand
Bildschirmmaske
Variable
Pufferspeicher
Skript <Programm>
Kurvenanpassung
Figurierte Zahl
Phasenumwandlung
Analysis
Funktion <Mathematik>
Demo <Programm>
Parametersystem
Hardware
Elektronische Publikation
Rechenzeit
Dämon <Informatik>
Brennen <Datenverarbeitung>
Endogene Variable
Konstante
Speicherverwaltung
Instantiierung
Resultante
Subtraktion
Bit
Mereologie
Gewichtete Summe
Primideal
Sampler <Musikinstrument>
Kurvenanpassung
Implementierung
Code
Multiplikation
Kryptologie
Code
Programmbibliothek
Skript <Programm>
Skript <Programm>
Modelltheorie
E-Mail
Konfigurationsraum
Gerade
Elliptische Kurve
Implementierung
Leistung <Physik>
Primzahl
Benchmark
Mailing-Liste
Physikalisches System
Ein-Ausgabe
Ellipse
Packprogramm
Primzahltest
Ganze Zahl
Zahlenbereich
Festspeicher
Parametersystem
Körpertheorie
Programmbibliothek
Resultante
Bit
Punkt
Compiler
Formale Sprache
t-Test
Versionsverwaltung
Kartesische Koordinaten
Automatische Programmierung
Service provider
Arbeit <Physik>
Reverse Engineering
Theorem
Minimum
Seitenkanalattacke
E-Mail
Gerade
Elliptische Kurve
Umwandlungsenthalpie
Softwaretest
Konstruktor <Informatik>
Schnelltaste
Nichtlinearer Operator
Parametersystem
Lineares Funktional
Dicke
Assembler
Kategorie <Mathematik>
Abstraktionsebene
Kryptologie
Systemaufruf
Benchmark
Softwareentwicklung
Quellcode
Konstante
Software
Druckverlauf
Primzahltest
Menge
Strategisches Spiel
Projektive Ebene
Notepad-Computer
Standardabweichung
Instantiierung
SCI <Informatik>
Subtraktion
Web Site
Multiplikation
Körperarithmetik
Invarianz
Schaltnetz
Zahlenbereich
Implementierung
Code
Framework <Informatik>
Physikalische Theorie
Loop
Virtuelle Maschine
Entwurfsautomation
Perspektive
Inverser Limes
Modelltheorie
Datenstruktur
Zeiger <Informatik>
Speicher <Informatik>
Informatik
Hilfesystem
Analysis
Mathematik
Programmverifikation
Mailing-Liste
Physikalisches System
Programmfehler
Komponente <Software>
Mereologie
Gamecontroller
Körpertheorie
Funktionale Programmiersprache
Innerer Punkt
Einfügungsdämpfung
Modifikation <Mathematik>
Formale Sprache
Ausbreitungsfunktion
Programm
Formale Grammatik
Kartesische Koordinaten
Formale Semantik
Dämpfung
Halteproblem
Kryptologie
Notepad-Computer
Metropolitan area network
Funktion <Mathematik>
Feuchteleitung
Umwandlungsenthalpie
Parametersystem
Hardware
Benutzerfreundlichkeit
Kryptologie
Softwareentwicklung
Ein-Ausgabe
Teilbarkeit
Software
Kritischer Punkt
Konditionszahl
Projektive Ebene
Ordnung <Mathematik>
Logische Schaltung
Notepad-Computer
Aggregatzustand
Varietät <Mathematik>
Subtraktion
Multiplikation
Code
Data Mining
Domain-Name
Computerspiel
Reelle Zahl
Software
Datentyp
Programmbibliothek
Modelltheorie
Softwareentwickler
Maßerweiterung
Autorisierung
Mathematik
Physikalisches System
Quick-Sort
Komponente <Software>
Dreiecksfreier Graph
Hypermedia
Mereologie
Hypermedia
Medianwert
Systemprogrammierung

Metadaten

Formale Metadaten

Titel Coming Soon: Machine-Checked Mathematical Proofs in Everyday Software and Hardware Development
Serientitel 34th Chaos Communication Congress
Autor Chlipala, Adam
Lizenz CC-Namensnennung 4.0 International:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/34867
Herausgeber Chaos Computer Club e.V.
Erscheinungsjahr 2017
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Most working engineers view machine-checked mathematical proofs as an academic curiosity, if they have ever heard of the concept at all. In contrast, activities like testing, debugging, and code review are accepted as essential. They are woven into the lives of nearly all developers. In this talk, I will explain how I see machine-checked proofs enabling new everyday activities for developers of computer software and hardware. These activities have the potential to lower development effort dramatically, at the same time as they increase our assurance that systems behave correctly and securely. I will give a cosmological overview of this field, answering the FAQs that seem to stand in the way of practicality; and I will illustrate the principles with examples from projects that you can clone from GitHub today, covering the computing stack from digital hardware design to cryptographic software and applications.
Schlagwörter Resilience

Zugehöriges Material

Video wird in der folgenden Ressource zitiert

Ähnliche Filme

Loading...
Feedback