## 6th HLF – Laureate Lectures: Algebra, Logic, Geometry: at the Foundation of the Computer Science

Video in TIB AV-Portal: 6th HLF – Laureate Lectures: Algebra, Logic, Geometry: at the Foundation of the Computer Science

0 views

 Title 6th HLF – Laureate Lectures: Algebra, Logic, Geometry: at the Foundation of the Computer Science Title of Series 6th Heidelberg Laureate Forum (HLF), 2018 Author Hoare, C. Antony R. License No Open Access License:German copyright law applies. This film may be used for your own use but it may not be distributed via the internet or passed on to external parties. Identifiers 10.5446/40180 (DOI) Publisher Heidelberg Laureate Forum Foundation Release Date 2018 Language English

 Subject Area Mathematics Abstract Sir C. Antony R. Hoare: "Algebra, Logic, Geometry at the Foundation of Computer Science" I show by examples how the Theory of Programming can be taught to first year CS undergraduates. The only prerequisite is their High School acquaintance with algebra, geometry and propositional calculus. The students’ motive is to learn how to construct and debugs their programs. I start with the familiar laws for disjunction in Boolean Algebra, illustrated by disjunction. A deductive rule for proof by cases is derived from the algebra. The algebra is extended by the operators of spatial and temporal logic: William of Occam’s ‘while’ (|) and ‘then’ (;). They satisfy the same familiar algebraic laws, including distribution through disjunction. A weak interchange law describes how they distribute through one another by ‘shuffling’. Proof rules are then derived for a modal logic of time and space. If this logic is applied to propositions about the behaviour of programs, the ‘while’ and ‘then’ operations can be reinterpreted as sequential and concurrent compositions of programs. The proof rules of the modal logic are then definitionally equivalent to two historic logics due to Hoare and Milner. They are now used widely for mechanical reasoning about correctness of programs and about implementations of programming languages. The opinions expressed in this video do not necessarily reflect the views of the Heidelberg Laureate Forum Foundation or any other person or associated institution involved in the making and distribution of the video.

### Related Material

Universelle Algebra Programming language Algorithm Musical ensemble Geometry Geometry Hoare logic Universelle Algebra Projective plane Computer Translation (relic) Parallel computing Mereology Computer programming Internet forum Logic Logic Computer science Quicksort
Principal ideal Boolean algebra Correspondence (mathematics) Binary code Mereology Computer programming Programmer (hardware) Mathematics Mathematics Video game Logic Mathematician Physical law Pairwise comparison Social class Universelle Algebra Hoare logic Concurrency (computer science) Temporal logic Software developer Shared memory Internet forum Category of being Phase transition Computer science Right angle Distributive property Similarity (geometry) Student's t-test Mathematical model Theory Boole, George Social class Goodness of fit Computer programming Authorization Energy level Integrated development environment Software testing Subtraction Boolean algebra Geometry Computer program Theory Operator (mathematics) Predicate (grammar) Boole, George Faculty (division) Integrated development environment Mathematics Formal grammar Computer programming Family De Morgan, Augustus
Commutative property Spacetime Geometry Group action Context awareness Universelle Algebra Venn diagram Modal logic Thermal expansion Propositional formula Axiom Mereology Computer programming Order (biology) Mathematics Strategy game Logic Mathematician Diagram Pairwise comparison Universelle Algebra Theory of relativity Product (category theory) Venn diagram Temporal logic Complex (psychology) Computer Computational geometry Variable (mathematics) Substitute good School of Names Category of being Lattice (order) Order (biology) Computer science Theorem Mathematical optimization Task (computing) Resultant Associative property Slide rule Read-only memory Distributive property Idempotent Connectivity (graph theory) Distribution (mathematics) Autocovariance Branch (computer science) Temporal logic Student's t-test Mortality rate Theory Number 2 (number) Peripheral Term (mathematics) Operator (mathematics) Computer programming Theorem Subtraction Associative property Boolean algebra Pairwise comparison Assembly language Geometry Autocovariance Distribution (mathematics) Poisson-Klammer Idempotent Physical law Expression Paradox Operator (mathematics) Reflexive space Morley's categoricity theorem Line (geometry) Cartesian coordinate system Axiom Faculty (division) Word Commutator Logic Universe (mathematics) Pearson product-moment correlation coefficient Strategy game Statement (computer science) Physics Partial derivative Logische Programmiersprache Diagram
Spacetime Group action Concurrency (computer science) Universelle Algebra Direction (geometry) Propositional formula Axiom Parameter (computer programming) Substitute good Heegaard splitting Software framework Logische Programmiersprache Social class Programming language Spacetime Temporal logic Concurrency (computer science) Algebraic number Hoare logic Distributive property Computer Parallel port Sequence Substitute good Maxima and minima Equation Theorem Physical system Point (geometry) Slide rule Distributive property Observational study Translation (relic) Theory 2 (number) Sequence Frequency Term (mathematics) Operator (mathematics) Computer programming Authorization Theorem Associative property Units of measurement Modal logic Key (cryptography) Projective plane Operator (mathematics) Theory Volume (thermodynamics) Line (geometry) Group action Axiom Cartesian coordinate system Symbol table Einheitliche Feldtheorie Logic Lie group Universe (mathematics) Interpreter (computing) Computer programming Units of measurement
Axiom of choice Commutative property Concurrency (computer science) Interleaving Universelle Algebra Compiler Axiom Turing-Maschine Semantics (computer science) Computer programming Radical (chemistry) Positional notation Office suite Universelle Algebra Programming language Hoare logic Concurrency (computer science) Algebraic number Distributive property Computer Parallel port Bit Student's t-test Term (mathematics) Sequence Category of being Data mining Order (biology) Theorem Reading (process) Associative property Point (geometry) Distributive property Consistency Theory Sequence Well-formed formula Computer programming Theorem Units of measurement Home page Axiom of choice Consistency Physical law Interleaving Expert system Theory Line (geometry) Semantics (computer science) Axiom Symbol table Equivalence relation Word Logic Interpreter (computing) Computer programming Formal verification Central processing unit Units of measurement
Meta element Program code Geometry Concurrency (computer science) Universelle Algebra Scientific modelling Axiom Complete metric space Interface (computing) Mereology Semantics (computer science) Machine code P (complexity) Data model Logic Series (mathematics) Factorization Descriptive statistics Fundamental theorem of algebra Universelle Algebra Programming language Electric generator Concurrency (computer science) Algebraic number Computer Parallel port Range (statistics) Quantification Term (mathematics) Sequence Internet forum Well-formed formula Phase transition Order (biology) Heegaard splitting Equation Theorem Compilation album Right angle Mathematical optimization Identical particles Physical system Fundamental theorem of algebra Point (geometry) Slide rule Distributive property Transformation (genetics) Constraint (mathematics) Student's t-test Reflexive space Counterexample Mathematical model Theory Number Architecture Structured programming Well-formed formula Computer programming Endliche Geometrie Code refactoring output Subtraction Slide rule Geometry Grundlagen der Mathematik Physical law Computer program Theory Machine code Set (mathematics) Semantics (computer science) Transformation (genetics) Axiom Cartesian coordinate system Compiler Component-based software engineering Calculation Logic String (computer science) Function (mathematics) Statement (computer science) Computer programming Formal verification Mathematical optimization Separation axiom
Internet forum Musical ensemble Autocovariance Physical law Theory
[Music] who knows algorithm 64 algorithm 64 is
also known or it was published as algorithm 64 for quicksort and so so this this was published in 1961 I believe I checked about on Google Scholar I found some 28,000 papers with quicksort in in their title sir Tony Hoare who is our next speaker invented it as part of I think a language translation project if I remember correctly but that's actually a small thing among many things he did he invented for logic to argue and about programs and proved incorrect he worked on concurrent processes and how to prove thing sorry oh yeah I was still working and today he'll be telling us about geometry logic algebra and how it relates to computer science sir Tony Hoare we're very glad to have you here [Applause] thank you very much I wish I could repay
your your applause by giving a lecture as good as the one that we've just heard but the best is the best I can do and you will I hope to find some similarities between my career and a lot of John Hennessy but the difference in other respects okay now I have three theses that the foundations of computer programming can be taught as an aid to practical programming throughout a degree in in computing science not as a separate course but always as part of the practical course that will immediately apply what has been learnt in the theory the program development environment for all practical classes should be based on on the theory but it's also very important to have a program development department of the kind that is widely available in industry now for professional programmers and which helps in at least the testing of of programs and increasingly in other phases of the development of the program right up to delivery and beyond and the final thing is that the initial level of math that's required to teach these courses is available in in the high school and from that I will quote from my own experience so the summary of what I'm going to talk about that summer is already published in in your blue book so very very quickly and start with a review of boolean algebra George Boole is clearly
you only have to prove the antecedents first and after you've done that you can use the consequent anywhere subsequently in your proof so let's try and prove a proof rule it's called the proof by cases first of all assume the two antecedents and then after the yeah you pluck the idempotence axiom and apply it to R now we can use the antecedents of substitution rules to replace both R's by the two different right-hand sides P or R or Q or R and finally we can use the distribution of our through or to prove the theorem and the conclusion follows by the definition of the ordering now the purpose of the proof rule is to give you a strategy for constructing proofs and that is I think very important also for a strategy of constructing programs they're based on proof rules as we will see later I hope so now a few theorems about the proof we we want to prove some of the peripherals relating to the ordering relation and perhaps many of you have have memories of going through this little exercise of proving the proof rules the nice thing in this case is not that each proof rule shown here reflexivity transitivity antisymmetry depend on all the axioms of the partial order each of of all the Aksum relating to its definition but by one axiom each so I will ask you to verify if you wish that those derived proof rules are valid well there's a very important property and computer science called monotonicity which is a rather horrible word and I prefer to use the categorical term covariance that when you make if you know how that two variables or expressions P and Q are relatively ordered then the you can substitute one of them for the other and in it in the it's in many interesting come contexts and the result will be to get to increase Authority to obtain a result which preserves the ordering of the operands so covariant or monotonic operator when plight applied to an operand preserves the ordering of the operands very important engineering principle because when we replace a component in an engineering assembly by one which is better then it cannot get worse it can only get better itself if you put a more reliable component in your assembly will become more reliable not always true if the laws are that you are using and the principle you using in your design are incorrect so the science has got to be right but but if if everything is working as it normally does then improving components will improve the product very important principle also in the liver of computer programs and in their optimization optimize a small part of it you won't make the whole thing worse this that's the principle to learn lesson move on to a branch of logic which is not so widely studied it is spatial and temporal logic sometimes called modal lock logics but ever since Aristotle in fact it has been studied by philosophers and logicians and especially in the Middle Ages where they hope to find in a modal logic a way of reconciling the great theological parrot paradox of this simultaneous benevolence and benevolence and omnipotence of the deity I'm going to draw your attention
to William of Occam who was the author of the Occam's razor which I've always used as as my philosophical and mathematical goal not to make unnecessary postulates about entities and partly because William of Occam was a Franciscan friar and a scholar at Merton College Oxford and many years later as many centuries later I was also a scholar of at Merton College Oxford and I studied philosophy it was at Merton that I studied I pursued my studies of logic and in my spare time and obtained my my only degree from in any university an MA in philosophy from Oxford University the point apart a temporal and spatial logic are that they give you two more methods of composing compositions if your propositions are describing something then I think the next slide here the propositions
needs need be the algebra from which I've deduced these axioms is a great deal shorter and a great deal simpler than the laws which I known as a whole logic so I tend to use the word whore to mean when I call the the old war before I came to the realization of this and then I realized that that could that could be misunderstood that I was now the old for and the other one was my previous incarnation no now I'm going to define a using the second interpretation to define robin miller's notation which by which he defines languages such as c CS it says the superscript superscripted notation says that if you want to execute are then one of the ways of doing it is to execute Q first and then P now when I substitute that definition for the same theorem I get the millers law for the semicolon it's not actually due to Milner but it's what Milner would have written if he'd been interested in semicolon which he wasn't and the Tuukka robberies are what known that would quote structural fools he would express them as equivalences I respect expressed them as orderings so we have two theories Robby Miller was a great friend of mine and we always regarded each other as scientific rivals and we fought like anything about the theories that we were promoting at that time in fact we both collaborated on the three-year and very expensive research grant to try to bring our theories together into a single unified theory which was very necessary because you can't really sell to industry any theory on which the experts disagree because if you go into an engineer's office sense of the sullens marvelous idea then they will ask you what you thought of the theory that the last person who came into his office said and if you say well that's all rubbish really and he says well that's what they said about your theory you their way and work something out and come back when you agree and we never did come back now we have agreed unfortunately too late for Robin but I like to say this as a tribute to him there are lots of other proofs of the consistency of a correctness logic with with an operational semantics but they occupy many thousands of Pape of pages of symbols which are we doing maybe I could introduce you to the basic principle of concurrency we interact the I need a new law which describes the principle that a concurrent program can be executed sequentially by interleaving something which was well-known since the days of curing but the proof of it was pretty calm pretty difficult because it had to use interpretation using a Turing machine if you make that into an axiom firstly curing a stone it will actually be true and a second it will be much simpler I have a much simpler well you won't have a proof at all because it's an axiom well first of all I've got to introduce concurrency well it's the same really has the same laws it's obviously associative and has a unit and it distributes through all the new axiom that I need is called the interchange or in category theory and my theory now and it looks a little bit complicated well I think it's called the interchange law because it interchanges the symbols which appear in the formula of the semicolon and the parallel bars so let's have a look at it again I've used coloring trying to deduce some of the consequences of the interchange law that the we can prove that the sequential composition is stronger or contained in the parallel composition if we start with P parallel Q dashed we can execute those P selling in in the order P Q dashed okay all cute that's what seems to me a miss and P dashed thank you I perhaps I hope you to see what I mean is it is I think it's obscured by a misprint anyway the point is we can now use the law of proof by cases to establish that a non-deterministic choice the or between P semicolon Q and Q semicolon P is one of the ways of implementing P parallel to Q so for sequences of less than two operands I have proved that if I interpret the ordering as a refinement ordering can be implemented as then I've got I've proved the principle of concurrency in this very simple case the general case he here I've listed the four different ways of executing programs with three operands and here I've speeded up to deal with four operands in the end you get an interleaving of the read operand and the black top around but within each operand you can extract back just an into living and we can take the red cards out and the black cards out and they will be so so the interleaving operator on the first line is really a shuffle operates often called a shuffle shuffle operator because it satisfies this property
so the I think I'll are we doing do I
have five minutes thank you then I can go quickly through the interchange rule from the statement of the interchange law I can do a simple proof of a an appropriate rule of reasoning which are displayed on this the slide it shows that conclusion says that if you've got to implement a concurrent composition you may be implemented as a sequential composition in which you've taken one operand either a dashed pair of operands or a yun dashed pair and run them separately which describes the calculation of the our parallel are dashed into two phases one in which you run p and q concurrently until they each reach there I've used the wrong things I decide ours it is implemented by P semicolon Q and on the right hand side I will start by executing peed in parallel with P dashed as shown here up to a point at which the are reaches this semicolon and the point in which our dashed reaches this semicolon and those two semicolons as it were a synchronized both but and then synchronized at that point there in the answer and the remainder the still remains to be executed is the parallel composition of P dashed and q dashed so summarizing we've got when we translate these laws according the same definitions that I've given for the ordering previously we get a law which is was rediscovered by which was discovered by Peter Ahern in his separation logic which is one of the most effective ways of dealing in a rigorous way with concurrency we get the which is expressed as a in all logic and then in up we also get an operational semantics a law which Milner postulated as defining the sequential composition in his language CCS and they're both equivalent to the concurrency rule which was proved directly from the algebraic axioms so algebra I think algebra is wonderful it's how much simpler la read my little eulogy of algebra but it's also so useful because algebra is what is used in any correct transformation of a program while preserving its semantics so all optimizations are dependent on applications of algebraic in equations algebraic orderings refinements and if you want to preserve the semantics you can apply algebra in in compilation to translate from one language to another optimization refactor ization of code obfuscation of code and automatic generation of program code and all of these things can be set as exercises to your first-year students optimize this this formula unification of our unification of theories is really very similar to the unification of arithmetic that is provided in the foundations of mathematics the mathematical models that are given for in the foundations of mathematics uses sets and pairs and sequences and every kind of number is has a very different modeling structure but the algebraic laws which they are bear very similar and this is what I really mean by a unification of theories not absolute identities but really understand the fundaments of what makes them what they are how similar they are and why they are so different now here are some deficiencies of the algebraic approach and they explain why we need geometry as well to serve as a model for the algebra so it in another lecture you would I have given a description of how geometry underlies the the kinds of geometries finite geometry that underlies the algebra that I've been describing in this lecture and if you look back at the Heidelberg lecturing in 2016 you'll see me giving a flexure anybody against algebra anybody thank you impacts so we'd have time for one or two questions and Azariah don't run away they're supposed to be the picture happening if you don't ask a question I will give you some other people's answers to this question well you told us the rather reasonable idea that if you improve one part of a of an operation that it won't make the whole thing worse but then you warned us that perhaps that idea might not be true if your theory was wrong could you give an example of when that might be untrue yes
it seems to happen whenever I replace the tap in my home can you think an example let's put it this way when it when it does happen you have to change your theory it's like an experiment that has disproved your theory when the law of refinement is not is is not monotonic it's not a covariance you don't want to make that mistake again so you have to revise your theory and I'm afraid I can't I can't off I don't think of a nice example I will try one more question well if there are no takers thank you very much for this wonderful lecture [Music] you

### Timings

```  682 ms - page object
```