Add to Watchlist

6th HLF – Laureate Lectures: Algebra, Logic, Geometry: at the Foundation of the Computer Science

0 views

Citation of segment
Purchasing a DVD Granting licences Cite video

Formal Metadata

Title 6th HLF – Laureate Lectures: Algebra, Logic, Geometry: at the Foundation of the Computer Science
Title of Series 6th Heidelberg Laureate Forum (HLF), 2018
Author Hoare, C. Antony R.
License No Open Access License:
German copyright law applies. This film may be used for your own use but it may not be distributed via the internet or passed on to external parties.
DOI 10.5446/40180
Publisher Heidelberg Laureate Forum Foundation
Release Date 2018
Language English

Content Metadata

Subject Area Computer Science, Mathematics
Abstract Sir C. Antony R. Hoare: "Algebra, Logic, Geometry at the Foundation of Computer Science" I show by examples how the Theory of Programming can be taught to first year CS undergraduates. The only prerequisite is their High School acquaintance with algebra, geometry and propositional calculus. The students’ motive is to learn how to construct and debugs their programs. I start with the familiar laws for disjunction in Boolean Algebra, illustrated by disjunction. A deductive rule for proof by cases is derived from the algebra. The algebra is extended by the operators of spatial and temporal logic: William of Occam’s ‘while’ (|) and ‘then’ (;). They satisfy the same familiar algebraic laws, including distribution through disjunction. A weak interchange law describes how they distribute through one another by ‘shuffling’. Proof rules are then derived for a modal logic of time and space. If this logic is applied to propositions about the behaviour of programs, the ‘while’ and ‘then’ operations can be reinterpreted as sequential and concurrent compositions of programs. The proof rules of the modal logic are then definitionally equivalent to two historic logics due to Hoare and Milner. They are now used widely for mechanical reasoning about correctness of programs and about implementations of programming languages. The opinions expressed in this video do not necessarily reflect the views of the Heidelberg Laureate Forum Foundation or any other person or associated institution involved in the making and distribution of the video.

Related Material

Series
Annotations
Transcript
Loading...
[Music] who knows algorithm 64 algorithm 64 is
also known or it was published as algorithm 64 for quicksort and so so this this was published in 1961 I believe I checked about on Google Scholar I found some 28,000 papers with quicksort in in their title sir Tony Hoare who is our next speaker invented it as part of I think a language translation project if I remember correctly but that's actually a small thing among many things he did he invented for logic to argue and about programs and proved incorrect he worked on concurrent processes and how to prove thing sorry oh yeah I was still working and today he'll be telling us about geometry logic algebra and how it relates to computer science sir Tony Hoare we're very glad to have you here [Applause] thank you very much I wish I could repay
your your applause by giving a lecture as good as the one that we've just heard but the best is the best I can do and you will I hope to find some similarities between my career and a lot of John Hennessy but the difference in other respects okay now I have three theses that the foundations of computer programming can be taught as an aid to practical programming throughout a degree in in computing science not as a separate course but always as part of the practical course that will immediately apply what has been learnt in the theory the program development environment for all practical classes should be based on on the theory but it's also very important to have a program development department of the kind that is widely available in industry now for professional programmers and which helps in at least the testing of of programs and increasingly in other phases of the development of the program right up to delivery and beyond and the final thing is that the initial level of math that's required to teach these courses is available in in the high school and from that I will quote from my own experience so the summary of what I'm going to talk about that summer is already published in in your blue book so very very quickly and start with a review of boolean algebra George Boole is clearly
the author of boolean algebra and he had an interest life he was born into a poor families father was a cobbler shoemaker who was a very good businessman and at the age of 12 he became the principal bread earner for his family within forty four or five years when when his father died he was a headmaster of a small school moved to a larger school and then he was always interested in mathematics and he read and studied mathematics independently but in correspondence with the leading mathematicians of his day and they were willing and happy to help him to make progress in much the same way as Augustus de Morgan helped Ada Lovelace in her logical and mathematical investigations an investigation into the laws of thought in those days it was surf psychology and philosophy that they were interested in and he gave a mathematical model of intelligent reasoning ah could you make me speak up please thank you unfortunately I have caught cold and my voice is not what it usually is how can I don't think I can remedy it very well myself but I will try to do so now because yet earn his living it yeah from the age of 12 on was he never took very much in the way of examinations or had no formal qualifications in mathematics but nevertheless in 1848 I think it was he was invited to join the new Queens Queens College at Corp as a professor of mathematics I thank you very much whoo that's better is it is that better yes thank you thank you for mentioning it oh it feels lovely now I I was interested in George Boole before I myself was invited to be a professor at the Queen's University of Belfast which originally the Queens College at Belfast also founded by Queen Victoria in 1848 and my mathematical qualification was that I had studied mathematics until the age of fourteen fourteen had given up mathematics since then and studied it as a hobby like a bull did but we both share share the property that we were appointed as professors into a faculty that we were not qualified to enter as a student I think that reinforced David's story too now so let's start off I will start by
giving you the first lecture part first lecture perhaps of your computer science course which is about boolean logic and introduced give you an example of an introduction of the operator or the important thing is the algebra of the operators associative commutative an idempotent and of course its purpose is to be used to prove theorems and I've quoted a theorem that may not be familiar to you but or distributes through itself did anybody know that law before I told you there would you Wow I mean gosh I'm not quite at the late League of Michael attea but to prove in front of your eyes a law that you none of you heard of before I think I'm very happy to do so so here's the proof the very full proof shows that the theorem depends on all three of the properties that I've postulated which is slightly cited as satisfying its novelty of the theorem is also satisfying but the real reason why I chosen to show it to you first that it's actually useful I needed it well I said we would talk about geometry obviously talking about geometry the relevant geometry for boolean algebra as Venn diagrams and so here I've given you a picture of of a Venn diagram a comparison between two boolean values and between many is given in the definition which shows pictorially that the first operand of the less than or equal is contained in the second operand and it's written as R equals P or R so no first theorem about the ordering operator that he is an expensive operator but if if you just join something with P you can only make it fuller larger expanded and the proof is more or less a single line from the definition that if you've spent the definition you get something which is immediately obvious by ivan Potence he arrived suppress reuse as mathematicians do suppress the use of the associative operator and I'll do that again and it's really what justifies having Association as an axis the value of association that you don't have to use brackets so that's my introduction to logic to algebra move on to the deductive logic of propositions deductive logic goes back to a ancient Greek philosopher Aristotle who was he studied in the Academy of Plato and founded another Academy which is called the Lyceum after wolf and he lectured and wrote lecture notes where students wrote the lecture notes on a number of different subjects in different faculties or modern university but I'd like to single out two of his particular discoveries of which he was a we know the originator that is biology and logic for in all the other lectures he gave he referred to other ancient Greek philosophers as the originator of ideas some of which were worse than his own of course and but for these two he doesn't give any any references and so we have reason to believe that he invented them both and they're not unconnected because logic his logic was designed to be useful in predicting the consequences of his discoveries as a biologist nowadays we who use advanced mathematics and computers to deduce the consequences of our theories and predict the results of our experiments and Aristotle used deductive logic for the same purpose I expect you've all seen an Aristotelian syllogism which I've shown you a famous example at the top of this slide the two statements above a line are called anti so what don't need to go through all this your mathematicians aren't you so the important thing is that if you already know the algebra if you know what an algebraic proof is you can use the algebra to prove the correctness of a deductive theorem of a deduction rule and then to use the rule
you only have to prove the antecedents first and after you've done that you can use the consequent anywhere subsequently in your proof so let's try and prove a proof rule it's called the proof by cases first of all assume the two antecedents and then after the yeah you pluck the idempotence axiom and apply it to R now we can use the antecedents of substitution rules to replace both R's by the two different right-hand sides P or R or Q or R and finally we can use the distribution of our through or to prove the theorem and the conclusion follows by the definition of the ordering now the purpose of the proof rule is to give you a strategy for constructing proofs and that is I think very important also for a strategy of constructing programs they're based on proof rules as we will see later I hope so now a few theorems about the proof we we want to prove some of the peripherals relating to the ordering relation and perhaps many of you have have memories of going through this little exercise of proving the proof rules the nice thing in this case is not that each proof rule shown here reflexivity transitivity antisymmetry depend on all the axioms of the partial order each of of all the Aksum relating to its definition but by one axiom each so I will ask you to verify if you wish that those derived proof rules are valid well there's a very important property and computer science called monotonicity which is a rather horrible word and I prefer to use the categorical term covariance that when you make if you know how that two variables or expressions P and Q are relatively ordered then the you can substitute one of them for the other and in it in the it's in many interesting come contexts and the result will be to get to increase Authority to obtain a result which preserves the ordering of the operands so covariant or monotonic operator when plight applied to an operand preserves the ordering of the operands very important engineering principle because when we replace a component in an engineering assembly by one which is better then it cannot get worse it can only get better itself if you put a more reliable component in your assembly will become more reliable not always true if the laws are that you are using and the principle you using in your design are incorrect so the science has got to be right but but if if everything is working as it normally does then improving components will improve the product very important principle also in the liver of computer programs and in their optimization optimize a small part of it you won't make the whole thing worse this that's the principle to learn lesson move on to a branch of logic which is not so widely studied it is spatial and temporal logic sometimes called modal lock logics but ever since Aristotle in fact it has been studied by philosophers and logicians and especially in the Middle Ages where they hope to find in a modal logic a way of reconciling the great theological parrot paradox of this simultaneous benevolence and benevolence and omnipotence of the deity I'm going to draw your attention
to William of Occam who was the author of the Occam's razor which I've always used as as my philosophical and mathematical goal not to make unnecessary postulates about entities and partly because William of Occam was a Franciscan friar and a scholar at Merton College Oxford and many years later as many centuries later I was also a scholar of at Merton College Oxford and I studied philosophy it was at Merton that I studied I pursued my studies of logic and in my spare time and obtained my my only degree from in any university an MA in philosophy from Oxford University the point apart a temporal and spatial logic are that they give you two more methods of composing compositions if your propositions are describing something then I think the next slide here the propositions
describing actions occurring in a particular region of space during some interval of time so if you'd like to think of a fairly large proposition take her take a volume called a history of France 1789 - 1815 this is of course split into two volumes one of which go from 1789 to 1800 and one from 1802 1815 that split is is a sequential composition of split the two volumes sitting next to each other on the shelves are sequentially composed because they relate to immediately adjacent intervals of time and similarly for a history of England during the same period the if you wish to spread your interest further over Europe if if you have a history of England over the same period in your library or on your shelf then it's related to the history of France by being run concurrently everything that actually happens happens in one of the two sub intervals so I think this is more or less a repetition but it's an important concept so we can talk about P and Q in terms of their start times and their end times and we can define the parallel QP concurrent with Q or P while Q by starting them both simultaneously finishing the both simultaneously and the actions are they just in in each case of a disjoint union of the axioms in the two intervals so the axioms now that we have a special case of a proposition which says that nothing has happened there and we use it as the unit of our sequential and and concurrent composition both of these compositions are associative and they have the same unit very unusual in algebra to find two two operators with the same unit but it has some consequences that are very important for applications who concurrency so that we can prove something a proof rule sequential proof by cases which gives you two two antecedents which differ only in the first argument of the semicolon p.m. on one of them and P dashed on the other and prove a simple theorem which is going to be useful excuse me by assigning the assuming the two antecedents we can quite rapidly prove that the new proof rule is true and here's a proof rule which relates to sequential composition alone which uses the associativity of the operator as its primary key again a simple substitution in the what I've singled out in red leads to a conclusion which turns out by definition of of the ordering operator to be equivalent to the consequence well there are various corollaries of this sequential composition if I replace Q by do nothing now and then I get the second of these two or two lines if I replace R by now I get the third of them so now really a brief note on unifying theories of programming they're very important to my central project because you won't be able to teach the theory to your class unless you teach them the same theory throughout the course all the tools must be based on the same theory or they're going to be inconsistent and the actual language that you teach doesn't doesn't matter are you as long as it pays the same laws and the same theory or even was very interesting to point out when they don't don't agree at least you've got a theoretical framework framework in which you can find find out in a very deep sense y2 programming languages are different so let me try to give you the two interpretations of the bordering relationship that I've been using in terms of program if P describes what happens from the start of our we've got two sides of the equation they have the same start of course so and the Q describes what happens from the end of P up to the end of our then our correctly describes the whole of the proposition P semicolon Q which holds throughout the interval R itself with the two intervals on both sides are the same and P semicolon Q are holes in one of them so it must hold in the other alternatively you can say well supposing your R is a program and you want to execute it then one of the ways you can do it is first to execute P and then follow this is followed by Q so I are going to define the hor triple as just this proposition that I've been describing here now when I give look back at the proof rule for the semicolon which I've just given you and doing absolutely symbol by symbol translation using this definition I get the whole rule for semicolon it's a direct consequence for Scioscia Timothy and the two-car corollaries are obtained from the to appear in my original as the poor rules of consequence and that was a shattering discovery to make in my more advanced years from I say because it suggests that the whole logic is much more complicated than it really
needs need be the algebra from which I've deduced these axioms is a great deal shorter and a great deal simpler than the laws which I known as a whole logic so I tend to use the word whore to mean when I call the the old war before I came to the realization of this and then I realized that that could that could be misunderstood that I was now the old for and the other one was my previous incarnation no now I'm going to define a using the second interpretation to define robin miller's notation which by which he defines languages such as c CS it says the superscript superscripted notation says that if you want to execute are then one of the ways of doing it is to execute Q first and then P now when I substitute that definition for the same theorem I get the millers law for the semicolon it's not actually due to Milner but it's what Milner would have written if he'd been interested in semicolon which he wasn't and the Tuukka robberies are what known that would quote structural fools he would express them as equivalences I respect expressed them as orderings so we have two theories Robby Miller was a great friend of mine and we always regarded each other as scientific rivals and we fought like anything about the theories that we were promoting at that time in fact we both collaborated on the three-year and very expensive research grant to try to bring our theories together into a single unified theory which was very necessary because you can't really sell to industry any theory on which the experts disagree because if you go into an engineer's office sense of the sullens marvelous idea then they will ask you what you thought of the theory that the last person who came into his office said and if you say well that's all rubbish really and he says well that's what they said about your theory you their way and work something out and come back when you agree and we never did come back now we have agreed unfortunately too late for Robin but I like to say this as a tribute to him there are lots of other proofs of the consistency of a correctness logic with with an operational semantics but they occupy many thousands of Pape of pages of symbols which are we doing maybe I could introduce you to the basic principle of concurrency we interact the I need a new law which describes the principle that a concurrent program can be executed sequentially by interleaving something which was well-known since the days of curing but the proof of it was pretty calm pretty difficult because it had to use interpretation using a Turing machine if you make that into an axiom firstly curing a stone it will actually be true and a second it will be much simpler I have a much simpler well you won't have a proof at all because it's an axiom well first of all I've got to introduce concurrency well it's the same really has the same laws it's obviously associative and has a unit and it distributes through all the new axiom that I need is called the interchange or in category theory and my theory now and it looks a little bit complicated well I think it's called the interchange law because it interchanges the symbols which appear in the formula of the semicolon and the parallel bars so let's have a look at it again I've used coloring trying to deduce some of the consequences of the interchange law that the we can prove that the sequential composition is stronger or contained in the parallel composition if we start with P parallel Q dashed we can execute those P selling in in the order P Q dashed okay all cute that's what seems to me a miss and P dashed thank you I perhaps I hope you to see what I mean is it is I think it's obscured by a misprint anyway the point is we can now use the law of proof by cases to establish that a non-deterministic choice the or between P semicolon Q and Q semicolon P is one of the ways of implementing P parallel to Q so for sequences of less than two operands I have proved that if I interpret the ordering as a refinement ordering can be implemented as then I've got I've proved the principle of concurrency in this very simple case the general case he here I've listed the four different ways of executing programs with three operands and here I've speeded up to deal with four operands in the end you get an interleaving of the read operand and the black top around but within each operand you can extract back just an into living and we can take the red cards out and the black cards out and they will be so so the interleaving operator on the first line is really a shuffle operates often called a shuffle shuffle operator because it satisfies this property
so the I think I'll are we doing do I
have five minutes thank you then I can go quickly through the interchange rule from the statement of the interchange law I can do a simple proof of a an appropriate rule of reasoning which are displayed on this the slide it shows that conclusion says that if you've got to implement a concurrent composition you may be implemented as a sequential composition in which you've taken one operand either a dashed pair of operands or a yun dashed pair and run them separately which describes the calculation of the our parallel are dashed into two phases one in which you run p and q concurrently until they each reach there I've used the wrong things I decide ours it is implemented by P semicolon Q and on the right hand side I will start by executing peed in parallel with P dashed as shown here up to a point at which the are reaches this semicolon and the point in which our dashed reaches this semicolon and those two semicolons as it were a synchronized both but and then synchronized at that point there in the answer and the remainder the still remains to be executed is the parallel composition of P dashed and q dashed so summarizing we've got when we translate these laws according the same definitions that I've given for the ordering previously we get a law which is was rediscovered by which was discovered by Peter Ahern in his separation logic which is one of the most effective ways of dealing in a rigorous way with concurrency we get the which is expressed as a in all logic and then in up we also get an operational semantics a law which Milner postulated as defining the sequential composition in his language CCS and they're both equivalent to the concurrency rule which was proved directly from the algebraic axioms so algebra I think algebra is wonderful it's how much simpler la read my little eulogy of algebra but it's also so useful because algebra is what is used in any correct transformation of a program while preserving its semantics so all optimizations are dependent on applications of algebraic in equations algebraic orderings refinements and if you want to preserve the semantics you can apply algebra in in compilation to translate from one language to another optimization refactor ization of code obfuscation of code and automatic generation of program code and all of these things can be set as exercises to your first-year students optimize this this formula unification of our unification of theories is really very similar to the unification of arithmetic that is provided in the foundations of mathematics the mathematical models that are given for in the foundations of mathematics uses sets and pairs and sequences and every kind of number is has a very different modeling structure but the algebraic laws which they are bear very similar and this is what I really mean by a unification of theories not absolute identities but really understand the fundaments of what makes them what they are how similar they are and why they are so different now here are some deficiencies of the algebraic approach and they explain why we need geometry as well to serve as a model for the algebra so it in another lecture you would I have given a description of how geometry underlies the the kinds of geometries finite geometry that underlies the algebra that I've been describing in this lecture and if you look back at the Heidelberg lecturing in 2016 you'll see me giving a flexure anybody against algebra anybody thank you impacts so we'd have time for one or two questions and Azariah don't run away they're supposed to be the picture happening if you don't ask a question I will give you some other people's answers to this question well you told us the rather reasonable idea that if you improve one part of a of an operation that it won't make the whole thing worse but then you warned us that perhaps that idea might not be true if your theory was wrong could you give an example of when that might be untrue yes
it seems to happen whenever I replace the tap in my home can you think an example let's put it this way when it when it does happen you have to change your theory it's like an experiment that has disproved your theory when the law of refinement is not is is not monotonic it's not a covariance you don't want to make that mistake again so you have to revise your theory and I'm afraid I can't I can't off I don't think of a nice example I will try one more question well if there are no takers thank you very much for this wonderful lecture [Music] you
Universelle Algebra
Programming language
Algorithm
Musical ensemble
Geometry
Geometry
Hoare logic
Universelle Algebra
Projective plane
Computer
Translation (relic)
Parallel computing
Mereology
Computer programming
Internet forum
Logic
Logic
Computer science
Quicksort
Principal ideal
Boolean algebra
Correspondence (mathematics)
Binary code
Mereology
Computer programming
Programmer (hardware)
Mathematics
Mathematics
Video game
Logic
Mathematician
Physical law
Pairwise comparison
Social class
Universelle Algebra
Hoare logic
Concurrency (computer science)
Temporal logic
Software developer
Shared memory
Internet forum
Category of being
Phase transition
Computer science
Right angle
Distributive property
Similarity (geometry)
Student's t-test
Mathematical model
Theory
Boole, George
Social class
Goodness of fit
Computer programming
Authorization
Energy level
Integrated development environment
Software testing
Subtraction
Boolean algebra
Geometry
Computer program
Theory
Operator (mathematics)
Predicate (grammar)
Boole, George
Faculty (division)
Integrated development environment
Mathematics
Formal grammar
Computer programming
Family
De Morgan, Augustus
Commutative property
Spacetime
Geometry
Group action
Context awareness
Universelle Algebra
Venn diagram
Modal logic
Thermal expansion
Propositional formula
Axiom
Mereology
Computer programming
Order (biology)
Mathematics
Strategy game
Logic
Mathematician
Diagram
Pairwise comparison
Universelle Algebra
Theory of relativity
Product (category theory)
Venn diagram
Temporal logic
Complex (psychology)
Computer
Computational geometry
Variable (mathematics)
Substitute good
School of Names
Category of being
Lattice (order)
Order (biology)
Computer science
Theorem
Mathematical optimization
Task (computing)
Resultant
Associative property
Slide rule
Read-only memory
Distributive property
Idempotent
Connectivity (graph theory)
Distribution (mathematics)
Autocovariance
Branch (computer science)
Temporal logic
Student's t-test
Mortality rate
Theory
Number
2 (number)
Peripheral
Term (mathematics)
Operator (mathematics)
Computer programming
Theorem
Subtraction
Associative property
Boolean algebra
Pairwise comparison
Assembly language
Geometry
Autocovariance
Distribution (mathematics)
Poisson-Klammer
Idempotent
Physical law
Expression
Paradox
Operator (mathematics)
Reflexive space
Morley's categoricity theorem
Line (geometry)
Cartesian coordinate system
Axiom
Faculty (division)
Word
Commutator
Logic
Universe (mathematics)
Pearson product-moment correlation coefficient
Strategy game
Statement (computer science)
Physics
Partial derivative
Logische Programmiersprache
Diagram
Spacetime
Group action
Concurrency (computer science)
Universelle Algebra
Direction (geometry)
Propositional formula
Axiom
Parameter (computer programming)
Substitute good
Heegaard splitting
Software framework
Logische Programmiersprache
Social class
Programming language
Spacetime
Temporal logic
Concurrency (computer science)
Algebraic number
Hoare logic
Distributive property
Computer
Parallel port
Sequence
Substitute good
Maxima and minima
Equation
Theorem
Physical system
Point (geometry)
Slide rule
Distributive property
Observational study
Translation (relic)
Theory
2 (number)
Sequence
Frequency
Term (mathematics)
Operator (mathematics)
Computer programming
Authorization
Theorem
Associative property
Units of measurement
Modal logic
Key (cryptography)
Projective plane
Operator (mathematics)
Theory
Volume (thermodynamics)
Line (geometry)
Group action
Axiom
Cartesian coordinate system
Symbol table
Einheitliche Feldtheorie
Logic
Lie group
Universe (mathematics)
Interpreter (computing)
Computer programming
Units of measurement
Axiom of choice
Commutative property
Concurrency (computer science)
Interleaving
Universelle Algebra
Compiler
Axiom
Turing-Maschine
Semantics (computer science)
Computer programming
Radical (chemistry)
Positional notation
Office suite
Universelle Algebra
Programming language
Hoare logic
Concurrency (computer science)
Algebraic number
Distributive property
Computer
Parallel port
Bit
Student's t-test
Term (mathematics)
Sequence
Category of being
Data mining
Order (biology)
Theorem
Reading (process)
Associative property
Point (geometry)
Distributive property
Consistency
Theory
Sequence
Well-formed formula
Computer programming
Theorem
Units of measurement
Home page
Axiom of choice
Consistency
Physical law
Interleaving
Expert system
Theory
Line (geometry)
Semantics (computer science)
Axiom
Symbol table
Equivalence relation
Word
Logic
Interpreter (computing)
Computer programming
Formal verification
Central processing unit
Units of measurement
Meta element
Program code
Geometry
Concurrency (computer science)
Universelle Algebra
Scientific modelling
Axiom
Complete metric space
Interface (computing)
Mereology
Semantics (computer science)
Machine code
P (complexity)
Data model
Logic
Series (mathematics)
Factorization
Descriptive statistics
Fundamental theorem of algebra
Universelle Algebra
Programming language
Electric generator
Concurrency (computer science)
Algebraic number
Computer
Parallel port
Range (statistics)
Quantification
Term (mathematics)
Sequence
Internet forum
Well-formed formula
Phase transition
Order (biology)
Heegaard splitting
Equation
Theorem
Compilation album
Right angle
Mathematical optimization
Identical particles
Physical system
Fundamental theorem of algebra
Point (geometry)
Slide rule
Distributive property
Transformation (genetics)
Constraint (mathematics)
Student's t-test
Reflexive space
Counterexample
Mathematical model
Theory
Number
Architecture
Structured programming
Well-formed formula
Computer programming
Endliche Geometrie
Code refactoring
output
Subtraction
Slide rule
Geometry
Grundlagen der Mathematik
Physical law
Computer program
Theory
Machine code
Set (mathematics)
Semantics (computer science)
Transformation (genetics)
Axiom
Cartesian coordinate system
Compiler
Component-based software engineering
Calculation
Logic
String (computer science)
Function (mathematics)
Statement (computer science)
Computer programming
Formal verification
Mathematical optimization
Separation axiom
Internet forum
Musical ensemble
Autocovariance
Physical law
Theory
Loading...
Feedback

Timings

  580 ms - page object

Version

AV-Portal 3.8.2 (0bb840d79881f4e1b2f2d6f66c37060441d4bb2e)