Merken
Decoding algorithms for topological codes
Automatisierte Medienanalyse
Diese automatischen Videoanalysen setzt das TIBAVPortal ein:
Szenenerkennung — Shot Boundary Detection segmentiert das Video anhand von Bildmerkmalen. Ein daraus erzeugtes visuelles Inhaltsverzeichnis gibt einen schnellen Überblick über den Inhalt des Videos und bietet einen zielgenauen Zugriff.
Texterkennung – Intelligent Character Recognition erfasst, indexiert und macht geschriebene Sprache (zum Beispiel Text auf Folien) durchsuchbar.
Spracherkennung – Speech to Text notiert die gesprochene Sprache im Video in Form eines Transkripts, das durchsuchbar ist.
Bilderkennung – Visual Concept Detection indexiert das Bewegtbild mit fachspezifischen und fächerübergreifenden visuellen Konzepten (zum Beispiel Landschaft, Fassadendetail, technische Zeichnung, Computeranimation oder Vorlesung).
Verschlagwortung – Named Entity Recognition beschreibt die einzelnen Videosegmente mit semantisch verknüpften Sachbegriffen. Synonyme oder Unterbegriffe von eingegebenen Suchbegriffen können dadurch automatisch mitgesucht werden, was die Treffermenge erweitert.
Erkannte Entitäten
Sprachtranskript
00:00
that putting that echoic so that it was invited to give this talk but unfortunately India so I would like to thank the organizers for giving me the opportunity to replace him today and I give this talk about the the results we obtained in the last couple of years regarding the common problem for topological much of it has been done a big part of it has been in cooperation with the example right from attribute Institute who gave its odd years unless you think about that this idea of a local influence of topological goes so what we're going to discuss the application of this mapping the 2 quantum error correction the FIL title my title
00:46
was quite simple and explicit I guess and any given all that we've heard up up until now this which I don't think any to work or to make too much effort in convincing you that topological so interesting as a summarize this by saying that this thing candidates who faulttolerant quantum computation but I'm the system as a physicist the schools also have another interest and I would like to emphasize that and the idea is that 1st of all they are physically relevant because they have local interactions and this is that seems to be physically realistic if 2 systems are far from 1 another you don't expect them to couple a trend track so this is the 1st thing to consider then there is this idea of topological order which can get very abstract very fast so the systems are pretty simple we can solve them exactly in the U.S. examples of topological orders and just to say the same thing and move around the manner to give us examples of Indiana citations and anionic models so we can a study that is in is you and moreover what is also interesting is that the 2 words we have all the concepts introduced regarding topological calls for instance to a store boundaries actually have a very natural interpretation in terms of physics but actually permutation of the NeOn so they their symmetries of the union model and Bonner is actually have to do with particle or string of conversation so this being said Dahl the thing that's I would like to cover but I would like to focus more on points 3 to 5 the 2 1st what we've already heard pretty much all curve is quite fast and the last time permitting I would like to discuss new ideas we have regarding this problem and the main part of the talk is actually introducing a method to decode the rhetorical which was inspired by our background physics which is some some kind of RG flow that we used to decode I'll discuss the results obtained for a ttest Oracle and I would like to I hear this said this application of SPECT mapping to quantum errorcorrection to extend the decoder to decode articles for instance soul like we we already so we have would answer some bicycle code to be in the physical picture if you will that where we take years and qubit system seminal tone on this Stewart system which just minus the sum of all the stabilizer generators which are of 2 kinds of these DSPs would be like these small x operators loop operators which are located at each side of the work each vertex of the lattice and on each but if you have a similar operator which is a loop of Simmons at all yeah so since repented space our code space we see that we have actually stabilized codes of meaning the post lying about the postwar eigenstate of each of these again there's nicely in between and you're topology and the actual logical meaning of operators and easy to simply put that is that if you have some string operator which is a trip which music and contracted to sum to a point then it has a trickle effect on the call so every kind of operator that looks like a string which is close to a loop and that is contractible what will but don't n something and yet and if you consider a nontrivial cycles station or a nontrivial logical operators because you you can't shrink them down to get the wrap around the the the lattice if you consider for instance you're pretty boundary conditions and you can't toss again you can take just the transporter after logical qubits encoded in this in this structure this this there's is this nice link between homology and logical effect of string operators given we can
04:31
consider what happens when I start writing for instance errors or if I that's the same flips and cubic then this thing was at the x operation I commutes with neighboring plaque yet so it creates 2 syndrome that's and as this physical picture let's just say there are kind of course I particles because they're actually citations of my Hamiltonians because I actually the eigenvalues of the Yoccoz from plus 1 to minus 1 so errors can create particles and yet so sentiment be considered as particles were defects and so we said errors could create particles we can also diffusion so we can move them around and the with this happens consider this very simple case and add another extra so that the the Fed has actually moved from left to right and the number of affected in changing it just so now we know that is very simple example we can convince ourselves errors can actually move the defects are up so they can trade and use the effects and we get a more generally considered their chains in general and not just by inspection you can come to convince yourself that the only place you will have the effect will be on the endpoints of the string so yeah so that the the syndrome configuration the endpoints does not depend on the geometry of the past and this not depend on its line or or it's geometry so in some sense these changed these error chains skin Mr. because you only have to pay only constant energy costs which is the energy necessary to create 2 particles but the number of the shape of the particles does not change as you increase the size and this is what you need actually active error correction and the system because you have in some sense to confine the defects 2 so we said particles could be created diffuse and it can also be annihilated back to the the Vikings yeah so if for instance this is just an example of a chain with errors in its end points if you had an extra it actually takes back the defects together and we're back in the Khot space and the only thing that is left behind is actually the word line so the history of all the particles are created move around and then an island it is if if we let the diffusion unchecked particles could derive or diffuse for for a very long distances because it's free and then if they were the at the few back the vacuum after having for instance wrapped around your Zundel direction this example then they will have implemented some logical operation and this will be corrupt or memory if were not aware of this so we have to keep the fixed localized and we have to prevent this kind of His so this would result in memory corruption if you were to use a taco as a as a memory it here you can have something similar with some wasn't strings but you create effects on the sites like site operators instead of the pocket operates so it's a it's exactly the same thing but on the lattice instead of the dual lattice OK so now given what I've said about Oracle I would like to discuss the decoding problem in this setting talk is
07:42
organ a counselor we've heard about this earlier in this week when a counselor very simple practical was also that our analysis is easier so for instance the polarizing the flip error models which are quite standard and probably the simplest all this is for instance the
07:58
parsing channel acting on Oracle we take some noise it it's a 15 per cent so this means there is a 5 per cent chance of having either x y Rosetta for a Cuban so the Air Act some defects are created we come and we measure the position of the particles by measuring all the black inside of this this is the information we have it in a sentence the decoding problem is trying to this year than that OK so there are many ways we could use all the particles back to the vacuum right because we wanna go back to a code space this is what the coating is about so we have to fuse all these effects together so that they all disappear so this is what this would be a possible this is another possible so the the same or the different well 1 way to see that you actually they the sum of the 2 so if you sum this configuration was this 1 what you're left with is something that what this means is that but if you had to actually implement a nontrivial logical operations which would be a sum the arms of operating so what this is is that these 2 corrections are actually inadequate but there's a difference between doing 1 or the other so you have to do a correction which has the same logical effect as your air and if you choose this properly will implement America and you will lose the information this you have to infer some more line or some correction for a defect in you have to be careful that we do this right so we heard
09:24
about the an existing method which is matching so the idea is to find the shortest path connecting all the cells remember we assume some independent noise so the the shores connection share the the 1 with fires broadly so this seems to be a pretty reasonable thing again you can map this problem as we heard earlier this week to actually a problem from statistical mechanics which is a randomized ising model and in this case it would miss the quarter would be equivalent to minimizing the energy of this of this system even the citations again this is called a perfect matching algorithm and it has a complexity which is a polynomial and the size of the system IT so all of this has been studied by Aage American in his speech he is best but complexity but it's still prove quite primitive but fortunately very recently and we heard all that 2 days ago if I remember correctly this method was improved very much by of all as a collaborator to actually be able to paralyze it and simplified such that you can do average Kohnstamm custom decoding with marginal losses of performances actually so this is a pretty good news for this method but so we can think about what what did we put aside when we did this decoding you 1st of all remember I told you many different strains of many different corrections are actually equivalent so by choosing the 1 the surest 1 which we forget about the fact that the the the general that many differences are actually equivalent so maybe we should take that into account and moreover if there is some correlation for instance X and Z which is the case in the parsing channel which introduces this correlation through the white operator but then usually in these methods you the code exons that defects individually and you don't think it account this correlations so these are the things that motivated us to actually search for another way to think of about this scope OK so all I can talk about these 2 points of degeneracy and the correlations so imagine you have this so this this confirmation of the effects gave you could say would you its program that doing there is only 1 minimal weights option this is because if you take any of it would be long but if you take something which is logically different for instance this 1 then you can see that many of the same order largely equivalent for instance if I just change this check to this I didn't do anything right so I could should count these different strains which are the same within which actually are in the same class which is not the inference in him get so this this is just saying that there is lots of degeneracy in the possible corrections and again if we go to to this idea of a mapping into some physical model it corresponds to minimizing actually the free energy so it's minimizing for energy versus energy in the difference is this term which is actually taking into account the degeneracy which would call entropy it is so this is 1 point then the thing remember I I thought the book Croatian so for instance if we have a site and bucket defects in such a manner this would be the minimum wage matching strips and inference and this 1 would be I with fights with because the the strains 1 from which 3 to wait 5 actually but the thing is that's not true if if this is the depolarizing Châu right because of a white area is actually as an an X so underlying crosses you don't have to come to ice for having an area on the right do so this is all for instance you can change the effective with changes by taking into account relations so this is in general not them by the Pew model this a given what
13:05
I've said and given what we knew from our physics background we thought of approximating this minimization of free energy which would be optimal decoding using an RGB Group algorithm so the idea is that if you do that the the the optimal think the minimization of free energy it's actually tractable number of of strings have to consider is exponential in the size of the system so our approaches to say let's try an approximate this optimal decoding which is intractable and this leads us to this court yeah so when we talk about gee what we'd like to see some sun in is in some sense of scale guards so let's say we start with as the reply could stabilize so this is destabilizes I would like to make a change of basis such that we have something which is scaling bot thing we do is the following so consider every 4 operators and just removed from year generators a all 1 over for 1 of the 4 packets and replace it actually by it it by it finds the other 3 so this gives you an approach is so we
14:10
started with this generator and will replace this 1 by why it times the other 3 and this is what we get and then on the red skill you can actually do the same thing exactly so each 1 of these fish we take it out and replace it with the other 1 twice as big you can do this recursive then to yeah you you get to the the trivial 1 by 1 our litigious having oneself if you have this kind of U R Q of of still larger generators and actually didn't do anything in the sense that you just changed the generator but the stabilizes this we have this going warrants and what is interesting is that that structure is very similar to what we see in got concatenated codes so you really have this idea of going cubits into Q it such that you create this year so we're going to use this this this what we know about the genitals were going to lose some decoding of each books on the 1st layer that's all the red blocks and then this information that will come out of each of these Zokora's will be used as an input for the coder which 1 assumes that the blues scale and so on until we get to our logical information he just to make this a little more precise
15:19
let me tell you about all we decode could get miracles so the way this goes is so this is 1st and a concatenated of a very simple 3 cubicle the way we decode this if we wanna do this optimally and efficiently is by decoding let's say we have a decoder which is which could be just bruteforce decoder for 1 of you actually just run the decoding algorithm
15:43
brute force once 1 small block so this will give you a marginal probability of the logical qubits both this book the once
15:52
this is done in be used as the input of the books on the layers just above it is so the logical all put 1 layer becomes the physical input of the later just about and this is all you can actually efficiently and optimally code concatenated and this is the structure of the EU's particles so just think of the terror code as
16:12
a concatenation of this very small open boundary surface goes history you have a tube attached to smooth lotteries throughout honorees and if you do a bruteforce according on it you will actually end up with their logical marginal probability on your to logical qubits implement in this very small as surface coal so the logical operators are just strings going through vertically or isn't the surface of core so given some syndrome pattern inside the cell you can run a bruteforce Dakota it will spit out 16 possible probably which corresponds to these 2 cubits meaning forces there is there is there a string going through the horizontal way not your book or not and you have x or z and you can generate everything will be 16 possible again as I said this is done by brute force you just sum over all possible errors which are consistent with fear know what you're their single it but
17:12
there's a catch the thing is that the Torah code is not occur again in code so this is where the gap at summation Parkinson's but is so for instance if you consider a small block I just told you about you can see that some of the stabilizers archly incomplete in this compared if you were to compare the stabilizers to the survivors of the you can see that there are some few bits missing actually so when the virus syndrome measurements you you do them on the thorough code so you don't have the value of these partial stabilizes if you will so as I
17:49
said you cannot break the particle as the small sold 1 solution would be just let's let's use overlapping salad and calls the gaps OK so every word there's a cubic missing let's just added can this is what is represented by these dashed lines you would just say some of the qubits are going to be shared between many cells and then we can run the decoding because we have all this syndrome information necessary for this because now the stabilizers are closed OK but again
18:18
this this implies a mac approximation because remember for instance of eyes you 1 might Oracle on only 2 such blocks that's a I I focus on this cubits which are labeled by a green circle here and I have some syndrome configuration when I will come in the cold a little the book right but this cube than this 1 only 1 tube so every time I suppose there's an X area I should do the same on the other 1 right so if you look for instance at only the left part this in this corrections seems completely reasonable it's a strange going out this stupid with this syndrome if you concentrate on this right part it also seems completely reasonable since it would correct personal but there is an inconsistency inconsistency in the fact that I have 2 different areas for the same Cuban which doesn't make sense so the simplest thing to do is just as forget about these correlations bowling if you do this that younger than doesn't work you can find constant weights errors which will make a decoder fail independent of the size of the lattice so this would work so where we wanna find some kind of a compromise on the 1 hand we have forgetting about all these correlations completely on the other 1 is taking them into account but making articulation intractable because then it would collate all the Dakotas together so 1 thing we can note is that 1st of all if we were to come to compute even the syndrome inside the cell the marginal in this tube the marginal error probably human in general it will match the marginal probably the model relative error for the same in that another cell right because the syndrome would be different you it involves different units so if you look at only this Cuba the marginal error probability it actually doesn't match so the compromise we wanna come up with is that's changed all of these 2 cells such that these marginals fit so we will has that every time you put an X on the side you actually put 1 year but we're gonna ask that it has the same probability solar to this 1 says 40 per cent of the time I think there's an area we require that this all also says 40 per cent of the time there should be an error so the way we do that is just by implementing belief propagation so each of these blocks computes the model police on these shared list and then just exchange this information and this information is used is used to update the noise malt such that at some point it should converge to having these marginals degree do we don't actually come all the way down to that point because you have some problems with your with propagation for instance you have 4 cycles and this year in Europe coding and this is a problem but still it works even if if it's not perfect the simple fact of doing it for some rounds this belief propagation algorithm even if this agreement is left is not younger younger than still works did so we have l squared given the layer that's in the physical layer the first one we have squared such books but you can see that you can very easily and naturally paralyze because each block is actually decoded independently or with any computer 1 of such models is decoding independently from 1 book to another so the only thing you're left with is the number of level you you're rocky right this you cannot paralyzed and this is where this like jail time comes from you so I'm going to tell you the results we obtained for that story code so what do we have here so we can put the grounds of belief propagation so we we don't have perfect agreement in our consistency relations but it seems to be enough so on the X. axis you have the bars in joules strength on the Y. axis you have the logarithmic scale and you have the decoder error probability and you can see here different cures for different sizes of the sizes of the light it's going from 0 equals A to a legal 64 we didn't do some sophisticated analysis but I think this is relatively convincing that there exist some partial which is around insofar as 15 cent and if you look back at for instance during since these is you will find that using PMA you obtain 15 comma decimal 5 % if you decoded X and that's it's a brother so we have lost a bit of a bit of performance but the decoding time was actually increased quite dramatically let me say this is the era we obtain that like 1 and a half years ago so we didn't know about the results of Austin the this idea which was quite an improvement over PM the key also I would like to emphasize the fact that this so I showed you know we we had this 2 by 2 book but this is quite flexible you could use for instance to buy 1 books which are smaller and the strips the the the constants here they go on your complexity right so all the you do that if you have a to where 1 still exactly shrinks the size of the lattice in 1 direction for instance horizontal so this you do an R G in 1 direction that's CVX direction and then you use the output of this the and you use the trans position of the unit cell to actually do to shrink the size of the is in the other direction that's a vertical direction so by implementing toward the steps with 2 different cells which are 1 the transpose the other begin AV ineffectual and effective renormalization in both directions and this enabled us to the decode codes of linear size of up to a thousand which is like millions of students and again this is on a mistake here should be but so on the bits that channel the PMA a bad thing printer % roughly of special and we have something like a point so so we lose a bit and performance again but we have gained a lot in the input and time implementation by dropping a constants and we were able to the cold and humans so there is this tradeoff between the speed of the algorithm and the precision or the light of the gates and where it gets very interesting is that if we have some preprocessing here that this is a bit tricky but remember by doing this geez you scale this change of basis and generated so Jean Lawrence Jong warrants his there there are some syndrome it's which we don't use until the very last steps OK and this seems to be problematic in some cases so 1 compromise we can come up with is you very basic Q with wise belief propagation between every checks and error cubits it's an in between every tumors in every checks and so on to actually just a big noise model to take into account all of the stabilizes of of this syndrome business that without committing to it because of this is only updating the noise malls and by doing this work to take into account earlier information that would have been taking into account only at the end of the computation so this seems to actually improve pretty much the performance and by doing this we were able to to beat this 15 comma decimal 5 the officials so we can go over and over what was in my or what was done by PME using this technique OK so I've covered these 2 points knowledge there you about an application of the results that got all the support last year and so we have this this article refer for the toric code visit to actually the color other the cops soul for example we have the
25:53
topological color so I think we heard about this in and whose start yesterday so we have 1 queue bits per vertex and we have very similar rhetorical we have loop operators for each of the spaces X and Z so for a chart the guns in each document and it was the question was can we decode this because in this case PME does not work in a straightforward fashion and it's not know all too to efficiently the the the equivalent of the matching on this lattice around this call give recently by the personified he and Robert were the loft came up with an adaptation of the Adi scheme to this 6 6 6 color code to example lattice it's not a straightforward is not just running the algorithm on this lattice Domine soltys for instance you have many should you with so we have to do some kind of course bringing in what a shared Stewart the mice that the runtime and also it's better at work in the lattice because the eternal stools this great much easier to work with but before that I actually want propose something a bit different which which is instead of inventing a new decoder that's just bring the decoding problem back to the coating that of a 3rd of this so remember what that delta list every 2 the transition environs Monterroso lies a code so that's just call that topological civilizing comes with local generators and microscopic minimal distance are locally coolant so there exist this local history map brings us to a number of copies of key test Oracle and on the number of copies of some of the all of you of your coat this so 1 can show that the top marginal coracles actually coolant 2 copies of the 2 historical the 1 I showed you the 4 8 8 so what we propose is due the mapping on the year syndrome that's an onion always involves such that you can actually run the get Oracle Dakota idea and decode the color code using it so the 2 since we need is an update of the noise from the color code to the rhetorical and we need enough for artists and it's so we can run it on the 3rd so this is just part
27:56
of the mapping there's 2 columns missing so what is pictured here is of blue means single expiry red things so what on the 1st column year label C C is the it's on the topological called lattice gives so this when using my X on the qubit living on top of a diamond in the color here we have 2 copies of T. test Oracle and the qubits on the edges so this says that for instance this this 1 here says that this thing might some top of some diamond is actually actually gets not missing mikes on this cube it on the 1st the the occult and these 2 you opera singers adopters on the 2nd or you can go on and check for a for a rule so what is interesting 1st this mapping is local so this is very important for a 1 1 Marvin was all this means that for instance if there was a Pearl of adding an area slipped workrelated error on these 3 operators and everything is local at most 3 bodies so this is encouraging this is this is required actually and you can check by inspection that it preserves also work correlation condition relations so using this we can actually do the mapping so I think that's the mapping of the noise is quite straightforward the 1st thing you can do is just forget about Croatians who did just you just get an effective always strength for each of these operators wants to the mapping of the small and the
29:23
other thing which is a bit trickier but is quite simple once you have the mapping is for instance opposite inferences this Z the fake on the caller color strains of X is on their the rents of lights and using the mapping into so due to produce side you can actually see that this gets not through this very simple singers of offers so this operator moves these defects or fixed so from that we can use or the use that this is a defect is actually equivalent to an electric defect or side effect on the 1st thorough right this this is just a very the conclusion drawn from this mapping here you can do the same thing for operators moving I call them mopping offers so the the the opposite move the particles around on your color and this directly implies that this thing was on the blue is actually just a magnetic effect is because this is the opeing opera for demanding the the effect on the so we have a way of mapping this interim to have a way of mapping the noise all the only thing left to do is to run our algorithm on each of these rhetorical so if you do this very crudely you get 5 % more stretched special if moreover here on some preprocessing the with propagation preprocessing and just as I told you about in the tree code we can actually increase the the performance quite dramatically so by using this technique so again we have shall strength going error probably different sizes and we get something like it was the sense and it is known that this code to have the same threshold as that Oracle which is around all of this so the good news is that obviously we have some suboptimal decoded because we we just lost some information between the correlations between the 2 articles and so on but it's not that bad actually we were quite surprised that the performance losses on a on a the dramatic at so using this technique actually you could build if you have the the mapping for any topological code which is insured by a cost construction you can actually decode any topological save article using this technique by mapping it to a number of copies of 2 types of and then running the code and it doesn't have to be RGB if you prefer that the words you because the proposed by Austin then you you could run this algorithm instead so you would get something suboptimal but it's it's systematic the I would like to point out that were the
31:48
remaining time as we were also able to decode the topological subsystem cortical which was introduced by a lot also and what is interesting is that this is not equivalent to a number of copies of students Oracle you do not need to have this complete local from that to do the coding what is needed is just having every particle which is its entire particle and the particle being moved around by string operators or cause I operators and if you look at the details of this goes well 1st of all it's interesting because it's a gage cold which only imply plot involves twobody operators and moreover against show that it has 3 fermions so it's not equivalent to thorough codes but fermions are the only particles in this setting and the of the by string steatopygous so you could actually come up fulfilling earmarked for the noise linear map for the syndromes and actually run decoder so she shot up but in Barbara told you came up with the decoding for a similar coaches of 5 squares cold which has an devising channel and this recorder was tailored for this particular coder and using this very general taking on this cold we obtain the
32:54
threshold which is around 2 per cent also so this is quite encouraging actually that we didn't use very much by doing this this mapping in this case to comment on how much but so I guess I'll just I'll just see 1 thing if you want if you go into this phenomenological model where you add you see that Oracle you wanna to faulttolerant storage was that Oracle and you had this property of error for a single measurements can that than RG method to work in 3 dimensions because in this case it just becomes a quantum error correction problem and 3 this was shown by and the paper by then it's all topological quantum memory and you can adapt the are due to work in 3 dimensions and if you do so and I implemented it would be
33:44
simpler cell which would borne by 1 so that you can run in the X z and y direction 1 after the other to implement an effective realization in the wall space so if you do this faulttolerant storage or this in this very basic knowledgeable model all we got was something like just above 1 comma decimal 8 per cent and this has to be compared with the number reported injury to see this again using PMA which was 2 comma decimal 9 so again we have some loss but we were able to decode l l equal 64 Q so so the 3 times 64 you so we all again have this log you complexity OK so I just skip to the a party so just to
34:27
conclude we I presented the coding problem so it's inferring were lines are presented the RGD quotas which is very fast loyal times that's and this is worse disability this is also an interesting point is versatile artificial and some very specific conditions the parsing noise it is you're a stick with but that there exists a threshold but it's not the but at least there exists a potential and we will show that we can extend beyond and also give this this application of Articles mapping to quantum errorcorrection and that would like to thank you for your attention the experimental questions I just want comments about their it the threshold a rigorous proof that so clearly there's a universal threshold for all ddimensional twoplusfour codes which is 32 the negative to punch dimensions there in 2 dimensions this 32 the for current 32 the metaphor so is it is it has been improved was missing so the new opera is it's ideas for caught so it's a bit about how you decoder and syndromes erroneous on 3D model all the art world or the syndrome measurement errors you're you quickly went all the 3D yeah errorcorrection moral he said at the adapted treaty more your question is about every or or we can about that in this case the article this yet the RGD dichloro and the syndrome measurements of matters Kay ever use this as you are you aware of the paper topological quantum memory by then this at all yes so the idea is that in this case when you have got only consider for instance that for handle Europe your chick operators in this to post 1 new suppose 1 structures actually become stockbreeders with set so that the legs in the plane the physical errors and delays in the other direction lots of the measurement it was this this is said we still have just a lattice of all these stars right so we can use the exact same tree of skill in bond so you do this change of basis in your stars such that or you can instead of thinking of all stars you could you could think of Q 4 you can use the same trick of St. take a bunch of tubes and replace 1 of them by the product of this this time the other and you can rebuild a scaleinvariant structure for the 3 cases then you can do this very simple bruteforce a calling for a small cell again it's the same thing you have some logical due to some over all strings which of the rights and and you just do this layer by layer and its it goes for exactly the same fashion conceptual it's the same thing the first one 1 last question do you plan to consider the case when you might have to cubic if you like more on the sacred circuit mall shop I'm not sure yet because I couldn't talk about this but we have another idea which involves a simulating confinement the calling and I don't know I have a picture which may
37:58
be will give you all an idea what I mean so we
38:01
wanna have so this is the new idea the idea we have is to have a unit cell control units on each for each check it's only allowed to speak with its neighbors and its neighboring cubits and the idea would be to simulate in some sense confining potentials so this inspired from physics so for instance if each part each defect was assumed that gravity you would these units cells would be useful implement the like actual equation evolution of some military might feel something that and then we moved to the FEC according to this to this new this according to the the force created by this potential so this idea of simulating confinement is where 1 after this we could have this this but we want to this intermediate step step before let's thank him again the the we have the next week
00:00
Resultante
Fehlererkennungscode
Selbst organisierendes System
Rechter Winkel
Mereologie
Quantisierung <Physik>
Kartesische Koordinaten
Attributierte Grammatik
00:42
Umsetzung <Informatik>
Stellenring
Gewichtete Summe
Freeware
HamiltonOperator
RaumZeit
Richtung
Fehlertoleranz
Netzwerktopologie
Gruppe <Mathematik>
Kommutativgesetz
Gerade
Kette <Mathematik>
Shape <Informatik>
Gruppe <Mathematik>
Quantisierung <Physik>
Gruppenoperation
Konstante
Randwert
Generator <Informatik>
Verbandstheorie
Hochvakuum
Festspeicher
Zellularer Automat
Kondensation
Ordnung <Mathematik>
Zeichenkette
Instantiierung
Fehlermeldung
Eigenwertproblem
Stabilitätstheorie <Logik>
Decodierung
Maßerweiterung
Mathematische Logik
Räumliche Anordnung
Loop
Informationsmodellierung
Knotenmenge
Weg <Topologie>
Diffusor
Gruppoid
Abstand
Datenstruktur
Konfigurationsraum
Stochastische Abhängigkeit
Soundverarbeitung
RaumZeit
Binder <Informatik>
Datenfluss
Zeichenkette
Wort <Informatik>
Partikelsystem
HillDifferentialgleichung
Symmetrie
Resultante
Punkt
Physiker
Natürliche Zahl
Kartesische Koordinaten
Kurvenanpassung
Interpretierer
Nichtlinearer Operator
Permutation
Homologie
Quantencomputer
Element <Gruppentheorie>
Systemaufruf
Ähnlichkeitsgeometrie
Nummerung
Kommutator <Quantentheorie>
Codierung
Arithmetisches Mittel
Verkettung <Informatik>
Twitter <Softwareplattform>
Gruppenkeim
ATM
Maschinencode
Web Site
Physikalismus
Interaktives Fernsehen
Transportproblem
Term
Code
Physikalisches System
Symmetrie
Arbeitsplatzcomputer
Speicher <Informatik>
Gammafunktion
Beobachtungsstudie
Qubit
Fehlermeldung
Fehlererkennungscode
Physikalisches System
Mapping <Computergraphik>
Energiedichte
Mereologie
Dreiecksfreier Graph
Codierung
Räumliche Anordnung
07:41
Subtraktion
Gewichtete Summe
Ortsoperator
Selbst organisierendes System
Geräusch
RaumZeit
Code
Informationsmodellierung
Partikelsystem
Konfigurationsraum
Gerade
Analysis
Soundverarbeitung
Nichtlinearer Operator
Benutzerdefinierte Funktion
Fehlermeldung
RaumZeit
Datenmodell
Widerspruchsfreiheit
Hochvakuum
Rechter Winkel
Codierung
Information
Partikelsystem
Fehlermeldung
Instantiierung
09:21
Mittelwert
Randverteilung
Einfügungsdämpfung
Punkt
Polynom
Inferenz <Künstliche Intelligenz>
Gemeinsamer Speicher
Extrempunkt
Komplex <Algebra>
Chatbot
Algorithmus
Statistische Mechanik
Korrelationsfunktion
Nichtlinearer Operator
Zentrische Streckung
Äquivalenzklasse
Extremwert
Bellmansches Optimalitätsprinzip
Globale Optimierung
Web Site
Nummerung
Konfiguration <Informatik>
Kollaboration <Informatik>
Generator <Informatik>
Gruppenkeim
Rechter Winkel
Ordnung <Mathematik>
IsingModell
Instantiierung
Perpetuum mobile
Zeichenkette
Web Site
Subtraktion
Gewicht <Mathematik>
Mathematisierung
Klasse <Mathematik>
Physikalismus
Zellularer Automat
Geräusch
KolmogorovKomplexität
Sprachsynthese
Term
Code
Informationsmodellierung
Zufallszahlen
Gewicht <Mathematik>
Syntaktische Analyse
Mittelwert
Optimierung
Stochastische Abhängigkeit
Einfach zusammenhängender Raum
Soundverarbeitung
Fehlermeldung
Matching <Graphentheorie>
Relativitätstheorie
Mathematisierung
Physikalisches System
Mapping <Computergraphik>
Energiedichte
Flächeninhalt
Basisvektor
Energiedichte
Codierung
Polynomialzeitalgorithmus
Einfügungsdämpfung
14:10
Zentrische Streckung
Generator <Informatik>
Maschinencode
Algorithmus
Codierung
pBlock
Information
Datenstruktur
EinAusgabe
Ähnlichkeitsgeometrie
LieGruppe
Gammafunktion
15:40
Randverteilung
Gewichtete Summe
Fächer <Mathematik>
Zellularer Automat
Nichtlinearer Operator
Code
Flächentheorie
Mustersprache
Glättung
Datenstruktur
Partikelsystem
Modallogik
Qubit
Nichtlinearer Operator
Konfigurationsraum
pBlock
EinAusgabe
Codierung
Zeichenkette
Randwert
Forcing
Thetafunktion
Einheit <Mathematik>
Offene Menge
Zellularer Automat
Speicherabzug
Partikelsystem
Fehlermeldung
Zeichenkette
17:11
Qubit
Stabilitätstheorie <Logik>
Bit
Computervirus
Decodierung
Gewichtete Summe
Zellularer Automat
pBlock
Systemaufruf
Code
Gittermodell
Modallogik
Konstante
Codierung
Wort <Informatik>
Information
Partikelsystem
Gerade
Einflussgröße
Instantiierung
18:16
tTest
Baumechanik
TexturMapping
RaumZeit
Richtung
Algorithmus
Typentheorie
Statistische Analyse
Korrelationsfunktion
Softwaretest
Extremwert
Matching
Konstante
Generator <Informatik>
Vertikale
Verbandstheorie
Rechter Winkel
Fehlerschranke
Würfel
Einheit <Mathematik>
Zellularer Automat
Fehlermeldung
Instantiierung
Subtraktion
Decodierung
Renormierung
Mathematisierung
Geräusch
Unrundheit
Maßerweiterung
Äquivalenzklasse
Algorithmische Zahlentheorie
Gittermodell
Loop
Informationsmodellierung
Knotenmenge
Parallelrechner
Abstand
Konfigurationsraum
Analysis
Soundverarbeitung
Algorithmus
Rechenzeit
HillDifferentialgleichung
Randverteilung
Resultante
Bit
Punkt
Mengentheoretische Topologie
Randwert
Ausbreitungsfunktion
Familie <Mathematik>
Kartesische Koordinaten
Komplex <Algebra>
Übergang
Einheit <Mathematik>
Funktion <Mathematik>
Zentrische Streckung
Nichtlinearer Operator
Äquivalenzklasse
Präprozessor
Approximation
Stellenring
Systemaufruf
Nummerung
pBlock
EinAusgabe
Systemaufruf
Variable
Widerspruchsfreiheit
Codierung
Linearisierung
Anpassung <Mathematik>
Information
Schlüsselverwaltung
Maschinencode
Gewicht <Mathematik>
Ortsoperator
Gruppenoperation
Zellularer Automat
Implementierung
KolmogorovKomplexität
Code
Komplexitätstheorie
TexturMapping
Konstante
Warteschlange
Schwellwertverfahren
Eins
Partikelsystem
Widerspruchsfreiheit
Kreisfläche
Matching <Graphentheorie>
Relativitätstheorie
MailingListe
Mapping <Computergraphik>
Minimalgrad
Flächeninhalt
Fluid
Basisvektor
Dreiecksfreier Graph
Mereologie
Codierung
Randverteilung
27:53
Einfügungsdämpfung
Bit
Inferenz <Künstliche Intelligenz>
Ausbreitungsfunktion
Geräusch
Code
Netzwerktopologie
Algorithmus
Kommutativgesetz
Korrelationsfunktion
Softwaretest
Qubit
Soundverarbeitung
Nichtlinearer Operator
Konstruktor <Informatik>
Schwellwertverfahren
Präprozessor
Relativitätstheorie
Einfache Genauigkeit
Nummerung
Schlussregel
Codierung
Hoax
Arithmetisches Mittel
Mapping <Computergraphik>
Rhombus <Mathematik>
Verbandstheorie
Flächeninhalt
Würfel
Rechter Winkel
Wort <Informatik>
Binäre Relation
Information
Instantiierung
Fehlermeldung
31:44
Maschinencode
tTest
Geräusch
Nichtlinearer Operator
Lineare Abbildung
Netzwerktopologie
Datensatz
Informationsmodellierung
Quantisierung <Physik>
Schwellwertverfahren
Generator <Informatik>
Speicher <Informatik>
Einflussgröße
Nichtlinearer Operator
Schwellwertverfahren
Fehlererkennungscode
Physikalischer Effekt
Kategorie <Mathematik>
Eichtheorie
Plot <Graphische Darstellung>
Nummerung
Mapping <Computergraphik>
Quadratzahl
Menge
Festspeicher
Quadratzahl
Partikelsystem
Zeichenkette
Fehlermeldung
33:41
Ebene
Bit
Wissensrepräsentation
Einfügungsdämpfung
Maschinencode
Punkt
Röhrenfläche
Messfehler
HausdorffDimension
Mathematisierung
Physikalismus
Zellularer Automat
Geräusch
Kartesische Koordinaten
Komplex <Algebra>
RaumZeit
Richtung
Diffusionsprozess
Netzwerktopologie
Informationsmodellierung
Syntaktische Analyse
Quantisierung <Physik>
Schwellwertverfahren
Speicher <Informatik>
Datenstruktur
Einflussgröße
Gerade
Algorithmus
Nichtlinearer Operator
Schwellwertverfahren
Quarkconfinement
Nummerung
Biprodukt
Quantisierung <Physik>
Verbandstheorie
Einheit <Mathematik>
Zellularer Automat
Festspeicher
Beweistheorie
Digitaltechnik
Basisvektor
Homologiegruppe
Codierung
Modelltheorie
Instantiierung
Zeichenkette
Fehlermeldung
37:54
Gravitation
Stellenring
Vektorpotenzial
Physikalismus
Quarkconfinement
Gleichungssystem
Oval
Quantisierung <Physik>
Quarkconfinement
Einheit <Mathematik>
Forcing
Einheit <Mathematik>
Zellularer Automat
Mereologie
Evolute
Gamecontroller
Simulation
Instantiierung
Metadaten
Formale Metadaten
Titel  Decoding algorithms for topological codes 
Serientitel  Second International Conference on Quantum Error Correction (QEC11) 
Autor 
DuclosCianci, Guillaume

Lizenz 
CCNamensnennung  keine kommerzielle Nutzung  keine Bearbeitung 3.0 Deutschland: Sie dürfen das Werk bzw. den Inhalt in unveränderter Form zu jedem legalen und nichtkommerziellen Zweck nutzen, vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. 
DOI  10.5446/35296 
Herausgeber  University of Southern California (USC) 
Erscheinungsjahr  2011 
Sprache  Englisch 
Inhaltliche Metadaten
Fachgebiet  Informatik, Mathematik, Physik 
Abstract  I will talk about the problem of decoding a topological code, that consists of identifying the optimal recovery operation given the syndrome of an error, or equivalently of inferring the most likely worldline homology given a defect configuration. I will describe a new decoding algorithm [Phys. Rev. Lett. 104 050504 arXiv:0911.0581 and arXiv:1006.1362] for Kitaev's toric code (KTC) that runs in a time proportional to the log of the number of particles, an improvement over the previously known polynomialtime decoding algorithm. This algorithm also achieves a higher threshold on the depolarizing channel. Moreover, we have recently shown that all two dimensional topological stabilizer codes can be mapped onto each other by local transformations [arXiv:1103.4606, arXiv:1107.2707]. This local mapping enables us to use any decoding algorithm suitable for one of these codes to decode other codes in the same topological phase. We illustrate this idea with the topological color code that is found to be locally equivalent to two copies of KTC and we extend it to decode the topological subsystem color code. 