Bestand wählen
Merken

Fault-tolerant quantum computing with color codes

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
at this conference Andrew Lendl will I speak about fault-tolerant computer color codes OK so for for this talk I'd like to take us on a magical journey he has rent Let's imagine a giant twisters common sucked up and taken us to the wonderful world of ours as everybody knows the wonderful lots defined by color so we're now in the world of us and what are be talking to you about is how color can make fault-tolerant quantum computing wonderful delightful so this is work based on a paper that we have in the archive and it's co-written with my grad students Joanne Jonas Anderson Patrick Rice and that's a shameless plug I'll mention the Jones is graduating this year so if you're interested in the topics of talking about now you're looking for a postdoc who's got training experience in this area but Jonancy also suppose drum homological Kwan computing upstairs I use here the common OK so color-coded defined by a
trivalent lattice embedded in the surface such a that embedding has a face 3 colorable a can be cult is the media space recallable and you look for services that a semi-regular as a regular lattices that have this property the only 3 up by semi-regular I mean that it's vertex transitive so that every vertex locally looks the same as every other vertex so we can define that term so it's called vertex notation so for example for 8 8 here means that every vertex you see a square and too often dons around over so you've only got 3 choices that mathematicians a wonderful gone classified all these things I know for sure they're just 3 and of these 3 choices of which 1 is the most interesting well I'd another all kind of interesting I wanna let I think might be more interesting because it is the fewest number of qubits to achieve the same code distance it also has a nice feature that the estimator phase gate transversal for this code not true for the other 2 color codes but because defined relative is lattices that we imagine there's a cube it at the vertex of at each vertex in a lattice and the stabilizer generators are defined by the simultaneous product of sigma X around all the qubits on a face and sigma around every Cuban effects and so this funny criteria being trivalent face recallable is ensuring that the stabilizer Jenna's commu with 1 another so I wanted also just mention as an aside that these go are not the topological subsystem color codes that robber Rossendorf spoke about his tutorial are those that arise from color codes and they have the additional feature that all of the stabilizer generators can be of measured by using way to operators that's another nice feature and it turns out that they also enable any color code to have the SK transversal as Robert talked about as well so the focus of this article however is going to be on these 4 88 traditional color code now to be a bit more practical Atlas and I could imagine arbitrary surfaces you climb models and what not was just imagine the plane and bounded playing us if you look at play here color codes it turns out that they have to have a multiple of 3 quarters in defining them if they have a boundary and so the simplest such object is a triangle so of going to be focusing on triangular out for 88 color codes as of the smallest triangular 40 color code turns out to be our familiar from the being and these are the codes kind of a way of growing the steam codon organic lattice-like way as opposed to be a concatenation now this goes of course an after suited to these 2 D technologies that we've heard about where distance transport is infeasible in practical for 1 reason or another so we've heard of multiple times at this conference at the 2-D surface codes are amazing and our world Wizard of Odds land into the wonderful delightful and we'd like to see whether the Coleco's share some of the same delights in particular but they were excited about the fact that the local in 2 dimensions they also was very high accuracy threshold and they don't need solution which is a huge overhead cost we knows that it is anchored in traditional concatenated so how to Coleco's compared to us but as I said my
tutorial I'll whenever we wanna answer a question like this we have a very clear about what our control noise models so the control noise models of be considering here I could to the patrol model the 1 where the gay basis is basically the standard basis but with the magic state for the anti-gay and I'll make the standard assumptions that we have massively parallel operation we have refreshable insula we can compute as fast as we like it or gates at the same time and then of course the ITD layout local car processing restrictions is really on a restriction offers to the color I before spreading the noise model it's helpful to I introduce a couple of noise channel and user-defined those noise models so the 1st is that BP channel which is the bit flip channel followed by the Facebook by each of these channels causes the Bloch sphere acuity contracted the this year the X. axis like the looks a little picture of love I also have it was in the D. channel which is a general as that of double poly channel applies every 1 of the possible 16 poly products of probability P over 60 so these channels are user-defined a noise model assume that Cubans don't leave Sean I talked about that the surface codes can handle very high leakage rates Akaka's might be able to is well I think that's an interesting open question and also the classical computers reliable so there are 3 levels of noise models that are going to consider here so the 1st is the most fine-grained it's the circuit noise model so all model in detail using this fall gate basis the syndrome attraction process and will model each faulty gate it to go 1 cubic gate as being ideal followed by this BP channel all while each seat faulty scene I gave as being ideal followed by the DP channel and a model each measurement as being preceeded by the BP channel and having this result with the probability p I should mention that that 2nd aspect of the result of the probability P is important and often overlooked but if not always model had then that the bit gets corrupted in some way and then we measure it ideally that would be a measurement error at all the measure would be consistent with with the date would happen to the data a measurement errors when the measurement is inconsistent with what's happened to the data and so we add that extra bit on as well now we have we can have a more coarse-grained model this so-called phenomenological model where we don't worry about the details of exactly this syndrome attraction circuit at all we just say the law its probability of error in 2 1 number we call P with that whole measure process that just deals with some probability P this has the advantage of informing us at the threshold in the advent of new technology that give us new gates but you know the threshold of the circuit models in a very specific set of gates but if some technologist comes along and says what I've been a great new set of gate media have the singer attraction gave it's 1 gay and it does it for you then the phenomenological noise model kind of tells you what the that is an upper bound on what any threshold will be to my what you're gay bases its so that so that was models exactly things above but but just as I said in the only time data appear now because in that have its appearance and a retraction is Inzunza is given getting that allow the only other place it'll appears when we do the encoding computation and finally the 3rd model which is just a property the code itself is the code capacity noise models if you want just 1 the capacity this code person in quantum information over a channel as a capacity required error-correcting code and has been sort of a single shot 1 mother kind of capacity and time is the same as a phenomenological model it's a syndrome measurement of perfect OK so those are 3 noise models so now we
need to figure out but our going to decode this somewhat the threshold is going to be so I I talk to my tutorial about optimal and that some of most likely error decoding and here's a table in our paper that lists all the known as a topological so we that had thresholds we were aware of that 1 of them doesn't have anything at all for 6 12 so that the 1st 3 are 3 spot color codes I mentioned I then we've got the tires code on the square lattice it also can only be on 3 different lattices if you want them in this city regularity conditions others a topological subsystem color codes and another code which is to find a hypergraph they don't have a great name for it so I'm just going to try our broadly to haul though that I'm aware of it in for so they look at the code capacity these we see that well if you only decode these things are all about 11 per cent and that's actually close to the best you could ever hope for for is especially I would say call these assuming you can dither about the decimal points but there are basically the same but we know that for the surface goes it dropped a little bit like that but not significantly so again basically the same so an 10 comma decimal 3 per cent Italy function put that in his time and you can use some suboptimal eco-design I should mention these colored numbers are numbers that appeared after we the table so I pretty server poly and robber Rossendorf Cocke is the 7 comma decimal 8 per cent threshold number using heuristic decoder and as you heard Austin talk in his sight these improved is estimate point 9 per cent a year so our 1st result and serve strip tease way of presenting to us is that the we find 10 comma decimal 5 6 per cent if you use the most likely error the color so again not a major drop all bases in the phenomenological noise model a threshold number drops to a you know 3 3-pointers service codes and then dropped 2 comma decimal 9 3 when we do the suboptimal but most likely error decoding and privacy it's a higher up about a 4 comma decimal 5 per cent so this is saying is good like the color might be doing a little bit better meeting about fault tolerance of the relative the surface but maybe it's special to be higher I went this the MLE decoding and in fact it's even when we dropped that is 3 comma decimal 0 5 per cent of that plus 1 comma decimal 0 4 per cent so are some of the higher so that's good but now you get to where you know the closest model to reality that we have here our we actually work out the details of all the searches and in this case I we know that the surface codes it's you know close to 1 per cent the threshold by using an LED go you know the the optimally coders missing here we don't even know what the threshold is Rothman coding of at the decoding flat it's very complicated but we don't know what it is for any of the color codes there and then I heuristic Dakota that was used a follower another so a without a pressure point 1 per cent and in order to achieve such a high threshold they also added and instill a distillation you sure but sure states the so decoding so that sort of anathema to the whole spirit of using these 2 but it does give you very high you know the boost threshold some and so it's report 1 % over curious to see what what we get we don't do that and a result is well we get point 0 8 to process so again nearly point 1 per cent it's point 0 1 of characters and so unfortunately while things started look promising at the code capacity phenomenological level the threshold is looking about a factor of ten year at a worse than what we find for surface scripts of this hierachy performance
sentiment fractions right before they got the circuit level we have a very detailed about what the actual circuits we use and I mention my tutorial that certain rules you want all day you won a key errors from propagating catastrophically of surface this get we thought it was he said well let's imagine is 1 syndrome at the center of each face and or just you our C not into or out of that syndrome bit in sequence so if it's a measuring a z Chakwal just do this you not all around so I guess hidden book order here about his little harder but 1 2 3 4 5 6 7 8 so we can have them go in sequence into the bit of the center to measure the z check say and then to measure the X. Check we followed up with 1 2 3 4 5 6 7 8 with a that's going in the direction so tell Stedall take 16 seen not to measure both the exons each we thought would you that takes a long time if you try to do total error correction you're waiting around a long while while the X errors of building up before you check that anywhere anytime time steps we know from studies of concatenated coating that by using more compact Center retraction circuits for example with the bacon short codes we get in improvements in the threshold so we thought maybe we should I think a little hard about how you can compress the scandal they came up with an interleave scandal that I just 8 total coherent time steps but the we call the eggs Z interleaved schedule where the X checks are measured in bulk order 1 2 3 4 5 6 7 8 and is checks are measured in in Hebrew book water so you read from right to left so 1 2 3 4 5 6 7 8 if you do that you can show that the errors don't propagate badly it satisfies all the right criteria and if you look we you examine no errors happening at different points in the schedule they propagate out but they probably to a bounded distance so this is an example of something like that and since we're looking at these are triangular cos others is finite and bounded finite-size effects and boundary effect so we have to be a little careful on the boundaries as well we define the scope or
OK now we explain how we extract syndrome now the process it and figure out what we think is gone wrong given the central so we developed a of a Dakota we reformulated the decoding problem them but most likely arity for problem it actually works every CSS code but that probably was obvious you 1 and minimize the total flips that you think of happened so X of the user possible flip through each vertex the work it and because it's the CSS but you can do bit-flip and phase of becoming independently of 1 another and and this you know the fact that it's the most likely error is really function effective uses bit flip channel followed a face of channel if we instead use a depolarizing channel there would be subtle correlations between those 2 noise processes because a wife flip is more probable than X followed by a z so that's why we can use this word most likely in this case so we want to minimize this UI I have these Flips subject to the constraint that all the syndrome that they're consistent with all the syndrome values we see for every face so that the optimization problem we can you know linear algebra is it if you will and the this parry constraint can be written as the face of vertex incidence matrix H which just be the parity check matrix each for the code and so you know we want to have this linear algebra version of their optimization problems satisfied a challenge of formulating this way is that the they have to this engines have people modulo 2 and this is doing is the program's over GF 2 are difficult so it's work needs add on auxiliary slack variables the light to do the integer program over the reals and so you can reformulate to do that and so this is the decoding algorithm that we use to identify errors in 3 dimensions but we don't just responded the syndromes because for example the data error happened you know if we repeat that syndrome measurement that syndrome flip will persist and we don't want to imagine the data flip this happened each time-step adjustment that happened at the 1st times that year so the more correct thing to do is to respond to syndrome differences that changes in the syndrome were telling us that something has happening and so we can be formulated and I will just over there is a way to reformulate this with this syndrome changes to do this all in a nice way with slack variables and that's what so what what is that threshold solicited this
code Abassi threshold 1st the the probability of failure of decoding is the sum over all their patterns you have times the probability of that happening and for small up to a small distance scales actually iterate over every possible error that happened look at its syndrome decoding and see what happened we can get exact curves of 2 distant setting using are decoded so that's what these nice curves API fail versus here we got a little higher like this since neither we had to resort to Monte-Carlo estimation so we do is we throw down an error pattern and ask whether our decoding failed or succeeded in repeat this many times and build up enough statistics such that the error bars were small and so there's a couple of their points on here too I indicate that data overlaid on this and as imagine my talk without these codes are expected to have the mutual to understand all the the same it will be some finite size effects were you but eventually it should all have the same inflection point and to find the threshold what we don't do which you might want to do is to just look at the crossing of all these birds but that's not a stable thing to do especially the small size is a more robust thing to do is to take advantage of the fact that this is beautiful mapping between the threshold of fault quantum puting in these lattice codes and associated order-disorder phase transition the ising model classical ising model and a certain kind of a lattice and because of that we can use our understanding and physics of these scaling laws universally parameters in these ising models to do curve fitting to say that near where all these lines per we expect this to be linear we understand how these slopes are supposed to scale because of this universe out the class this allows us to do a curve fitting this is described in a paper by way at all and we do this curve fitting here this curve and get the precision on our reported died so this is why real-estate 10 comma decimal 5 6 1 per cent now that curve fitting is assuming go however that the sum of some I would say that some of the finite size effects may actually enter into this made up to the degree to which if you just to the crossing but I think that they would be our present so there is I think a little bit of artifice in this small numbers OK the phenomenological
threshold and how we do the same way I'm not in the interest of time I'll skip the algorithm here may be of interest only to specialists but we do have a Monte-Carlo algorithm for that as well and that we you know get our data we go to this slide again here we infer what the threshold value in the circuit threshold with additional challenge that the gates in our model can cause single errors to go out to these things propagate to multiple areas so difficult hoax and surface but paper that I worked on time going so we just call the generalized to the color codes though they could be of weight 2 3 to 4 because we wait checks and so that means that I near the crossing point here is is necessary linear but to the next order or quadratic we have to consider and we get a better fit we do that we do our curve fitting with a quadratic term in here as well and calculate the threshold for both are interleaved and our sequential scandals and add to save somewhat disappointingly we found that this all this cleverness so we put into into leaving the scandals was for not we found that there really was no difference in the threshold by interleaving or not interleaving in fact I mean these are so this again dissertation at a point 2 plus or minus 1 comma decimal 3 so these these there consistently each other there's no real in addition to do a numerical simulations is
valuable to have rigorous analytic bounds on the threshold of so that you know that you not you know having difficulties in America roundoff errors in computer or what and so what we did is we adapted a technique that we call the self avoid walk technique that's from Iran surface code some years ago I to the color code the general idea is as follows you we can bound the probability of failure by the probability that those support operator in a support logical operators contained in the error pattern that consists of the actual errors that happened on our best guess of what happened the combination of the actual errors in the gas but contains a logical operator roots we failed for color codes the logical operators are not just string like they actually can be what's called string that like I haven't gone into that but they're a little bit more elaborate but fortunately for every string that there is there's a medical knowledge strain that has the same likelihood so we just double the sum to get abound and the probability that a logical operators contain the string is the probability that a self avoiding walks and as that contained in a string so we can just count the number of self avoiding walks and I look at the probability that we have a bit flip along that so we know and possible places for the walk than giving you started there there's a certain number of walks of link L at a distance d year greater it's all going to assume that every error pattern of greater than distance d will cause failure on there's ever ways to choose the folks on that path the probability of that and I the only thing that we need to work out that is how does the numbers of avoiding walks scale as the link and this is something again mathematicians of studied in great detail the coarsest bound you could say as well as a 1st step I can to walk in any 1 of the degree of the lattice of step but again I think it is I can take a step delta different ways were dealt as a degree the lattice in the next step I can take delta minus 1 that has delta minus 1 and so forth but generally uses as those of a connective constant for the lattice and for the 48 now for 80 lattice it's known exceptionally well and if you like bounces oppose approximations you the bound instead and how we use this approach we find that for the code capacity noise model we about of 8 per cent which is less than the 10 comma decimal 5 2 per cent we estimated but you don't abound and and for the phenomenological noise model breaks it will be more elaborate so this is the structure you have to do the walk on here you have to walk in this three-dimensional lattice that's the sort of the prismatic extension of the before lattice allowing couplings between the syndrome that's in the data its if you do that you read get about of about a half a per cent so that's less than the 3 this week so I finally here then we
we actually push this through for the quantum computation threshold and it turns out all the gates have at the threshold in the transversal model the roughly the same as the memory threshold destructive measuring go preparation me much higher and in this code effort you can do co defamation in these articles just like you can see in the Surface goats and then in that case all gates have the threshold of the memory threshold so that's and so to summarize
and we find at the heart of the service codes in these phenomenological code capacity models 10 times worse the service codes and we look at circuits we have some rigorous bounce but they're fairly we would be nice tight nose up to really understand things better especially in the circuit model the overhead of these is comparable to the Surface codes but there's some interesting questions like these topological subsystem Coleco's do they have a better threshold or not I would not leakages in these can we handle them just as well I'm tightening up is lower bounce but magic see distillation is something that has been studied well I'd say there in the service codes or the color codes this whole prosody not the solution but the injection process and you might be concerned that while the states are not included very well that they could propagate errors badly until they grow to the size of the code that's interesting question and finally for those of you interested in topological quantum computation I'll be interesting to see if we can extend the color codes to a quantum double model in the way that time and she originally the surface goes to think about different had any ionic theories that might realize topological quantum computation so with that it's time for me to put my heels and then enter back to our reality here integrations he might have been written the challenge of any questions we found from the 1 before lunch is there a simple explanation for the circuit threshold isabell ones a factor 10 lower is it possibly because you're measuring highway stabilizes to keep let's my Ghassemi numerics K give you you know what is the probable process and computers only give you answers so I don't know just tells you the act like my guess is that it has to do the highway to stabilize generator so a way to examine say for 612 codes which I have even higher regenerators perform worse but this it's it's it's 1 about balance of foreign aid in between I don't know so I would say a comprehensive study these other codes might give some more evidence of that line of thinking but that certainly my thinking is that the highway stabilize Jenna's a why that's what's causing that back out there that's also I believe that the phenomenological model does better better the surface good because I'm getting information from 8 bits now all in a single step words and service goods I said I only got 4 bits in a single step so I think it might explain both sides of the equation that said they dollars because in the morning session 1 more time and now we have lunch and we have a group photograph of side
Quantencomputer
Wellenpaket
Magnettrommelspeicher
Decodierung
Homologie
Landau-Theorie
Quantencomputer
t-Test
Computerunterstütztes Verfahren
Computer
Packprogramm
Gradient
Fehlertoleranz
Flächeninhalt
Resultante
Bit
Prozess <Physik>
Gemeinsamer Speicher
Mathematisches Modell
Maschinensprache
Computerunterstütztes Verfahren
Gesetz <Physik>
Raum-Zeit
Übergang
Netzwerktopologie
Zahlensystem
Perfekte Gruppe
Stützpunkt <Mathematik>
Vorwärtsfehlerkorrektur
Einflussgröße
Auswahlaxiom
Phasenumwandlung
Dimension 2
Umwandlungsenthalpie
Nichtlinearer Operator
Schwellwertverfahren
Topologische Einbettung
Kategorie <Mathematik>
Kraft
Stellenring
Quantencomputer
Nummerung
Bitrate
Biprodukt
Dreieck
Transversalschwingung
Generator <Informatik>
Dienst <Informatik>
Verknüpfungsglied
Menge
Einheit <Mathematik>
Würfel
Information
Overhead <Kommunikationstechnik>
Gittermodell
Aggregatzustand
Fehlermeldung
Ebene
Facebook
Stabilitätstheorie <Logik>
Decodierung
Kontrollstruktur
Messfehler
Selbst organisierendes System
Hausdorff-Dimension
Geräusch
Polygon
Sigma-Algebra
Term
Code
Demoszene <Programmierung>
Multiplikation
Knotenmenge
Informationsmodellierung
Kugel
Flächentheorie
Delisches Problem
Abstand
Schätzwert
Soundverarbeitung
Qubit
Kanalkapazität
Fokalpunkt
Quick-Sort
Objekt <Kategorie>
Quadratzahl
Digitaltechnik
Basisvektor
Hypermedia
Gamecontroller
Modelltheorie
Brennen <Datenverarbeitung>
Resultante
Bit
Punkt
Ausbreitungsfunktion
Mathematisches Modell
Maschinensprache
Extrempunkt
Übergang
Richtung
Gebundener Zustand
Fehlertoleranz
Netzwerktopologie
Interleaving
Regulärer Graph
Stützpunkt <Mathematik>
Skript <Programm>
Tropfen
Bruchrechnung
Lineares Funktional
Schwellwertverfahren
Oval
Güte der Anpassung
Systemaufruf
Nummerung
Kontextbezogenes System
Teilbarkeit
Dreieck
Randwert
Scheduling
Dienst <Informatik>
Druckverlauf
Kompakter Raum
Konditionszahl
Server
Ordnung <Mathematik>
Gittermodell
Tabelle <Informatik>
Aggregatzustand
Fehlermeldung
Folge <Mathematik>
Total <Mathematik>
Wasserdampftafel
Geräusch
Polygon
Code
Flächentheorie
Abstand
Schätzwert
Beobachtungsstudie
Soundverarbeitung
Fehlererkennungscode
Datenmissbrauch
Konvexe Hülle
Kanalkapazität
Schlussregel
Quick-Sort
Digitaltechnik
Codierung
Verkehrsinformation
Matrizenrechnung
Bit
Gewichtete Summe
Prozess <Physik>
Punkt
Versionsverwaltung
Maschinensprache
Inzidenzalgebra
Gesetz <Physik>
Algorithmus
Maschinensprache
Mustersprache
Ausgleichsrechnung
Kurvenanpassung
Einflussgröße
Phasenumwandlung
Korrelationsfunktion
Gerade
Zentrische Streckung
Parametersystem
Schwellwertverfahren
Wendepunkt
Klassische Physik
Quantencomputer
Optimierungsproblem
Nummerung
Einheit <Mathematik>
Gerade Zahl
Lineare Optimierung
Ising-Modell
Gittermodell
Fehlermeldung
Nebenbedingung
Subtraktion
Hausdorff-Dimension
Mathematisierung
Physikalismus
Klasse <Mathematik>
Geräusch
Code
Knotenmenge
Variable
Informationsmodellierung
Reelle Zahl
Lineare Geometrie
Schwellwertverfahren
Abstand
Optimierung
Grundraum
Schätzwert
Soundverarbeitung
Finitismus
Mapping <Computergraphik>
Minimalgrad
Codierung
Wort <Informatik>
Bit
Punkt
Gewichtete Summe
Mathematisches Modell
Computer
Extrempunkt
Gebundener Zustand
Algorithmus
Mustersprache
Ausgleichsrechnung
Wurzel <Mathematik>
Zentrische Streckung
Addition
Nichtlinearer Operator
Schwellwertverfahren
Approximation
Physikalischer Effekt
Flüssiger Zustand
Nummerung
Hoax
Rechenschieber
Verknüpfungsglied
Einheit <Mathematik>
Rundungsfehler
Mathematikerin
Ordnung <Mathematik>
Gittermodell
Fitnessfunktion
Fehlermeldung
Zeichenkette
Subtraktion
Gewicht <Mathematik>
Digitaltechnik
Quadratische Gleichung
Schaltnetz
Geräusch
Term
Code
Flächentheorie
Diskrete Simulation
Schwellwertverfahren
Abstand
Maßerweiterung
Datenstruktur
Likelihood-Funktion
Kanalkapazität
Binder <Informatik>
Quick-Sort
Inverser Limes
Minimalgrad
Flächeninhalt
Digitaltechnik
Stabilitätstheorie <Logik>
Bit
Prozess <Physik>
Mathematisches Modell
Gruppenkeim
Gleichungssystem
Maschinensprache
Computerunterstütztes Verfahren
Kardinalzahl
Extrempunkt
Code
Physikalische Theorie
Eins
Netzwerktopologie
Informationsmodellierung
Digitale Photographie
Flächentheorie
Delisches Problem
Gerade
Beobachtungsstudie
Schwellwertverfahren
Architektur <Informatik>
Güte der Anpassung
Quantencomputer
Einfache Genauigkeit
Kanalkapazität
Marketinginformationssystem
Teilbarkeit
Summengleichung
Dienst <Informatik>
Verknüpfungsglied
Einheit <Mathematik>
Festspeicher
Injektivität
Digitaltechnik
Wort <Informatik>
Information
Overhead <Kommunikationstechnik>
Fehlerfortpflanzung
Aggregatzustand

Metadaten

Formale Metadaten

Titel Fault-tolerant quantum computing with color codes
Serientitel Second International Conference on Quantum Error Correction (QEC11)
Autor Landahl, Andrew
Lizenz CC-Namensnennung - keine kommerzielle Nutzung - keine Bearbeitung 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt in unveränderter Form zu jedem legalen und nicht-kommerziellen Zweck nutzen, vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/35307
Herausgeber University of Southern California (USC)
Erscheinungsjahr 2011
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik, Mathematik, Physik
Abstract We present and analyze protocols for fault-tolerant quantum computing using color codes. To process these codes, no qubit movement is necessary; nearest-neighbor gates in two spatial dimensions suffices. Our focus is on the color codes defined by the 4.8.8 semiregular lattice, as they provide the best error protection per physical qubit among color codes. We present circuit-level schemes for extracting the error syndrome of these codes fault-tolerantly. We further present an integer-program-based decoding algorithm for identifying the most likely error given the (possibly faulty) syndrome. We simulated our syndrome extraction and decoding algorithms against three physically-motivated noise models using Monte Carlo methods, and used the simulations to estimate the corresponding accuracy thresholds for fault-tolerant quantum error correction. We also used a self-avoiding walk analysis to lower-bound the accuracy threshold for two of these noise models. We present two methods for fault-tolerantly computing with these codes. In the first, many of the operations are transversal and therefore spatially local if two-dimensional arrays of qubits are stacked atop each other. In the second, code deformation techniques are used so that all quantum processing is spatially local in just two dimensions. In both cases, the accuracy threshold for computation is comparable to that for error correction. Our analysis demonstrates that color codes perform slightly better than Kitaev's surface codes when circuit details are ignored. When these details are considered, we estimate that color codes achieve a threshold of 0.082(3)\%, which is higher than the threshold of 1.3 \times 10^ achieved by concatenated coding schemes restricted to nearest-neighbor gates in two dimensions [Spedalieri and Roychowdhury, Quant.\ Inf.\ Comp.\ \textbf, 666 (2009)] but lower than the threshold of 0.75\% to 1.1\% reported for the Kitaev codes subject to the same restrictions [Raussendorf and Harrington, Phys.\ Rev.\ Lett.\ \textbf, 190504 (2007); Wang \etal, Phys. Rev. A \textbf, 020302(R) (2011)]. Finally, because the behavior of our decoder's performance for two of the noise models we consider maps onto an order-disorder phase transition in the three-body random-bond Ising model in 2D and the corresponding random-plaquette gauge model in 3D, our results also answer the Nishimori conjecture for these models in the negative: the statistical-mechanical classical spin systems associated to the 4.8.8 color codes are counterintuitively more ordered at positive temperature than at zero temperature.

Zugehöriges Material

Ähnliche Filme

Loading...
Feedback