Bestand wählen
Merken

Bound on quantum computation time: Quantum error correction in a critical environment

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
and 7 and that was me there are a few pounds lighter and but I started for this slide because actually this was my last slide from 2 thousand 7 in there and I will end up his talk with the same slide and the reason for that is because during that process of working out to correlate environments in plants in quantum error correction and I learned a couple feet a couple thanks so for example at that time I was wondering if you could have some quantum parameter and a have your computer protected against correlations as you change this parameter I found out well me and other people right that you trash hold the iteration would still hold that if you satisfy some dimensional criteria and then when I look at this it looks like very much of a quantum phase transition phase diagram so we have something that we call on upper critical dimension provision teary works where your expansion works and below that below the upper critical dimension you kind of don't know I didn't know but I could guess bees on other than the quantum phase transition schematic phase diagram right that there could be a region where the threshold could exist but you need to some up a lot of terms in order to see it and another region where you could not computed all that correlations a SOASTA along that will completely suppressed no possibility to compute OK so now let me start so
what is the motivation well we are all here because we would like to protect quantum information against the environment so there are a lot of strategies like the gradient 3 subspaces dynamical the coupling of logical systems and but error correction which supposed to be very general but from my perspective but what I find interesting in whether a practice that in the correlated environment is that what you're doing it's actually the driving his system strongly and creating with that a new state of matter the possibility of having a different state right you have this group is of strong related problem when you have a correlated environment with many of many-body system of a many-body system so as usual
right I'll start with the threshold theorem which is the the starting point of all these discussions right because really tells us that the figure below a certain noise trends but we can process quantum information for arbitrary long times right and
then usually in curiously the traditional assumptions right or fast measurements but they're not really fundamental the fast beats and they're not fundamental again in my opinion and there are models which will follow in fact what the bit error models and probabilities and amplitudes so what happens if you start with a hundred times and if you start with a Hamiltonian eighties to want to use those assumptions usually have to go in and derive some master equation and and the notion of a local probability using a more call for Markov approximation so but this for the barely and our whole word aliki and in Adi for Châu in 2 thousand 6 cascades constant supplies of fresh golden seal is in the more call him back may not be mutually const mutually consistent so we may have a problem we may have to choose your poison into to all out of history tests so what happens in real
systems of lower is that we most likely will have correlations in space and time so you have a key bits of old and they were excited some photons or some forms and they start talking with each other so correlations in space and time are everywhere in that we need to true for all them out we need to justify our assumptions true fold are Croatian free bad so this is a back-of-the-envelope calculation is actually a very simple calculation of to Consider purely phase and back with anomic spectral acting a logical qubit to have a just a single logical qubit here there are 5 5 give cold and I'm coupling here the lower bound for the trace distance between no evolving it all right the perfect evolution right it was a perfect memory and this memory this back driving dynamics and as you the so the scales here are set up you don't pay much attention to the numbers because the skills was set up to find this 1 here such so you start your own with memory effect which for the Macbeth are very very small very low ready grows slowly because the logarithmic in time but eventually as a spatial Croatians' started to kick in you see that you trace this as this lower bound we appraise the and start to increase very fast so what is it what can I tell about this graph is that well if you start believing that your local where probabilities a small and completely forget that you may have spatial correlations sometimes you can get it through can goal all out of your practical value because of the spatial correlations they need to be careful with your statements about measuring years in 1 qubit and putting a bunch of them together so a lot of people
actually work on this problem and these are some of them actually do use those officially creative forge the way that I think about the problem in especially warmest their whole in but Khot paper was 1 of the special ones that I really enjoyed to read in learned quite a lot so
but what we created the whole framework that I'm going to talk about was a completely different prey per from the return of right because in 2000 she wrote a paper where she was analyzing the trash Terence as the quantum to classical transition and she was saying that well if you are below a certain error probability your raising you're in the threshold limit you are at risk have risen by the computation in strong and that with your computer so this is like a low temperature regime you on the quantum regime on the other hand when you are about this threshold value your computer is a noisy computer actually is well simulated by a Turing machine and is at a high temperature regime or classical and what a classical system so this quantum to classical transition actually motivated weight I was thinking about this problem because while if I can't imagine a prior probabilities as a temperature right so this is low-temperature and high-temperature can I add now a axis here and think about why are correlations doing to describe the axes represented this quantum transition so it certainly makes sense
to think that there may be a quantum to quantum transitions well because a quantum phase transition is defined by a qualitative change in the wave function of the ground state that's that's how you you pick up such those quantum this position book and that's what it's written there so you have a ground state wavefunction any changes as a function of a parameter of you have a point so I think you see it we're not talking about the Hamiltonian but we have a driven we're forcing a particular state right that we want to have our out of this out of the problem right preserved and the inverse of the computer revolution in this sense we are defining a quantum phase with the finding that states that we want to preserve and so when you have a correlated environment the question is can late environment drives you away so from that particular state that you want and so that's how I start to think about this problem a 2007 so the gas models of the noise gaseous model actually it is based in the is based on generalization of the spin balls on that but you have a free Hamiltonian full balance which is based harmonic oscillators an interacting Hamiltonian between the degrees of freedom of the bath and your pupils you also assume that there's some correlation functions I didn't that obey a power if it was an exponential decay follow that will be a problem because then you can define formally the former cub approximation so power laws are really the problem and the Gaussian noise enters the point that you need of to the composed for practical purposes the endpoint relation functions into 2 points products of 2 point correlation functions so the the important parameters here that will show up later on the the spatial dimension of Europe quantum computer of delta as there and see it was called the dynamical exported exponent and delta is a scaling the mention of the interacting point OK
so how gender is this model all this can actually happen in many situations in Physics right so you have a letter magnetic fluctuations follows charge fluctuations in etc. but it does not cover everything does not cover a spin Baffert what I am you can also think in a very general argument about this problem can think about a very long credited board general that the after doing all possible hardware solutions to new reduce decoherence right you do them for the coupling dynamical the principal subspace you do everything you end up so with some residual the coherence between your computer everything direction between your computer an environment so you have this residual the coherence that it will happen in this environment is still very large by had poultices right so it's very unlikely that the computer will have a strong influence in the environment and the environment will be the minimal of its local energy landscape so now that you're sing this minimal energy of its landscape you assume wait approximation to this landscape and the link between the computer the environment and that's how you get to that model so this is actually a paraphrasing of the original called they're alighted argument of how to derive the spin boson model and so on trying to say that well eventually will not was something very similar to that we may end up with something very similar to that after doing several layers of hardware protection to the system so the basic assumption 2007 was that OK I have a spin but this in a lattice of out from a set of all the spins sort keep it but they're separated by a minimum distance and the and delta here is the time it takes me to do error correction procedures and the correct step and so while they were separated by a minimum distance because I wanted them to have be to start with I wanted to be able to define what I was calling the local era procurement so when
I was doing when you when you when you look at all of the array of qubits and time evolving I could separated the correlations into 2 parts correlations that will happen in size and the correction period and this will address the probability of having errors and correlations that will connect different error-correction periods and so by separating these 2 I was
creating creating an expansion in this and the finding in a way to create an expansion of errors and correlations right so I will create the dress the probabilities of having years in certain positions in space and time and also and or computer how corrections due to correlations between errors in different critical corrections signed it they're correction times would affect no result and yet that was to develop would accomplish was to develop a systematic expansion that would include this correlations and to study the stability of this expansion ends of function off correlations so the whole point was that OK I started with a well-defined systems which is the state that I wanted to protect a now I'll start to add correlations and see how stable this spread new perturbation theory it's
so it turns out that while it is a stable inside this region which I was calling loosely as or above the upper critical dimension so now we have a threshold here in with about 4 temperature right for local where probabilities which are well-defined because the qubits a individual inside each 1 of its all of those hypercubes of the boxes in also know I have of quantum but quantile access to this graph so for a sufficiently slow decay ancients Fezzik think relations of the very the years will not matter the correlators correlations will not matter so but what are these this 2 faces to they really exist so we do you read there
and that was the question that I didn't know how to answer in 2007 what they mean right they do have can I really interest those situations and how to consider the problem of dense set physical spent right without this hypercube hypothesis role it turns out that I need to change the question and the reason that
I need to change the question is very obvious red because when you in 1 side of the face transition real looking you're starting to you're looking into a particular fixed point of your of the point in space which is very hard to see what's in the other side of the phase transition need to somewhat infinite set of diagrams an infinite family of tr actually happened anything on the other side so it is starting from the perspective of both of fault tolerance by the computations starting from that point wouldn't tell me anything from the other for the other 2 sites for the other 2 possible phases that I was guessing being me exist so I change the question now for something very different sure how for how long can we compute using the see so do all the moles favorable assumptions that you can do and ask yourself for how long can you actually compute so the friendly assumptions still we see from that works there are no errors in the measurements as they no errors in state preparation there no errors in Gates but evolution move all the known there's syndromes will be taken into account so I'm saying is that OK I will never measure any the computer the computer runs flossed with it without any errors all the time in that while the unfriendly assumptions while I have our power law correlations in time and space due to the back so so what that let what's
left over right if I irony assume that I measured no for what do I have when you did you receive right you and called the physical qubits in a larger hilbert space and you have your correctable errors and also the uncorrectable workers so what I'm saying is that well I will look at these uncracked wares and these are the things that are going to evolve my logical qubit so that's pretty much everything it'll happen only the uncrackable where's everything else will be suppressed low not take into account because I'm assuming that I'll
have this perfect history of no errors at all right so every time that I measure a sentence I measured no errors so from all the possible evolution of the quantum computer and find myself to consider this 1 why this 1 because from the experience from 2 thousand and severino that this 1 is the 1 that leads you to the last possible the coherence so this 1 will be used to the longest computational time available to you all right
so the example that I'll take into account 5 cubicle can I use any other code yes but this 1 is the smallest 1 right form and sold or make things easier so you think the stabilizer codes and the and also the the stabilizers for the cold and also the logical word and then you look up in this table here and you see that you have 3rd quarter certain through the water events that are allowed and of course the inverse of events that are allowed because it reduce a distance treacle so at the start water advance here which are products of the stabilizers in and the
logical words are the ones that help keep while keep them because they the lowest order in the coupling between a computer and the environment so again will give me an upper bound so it's a very lower limit a very long time limit for what's possible to computer not and how the qubits are organized in space well well I assume spatial locality so it is not is this fundamental note but it's useful and it's also very physical is very physical because measurements and gates are hard to do so you expect that to logical qubits will be very close together Rabin said the physical qubits that make up the logical qubits will be very close together so I put them close together in certain battery right in a certain way the d-dimensional lattice now we look at the time evolution in interaction picture so we start with the usual and when the Dyson series and you expect and I expand just to force water while expanding the 1st war in a to 1st order but I'm assuming that for a short time for inside accusing period the likelihood of having very high water events is is a small so what I'm doing is that I'm creating an expansion parameter which is the coupling between the bath and the computer times the QC period so I'm saying that this will be my new small parameter and our organize all the expansion from now on from that point on so I'm looking for a very large computer and very long packed and aiming to address this question so OK you would just
expand that and you look back at your 5 called and you find yourself to only the 3rd quarter events the ones that are allowed by the 5 cubicle so you're gonna have for example to zz errors in next error is allowed medical and here is your logical qubit operator so inside of this logical qubit of measuring nowhere at all I still have this point and evolution to take on to move on in time OK so this evolution with no errors now can be reorganized into 2 parts I have my pre cubits might 3 operators between the back of the environment and and and then you can break down in 2 parts the 1st 1 actually it's highly Alation event it tells in the scrimmage through here it tells me that OK I can have the trees spins flipped this 1 this 1 and this 1 but the do not contracted endemic talk with anybody else in this same USA period but they'll talk with someone in a different way of different logical qubit or in at at the at a different time so when I do that I can assume again spatial locality and realize that this is a smaller contribution to myself to my corrections but it gonna hand I can have a contraction FIL discovered is working there I think it's it's I can have a contraction between 2 cubits inside PC period so for example tool acts errors that talk to each other and then there's that doesn't talk to anybody well we'll talk to someone later on in the computer revolution so this 2 X errors for example the taught each other to create a renormalized coupling calls and so now I have these higher-order correlations here and also a renormalized coupling cost and my son the Meyer using period of of evolution our was broken down into this to components thinking so now I'm gonna throw out this higher-order correlations using this spatial locality assumptions for mice he Physik my lot physical and logical and I'll keep only these dressed Croatia there that dressed coupling constant so
what did they gain now I can be exponentiate that that operators and on the right and evolution operator for the logical qubits so this is the evolution operator for the logical qubits that time ordered I have my effective coupling constant here the coupling between the computer environment that was that's coarse-grained of and the logical qubit so this is the evolution for the whole logical qubits as time goes by following this very particular history of syndromes all I've got a lot from QC 1st of all my coupling constant now is way higher order 3 or the 3 now in the coupling between the computer and also the ultraviolet cutoff was produced it's no longer the bear cutoff it's related to the time that it takes me to do that he received compilation so all these high frequency moments of the bear that kind of when went so but what borders me with this expression the what's the problem that is now I we created exactly the same functional form that I had when I started so now I have an expression for the evolution of the logical qubit that's related to the bath exactly the same form as the original physical qubits were talking to the back so now how I'm gonna compute
this upper bound for the time available to call to do by the competition I will once again look at the trace this I would look at the trace distance between these evolution of logical qubits due to disenfranchisement is and is ideal nesting matrix when there's absolutely no evolution all OK so this tells me how hard it will be to distinguish the 2 by measurements
so the information lost by the single qubit a single logical qubit is very simple to calculate because it became now a problem of a single step I don't have to worry about 5 spins anymore evolve in time I read a everything to a single spin an IQ and I can look it up into the books and I just look up and this is the expression and these are the kinds of kernel at of of functions that you need to copy so so you need
to cover DECstation values seems and the expression value Alzour not plus logical sigma books this the coherence function you can go to many books Phys books per sample and look it up and this is the integral of function that nature they just need to do this integral in momentum space to find out what happened and this is the result the result is that the trace distance now depending on certain parameters of the environment has 4 different possible possible evolution the trick so 1st for the for Zeder this function here smaller than 0 actually it's independent of time where what is needed so it's 2 times z ies these dynamical exponent of deuterium so it's how time and space are Pearl are related I guess is related to that dynamical yeah that the the de-skilling exponent that I talked in the beginning and the the number of spatial dimensions of the back so if this is more than 0 then it's absolutely no there's absolutely no dependencies there's this initial small decay of the trace distance but after that it doesn't go away any more states and seed exactly 0 then you have a lot of growth in what the number of Casey steps that you doing so the trace distance starts girl who starts to roll very slowly logarithmic with the number of QC steps for his ether between 0 and 2 now it's actually a grows as a power law and utilized into informally diverges because it will depend on the size of the back so for a very large bath actually the global the traces and which to each 1 very fast so the result now if you invert so just what I said if you invert that expression for time of so this is the number of Casey steps and you assume that there is a maximal trace distance that he can tolerate between the idea was to lose evolution and the i illusion that you have been to these residuum friction interaction between the computer and back you found out that for this so case right you have this initial decay of the of the trace distance and the and eventually the worse thank you initially that is on the surface and how they go so you have this initial the page but after that doesn't the anymore and so if you start below the threshold university there forever so for on-demand is they close to 0 you grow but here are not in the the growth is not that bad so be it orally tell I'm them we so
now what about an array of it's so to calculated the trace distance for an array of qubits and I I don't have I'm not I'm I have have what is really spin a rainy that its coupling to this bosonic staff in the hands of mimics a lot of what you do in solid state so but he learned to do that without having to worry about it you see pollution measurements also for these everything was take into account array of this is a hard problem though I cannot do this for the trees but I can do something for the kid which made norm like in balance the trace distance for the repair Ritchie MIT Naur right so N is the number of logical and this is the relation between the Chern norms so they have much in it nor
can be written in this form and again these are the fault that all these elements inside that that matrix that you need to help with and you king again facing another in the growing momentum space that you can now I just gonna need to calculated all these integrals here and the result did you find is very similar to what you had before but now you have the self-interacting part of the diagonal terms all the best in matrix to look at and is the final 1 a particle is dead and you have the correlated parts the off-diagonal terms of that the estimated to look at this the final disease so here what enters what the difference between the 2 for quota for the correlations part part also have a turn that move depends on the dimension of your computer so if you will lay down the computer their spins in the square lattice or in a cubic lattice this will change this criteria so when you go back and then you see that we once again have 4 possible behaviors a symbolic where you have something that is independent of the image of the number of Pusey steps that you doing and on the other 3 which actually will depend on the number of Q. step that you're doing so for how long you can compute well you can invert there is those expressions and find out well if you start building the threshold for this so long ago that 1st Macbeth you can compute for an infinite amount of assuming all these good alters the right if you are the only case then it grows but doesn't grow that bad so this is the critical distance that he assumed that you can have to compute and the other cases you start to rovers who go away really fast so graphically that's what that's a pictorial thing that you have so for super them you can go be in if start below these the trace the threshold of what you call threshold distance you go and you still below this threshold you have initial decay 100 acted on the king and on the other hand for the only about you eventually grow very slowly individual crosses the threshold distance very slowly finally for this so long make and the adversary's scenario you you go really fast or waiting for from this ideal evolution
so now what this Council well this tells me is that their 1st of all there no adversary environments to QVC so if you have correlated environments depending on the interplay of the dimensions of the back half of that mentions of the computer and this and the parameter that ended you can have some environments that really bad that correlations actually will drive you away from your desired quantum states a situations where there's in certain situations it is possible to improve a lot by doing engineering on your come on you're cubits you don't need to put them all together you can put them apart you can try to reduce all these effects and so it is not a lot of sold statements and the and say that is impossible to compute for certain rights just means that you need to rethink your engineering when you find certain situations in all cases the topological qubits enter in this in this I think the expression for the total amount of time available but this is expected so this is not something that you wouldn't the even in this acatech URA model that what you expect and a tree the genes that I just described the really quality of crude entering this positive interpretation of resiliency as a dynamical point of phase transition so what I'm
saying is that OK so when you're in this Super at bridging you are in this traditional threshold pure and limit in a place that preservation to works fine and you can expand coral in terms of the correlations as corrected for the existence of correlations systematically where in this finite would be all making use or the upper critical dimension is when you start grow very slowly a way regime where you need to some up inference family of logs knowledge find a positive result quantitative results but in this symbolic bridging you have a finite amount of time available to you to do the computation and the reason for that is because all these left overs all this trace this the this uncorrectable errors will start to accumulate in will drive you away from your ideals state in a fashion that is faster than logarithmic with a number of QC steps finally if you're below this and lower critical dimension lower this so that condition for Rosita's smaller larger than 0 I it's not possible to compute formally then why is that because all the expressions and that that they are suppressed by the size of the back at the size of the gets bigger that you have available to compete become too small OK so that's it thank you very much the the the questions I think you will you started the same slide and and that the same slide as the previous time an actual have the same question as they had the previous 9 theories a design of qubits this would allow you to couple the 2 the but not as a dipole captain but as a quadruple dichotomies an optical direct coupling etc. and that would lead to use the delta a lot and so in I am quantum computer if you would just use dark transition in spent qubits you just use spin-up spin-down encoding doubles quantum spins of triple quantum spins and those would you choose and dramatically yeah so the delta would go up and so you know it is not possible to compute that means unless you do some tricks which experimentalists are sort of very good here are a breeze so but that's why I was trying to see when you started to discuss when I start the discussion is that wall how generic the genera what I'm trying to be true it's true to explain or that sh who to argue is that the this bath this all make therefore of this I was like theft is very ubiquitous right after you do everything that you think about the decoherence free subspaces right or and Amoco the coupling there's still be some leftovers it is very likely they will be able to model does laughed over interactions by a bosonic that so I'm not saying that these bosons here that I'm discussing unnecessarily electromagnetic for patients with a form of fiction but everything that is left over after you go to a logical or to of of 1st layer of hardware protection so you do this analysis for the code but if you've any sense of where these properties for these people crossing points depend on properties of code so what the court wonder called will do to you is actually increase or reduce increase the power of protection due to these distance right so when you increase the distance of the coal that effectively coupling constant will become higher power in the bare coupling cost so the in terms of the UN General analysis nothing changes much but but the effective coupling constant will be extremely reduced by a larger distance gold that's why surface coatings to it because the Surface goal that will grow then the distance of the call will grow right as a as you as a grow the lattice but other hand the qubits a very close together so there would be a lot of correlation there too I would just like to ask for clarification could you please go back to the slides where you where you where you show the curves as your condiment showing how quantum information degrades over time and so there was a sort of the things that seem to be too good to be true so you're not using a lot of information in the limit of infinite time That's right and let me just clarify because I was also was I'm assuming that I have an infinite computer and I have an infinite amount of time than the computer but also an infant computer from just saying is that D in assuming all these very very good evolutions this of with higher and higher order in the bare coupling so you solve you this will be a very dilute so when you take the tunnel dynamical limits this scarers these left over IaaS they are very dilute and space and so when you calculate the trees distance you will not take you away OK I'm just asking because even if you will use 1 of our phenomenological models is totally independent in time I mean for such an error model sooner or later you would use your quantum information and in the limit of infinite time you would lose everything so you would have you would end up with a completely depolarized density means so I'm wondering about those of curves and about the slopes and so for those of the 3 regimes and you may have said it what do the slopes depend upon do they depend on how many encoded cubits you have and a new computer what 1 so the slope here they are actually whilst they are just a few years they're just draw they would just fraught they're not recalculated right so because this is a quantity of statement but if you go back and I and see what the really the key and in terms of the cold as I said is the distance of the called the large the distance of the cold so this Molla that sold this longer time will take you to cross the threshold so that's it so there's lot distance this longer time you take it to do that so that's pretty much what I have to say I about this a Bromma case is very about is as I think we discussed this before is very much like our when you calculate Yackel Insulza for if you have this a parameter kinda the merits and you just ocellated you have an initial drop of fidelity for example and annual slated back or and any for the questions all hear about that last point a little later today in 1 of the afternoon talks but to the tanker speaker 1 more time the thanks in part of the
Parametersystem
Fehlererkennungscode
Schwellwertverfahren
Prozess <Physik>
Phasendiagramm
Hausdorff-Dimension
Iteration
Computer
Term
Quantisierung <Physik>
Rechenschieber
Quantenphasenübergang
Rechter Winkel
Quantisierung <Physik>
Wärmeausdehnung
Ordnung <Mathematik>
Programmierumgebung
Korrelationsfunktion
Theorem
Subtraktion
Punkt
Prozess <Physik>
Gruppenkeim
Geräusch
Mathematische Logik
Gradient
Systemprogrammierung
Quanteninformatik
Perspektive
Datenverarbeitungssystem
Theorem
Quantisierung <Physik>
Schwellwertverfahren
Figurierte Zahl
Korrelationsfunktion
Fehlererkennungscode
Schwellwertverfahren
Physikalisches System
Unterraum
Twitter <Softwareplattform>
Rechter Winkel
Strategisches Spiel
Information
Programmierumgebung
Aggregatzustand
Zeitabhängigkeit
Bitfehlerhäufigkeit
Zahlenbereich
Gleichungssystem
Einhüllende
Rechenbuch
Systemprogrammierung
Informationsmodellierung
Bildschirmmaske
Perfekte Gruppe
Korrelation
Abstand
Minkowski-Metrik
Korrelationsfunktion
Einflussgröße
Softwaretest
Soundverarbeitung
Qubit
Zentrische Streckung
Fehlermeldung
Befehl <Informatik>
Approximation
Graph
Diskretes System
Raum-Zeit
Physikalisches System
Rechnen
Konstante
Rechter Winkel
Festspeicher
Analogieschluss
Wort <Informatik>
Reelle Zahl
Modelltheorie
Schwellwertverfahren
Gewicht <Mathematik>
Klassische Physik
Gruppenoperation
Klassische Physik
Kartesische Koordinaten
Computer
Framework <Informatik>
Quantisierung <Physik>
Approximation
Eins
Turing-Maschine
Rechter Winkel
Theoretische Physik
Quantisierung <Physik>
Inverser Limes
Gasdruck
Markov-Kette
Korrelationsfunktion
Fehlermeldung
Punkt
Statistische Hypothese
Computer
Computer
Extrempunkt
Hamilton-Operator
Gesetz <Physik>
Rechenbuch
Richtung
Quantenphasenübergang
Qubit
Korrelationsfunktion
Lineares Funktional
Parametersystem
Approximation
Hardware
Exponent
Inverse
Quantencomputer
Stellenring
Programmierumgebung
Biprodukt
Algorithmische Programmiersprache
Approximation
Quantisierung <Physik>
Unterraum
Funktion <Mathematik>
Menge
Verbandstheorie
Einheit <Mathematik>
Rechter Winkel
Geschlecht <Mathematik>
ATM
Ablöseblase
Programmierumgebung
Aggregatzustand
Lineare Abbildung
Theorem
Ortsoperator
Wellenfunktion
Hausdorff-Dimension
Physikalismus
Gruppenoperation
Mathematisierung
Geräusch
Whiteboard
Stabilitätstheorie <Logik>
Physikalisches System
Freiheitsgrad
Informationsmodellierung
Harmonischer Oszillator
Rotationsfläche
Hamilton-Operator
Minimalgrad
Quantisierung <Physik>
Abstand
Störungstheorie
Hardware
Gammafunktion
Leistung <Physik>
Fehlererkennungscode
Diskretes System
Fluktuation <Physik>
Relativitätstheorie
Datenmodell
Physikalisches System
Binder <Informatik>
Ordnungsreduktion
Quick-Sort
Endogene Variable
Summengleichung
Energiedichte
Parametersystem
Energiedichte
Quantenphasenübergang
Harmonische Analyse
Resultante
Einfach zusammenhängender Raum
Qubit
Lineares Funktional
Subtraktion
Fehlererkennungscode
Stabilitätstheorie <Logik>
Punkt
Ortsoperator
Störungstheorie
Physikalisches System
Computer
Frequenz
Mereologie
Verband <Mathematik>
Wärmeausdehnung
Minkowski-Metrik
Korrelationsfunktion
Aggregatzustand
Fehlermeldung
Qubit
Schwellwertverfahren
Hypercube
Quader
Graph
Hausdorff-Dimension
Relativitätstheorie
Statistische Hypothese
Dichte <Physik>
Geräusch
Wärmeausdehnung
Menge
Rechter Winkel
Quantisierung <Physik>
Qubit
Empirisches Quantil
Korrelationsfunktion
Qubit
Zeitabhängigkeit
Fehlermeldung
Web Site
Punkt
Gruppenoperation
Familie <Mathematik>
Computerunterstütztes Verfahren
Computer
Gesetz <Physik>
Raum-Zeit
Unendlichkeit
Fehlertoleranz
Hilbert-Raum
Unendliche Menge
Diagramm
Perspektive
Phasenumwandlung
Minkowski-Metrik
Korrelationsfunktion
Einflussgröße
Leistung <Physik>
Aggregatzustand
Fehlermeldung
Zeitabhängigkeit
Fehlermeldung
Stabilitätstheorie <Logik>
Zeitabhängigkeit
Wasserdampftafel
Inverse
Quantencomputer
Biprodukt
Code
Ereignishorizont
Bildschirmmaske
Einheit <Mathematik>
Rechter Winkel
Code
Dynamisches RAM
Codierung
Wort <Informatik>
Abstand
Fehlermeldung
Tabelle <Informatik>
Korrelationsfunktion
Punkt
Computer
Extrempunkt
Raum-Zeit
Rechenbuch
Eins
Netzwerktopologie
Wechselwirkungsbild
Lokalität <Informatik>
Code
Qubit
Einflussgröße
Korrelationsfunktion
Parametersystem
Nichtlinearer Operator
Machsches Prinzip
Reihe
Systemaufruf
Spieltheorie
Frequenz
Ereignishorizont
Großrechner
Verknüpfungsglied
Verbandstheorie
Rechter Winkel
Messprozess
Ordnung <Mathematik>
Programmierumgebung
Fehlermeldung
Zeitabhängigkeit
Subtraktion
Wasserdampftafel
Physikalismus
Mathematische Logik
Konstante
Rotationsfläche
Inverser Limes
Zusammenhängender Graph
Drei
Qubit
Fundamentalsatz der Algebra
Physikerin
Fehlermeldung
Zeitabhängigkeit
Likelihood-Funktion
Design by Contract
Mereologie
Wort <Informatik>
Wärmeausdehnung
Zeitabhängigkeit
Matrizenrechnung
Mathematische Logik
Momentenproblem
Compiler
Aggregatzustand
Computer
Information
Mathematische Logik
Arithmetischer Ausdruck
Bildschirmmaske
Konstante
Abstand
Qubit
Ideal <Mathematik>
Einflussgröße
Qubit
Fehlermeldung
Zeitabhängigkeit
Frequenz
Frequenz
Quantisierung <Physik>
Rechter Winkel
Wissenschaftliches Rechnen
Reelle Zahl
Messprozess
Ordnung <Mathematik>
Programmierumgebung
Dichtematrix
Resultante
Impuls
Zeitabhängigkeit
Mathematische Logik
Natürliche Zahl
Hausdorff-Dimension
Reibungskraft
Interaktives Fernsehen
Zahlenbereich
Computer
Extrempunkt
Dialekt
Information
Mathematische Logik
Sigma-Algebra
Gesetz <Physik>
Rechenbuch
Raum-Zeit
Überlagerung <Mathematik>
Kernel <Informatik>
Residuenkalkül
Divergenz <Vektoranalysis>
Arithmetischer Ausdruck
Bewegungsunschärfe
Flächentheorie
Stichprobenumfang
Abstand
Grundraum
Minkowski-Metrik
Leistung <Physik>
Qubit
Parametersystem
Lineares Funktional
Schwellwertverfahren
Adressierung
Exponent
Theoretische Physik
Diskretes System
Programmierumgebung
E-Funktion
Funktion <Mathematik>
Rechter Winkel
Information
Ablaufverfolgung
Programmierumgebung
Aggregatzustand
Resultante
Korrelationsfunktion
Matrizenrechnung
Impuls
Zeitabhängigkeit
Subtraktion
Hausdorff-Dimension
Stab
Zahlenbereich
Computer
Mathematische Logik
Term
Raum-Zeit
Netzwerktopologie
Bildschirmmaske
Abstand
Qubit
Bildgebendes Verfahren
Einflussgröße
Korrelationsfunktion
Normalvektor
Qubit
Eindringerkennung
Schwellwertverfahren
Relativitätstheorie
E-Funktion
Quantisierung <Physik>
Integral
Summengleichung
Verbandstheorie
Rechter Winkel
Mereologie
Partikelsystem
Normalvektor
Aggregatzustand
Resultante
Punkt
Inferenz <Künstliche Intelligenz>
Freeware
Familie <Mathematik>
Computerunterstütztes Verfahren
Computer
Extrempunkt
Login
Technische Optik
Raum-Zeit
Richtung
Netzwerktopologie
Arithmetischer Ausdruck
Exakter Test
Code
Existenzsatz
Qubit
Tropfen
Kurvenanpassung
Diskretes System
Phasenumwandlung
Korrelationsfunktion
Parametersystem
Interpretierer
Befehl <Informatik>
Schwellwertverfahren
Hardware
Kategorie <Mathematik>
Quantencomputer
Dichotomie
Systemaufruf
Ideal <Mathematik>
Programmierumgebung
Quantisierung <Physik>
Dichte <Physik>
Unterraum
Rechenschieber
Verbandstheorie
Rechter Winkel
Konditionszahl
Information
Decodierung
Ordnung <Mathematik>
Schlüsselverwaltung
Programmierumgebung
Aggregatzustand
Fehlermeldung
Zeitabhängigkeit
Total <Mathematik>
Ortsoperator
Hausdorff-Dimension
Mathematisierung
Gruppenoperation
Zahlenbereich
Interaktives Fernsehen
Term
Code
Physikalische Theorie
Bildschirmmaske
Informationsmodellierung
Multiplikation
Flächentheorie
Theoretische Physik
Vererbungshierarchie
Quantisierung <Physik>
Inverser Limes
Schwellwertverfahren
Abstand
Analysis
Leistung <Physik>
Qubit
Diskretes System
Finitismus
Dipolmoment
Quick-Sort
Ordnungsreduktion
Unendlichkeit
Faktorisierung
Auswahlaxiom
Parametersystem
Mereologie

Metadaten

Formale Metadaten

Titel Bound on quantum computation time: Quantum error correction in a critical environment
Serientitel Second International Conference on Quantum Error Correction (QEC11)
Autor Novais, Eduardo
Lizenz CC-Namensnennung - keine kommerzielle Nutzung - keine Bearbeitung 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt in unveränderter Form zu jedem legalen und nicht-kommerziellen Zweck nutzen, vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/35291
Herausgeber University of Southern California (USC)
Erscheinungsjahr 2011
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik, Mathematik, Physik
Abstract We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

Zugehöriges Material

Ähnliche Filme

Loading...
Feedback