Bestand wählen
Merken

Panel moderated by Byrd

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
the and so I did I want to thank them on his or for the that it is a word that you a little more the the and and also because the story about what theoretical on the so the the you must remember so so but hopefully you that so apologize for this little interaction whatever and so it was about that was call so I the Google Images in advance so that is the 3rd options for a year as the title of the year some some parts of the of the year the panel what 1 idea or theory that's
kind of a nice ice version of the that kind of because is the 2 in the morning and I have
no idea what was and then there's the media world another the of
the the more thought well there's a between theory and experiment what is the man I thought it was the I'm not sure that that was the of the so far the the all a whole that was a bit more for the there's a little area yeah in the you know this this from the can I next readable and the and American if we the top of today in the the potential so the original of the title of the the know so that was passed in which woman is the when will you know you know very often you know how of the other day on this the New York Times of him it it comes up well in the above
mildly it's in the times of all the news that's so the the world in other words
predicting the future of computing the world around some points I live Proteus 2033 and so I was inside the room at that so I it was great and Director of IT and this is the time you title then by chance in all the program and has more information and why and how the laureates our readers grid this will occur around 20 33 had moved this state 596 after that I thought OK well that's not quite and show somebody connected by 2020 and in the event of the support but when you look at this time here OK are semantic here before you make it earlier about 4 years of their life no further by 3 years later worked for the use of this morning the announced earlier work for you so even though there have been changes 596 times that of the Standard relations was so so was this but means on the estimates by the leaders of this year's determine what areas of so that such so I wanted to do say that's supposed to a little more specific questions get from a the only way is now we have a of these but the but how do we get from 8 of the so that's sort thing you want on the more say we have a lot of work and so on I thought what the words can you read that that was I'm sort of what we do 1st is give everyone a chance to work to respond to these questions say 3 30 minutes or so and and I'm going to
what we use a lot of time and to respond to the it was around the responding to the palace responses in will a so you know what happens you like thank you what all as a of reform without truck we addressed numbers want to include they all capture theme for me they were also captured in the in the title which was this idea of practical I questions associated with computing but it's really interesting to note that if you look back in history the field for the last 10 years the beast worry about realizing a Cuban then they started pouring and of mass about realizing a multitude date and now they worry not even so much about realizing algorithms are still very aggressive it's kind of something that's been demonstrated words we understand it the questions are a starting to trend to words in my in my view engineering questions so there's this into the manager and so it's not this error correction work at 1 level it can you repeat a quantum error correction and preserve a qubit for extended time what are the tasks and challenges that come into play there any suppress there's to the 10 minus 5 level for fault tolerance and use of these things were seeing experiment being being realized and it's different minus 5 in an animal origin but these things are not fundamental anymore they're not foundational questions of quantum physics the practical questions and so what becomes of interest to me as an experimentalist has limited mathematical ability you heard my diatribe about preparing it I'm really interested in theoretical problems that address or theoretical work that addresses practical problems I wanna make very clear I think it's really useful that we have a big division of labor in the community I don't think it would be good if every theorist start working on effects of timing resolution Deligne what a coupling I think it's great that people you really deep dive into color codes and of system codes in all sorts of other codes of really understand I think it's extremely healthy useful 1 thing I would like to see more of these some consideration of engineering aspects and making connections with community there is a really big body of literature that relates to control Theory classical control we we've seen a little bit of that in this talk to the last couple days it's nice to see if it would be nice to see that the rising to more prevalent which involves a growing body of theorists you have kind of a special capability to function in a laboratory setting many was a great example that he can change the diet allowed for lasers absent that's a little bit extreme but the idea that that you can traverse both worlds is is a requirement I think moving forward we're really tackle these hard problems and it's true the experimentalists well we're required to learn more and more about these these group theoretical concepts in the code systems and trying to translate them into an experimental setting but you know I don't wanna give a specific answer about exactly what problem to work on 180 of this thematic answer that making this words engineering-oriented problems and experiment meeting theory that kind of in the lab level water the realistic constraints that your systems basically tell me you can do some optimal modulation their requires 19 digits of precision in the amplitude of a pulse that's not really useful it's important to understand what we can really do and at least some people that I can start thinking about that very seriously thank you you know I'm not understood the question in a slightly different way because it's this what the particular procedure intimate methods which at most useful for your particular experiments or and just as you could have a straight from these general remarks I would like to come back to the things that you're doing as you're talk here using the current through subspace for our prevention they're needed no pickup and the use of appropriate encoding cubits necessary noon and then of course you have see the trade of this in the number of units and the timing and the position that the coherence style use the error correction for real active error correction and uses reason to go trick you told and then you're asking now what what's lacking and household then at this time I would say rather say we have a bunch of operations little that are very complex and very powerful and I don't have an idea of the compulsively used for further deeper model links correction and maybe that's something that could be addressed by theorists and in the column error-correction protocols to overcome correlated noise I never really new to me after now some comments after my talk on of being given the device at the critical for a more interested in collaborating on this because this is something that really prevents us from doing so that the correction this summer and getting further and then of course the general question as we have a number of problems don't of the time year particularly in the 2nd point to pinpoint all these numbers of you here you have technical shortcomings for each individual technical point there is possibly a solution for the Koreans results based now 1 encoding is boring not optimal for another world so so the question is then what would be an optimal choice for the occurrence subspace this is probably gain depending on the implementation and here again the advice of theorists would be really very good cup and again all our results current rely on the availability of all for the I called quantum compiler which really translates the security the theories let me come right down and to translate that to the pulse sequence that's on the line and of course this is not final sort of an optimizing compiler when I approached theorists of top of people read to my approach theorists whether that be interested in doing this many theorists simply talk about this stuff the greater part of the system engineering problems the engineers cannot deal with this we need physicists to do that I would be really interested if somebody would attack the problem because we need something that optimizes all quotes as in not under the very like 1 of when troll but more adapted to the experiments that had so this is central to what I think would make things much more useful for infinite it from infection OK I think I'll try to go through the questions 1 by 1 the 1st question was what stratus are you currently using mandatory and all of them but the trend with vary from 1 you I haven't used against the subspace in the past the we definitely use most of dynamic in all the systems that we have the operate very different systems nuclear spins electron spins optical systems in all of them we use in some way dynamical coupling for us has proved to be has been most useful because it's relatively simple to implement as little overhead and it works very well the at the improvements by orders of magnitude coherence and the cooked operated on a single spins they could operated on systems with them and wasn't cure Newell is then when we also the error correction by so it's a lot harder and we have been able to do it in all the systems that the what OK I think it was about question a christian be book but to collect semantic constraints if we have not been adequately addressed the letter will say none of them please if they were adequately addressed them you have look to that but it was really sweet but we're 2011 for the operation of a quantum computer also I'm not so sure that's available investment in fact quantum computers for 14 years now in some way but they you don't want speech are commercially viable also although I claim that the NMR spectrometry so commercial Kwan from of maybe that's a different issue that but a good
constraints that most at all I mean you know way though it's sometimes it's frustrating for us to experiment with if you probably agreement with the fatal per square they show us how to reach 10 to the minus 100 forget it relations from the fertility of we are struggling to achieve 99 per cent fidelity that obviously there's a big gap there between experimental capabilities and theoretical predictions OK I think I should come to this to wish-list experiment with I would actually going to the 2 previous speakers that most of the requirements major experimental life for easier on the engineering side I mean the hat the possibility of the nanostructures on an application if we had for instance clicked into place diamond and the sense of synonymy just K able to get those long way Towards a useful quantum computer the obviously based on the existing material we could still to better and we'll do that the in the future and I am very sure about that of the so far my operation is that the series provide as much more possible ways to do it that we can implement and so about the books the project that is set up for serious essentially well can I answer that in a slightly different light because I don't like till here is what they have to what is however I mean I but preferably theories of the track you have also sponsored the criteria of the crossing of some other cells so I like to work with them and what I would perhaps be putting in that context is how it becomes easier to collaborate and how such a collaboration we can become more fruitful for those of you who own this to cover stroke that was an excellent example for me however theories can help in explaining how is I mean right at the beginning Baltics goalies I hope he wants to achieve and the this practically in a way which is in principle possible to implement I mean he he says I want to preserve the coherence for as long as possible for instance that 1 possible gold and that's something that the experimentalist usually want to do so there is this clear created and but the other thing is that very often the did 2nd goal of theoretical work is incompatible with experimental implementation like if you show that you can be coupled to infinite orders that straight but you need infinite resources for that and we don't have those available so obviously we have to find some common ground there very well may not be interested in implementing the highest order but as for instance coverage showed the longest possible life and so I think it's that requires so willingness on both sides to learning each other's language and find some common ground so but there a general guidelines infinities and and taking into account the experiments but perfect theorists the the what I had to those of you on I wanted to a lesser of the list of what to but to because of the of the larger right hand of well I I think I don't think the sentiments that this field has allowed us to cross boundaries and collaborate with all sorts of experimental is the nearest computer scientists and mathematicians and so it's this sort of thing OK it was a universal translator on on on that website but its confirmation seems to a universal transfer it's breaking down all sorts of areas but getting back to this question I I really like to see some deeper cubicle alive for 4 days days like when I was talking about today and that 1 of the exciting things that I'm theorist of happen it has been the development of the security that has allowed all sorts of textbook experiments to happen when before it was very hard to settle out all sorts of dearest to do all sorts of exciting things and I think that Lotus Blossom continue so to a great start to see that and OK when I gave my talk I I talked about this coherent recovery that's when you actually don't boost the syndromes back to the classical world and quantity that way and you it's interesting to notice that all the experiments that have been done now have done that so that I think it takes just too much effort time and resources to try and use that the classical world and so would be great to see more work on trying to look at more fault tolerant schemes that have this coherent recovery and I noticed that Microsoft there where these warmer Sørensen gates which of these global gates that entangle whole banks and I again OK I banged on that in my talk as well I think that really helps save lots of resources by having access to these global global addressing technologies that they were able to adopt your mind you be able to I'm not going to so again I think this global OK I push that has to have wider acquire and extent and work but I think it's actually quite useful to be able to do that not just individual years about the so-called and these correlated errors that people people are finding I think they're they're going to come up and and but I was very excited to see this sort of Norway's spectroscopy happening and just as Weiner was saying no your enemy and being able to actually crawl your environment and then adapt your error correction to fetus is most efficient it's possible there was talk where he had this topic they ensure called which which showed you could actually get quite far and in beating that's a lot of so were so I think it's pretty exciting and thing I I be most worry about is whether there be some jobs have been in physics so people would discover interesting topological systems build them artificially would actually do a lot of protection for you that a lot of the software on top of it to try but I think thanks that know lots of things that I want you to do so that what I want 1st of all the things that make fault-tolerant on computing work better on highly parallelized process I want efficient extraction of entropy no fast and accurate data and fast and accurate measurements I want rare Cubitt leakage 1 non-local couplings between cubits if you can handle Our at like experimental evidence that noise is weakly correlated and Malta keep the system and certainly would be nice to see a demonstration of quantum error correction really working against naturally occurring lawyers that would be good for everyone's morality yeah Christmas is coming so you what but I really really want it's really low error rates for day just below the vectors you threshold way below the accretion virtual Sommersemester way way below the in pretty pressure of the lower the better and the less overhead will need her fault tolerance so raising a serious question I hope which Israel or can we go and making gates better We've heard a lot conference about robust dynamical coupling of methods with clever control protocols with the addresses in the other tricks we know how how far can we go and making the performance of our elementary gates of better than they are now and the something which Jason just mentioned but which we haven't heard much about the conference and that's topological quantum computing and and the more general things of whether we can achieve
gates that have really really low error rates because of a clever physical encoding system with many degrees of freedom beyond certain have to make the rate so low that we don't have to use correction quantum correction at all but we can make error rates much lower than people are just going to be easier and it's going to work of so something I would like to see is a direct demonstration experimentally of non-Abelian anyons statistics and many-body system neither would be a milestone for quantum condensed matter physics which will be but of because it would confirm that quantum codes involving many degrees of freedom really can be realized in nature we have indirect evidence for topological order now fractional on all systems so I would welcome much more direct evidence and the reason why we want built on use of a physicist bodies to see whether it's possible to see whether nature it allows it and if we think of can find that robust of quantum codes can be realized in the lab in a many-body systems that would be a big step forward because you were not use topological quantum computing in the long run it would be at a demonstration settle a nagging point of principle that we can really get big quantum block to work in Protect quantum information science that seems to be what's going on in these systems of non-abelian topological order as I said in my talk I think continuous variable systems like in superconductors and quantum optical systems may have capabilities were protected of quantum information processing which go beyond those of Cuban systems of spin systems that we haven't fully learned how to exploit and specifically in the example I gave because Josephson junctions they're just the right kind of nonlinearities realize potentially powerful long codes superconducting systems I want theories to do things to out there we have 2 ideas about how to get an accuracy threshold 1080 codes topological of another idea and by the way of a family of codes and at the threshold to to be interesting and important for quantum computing once error rates are low enough families of codes like but they sure codes for example can be very powerful even if they don't really formerly have a threshold for of improving the probabilities magic state distillation it's great great trick of thank you sir days he still here if not but I did make that's kind of made us lazy and so we have but we do know that the group computation and and state installation were satisfied that we think we're done of and you know we don't see the general reasons why we can do fault tolerance a universal set of our gates transversely that there are other things we can do the universal set aside smectic state of distillation like like deformations the morning example of that is as an existence proof this because the 11 when codes which realize of non-abelian and he answers by sequence of core code that section 3 to a universal set dates and like this those ideas developed of so 1 more thing to think about how the theories in particular experiment that's very relevant to experiment what are the things which really go beyond what can be simulated classically that we could do in the relatively near future without using quantum error-correction at all but the reason we're excited about quantum computing is because we want to be able to do things that can't be simulated with classical computers and I don't think we're gonna be factoring 2048 digit numbers without quantum error correction but with a system of order on the physical Cubans without using error correction at all in the interesting things we can do that we can argue of the what can be simulated classically that something due before it retires even if you can't keep Jupiter life and the it would be it would be exciting to thanks for Engelbart Folin 2nd start right nothing which shall we can do with 300 we the we can DuPont simulations with 300 qubits alright so think of something into it 300 cubits the well it's quite a christmas tree see my family we usually get 1 presents Christmas so I just ask for 1 and that is unlike C 2 cubits demonstrated universal gates with 2 nights that if we can get that we're in business that's all from the error correction here so I think we have a lot of work to do as a community realism is not would we encounter various we need 20 groups we need to improve the realism about their models correlated us we need really very much to think more about what you can do geometric geometric constraints are very and considered problem because you kind just assume the qubits a magic objects interact with 1 and even when you go to all that to introduce these things into your technology there's a cost you the time or fidelity and it's never we really need to take those considerations into account when you study a it should be fault no we shouldn't Institute perfect anything anything where we sit there and say let's assume we have put you should stop cannot do it I it's not going to be something that's going to the the relevant so those would be the 2 things I'd like see realism from polymeric actions and to give it to the nights that to I do not want to do not like that means that the the history by Christmas it so this is just to be clear that's neutronization and measurement of single human and to hear about what might what might might 99 is like this this would just do you just attributes in a scalable system so so these might be there so this is like the I did have the paper that were they do arbitrary rotations and as you for they only get some about % over offered right that's that's kind where after 1 system no by state preparation and utilization of the measurement everything with Mannheim that's a book has been done in that it can be done but users vision person we can so I into now about christmas I guess I talk about what's not in nice I guess is that the fact that it might think about that the 1st question that market there is for the variances of what I think that use Konica candidates underutilized by experimentalists and I would say 80 about a quantum computer the if you would tell an outsider that therefore models upon computing and 1 of them the circuit model requires a rate overhead and demanding hardware requirements the 1st question they would wouldn't be walking you make those requirements better because requirements and you know which government resources they would say what about the other models I did when you look at these other models were careful in although the Vatican computing model we know it's universally into agitation that you can do in the circuit model in the bag model efficiently on paper it looks very robust but their arguments to suggest that the phasing might not matter that you have control robustness because you just have to stay idiomatic it doesn't matter how much you wanted path and that energy changing errors are protected by a gap but experiment there's really only 1 game in town and that's the way it's doing there's 2 preventing upon Computing's hardware there's no reason to believe that superconductors are any more amenable to the that architecture than any other technology and behooves us to look at the various technologies that are out there ask whether they can realize that computing because a great potential their to reduce the high demands that the CA models placed upon us I think the CA models that a bit of a siren song at 1st it seemed very nice any quantum computation be decomposed into these elementary passes information recall cubits and elementary transformations of recall gates and so that's that's kind of a mandate until effort to work really hard at making these gates very well on these cubits variance and there's a at the composer contract but there's neighborly experimentalist that there's a path forward to put these pieces together and do arbitrary things but the path looks very daunting right now I think that we should examine alternative paths but I would say that it was in communities and are already realizing this is very difficult path I see many of the experimental my director colleagues looking at in what 1 simulation because they see that case very hard on the folks like right were still working hard the digital simulation but more more looking at and what's in which I think because it's more tractable having a lot of the techniques used and that are amenable to a that on I have a big open questions the fault tolerance of that model I think that we should think carefully about that how can we combine quantum error correction and the idiomatic motor make things great I mean it's sort of like these results rhesus partials in the seventies and eighties you know 2 great thanks the 2 great things that taste great gather maybe there's a way to put them together to do something interesting
advertise the 2nd question and what what what I like to hear from the experimentalists I'd like to hear what your scalability challenges of the experimental so very good at telling us what the hardware challenges are they tell us that there the gates are they tell us how difficult it is to do this and that I think partly that's because of fears of invaders importance of the threshold to them so that they're focusing really good at making you would be the focus in many of their efforts in making the individual components there and I think that the attitude prevailing out useful the scalability concept is just engineering but the fiercest suppose be providing a path forward we need understand what those requirements are so we can come up with proposals that a plausible to move forward for example of requirements often overlooked in all I think all the theoretical analysis we talked about today is that these experiments are controlled by classical electron it's typically and use wires to control things and it turns out that wires have thickness and they run into 1 another if you tube it's a very small you wires is a very big and fat then you're going to have a problem getting the wires to the Cubans this is the kind of issue that you know you need to think about if you go imagine putting together more more pieces of this to have a false color architecture so I think that if the experimentalists could convey to us a little bit more about the scalability challenges and not just this is just engineering and and fears could have an attitude that it's also not just engineering that they should embrace this thing that we helpful the final question mark ass that there's here what modifications they think could be made to be more useful I think the biggest modification is 1 and protocols with people I think that there is we need to add take an attitude more service Towards the hardware experiment experimental colleagues I think there's this certainly there's communication already between everybody here is this this is a great conference with the between experimentalists and the theories and I think that's absent but I think that there's this tendency a theorists take a couple of ideas in the experimentalists and hear them and then the wall goes up and they go off and they do their own thing and their in Wikinews divergent stand for reality and for a while after that and so there's constant communication an attitude that the theoretical evidence supporting an experimental effort but I think that we a lot more progress I think that all of those are those agreed that I I I I think maybe a 2nd round is more her 2 questions 1 was years interrupt scalability questions scalable the other 1 was how could do you actually think you can make the case prime what sort of accuracy do you think you can get to in the next few years many and so the not only of the of the all give a general coming here last time at 1st and although it is proving its kind said that at least some theorists have a perception that experimentalists trying to sweep scalability and the rugged or not address it no I think it has been poorly mitigate the theorists OK that I think the challenges all speak for ions and semiconducting doctors I don't vote but I mean Drilon's there a lot of challenges about how long it takes to move things if you're gonna think about their quantum CCD model that can you wrote about you already physically transport ions information carriers because there's no quantum wire Kristen Munro in the and on really interested in this transverse mode model where they to local operations using the standard cologne interacting gates with emotional moods and without too let's say 10 times in 1 trap and then they do this kind of interconversion of photons and you heralded because for large-scale scale issues I think in principle it's really beautiful but there are a lot of challenges about what kind of Fidelity's can be really realized in interconversion how long does it take to generate an entangled pair were strictly Teleport information using these particles those questions all almost all relate to for no loss to the reflections from fiber optics the really just nasty things that I've never heard of in terest wanted to hear anything about it if I if I start talk OK for and these these are the kinds of things like when I go to a manufacturer and I say what kind of ultrahigh reflectivity specialty coating over at 399 have anatomy like can you give me no say 99 comma decimal 5 per cent or whatever and that number goes into some calculation and it still turns out you know it takes 10 to the 6 cycles or whatever the numbers in origin generated really hard practical questions that have very significant effects of scalability but I do think the people working on them people like Bradley extended this experiment nest where they looked at the transport of ions through an extraction and to I think they got 8 9 fidelity in the transport and studies the never observed also here it is not right so I I was involved in an experiment and the upshot is we we did it so many times we can never observe a loss of time right so it's a very practical question that's I think to be speaking honestly not all sexy and so it doesn't get the acting kind of attention is as well you know that this scripted multi I am a cat states and things like that on the semiconducting side the challenges all relate to noise in the substrate is still an unsolved problem and until we figure out how to make robust to keep gates in that technology basis I know that there's a recent demonstration missing the triplets of bases but it's still it's a major challenge at the 1 C 2 C device-level of 4 weeks you can really have a clear picture of scalability for as a minor questions really for everyone based on experimental treatments to date it would appear to me that the shuffling through junctions is far more practical than the optical coupling optical coupling seems almost insurmountable differ comparisons and it's not clear that it would be better what advantage would give you want to think of my epochs of Michael because you all spared me there's no line drop will really think about these things how many of these can the which speakers here but the source of knowledge I that's what the point is there's all ideas run as the shopping area secondly wild chunked it certainly is the 1 that's a variable in this again you do that 2 in our laboratory but that person if you that we have to look for more hybrid technologies here and speaking more for the eye into this time because this is my specialty and he talked over lunch but that my personal view is rather than being cold logic cubits locally don't move and then have bits of maybe as surface varying encode logically and then the they could just connect interconnect by shuttling iron that's 1 way of doing it but that's inherently slow all title of interactions that I think is a very practical period that there among the practical solutions so that I think there is a there is a viable this but again you have to think about the other folks and go for other things little bit surprised that and was the suggestion that we should talk about the into details the gory details of the experiments I can do that you like and at the go on for another 2 hours and I'll tell you about all the glory details that that we want to overcome at him a solution so that on what should as so much so this is more of a 6 I I don't mean the gory details that there even abstract principles that can be drawn when you think about the control surrounding control apparatus rather the Elektronika optical or what this let me give you an example of that the 1 of the assumptions that was brought up multiple times in the talks was the need for parallelism to do fault tolerance on computer and a number of technologies have challenges to getting maximal perilous so that might be a need for multiplexing of 1 kind or another whether optical or electron or what have you and so that means that you can talk to every cube at every time step but maybe every 10 times or 100 times and that will impact the threshold that you reported the resources that are required to achieve fault tolerance so understanding even had a general sense of what are the kinds of things could impact something like parallelism is valuable for designers were trying to come up with his ergonomics means the 2nd design of the architecture and that's the 2 different if you do something about this or in a moral the lines or stay in an optical lattice and the so you really have to talk to the utility from Atlas active course than it really is so fond of information and this cannot be generalized so for example we you have problems like in and the stabilization of the of the laser and and things like this frequency stabilization things like and is really the not I think there are some things that can be generalized certainly the concept of high spin-echo has been you know pride in our dynamical couplings been applied to model non-allergenic pulses are a general concept introduce whether that pulses optical
electronica voltage whatever have you know the only reason there are highs and the share these things about you on the use you make of the constant use of these 2 motorized boats is so what's shaping and and all that all the like pinnacles so this this is a fairly new known of but the it's it's for the things that can also be easily control to that said before we have to know all the model system and to make an impact that's just take some time and my experience just the opposite of what you say that I'm talking to all the recent House they're not least interested in these things the same this and not your house the irritation of I wanted to raise his theory and not to do the calculations for you this sort about 4 with the secular and talking about that so my appeal was more to understand general constraints that apply to the extent to which they can be generalized and if you wanna be most valuable to particular hardware they would have to be specific but I I had this feeling the sort it should be for that research goes ahead slowly and then suddenly something happens and on a whole new discovery happens and OK theories can quickly jump and make an advance on that but if they were all to the slave on on on the particular the nitty-gritty the these jobs relates to the division of labor and talk about what do you think about all sorts of different levels of the structure you mean what makes the extent of the mean you're trying to improve the technology as much as we can but depends really on how how we can overcome the sources like the city right now we're developing emphasis they realizations so you can stabilize the less the amplitude of the laser gun to play with it and 1 for which you really break of but that's about it I think that's the limit as far as I can see it and then you really you have to make sure that the pilots area it's really stabilized and this is the sum of this talented technologically very challenging so what would be really an issue here if he came up of the particle that's sort of sensitive to the lottery tomorrow and things like that things are that insensitive that like France and demonstration of the way we did it this non-sensitive with respect to the residual motion you originally worked for the CIA selected operation that also sensitive to the most that we would never give young 95 % fidelity nowhere and now we just do this routinely and things like that that's year in this physics versus engineering and but all of these things have to go hand in hand taught to predict then of course we would have to set aside for this measurement time that make it very capable of a hole to measure say so for 9 and that's not easy at all and you wanna make sure that this right at this time we for example you're trying to come up with a standard set for the tomography and you talk about tamoxifen results on the top of the the way people derives its resultant plants abilities you find the utilities for the same demand of their theory but plus minus 5 % the depending on final and this is just part just off right so you have to unify these things and make sure that we speak the same language to the same measurement procedures no talking all about say 99 point 9 per cent and the kind want beyond let's talk rather talk what infidelities and once for that's very hard to measure it he won't agree that this is really position spectroscopy position mythology and this takes time it's not all sexy people wanted to see these things being applied to long chains of calculations computations and that's a very often not very helpful to to just optimize single so we draw the optimized in play algorithms a bipolar or pull what the extend all toolbox to some proteins and the subroutines rejects and their carefully thunder forget and then you can concatenate these things like later on IT support staff Bosasso I propose to predict how this goes what I'm sure we can get the whole thing for the next 3 years you can extend things so many qubits could I make a comment on that I mean we could support that work a lot more in my opinion someone came up with a paper with titles universal so that a set of gates with minimum fidelity 95 % that would be something beyond the current state but it should get in a very high ranking journal and we in this room should support the and the next 6 months when someone comes up with the same title page but it's 96 per cent the be different stories have they got there we should also support as being very high quality work you of all of this but if you if you if you knew about this that is an enormous amount of work to do this and that the bill of the sun and then others and he did the same thing we can do this you can be all that the Havel University of Gates and if you really want if you want to take a look at you can do to figure that out and measuring the enormous amount of work and I doubt publish a paper I got 1 per cent more than the than the other group contract why does that happen because too many people say it's not malleable for as we should send it's extremely valuable theorists it incremental broken there's engineering unified pop of fear is that the only why should I do this so we shouldn't be saying that we should be changing the attitude that should be considered a big that some of process said it's imperative that you want to make progress so this this goes and phases right now we have a very good place me just demonstrates simulations of the other things and that was followed based on the technological progress have been made of the last 3 years right now we have another phase of we just who's now the technology again to get to the next phase but that remains to be seen this because proper so I can't predict this goes because as you say at the moment as community with forced out experimentalists to tell stories to come up with a new story this you experiment not just to focus on the experiment which we shouldn't do which you need to do that you know we should reward good work in a good direction with high-ranking by the hell of funding agencies but they should soon papers well ranked and we might be or might be in the loop here but I can make a a different comment this compiler that talk about I think there's some this there's some general work that can be done there because I know it sounds like those things can be engineered once you have the parameters of the of the physics that you're compiling 2 can be made so it's kind of robust to all sorts of I think this is all just a person who think the same way on by apartment commensal theorists in his book The to remember out because that will be this engineering work maybe this by talking to a computer scientists in our department and they're not willing to step in because that no quantum physics in the in the optimization procedure that we have only works if you have a human interface they're sitting in front of it this for all the general the rough waters that have here so I would really like to have people look into that because a person is thinking of whatever we do we really start looking into scalar quantum computing's simulations and the like we need to come up or every individual implementation of for optimal much control the spot controls and and and mind drops of actually exactly the same differ slightly different frequencies like it and things otherwise the site we just need the optimal pulse sequences to do this now we can do that according to on the amplitude modulation this is not particularly useful for I and so we have to do the slightly different but the underlying principles of the set like someone to do that who then it's it's so because very quickly there was some controversy about what kinds of game capabilities we can achieve and what kind of problems relate to that was specifically questioning about about the challenges It's really important to keep in mind that when we're talking about pushing even single cubic date Fidelity's to that 10 minus 5 beginning fell to the 10 minus 5 level not only were talking about state of the art in on computing we're talking about the state of the art commercial hardware media you're talking about extraordinary move phase noise on a master oscillator if you want a gate infidelity over some period some number of cycles and your pulse that is at the 10 minus 5 levels below shows only want to know and apply I made a similar level 1 job I mean what do I mean good good John-John brought explicitly before get the data is not a slow but as low as you can possibly on it I mean not totally on board with that it's a it's there's a lot of classical electrical engineering I mean the kind of box you need from natural in terms of of these newest abilities 100 dollars just to get the 10 months 6 level in dearie principle so this is a really hard problem which depends of course on the system that would to
talk to me all be cheaper show that they faced was able to undertake the air also more expensive than that and make sure to that would like to say a few words about the scalability issue then and I mean you want it from Austin know where we see potential for your 1 thing I've always missed in the context of liquid state and 1 that is that I was always said it's not scalable because it doesn't have to step but I mean there's been a lot of work demonstrating that you don't need years there have been some algorithms which work with its status something like if you see 1 model where you need 1 pure Cu rested on hand but still most of the algorithms which have been published the words pure state and so I think there will be a lot of potential that she still I mean we link it said are you won't get on maybe you could get the students and that certainly not impossible if you're willing to deal with similar but if you look the diamond and the centers you there it could work with states and you could probably have quite a few qubits if you're willing to hand complicated network that I mean assuming you have a random the doping moral random positioning of sense because engineering cannot provide us with equally an equal distance and we said but what they can do the the people I want 50 and the centers with like 20 nanometers and do that the problem is with 1st have to completely characterize the system and adapt all the algorithms to to specific coupling at work and the orientation of the taxes and so on I mean if that's something you can do all that that the algorithms to a pre-existing system there you'll be a lot of work to tell you the Hamiltonian but eventually we could do that if you can bend at that the algorithm to such a system that would be great I get the impression that a lot of the experimental group certainly hasn't to look to carefully into this scalability question for fear that it might look worse than they might want to seen you know there's there's hope that down the road that we get some solution to the the entire problems just like right says that the various say analysis during an excellent experimental say hi donor I wanted us to migrate science of a small scale and of I think nobody wants to end up you know in the graveyard of that you know if suddenly in others as reported 2004 there is this wrote that bright for quantum information processing and a scalability from the divergence of criteria 1 of the metrics that you get a red dot org that chart was the so I think that there's some concern about that of course but so I'm asking for a bit more openness I guess about the scalability challenges to see what can be addressed there I think I mean no actually I was in the audience you did a great service to the community by pointing out that it's not all just threshold is high attrition results I don't see it this paper the way I read it is not as a way to say here's a way to get a high threshold it was here's a message to you in the community it's not all about threshold there's more going on than just threshold that was a very important I think we're starting it's starting to sink in World words seeing the toxins conference so and it behooves us to think carefully about what are the challenges and to the extent to which they can be generalized press the technologies a a theorist who isn't just a house theorist might be able to address those other the I've been a time in the past where we thought that pulses were greatest for in a mark you and make you know better you know resonance kind of experiments it wasn't too many years later we saw the general principles involved in that part of the objects so so appeal the Christmas appeal tissue you have another comment the world and the first-round everyone great comments and they altered being much less obnoxious than I was and there was a lot of good advice like we should learn 1 another's language we should avoid infinities but the best advice I was from my who set there should be more like many I was like I do that if I could and so 1 of the the thing that came up in in Andrews of comments is the distinction between digital and analog quantum simulation simulation which was mentioned in run a stock standard so here I am again of coming back to well what can we do without quantum error correction but you know in the near term if we want to reach show what we might call quantum supremacy of doing things in the lab that can can't be done can be simulated classically 1 approach to doing that is analog on simulation and it's a very active field you know in with lots of different systems called ultracold atoms molecules people trying to do with superconductors and science of end but the interesting questions there about can 1 given the limited control those systems obtain answers to computational problems which really to simulate a given that systems are noisy that you don't know exactly what Hamiltonian and so the hope that when people in the field realize that's what they're trying to do they want to do something that you can't is simulated on digital computers but so on the other hand said you know really know exactly what system is that side simulating the left with great precision you wanna study robust properties which aren't that sensitive to deformations of the Hamiltonian and how hard it this is part of the question and you know of we we sort of have this prejudice it's more than a prejudice and it's well-founded 1 that in the quarter in which we are on you quantum error-correction is not a powerful 1 you can't do things that we can to classically the but you know as a matter of principle if it is on view we at some way getting entropy up that in principle we still do things that just simulated and you know what we as a community I think should try to give sharper answers to the question how hard are the things that people are trying to observer achieved with analog once that is 1 of the I really support that 1 of you very much so because very often the point of view this in the public specially altered by officers in love with his colleagues up and see if there's something universal and if that doesn't will number-crunching it's not point here so we have done a lot of analog all at once simulations for that very reason just please hold things that we can go better than our can make things differently than you would actually do it in the classical computer so I really would recommend that we explore these things much more and that they conveyed the idea that as you call that we can get the quantum supremacy of very soon both ways and even without the full launch of quantum operation I think you should really give this will broaden of our based on my personal point of view made me at that is that the the real applications of this is not going to be in the long range of that and then in the mid term in the range not so much number crunching simulations but it's also metrology because what you're doing you're trying to enhance metrology with quantum methods found to this can be achieved with you for a few Cubans either visually or in an analogous way I think that
here is probably resource that we still have to open because if you just look back at the most was computers so much useful number-crunching lost computations nowadays almost every dish water as of microcomputer microchip building average was 30 years ago just a computer everybody would have been brought up so we can possibly enhance every measurement every measuring devices quantum methods that use these things and that's something we should be there but think that quantum error-correction riches can we do then but is not a necessary ingredient to get to that what was promised so they might not like we know that there is a flip side to that argument which ties into something else to think for accurate strongly about which is not only can these quantum effects and informatic techniques be useful for metrology metrological research is extraordinarily useful in on information and so to get together this cultural discussion we're having before about what kind of work is supported I got my I strongly believe that if you want this field to succeed and really make serious progress when there is an open higher in Europe department you should not just support the guidance on information you should support 1st precision metrology us in everything we do depends on ultra high stability lasers this right I was talking about a of sources again was talking about and if you look at the bag of tricks that in use today it's an which is spectroscopy and its atomic clock which is spectroscopy and laser spectroscopy in like so we support precision metrology to advanced quantum information and vise versa OK I I didn't need some time about 20 minutes to show try right to leave some time for some questions from the audience the and I think both and yes something called the so so Andrew brought up this question of it about a quarter hearing which was promptly ignored by all these from analysts and I want to know forces back on the table of mind the pretty you addressed look at it and go along over well it just want to say that they did a lot more about I mean we did it's quite a bit of a data about the quantum computing for instance means you know if I guess everybody knows knows that chores algorithm was implemented and 15 well the the the factoring algorithm 21 8 at the and that the digits adiabatic quantum computing for quantum phase transitions and a lot of others but I mean it's of course that was done in and its source of indirect because the use them in the normally consider NMR as the network model but you can if coups news network model to see more IT adiabatic quantum computing but it was real male and the have hope you so I think there's a font is a funny asymmetry that I think is developing community about 80 about it on computing people push very hard on and demanding what kind of fault tolerance and scalability you can realize if you're an experimentalist to a model hard questions like that are not generally fast body that on computing or even topological on being when we it if topological quantum computing works well 1st of all it's beautiful physics no doubt about it but what new problems come into play ball of there's this this deep dive really hard critical look on the scalability of transporting trapped ions for for quantum algorithms but we brush similar issues aside in in a key scene and you know and before about the way being the only game in town and D-Wave is written a paper in which they said we can never realize by using our approach the kind of scallops that give equivalence between adiabatic quantum computing and the circuit model was so it certainly is a useful approach and they do all sorts of very interesting for physics work that has lots of utility but I think it is were not were not approaching the to problems in the same way at this point if I may I think to be part of the reticence of experimental groups to get engaged in the Vatican computing is that there is this ongoing war and about the outcome the computational power of optimum of the Vatican computers to solve optimization problems there's this back and forth you can they solve you're not but but it that's actually a sideshow ugly I believe that the real question is you know universal quantum computing about model computer does not do that and I think you need a 1st use this sufficiently rich repertoire of interactions we will achieve that but we need to have like a differed gentle criteria for the model that says you know if you have this set of interactions and you know this kind of layout and what not then you can achieve universal quantum computing and then that might set a target that experiments to look and say well I have this interaction dipolar I do it I exchanger do that they can look and see what they have and what kind of layout considerations of they and so I think that we shouldn't get distracted by discretion of optimization and and what do you ways machine is doing a certain views of the universe computing we should be looking at other experimental systems asking can they realize you respond appearing in particular are these theoretical promises a robust is borne out in the laboratory but in a paper to say there's a gap protects you from noise and at that control robustness and certainly experimental or where the value of doing things the even in the game model certain gated ion 80 about with some of the processes you know a better transport is very long and rich history and I think that's a valuable idea and I think we should start thinking about well maybe we can process 1 information about the universe away and was think seriously about what kind of hardware requirements are needed to do that actually I have comment on that I mean you would you mention that that's adiabatic policies or gates are used in Medford model for instance and I really CDs us not as God this for no separate models we can really implement both in the same compute the same car processes same computer and we do some things adiabatically others we do this pulses in non adiabatic pulses for some 1 works better for the owners of the work when there's been no set a lot the adiabatic is less less prone to errors but I don't think anybody's as the proof that I mean I think that depends a lot on what you're actually trying to cheat for some things it's definitely fold to his fault tolerant more for fault-tolerant than gates but for others it's definitely not the case and I don't think that he cites not seeing a general rule which says that 1 or the other respects but I would also like to comment on your other remark that the 2 systems have to be universal I must say I don't really care book called like here is that I can implement a specific algorithm for instance that act I can solve a specific problem and for instance discuss what I also mentioned that he would like to quantum simulations of quantum simulations heat general-purpose quantum computer and I think at the moment we really not in a stage that we need a general-purpose computers I think what we need is computers which I can solve specific problems and that's what we're most experimentalists working I mean of course the long-term goal be to have a general-purpose computer but that's at least they differed in the space of just 2 brief response that I know what was the other the ticket seller regret your 1st comment I I
completely agree that the that there isn't a clear answer about the robustness of the quantum puting and the information as experimental science and I think it needs to be addressed experimentally and I would encourage my experimental colleagues to investigate this to see whether I during and regards the universal on computing OK I agree I mean I'm I'm I was getting a little bit like John you know 1 at all and so I think for a universal on getting but think that there is room is there's there's room in the middle you know to do interesting things of metrology is an interesting example simulation small-scale all kinds of things in doing a small scale the I just think that the Vatican approach is something that hasn't explored sufficiently well currently so so so your task is to write 1 of these 2 dimensions of Toeplitz OK well I'll let someone who's actual job it is put programs together with this science forward I I want to ask the following question I heard of experimentalists lament the issue of not having certain types of good design tools are having people who could operate those design tools and of so I'd like to understand what design tools needed and then like what of the but that there worst something that is there a need a version of some sort of quantum a version of VHDL or Verilog is there some sort of quantum spice that needs to come into existence what could be put into that book with that would work with the it could donate to that and what would what would experiments like they have to go for the start of the is there some sort of standard of Gates said I mean that maybe that's 2 minutes to looking to too far into the future where the pieces have to know where the tools that have come into existence to to to make things go for it and the little for fast I think that it takes a long time I think everyone grids lament there's gets an idea very quickly and then it take quite a while to put that into into practice and I don't think that has to be that way but I think we have to build some tools around someone hear more about that and I don't think it's up to the entire audience if you you know contact me this is this is this looks like and of program some ideas on numbers of that you know the the source of a few years ago the truck on trees electronic circuits we than within the hour or approaches and things like that yes I understand but that's what you're referring to and the Europe had the standard developing in 1 of these programs that th UC spice so this so this bicycling for dropped by 1 of the because the the few that is necessary those based on a program that he developed years ago on its program to used now run and 4 or 5 different nodes but what I want to just do something that you really need now for this for all our implementation but I'm sure this will come from the other implementations to late wrong because we have this of a multitude of gate operations available so that you don't have a uniquely defined sequence of things we need an optimizing compiler that seriously packets this and just gives us an optimum program because the can run the sequences within an hour available cleared sounded B of its mentally really extend all the time every year we make some progress but it really is that go on Nobel if you just have the right program and 10 years ago when I made the 1st teleportation experiment that went only was only successful because we're able to sort of have to optimize the algorithm the potency this is not so complicated assimilation voters of about 140 gave iterations 150 I can't you simply can't do this anymore and that's why I personally find this is missing this is 1 thing I don't think that needs a huge program we just need to find someone or a few people who really tackle the problem but that's not what should so I mean in and they've got a lot of these tools already built but not very usable by the larger communities to scrape out with and then you can derive all sorts of robust pulse sequences but it it's it's like a Finder says you have to have someone there taking all constraints on all the particular physical parameters that and what you can what control Hamiltonians are available to you what what the powers that you can have on on certain points times scales and feeding and get an answer so up until now I haven't really seen a general tool available so I mean not the general tool in this sense of universal compiler for that but of course the scientific field in the background which the optimal control theory and that relates the and that's what people use that's the basis for a great algorithm and all the other techniques which are used in magnetic resonance and in laser spectroscopy right whales and so will the point of view has to be adapted for the various needs and it's very close related that's reasonable to serious theater theoreticians I think they don't want to get into that because it's all about what was wrong but you know this happened and I don't feel as an expanded list all able to to do this we done with the list this is certainly the 1st try some serious the book maybe we can continually dumb because I think there is no single answer to your question there are dozens and basically every implementation has its own needs into the semiconductor community of course a lot of that this present already from microelectronics industry by the same or well but at least that they rely on the existing boundary basis and so on that day having sure to to actually get it going they have to advanced it but they they can build on an excellent basis which we don't afarensis and I'm doing the same thing to on the news and culture of their using all possible programs the the the then I was talking about Department of of of the of so on that and say there are different aspects that this hardware software and the hardware is different for every i for every system like in the diamond there are people now developing in implantation schemes where they can deposit on my irons nitrogen ions with the precision of a few nanometres but these are small efforts distributed around the globe so an and again it's completely different issue I mean that test that requires pulled actually from iron story of people and that also from people in material science from electronics and so on and if you go the semiconductors superconductors again it's a completely different technology basis in each case that from my perspective the the semiconductor people have their best basis to build on but of course for instance the people doing superconductors stay to some degree rely on the same using the same lithographic tools for instance what might not only be in terms of lithographic tools you know some of the building in what sense by SRI at LC resonators night and making quantum you have to start asking questions perhaps about of tolerance is and how to how the tolerance is that you have been translated back into the error models can you do that at the design level so that's the points the masking I'm I'm I'm hearing great silence from the theory and the serious side so maybe the they these problems all solvable solve problems they don't really want option actually I was trying to think I could talk to to get people that have few names in mind of students in the past I work with on these kinds of problems but 1st I need to be convinced they really can do
better than you as someone to some extent in my lab my initial feeling is can we really do better than you guys can do already but I certainly like to help to it and we have done great and the and things that took a lot of work to understand what actually is happening in the in the and what the controls were possible but it seemed to work that but it was a very tailored solutions and lots of efforts and random be great to have a more widely distributed software package you have to do that possibly running on prefix the so the the spices you I just want to emphasize how 1 of the challenges we face is how different and even small variations in implementation the the people they look at me and writer and poet clearly distinguishable by talent except that the writer uses calcium ions uses an optical transition I favor of beryllium on which has the EIB refineries painful transition frequencies a totally different that capabilities are very different and even when it comes to the gates about Mama servants and verses a geometric the state It's very it's a very challenging to build a single model that addresses even the challenges of there's so much variation at the place right do think there's a huge amount of opportunity is in taking the however many decades of control theory research and translating it in some ways that we have the Rosetta stone a person who can do both the into language that we use in in in such external useful it's going Dale is a person who is like that and the Dougherty is a person like that and but everybody has lots and lots of interests and they spend many-body physics and all these things and it's of course challenging to tell anybody to do any 1 thing but that that's an area where I think there's a lot of opportunity if we have a better tool set of translation of existing on concepts into what we actually do you furniture we then we're should continue are there any other questions may be 1 of of the the last 1 so this is a question for the experimentalists and so it's a question about overhead so suppose who work on overhead in the spot on models but the being for other than as low as possible and in particular 2 minutes looking forward to think you'd be willing to trade higher fidelity dates for better over the action of the this is a very tricky again very hard question very much depends on the would you really want to do it when you're talking about a computer architecture and all of that overhead you talking about depends on how this is laid out someone mentioned already the wiring someone mentioned already how the scalability issues the in Lions and others and and and in light was and so on but I had that is an enormous always the depends very much on how you to do the layout of the system or you just make things so if get a little giving examples if you really want to do a real scale lion chips for example then it would be nice to have most of you always it underneath the chip optically electronically that read on wall that's been isolation that could relieve your very many things we should it's sort of a trade of 1 so it has taken care of small-scale then can just count out more particles that very much depends what you want to do do we want to have a supercomputer number-crunching device yes then you really have to start thinking of the real architecture for the time being you're talking at least in my laboratory not about 300 sorry of are we talking about maybe 20 and maybe in 5 years but for the qubits and for that deal with the semantic and manageable some rather trying here to manage the 40 cubits and not so much think about the complicated or say a more economic architecture and the U S 6 minutes list the mile up don't have the time right now to read you think about some people to scale this up the large-scale systems I CTRON does that in 2 years investigate these things quite a bit so we are collaborating and they use also of written right now on the Shiite you see Spice it just to compare with you the question which I personally think is that this a nice tool with which we can start thinking that direction and to see what kind of specifications and needed on the other side thing being in this case things up but I cannot really generally answered that question because we I don't have a unified set of not even 4-irons and they are really lost at different if you have a big found to have small drops have segment of of playing trapped you know there's no decision and the infrastructure and the overhead depends even and implementation there also if you not talking with superconducting qubits or in the sentence or at at the atomic lattices that's very activating clarify the question of it so most as asking the overhead involved in fault-tolerant particles so theoretically and obviously that's quite high and the question is if we can bring that down the b be aiming mostly to bring it down with say still very high were relatively higher rates or is it worth that of lower error rates but lower overhead from afar again that depends on the implementation it's easier for you to if Jupiter cheap for example in a solid-state device qubits was achieved and you can just managed to get the wrong all the clearance issues that's fine then you can try a fortuitous Bryant Stewart and there were in optical lattices qubit cheap so it is really a very much depends on the overhead and this in the implementation what kind of already at so I would probably not do it to on to step up error-correction alliance of the encoding site on the logic here in the 49 nights denied the tightest make a comment even though I think cubits might be cheaper I think control it's expensive and and so even though there might be if you want have individual addressable control and you scalable to huge numbers of qubits again just too costly so this going back to my my comments about global all if it can do global control that can reduce a lot of addressing and the technical ahead and do it dealing with this concatenation of all that was the same 1 as the same I mean when I heard about this 49 books whatever long-lined cold so something that forced the people were joking because I mean really the will be happy to actually we have been working on a 5 and 7 to be called right now it's hard it's working but I see absolutely no hope for 49 of physical qubits in the near future so I have an order of magnitude better gate on 7 cubic system if that if that the numbers if that these are numbers that you're thinking of well I mean is unfortunately for small systems is very limited options but this is not that many codes available moral and thinking in the long term as we scale up and suppose you do you have hundreds of thousands of you that's you rather be spending this year that's you know 100 to 1 with you know relatively higher rates or 10 to 1 of 5 to 1 or made into 1 unit with with the where no future is hard to predict a message that the writers studied already depends on the implementation it's impossible to say right all the but it's not what others say I
think it's more likely that we're in SC continued advances on could then on the how many things can we pack into units space I mean you run into some fundamental limits of a talk about how much less control circuitry can but underneath and I dropped and I think those fundamental limits are closer than whatever limits we face on laser amplitude stability face knowledge OK I I this work that over time what we should do the panelists the there's a poster session upstairs writing and the
Mereologie
Systemaufruf
Interaktives Fernsehen
Wort <Informatik>
Physikalische Theorie
Konfiguration <Informatik>
Bit
Flächeninhalt
Physikalische Theorie
Hypermedia
Versionsverwaltung
Physikalische Theorie
Metropolitan area network
Schätzwert
Videospiel
Punkt
Relativitätstheorie
Mathematisierung
Ereignishorizont
Quick-Sort
Flächeninhalt
Unordnung
Wort <Informatik>
Information
Optimierung
Aggregatzustand
Standardabweichung
Virtualisierung
Compiler
Wärmeübergang
Netzwerktopologie
Fehlertoleranz
Perfekte Gruppe
Algorithmus
Puls <Technik>
Prognoseverfahren
Gerade
Auswahlaxiom
Korrelationsfunktion
Schwellwertverfahren
Sichtenkonzept
Kontrolltheorie
Computersicherheit
Ruhmasse
Strömungsrichtung
Bitrate
Kontextbezogenes System
Rhombus <Mathematik>
Kollaboration <Informatik>
Menge
Verbandstheorie
Rechter Winkel
Digitalisierer
Ordnung <Mathematik>
Programmierumgebung
Fehlermeldung
Instantiierung
Folge <Mathematik>
Subtraktion
Wasserdampftafel
Geräusch
Weg <Topologie>
Informationsmodellierung
Reelle Zahl
Endogene Variable
Maßerweiterung
Informatik
Soundverarbeitung
Videospiel
Protokoll <Datenverarbeitungssystem>
LASER <Mikrocomputer>
Binder <Informatik>
TLS
Unendlichkeit
Gamecontroller
Wort <Informatik>
Wiederherstellung <Informatik>
Größenordnung
Resultante
Bit
Punkt
Physiker
Prozess <Physik>
Formale Sprache
Adressraum
Gruppenkeim
Kartesische Koordinaten
Computerunterstütztes Verfahren
Technische Optik
Übergang
Arbeit <Physik>
Datenmanagement
Einheit <Mathematik>
Prozess <Informatik>
Einflussgröße
Bildauflösung
Nichtlinearer Operator
Systemtechnik
Quantencomputer
Klassische Physik
Reihe
Nummerung
Algorithmische Programmiersprache
Unterraum
Druckverlauf
Datenfeld
Verknüpfungsglied
Twitter <Softwareplattform>
Mathematikerin
Projektive Ebene
Decodierung
Overhead <Kommunikationstechnik>
Extreme programming
Nebenbedingung
Telekommunikation
Ortsoperator
Physikalismus
Zellularer Automat
Zahlenbereich
Implementierung
Sprachsynthese
Quantenmechanik
Physikalische Theorie
Division
Code
Task
Software
Quantisierung <Physik>
Softwareentwickler
Qubit
Einfach zusammenhängender Raum
NP-hartes Problem
Fehlererkennungscode
Diskretes System
Relativitätstheorie
Mailing-Liste
Vektorraum
Physikalisches System
Quick-Sort
Modallogik
Quadratzahl
Flächeninhalt
Mereologie
Codierung
Einfügungsdämpfung
Spiegelung <Mathematik>
Netzwerktopologie
Fehlertoleranz
Puls <Technik>
Perfekte Gruppe
Eichtheorie
Gerade
Nichtlineares System
Schwellwertverfahren
Sichtenkonzept
Hardware
Partielle Differentiation
Bitrate
Rechnen
Dienst <Informatik>
Knotenpunkt
Verbandstheorie
Würfel
Rechter Winkel
Beweistheorie
Digitalisierer
Ordnung <Mathematik>
Computerunterstützte Übersetzung
Fehlermeldung
Stabilitätstheorie <Logik>
Folge <Mathematik>
Geräusch
Unrundheit
Mathematische Logik
Interrupt <Informatik>
Freiheitsgrad
Variable
Informationsmodellierung
Arithmetische Folge
Flächentheorie
Spieltheorie
Diskrete Simulation
Varianz
Analysis
Soundverarbeitung
Videospiel
Protokoll <Datenverarbeitungssystem>
Softwarewerkzeug
Digitaltechnik
Gamecontroller
Simulation
Kantenfärbung
Partikelsystem
Resultante
Abstimmung <Frequenz>
Bit
Punkt
Physiker
Sweep-Algorithmus
Natürliche Zahl
Familie <Mathematik>
Gruppenkeim
Computer
Computerunterstütztes Verfahren
Euler-Winkel
Technische Optik
Skalierbarkeit
Existenzsatz
Stützpunkt <Mathematik>
Tropfen
Parallele Schnittstelle
Maschinelles Sehen
Einflussgröße
Divergenz <Vektoranalysis>
Zentrische Streckung
Nichtlinearer Operator
Parametersystem
Freier Ladungsträger
Klassische Physik
Quantencomputer
p-Block
Quellcode
Frequenz
Verknüpfungsglied
Garbentheorie
Decodierung
Information
Overhead <Kommunikationstechnik>
Aggregatzustand
Nebenbedingung
Telekommunikation
Physikalismus
Gruppenoperation
Schaltnetz
Interaktives Fernsehen
Zahlenbereich
Transformation <Mathematik>
Kombinatorische Gruppentheorie
Physikalische Theorie
Code
Multiplikation
Quantenkommunikation
Quantisierung <Physik>
Zusammenhängender Graph
Grundraum
Computersimulation
Qubit
Beobachtungsstudie
Fehlererkennungscode
Einfache Genauigkeit
Physikalisches System
Paarvergleich
Grundrechenart
Quick-Sort
Design by Contract
Objekt <Kategorie>
Energiedichte
Flächeninhalt
Dreiecksfreier Graph
Basisvektor
Codierung
Speicherabzug
Computerarchitektur
Vektorpotenzial
Gewichtete Summe
Momentenproblem
Extrempunkt
Compiler
Hamilton-Operator
Homepage
Puls <Technik>
Algorithmus
Zustand
Phasenumwandlung
Schwellwertverfahren
Sichtenkonzept
Hardware
Datennetz
Kategorie <Mathematik>
Güte der Anpassung
Flüssiger Zustand
Rechnen
Kontextbezogenes System
Rhombus <Mathematik>
Dienst <Informatik>
Menge
Datenverarbeitungssystem
Rechter Winkel
Digitalisierer
Orientierung <Mathematik>
Subtraktion
Folge <Mathematik>
Wasserdampftafel
Ordinalzahl
Kubischer Graph
Whiteboard
Loop
Spannweite <Stochastik>
Informationsmodellierung
Arithmetische Folge
Spieltheorie
Diskrete Simulation
Abstand
Maßerweiterung
Datenstruktur
Informatik
Analysis
Anwendungsspezifischer Prozessor
Softwarewerkzeug
Unendlichkeit
Offene Menge
Gamecontroller
Wort <Informatik>
Simulation
Partikelsystem
Resultante
Bit
Resonanz
Prozess <Physik>
Punkt
Gemeinsamer Speicher
Minimierung
Formale Sprache
Gruppenkeim
Kartesische Koordinaten
Computer
Euler-Winkel
Computerunterstütztes Verfahren
Skalarfeld
Übergang
Arbeit <Physik>
Skalierbarkeit
Prozess <Informatik>
Metrologie
Randomisierung
Tropfen
Einflussgröße
Analogieschluss
Algorithmische Programmierung
Divergenz <Vektoranalysis>
Umwandlungsenthalpie
Parametersystem
Nichtlinearer Operator
Klassische Physik
Quantencomputer
Systemaufruf
Quellcode
Frequenz
Algorithmische Programmiersprache
Verknüpfungsglied
Datenfeld
Spektralanalyse <Stochastik>
Pendelschwingung
Message-Passing
Standardabweichung
Aggregatzustand
Nebenbedingung
Web Site
Quader
Ortsoperator
Stab
Physikalismus
Zahlenbereich
Implementierung
Term
Quantenmechanik
Physikalische Theorie
Division
Quantenkommunikation
Luenberger-Beobachter
Inverser Limes
Quantisierung <Physik>
Modelltheorie
Grundraum
Qubit
Beobachtungsstudie
Fehlererkennungscode
Benutzeroberfläche
Linienelement
Einfache Genauigkeit
Physikalisches System
Quick-Sort
Office-Paket
Design by Contract
Objekt <Kategorie>
Flächeninhalt
Hypermedia
Dreiecksfreier Graph
Mereologie
Momentenproblem
Compiler
t-Test
Iteration
Hamilton-Operator
Mikroelektronik
Raum-Zeit
Netzwerktopologie
Fehlertoleranz
Puls <Technik>
Algorithmus
Quantenphasenübergang
Softwaretest
Hardware
Sichtenkonzept
Güte der Anpassung
Gebäude <Mathematik>
Optimierungsproblem
Randwert
Rhombus <Mathematik>
Parametrische Übertragungsfunktion
Menge
Forcing
Rechter Winkel
Datenverarbeitungssystem
Digitalisierer
Beweistheorie
Grundsätze ordnungsmäßiger Datenverarbeitung
Fehlermeldung
Instantiierung
Tabelle <Informatik>
Subtraktion
Stabilitätstheorie <Logik>
Folge <Mathematik>
Wasserdampftafel
Diskrete Gruppe
Äquivalenzklasse
Demoszene <Programmierung>
Virtuelle Maschine
Informationsmodellierung
Netzwerkdatenbanksystem
Knotenmenge
Arithmetische Folge
Perspektive
Spieltheorie
Globale Optimierung
Netzbetriebssystem
Diskrete Simulation
Datentyp
Endogene Variable
LASER <Mikrocomputer>
Softwarewerkzeug
Schlussregel
Verdeckungsrechnung
Digitaltechnik
Gamecontroller
Simulation
Bit
Abstimmung <Frequenz>
Resonanz
Prozess <Physik>
Punkt
Minimierung
Gruppenkeim
Versionsverwaltung
Computerunterstütztes Verfahren
Computer
Übergang
Umweltinformatik
Skalierbarkeit
Font
Prozess <Informatik>
Existenzsatz
Metrologie
Asymmetrie
Einflussgröße
Bildauflösung
Mikrocomputer
Umwandlungsenthalpie
Zentrische Streckung
Parametersystem
Nichtlinearer Operator
Quantencomputer
Optimale Kontrolle
Nummerung
Quellcode
Teilbarkeit
Konfiguration <Informatik>
Arithmetisches Mittel
Verknüpfungsglied
Datenfeld
Spektralanalyse <Stochastik>
Information
Standardabweichung
Telekommunikation
Nebenbedingung
Hausdorff-Dimension
Physikalismus
Interaktives Fernsehen
Zahlenbereich
Implementierung
Quantenmechanik
Term
Physikalische Theorie
Task
Software
Mittelwert
Quantisierung <Physik>
Optimierung
Grundraum
Leistung <Physik>
NP-hartes Problem
Mailing-Liste
Physikalisches System
Quick-Sort
Minimalgrad
Turing-Maschine
Basisvektor
Mereologie
TVD-Verfahren
Stabilitätstheorie <Logik>
Bit
Web Site
Formale Sprache
Gruppenoperation
Zahlenbereich
Implementierung
Term
Mathematische Logik
Raum-Zeit
Richtung
Informationsmodellierung
Einheit <Mathematik>
Skalierbarkeit
Reelle Zahl
Supercomputer
Randomisierung
Translation <Mathematik>
Inverser Limes
Maßerweiterung
Tropfen
Umwandlungsenthalpie
Qubit
Zentrische Streckung
Kontrolltheorie
Ausnahmebehandlung
Physikalisches System
Bitrate
Frequenz
Quick-Sort
Konfiguration <Informatik>
Entscheidungstheorie
Verknüpfungsglied
Flächeninhalt
Menge
Verbandstheorie
Gamecontroller
Codierung
Vielteilchentheorie
Größenordnung
Computerarchitektur
Partikelsystem
Overhead <Kommunikationstechnik>
Message-Passing
Aggregatzustand
Modul <Software>
Fehlermeldung

Metadaten

Formale Metadaten

Titel Panel moderated by Byrd
Serientitel Second International Conference on Quantum Error Correction (QEC11)
Autor Byrd, Mark
Lizenz CC-Namensnennung - keine kommerzielle Nutzung - keine Bearbeitung 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt in unveränderter Form zu jedem legalen und nicht-kommerziellen Zweck nutzen, vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/35314
Herausgeber University of Southern California (USC)
Erscheinungsjahr 2011
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik, Mathematik, Physik

Ähnliche Filme

Loading...
Feedback