Highest Rated Comments
qiskit85 karma
This is quite difficult, we won't know for sure until the technology gets there, but I imagine physics / chemistry simulations will probably have the biggest positive effect on individuals. There are thousands of proposed applications to all sorts of different cases but it's difficult at a glance to know which will actually be useful and which are just noise.
I don't know much about AI/ML and what challenges it currently faces, but there is definitely a lot of research in that area. You might be able to read more about some of the algorithms here.
-- Frank
qiskit48 karma
I'm still more of a scientist than a corporate shill (or at least I hope I am) so I'm still very interesting in seeing multiple approaches being taken. There's still a lot of fundamental physics to discover from learning how to isolate and manipulate a good qubit from different quantum systems.
At IBM Research we also maintain enough curiosity to look into other approaches, at least in a more limited way. We are part of a consortium in Switzerland (I'm at IBM Research - Zurich) that looks into spin qubits, for example.
Nevertheless, superconducting qubits show great potential for scalability and allow us to commit to a firm roadmap for building ever bigger devices. So they are definitely our main focus.
--James
qiskit43 karma
I know this will be unsatisfying but I honestly can't tell. At some point, we can guarantee some kind of speed up for things like protein folding through Grover's algorithm, but the computers will need to have millions of error-corrected logical qubits to out-perform current classical computers (unless some smart people work out a better quantum algorithm).
There are a lot of heuristic quantum algorithms coming out that claim to work better on noisy devices, but many have no guaranteed speedup, so we don't know for sure until we run them on a big enough computer if they're actually useful.
Something I forgot to mention was that random number generation could also be a contender for an early use-case (and I actually have a source for this one).
-- Frank
qiskit36 karma
Quantum Turing machines do exist, and James wrote this nice piece about how we prove certain sets of operations are universal. If you pick a universal set of quantum gates, that would be your most basic, complete set of instructions. In this model, the inputs and outputs are qubits.
-- Frank
qiskit174 karma
I think that a lot of people don't think too deeply about what normal computers are, so it's good to start there.
At a simple level, a computer takes in information, processes it and then spits out other information.
In a normal digital computer, that information is expressed in binary: so rewritten in an alphabet of zeros and ones. The processing is done by algorithms, but these are all compiled down to simple operations at the level of bits. Such as the AND operation, which checks if two bits are both 1, and outputs a 1 if they are. With lots of these basic 'atoms of computation', as I call them, you get a computer.
The first section of our textbook is called 'The Atoms of Computation' for this reason, and aims to show that a lot of these basic facts are true for quantum computers too. They are still about processing information expressed in binary.
The difference is that the bits are implemented in a different way. They are not just some simple voltage that is high or low. Instead they are implemented with a an object that obeys quantum mechanics. Various objects can be used (we use superconducting circuits at IBM Quantum). The important thing is just that it is a system described by quantum mechanics, and that you can find a pair of states to store a 0 and 1.'
The fact that it is described by quantum means that you have more basic manipulations that you can compile your algorithm down to. You can make use of quantum effects such as superposition and entanglement. And its been shown that this allows algorithms to be designed in completely different ways, and solve some problems much much faster than conventional computers could ever dream of.
I've gone well beyond your brief of being brief now, so I'm going to stop. Hope that helps!
--James
edit: boolean logical error
View HistoryShare Link