Intel recently announced the delivery of a 17-qubit superconducting test chip, for quantum computing, to QuTech, Intel’s quantum research partner in the Netherlands. The performance in terms of computing power isn’t a revelation, but these are the early days of something rigorous and robust.
Quantum computers are able to solve problems that are out of reach of the traditional computers, both in terms of complexity of processing speed. The new technology is for example able to simulate nature for research in the fields of chemistry, materials science and molecular modeling.
The creation of stable and uniform qubit is not easy , and it represents a challenge for the immediate future. The qubits are fragile, and huge amounts of data can easily get lost. For this reason it is necessary to operate at 20 milli-kelvin, temperature 250 times colder compared to the deep space.
That isn’t easy when the field of computing they’re attempting to enter is largely theoretical. That’s why partners like QuTech, a research institute under TU Delft, are essential. Intel isn’t short on R&D resource, but a dedicated facility under a major technical university, is likely more fertile ground for this kind of project to work.
The basic relationship is that Intel makes the chips, QuTech tests them with the latest algorithms, models, and instruments. They turn around and say something like “that was great, but we’ll need at least 14 qubits to do this next thing, and we saw a lot of interference under such and such conditions.” Intel jots it down and a few months later (there’s no set timeline), out comes a new one, and the cycle repeats.
There’s a long way to go in the quantum computing world, but it’s a no-brainer for companies like Intel to bet on the concept; its billions of dollars in infrastructure serve excellently for collateral. The possibilities are endless and the temperature resistance proves that it will be most fruitful for NASA in the deep space.