As the quantum computing industry continues to move forward, so does the goal post.
The long-sought goal was to achieve a quantum “superiority”. This shows that quantum computers can solve calculations that conventional computers on Earth cannot do without the need for practical benefits.
Google claimed to achieve that goal Breakthrough scientific paper In 2019, however, IBM expressed particular skepticism. In any case, it was a computer science exercise and had no practical relevance in the real world.
Since Google’s announcement, the industry has stepped up its efforts to achieve a quantum “advantage.” It is defined as achieving a business or scientific advantage by exceeding the computing power of the largest supercomputer in the associated application.
It was certainly more useful than quantum transcendence as a benchmark for comparison and benchmarking. Quantum transcendence is often associated with achieving significant advances in drug discovery, financial transactions, or battery development.
However, Quantum Advantage ignores one important point. Do we really have to wait for a million qubit quantum steampunk golden chandeliers to outperform supercomputers before we consider them meaningful? Or performance performance compared to the units of hardware used in today’s classic computers, such as individual CPUs (Central Processing Units), GPUs (Graphics Processing Units), and FPGAs (Field Programmable Gate Arrays). Should we focus on measuring improvement?
Because a more valuable goal for this still-beginning industry is to achieve the “utility” of quantum, the utility, as soon as possible. Quantum utilities are defined as better quantum systems than traditional processors of comparable size, weight, and power in similar environments.
Those who have scrutinized quantum computing know the enormous impact it has on IT, business, economy, and society. The future of quantum supercomputing mainframes with exponential acceleration, error-corrected cubics, and the quantum internet will be very different from the one we live in today.
That said, like the classic mainframes of the 1960s, quantum mainframes remain large, fragile machines for the foreseeable future, requiring ultra-low temperatures and complex control systems to operate. Even when fully functional, very few quantum mainframes are deployed in supercomputing and cloud computing facilities around the world.
The quantum computing industry should emulate the success of classic computers. With the advent of personal computers in the late 1970s and early 1980s, IBM and other companies were able to sell new models each year, with gradual improvements over previous models. The dynamics of this market are what propelled Moore’s Law.
Quantum computing needs a similar dynamic market to scale and thrive. Investors can’t expect quantum computers to keep giving out money waiting to outperform some supercomputers. The annual release of new, improved and more “convenient” quantum computers provides a revenue guarantee that facilitates the long-term investment needed to unlock the full potential of technology.
Due to the constant supply of quantum systems useful for a variety of applications, if a quantum processor can be placed right next to it, it will line up to process computations on one of the few huge quantum mainframes available in the cloud. There is no reason to wait at. You are integrated with your existing classic system. Your application may need an instant calculation that “quantum in the cloud” cannot be delivered in time. Or, if cloud access isn’t possible, you’ll have to rely on on-premises or on-board computing.
Expanding the concept of quantum utility, we can imagine the following scenario.
- Signal and image processing with autonomous and intelligent technology at the network edge of robots, self-driving cars and satellites.
- Industry 4.0 endpoint applications such as digital twins in manufacturing facilities.
- Distributed network applications such as battlefield situations in defense.
- A classic computing accessory that provides boosts as needed for laptops and other popular devices.
Realizing these quantum “accelerator” applications in the next few years will require room temperature quantum computing in a small form factor. Several approaches have been taken, but the most promising is to use the so-called nitrogen vacancies of diamond to create qubits.
Realization of technology
Room temperature diamond quantum computing works by leveraging an array of processor nodes, each consisting of nitrogen vacancy (NV) centers, or ultra-purity diamond lattice defects, and clusters of nuclear spins. The nuclear spin acts as a computer qubit, and the NV center acts as a quantum bus that mediates the operation between the qubit and its inputs / outputs.
The main reason diamond quantum computers can operate at room temperature is that superhard diamonds act as a kind of quantum mechanical dead space where qubits survive hundreds of microseconds.
Quantum scientists at the University of Stuttgart in Germany have pioneered many diamond quantum computing achievements in algorithms, simulations, error correction, and high-fidelity manipulation. However, due to cubit manufacturing yield and accuracy issues, trying to scale the system beyond a small number of qubits ran into obstacles.
Since then, Australian quantum scientists have discovered ways to address scaling issues, as well as miniaturization and integration of the electrical, optical, and magnetic control systems of diamond quantum computers. Their research allows scale-up of qubit numbers while scaling down the size, weight, and power of diamond quantum systems.
Scientists also found that compact and robust quantum accelerators are large-scale parallel for simulating the molecular dynamics of robotics, autonomous systems, satellite mobile applications, and drug design, chemical synthesis, energy storage, and nanotechnology. Demonstrated that it is possible in the application.
Due to the unique advantages of diamond-based computing, global research activities involving major academic institutions such as the University of Cambridge and Harvard are currently underway. The Australian National University’s diamond-based quantum computing research has moved to the first commercialization stage.
Other types of quantum computing technology that operate at room temperature with relatively small form factors, including trapped ion and cold atom quantum computers, are also advancing. However, they have the requirements of either a vacuum system and / or a precision laser system. A quantum computing start-up has successfully developed a trapped ion system that fits in two server racks. However, it is unclear if these types of systems can be made even smaller.
Readjustment of assumptions
To realize the vision of quantum accelerators, where the industry provides quantum utilities, the technology must be compatible with scalable semiconductor manufacturing processes. This process forms qubits and integrates with a robust, low-maintenance, long-lived control system. As classic computers have shown, the best way to do this is to develop a miniaturized integrated quantum chip.
As with the first transistors in the early days of ubiquitous classical computing in the 1960s, the key technical challenge for achieving a wide range of quantum utilities lies in the manufacture of integrated quantum chips. However, as with traditional computing, this manufacturing makes it easier to use and deploy the device.
Early useful quantum systems have significantly fewer qubits than quantum mainframes, but once the first integrated chips are manufactured, they have the potential to become the focus of the industry and future markets.
The downstream impact is almost unimaginable — there may be no region where room temperature quantum systems exist. Do not It will bring about a fundamental change in how to solve problems. Again, there is a clear message to all product designers, software developers, market forecasters and social observers. It’s time to worry about quantum computing.
In the short term, useful quantum computers will disrupt the supply chain and even the entire value chain on a large scale. Being prepared for its impact means understanding not only the technology, but its economic impact as well. And, of course, technology that runs so fast offers incredible investment opportunities.
The Quantum Utility also means that the future of Quantum can be foreign. Accelerators run side by side with mainframes and can be deployed for a variety of reasons and applications. It promotes the swell of cooperation as opposed to direct competition and accelerates innovation and adoption in the quantum industry.
Why quantum ‘utility’ should replace quantum advantage – TechCrunch Source link Why quantum ‘utility’ should replace quantum advantage – TechCrunch
The post Why quantum ‘utility’ should replace quantum advantage – TechCrunch appeared first on California News Times.