Electronic Products & Technology

Quantum computation center set to radically alter our data perceptions

By Bob Sutor, vice-president, IBM Q Strategy & Ecosystem, IBM Research   

Electronics Engineering Software Engineering IoT Supply Chain quantum computing Quantum Computing

Estimated global data usage to continue rising into zettabytes

Every person on the planet will create 1.7MB of data per second by 2020, according to an IDC report. That’s 40 total zettabytes in a year, where a zettabyte is 1 with 21 zeroes following it. For comparison, that same report estimated global data to grow to 2.7 zettabytes – in 2012. All this data will pass through aptly named datacenters currently sprawled across 1.94 billion square feet throughout the globe, including the Arctic Circle and under water (for another comparison: that’s about 25 square miles larger than Vancouver). The concept of how we think datacenters operate and grow is about to radically change with the reality of a completely new kind of computer: quantum computers.

IBM scientists work in the IBM Q computation center at the Thomas J Watson Research Center in Yorktown Heights, New York. The new center houses IBM’s most advanced quantum computers, which are accessible via the IBM Cloud to organizations participating in the IBM Q Network. (Photo by Connie Zhou)

Quantum computers entered the public consciousness in 2016 when a 5-qubit system was put on the cloud for anyone to experiment with by IBM. Today, IBM has several quantum computers, including an open, public 16-qubit system, and a 20-qubit commercial system – housed in a computation center that a dozen companies, universities and US government labs are tapping into for case studies and application research.

I say “computation center,” and not a full-blown datacenter, because quantum computers will be doing computations in coordination with traditional classical computers, which will handle the data tasks. Quantum computers operate in a completely unique way and might accelerate some of the underlying calculations in areas such as AI, for example.

Fragile, powerful: Tapping into a quantum computer

A qubit, short for quantum bit, is the building block of a quantum computer. It’s analogous to a classical computer bit in that it holds information to be processed. However, instead of representing just a 1 or 0, as in a classical system, a qubit’s quantum state can take on other values during operation. Like a classical bit, though, a qubit will be just 0 or 1 when you look at the answer at the end of the computation.

Advertisement

This exponential power to represent such additional states is what excites scientists about qubits – and now developers and business leaders. Just 50 of them can represent more than one quadrillion values, simultaneously. That’s more zeros or ones than any super computer on earth can manage.

Stabilizing a quantum computer’s qubits requires pumping liquid helium into the chips to cool them to 15 milliKelvin, or a fraction above absolute zero, which is colder than outers space. The 2,000-plus parts of the system are protected from the elements inside a pillar-shaped dilution refrigerator that hangs from the lab’s ceiling – it’s definitely not a traditional server. Otherwise, any disturbance or noise from light, movement, or sound will collapse the qubits.

Input in the form of a signal from an algorithm execution, sent by a classical computer, moves through the dilution refrigerator’s microwave lines, down through a mixing chamber that cools and attenuates the signal until it reaches the cryoperm shield at the bottom, where it’s processed by the qubits. Output, what will be measured from the qubits’ calculation, moves back up the system through amplifiers and coaxial cables (at a balmy 4 Kelvin) to be read as ones and zeros by a classical computer, and then sent back through the cloud to a user.

With all settings perfect, qubits are only coherent, or available for access and measurement, for about 100 microseconds. In other words, we have less than one-three-hundredth of an eye blink to ask this exponentially smart device a question. And the answer can only be a “yes” or a “no.”

These early days of quantum computers hearken back to room-sized mainframe that were large and fragile. As we know now, they were also full of potential. So, before you decide to wait for scientists to eliminate decoherence and make fault-tolerant qubits, know that these complex, cold systems with brief execution times can still offer “quantum advantages” worth exploring.

That’s why 70,000 users have signed up to run more than two million experiments on the IBM Q Experience – and, to date, have published at least 50 papers based on those experiments. It’s also why a dozen Fortune 500 companies, academic institutions, and US government labs signed up with the IBM Q Network. They want to be “quantum ready” with capabilities and applications for their industry that offer advantages over classical computers.

Getting quantum ready means developing quantum algorithms. Last year, our scientists modeled the largest molecule using a quantum computer with algorithms running on 7-qubit systems. Today, IBM Q Experience users have access to a 16-qubit system, while IBM Q Network clients have access to 20 qubits. IBM scientists also announced that they are working with a prototype 50 qubit machine. Users can write their own algorithms and test them via the cloud. For example, one professor created a quantum battleships game, while a major financial company is working on quantum applications in trading strategies, portfolio optimization, asset pricing and risk analysis.

Quantum power is in a bigger ecosystem, not a bigger building

IBM Q’s computation center takes up about 2,000 square feet. That’s a far cry from the millions of square feet that datacenters occupy. As we improve the technology and add machines, our footprint will likely grow, but our bigger focus is on the growth in our ecosystem of users. Today, 1,500 universities, 300 high schools, and 300 private institutions use the IBM Q Experience in their curricula. We’re even offering institutions, students, and developers monetary awards for the best course materials, tutorials, and papers, among other creative ideas. Our IBM Q Network partners, members, and hubs are working with us, and their own clients on use cases for their industries.

What if everyone in the 1950s had had 5 to 10 years to prepare for the mainframe, from hardware to programming, while they were still prototypes? In hindsight, we can all see that jumping in early would have been the right decision. That’s where we are with quantum computing today. Now is the time to begin exploring what we can do with quantum computers across a variety of potential applications. Those who delay years until the day a fault-tolerant quantum machine appears might risk falling behind on the shorter term benefits we are starting to discover today.


Quantum doesn’t follow the (Moore’s) law

Quantum computers will not pick up the Moore’s Law torch when today’s classical computers’ processors can no longer keep doubling in computational power on a regular basis. Qubits’ exponential ability to calculate in ways intractable by even supercomputers will still require being understood by a classical computer. So, yes quantum computers may soon solve previously impossible problems, but they will also work alongside classical computers for the foreseeable future.

Advertisement

Stories continue below

Print this page

Related Stories