Showing posts with label IBM. Show all posts
Showing posts with label IBM. Show all posts

Wednesday, December 6

More Than 1,000 Qubits



One of IBM’s latest quantum processor has improved the reliability of its qubits. Credit: Ryan Lavine for IBM



IBM has unveiled the first quantum computer with more than 1,000 qubits — the equivalent of the digital bits in an ordinary computer. But the company says it will now shift gears and focus on making its machines more error-resistant rather than larger.

For years, IBM has been following a quantum-computing road map that roughly doubled the number of qubits every year. The chip unveiled on 4 December, called Condor, has 1,121 superconducting qubits arranged in a honeycomb pattern. It follows on from its other record-setting, bird-named machines, including a 127-qubit chip in 2021 and a 433-qubit one last year.

Quantum computers promise to perform certain computations that are beyond the reach of classical computers. They will do so by exploiting uniquely quantum phenomena such as entanglement and superposition, which allow multiple qubits to exist in multiple collective states at once.  READ MORE...

Monday, August 28

Vibrations Prevent Quantum Computing Loses


Michigan State University researchers have discovered how to utilize vibrations, usually an obstacle in quantum computing, as a tool to stabilize quantum states. Their research provides insights into controlling environmental factors in quantum systems and has implications for the advancement of quantum technology.




When quantum systems, such as those used in quantum computers, operate in the real world, they can lose information to mechanical vibrations.

New research led by Michigan State University, however, shows that a better understanding of the coupling between the quantum system and these vibrations can be used to mitigate loss.

The research, published in the journal Nature Communications, could help improve the design of quantum computers that companies such as IBM and Google are currently developing.

The Challenge of Isolation in Quantum Computing

Nothing exists in a vacuum, but physicists often wish this weren’t the case. Because if the systems that scientists study could be completely isolated from the outside world, things would be a lot easier.

Take quantum computing. It’s a field that’s already drawing billions of dollars in support from tech investors and industry heavyweights including IBM, Google, and Microsoft. But if the tiniest vibrations creep in from the outside world, they can cause a quantum system to lose information.

For instance, even light can cause information leaks if it has enough energy to jiggle the atoms within a quantum processor chip.

The Problem of Vibrations
“Everyone is really excited about building quantum computers to answer really hard and important questions,” said Joe Kitzman, a doctoral student at Michigan State University. “But vibrational excitations can really mess up a quantum processor.”

However, with new research published in the journal Nature Communications, Kitzman and his colleagues are showing that these vibrations need not be a hindrance. In fact, they could benefit quantum technology.     READ MORE...

Monday, August 21

AI Chip That Works Like a Human Brain


Tech corporation IBM has unveiled a new "prototype" of an analog AI chip that works like a human brain and performs complex computations in various deep neural networks (DNN) tasks.

The chip promises more. IBM says the state-of-the-art chip can make artificial intelligence remarkably efficient and less battery-draining for computers and smartphones.

Introducing the chip in a paper published by IBM Research, the company said: “The fully integrated chip features 64 AIMC cores interconnected via an on-chip communication network. It also implements the digital activation functions and additional processing involved in individual convolutional layers and long short-term memory units.”

Reinventing ways in which AI is computed
The new AI chip is developed in IBM’s Albany NanoTech Complex and comprises 64 analog in-memory compute cores. By borrowing key features of how neural networks run in biological brains, IBM explains that it has embedded the chip with compact, time-based analog-to-digital converters in each tile or core to transition between the analog and digital worlds.

Each tile (or core) is also integrated with lightweight digital processing units that perform simple nonlinear neuronal activation functions and scaling operations, explained IBM in a blog published on August 10.

A replacement for current digital chips?
In the future, IBM's prototype chip could replace the current chips powering heavy AI applications in computers and phones. “A global digital processing unit is integrated into the middle of the chip that implements more complex operations that are critical for the execution of certain types of neural networks,” further said the blog.  READ MORE...

Friday, May 27

The Revolutionary World of Quantum Computers

The inside of an IBM System One quantum computer.  Bryan Walsh/Vox


A few weeks ago, I woke up unusually early in the morning in Brooklyn, got in my car, and headed up the Hudson River to the small Westchester County community of Yorktown Heights. There, amid the rolling hills and old farmhouses, sits the Thomas J. Watson Research Center, the Eero Saarinen-designed, 1960s Jet Age-era headquarters for IBM Research.

Deep inside that building, through endless corridors and security gates guarded by iris scanners, is where the company’s scientists are hard at work developing what IBM director of research Dario Gil told me is “the next branch of computing”: quantum computers.

I was at the Watson Center to preview IBM’s updated technical roadmap for achieving large-scale, practical quantum computing. This involved a great deal of talk about “qubit count,” “quantum coherence,” “error mitigation,” “software orchestration” and other topics you’d need to be an electrical engineer with a background in computer science and a familiarity with quantum mechanics to fully follow.

I am not any of those things, but I have watched the quantum computing space long enough to know that the work being done here by IBM researchers — along with their competitors at companies like Google and Microsoft, along with countless startups around the world — stands to drive the next great leap in computing. Which, given that computing is a “horizontal technology that touches everything,” as Gil told me, will have major implications for progress in everything from cybersecurity to artificial intelligence to designing better batteries.

Provided, of course, they can actually make these things work.

Entering the quantum realm
The best way to understand a quantum computer — short of setting aside several years for grad school at MIT or Caltech — is to compare it to the kind of machine I’m typing this piece on: a classical computer.

My MacBook Air runs on an M1 chip, which is packed with 16 billion transistors. Each of those transistors can represent either the “1” or “0” of binary information at a single time — a bit. The sheer number of transistors is what gives the machine its computing power.

Sixteen billion transistors packed onto a 120.5 sq. mm chip is a lot — TRADIC, the first transistorized computer, had fewer than 800. The semiconductor industry’s ability to engineer ever more transistors onto a chip, a trend forecast by Intel co-founder Gordon Moore in the law that bears his name, is what has made possible the exponential growth of computing power, which in turn has made possible pretty much everything else.  READ MORE...

Monday, February 14

Cognitive Computing


The goal of cognitive computing is to simulate human thought processes in a computerized model. Using self-learning algorithms that use data mining, pattern recognition and natural language processing, the computer can mimic the way the human brain works.

While computers have been faster at calculations and processing than humans for decades, they haven’t been able to accomplish tasks that humans take for granted as simple, like understanding natural language, or recognizing unique objects in an image.


Some people say that cognitive computing represents the third era of computing: we went from computers that could tabulate sums (1900s) to programmable systems (1950s), and now to cognitive systems.

These cognitive systems, most notably IBM ’s Watson, rely on deep learning algorithms and neural networks to process information by comparing it to a teaching set of data. The more data the system is exposed to, the more it learns, and the more accurate it becomes over time, and the neural network is a complex “tree” of decisions the computer can make to arrive at an answer.

What can cognitive computing do?

For example, according to this TED Talk video from IBM, Watson could eventually be applied in a healthcare setting to help collate the span of knowledge around a condition, including patient history, journal articles, best practices, diagnostic tools, etc., analyze that vast quantity of information, and provide a recommendation.  READ MORE.

Wednesday, January 26

Nuclear Quantum Computing


A trio of separate research teams from three different continents published individual papers indicating similar quantum computing breakthroughs yesterday. All three were funded in part by the US Army and each paper appears to be a slam dunk for the future of quantum computing.

But only one of them heralds the onset of the age of nuclear quantum computers.

Maybe it’s the whole concept of entanglement, but for a long time it’s felt like we were suspended in a state where functional quantum machines were both “right around the corner” and “decades or more away.”

But the past few years have seen a more rapid advancement toward functional quantum systems than most technologists could have imagined in their wildest dreams.

The likes of IBM, Microsoft, D-Wave, and Google putting hybrid quantum systems on the cloud coupled with the latter’s amazing time crystal breakthrough have made 2018-2021 the opening years of what promises to be a golden age for quantum computing.

Despite this amazing progress, there are still holdouts who believe we’ll never have a truly useful, fully-functional, qubit-based quantum computing system.

The main reason given by these cynics is usually because quantum systems are incredibly error-prone.  READ MORE...