Showing posts with label Quantum Coherence. Show all posts
Showing posts with label Quantum Coherence. Show all posts

Friday, September 29

Boosting Quantum Devices


MIT physicists, inspired by noise-canceling headphones, have advanced the coherence time of quantum bits by 20-fold, marking significant progress for quantum computing. The team used an “unbalanced echo” technique to counteract system noise, and they believe further improvements are possible. This breakthrough has vast potential, from quantum sensors in biology to advancements in quantum memory.



MIT researchers develop a protocol to extend the life of quantum coherence.

For years, researchers have tried various ways to coax quantum bits — or qubits, the basic building blocks of quantum computers — to remain in their quantum state for ever-longer times, a key step in creating devices like quantum sensors, gyroscopes, and memories.

A team of physicists from MIT have taken an important step forward in that quest, and to do it, they borrowed a concept from an unlikely source — noise-canceling headphones.


Led by Ju Li, the Battelle Energy Alliance Professor in Nuclear Engineering and professor of materials science and engineering, and Paola Cappellaro, the Ford Professor of Engineering in the Department of Nuclear Science and Engineering and Research Laboratory of Electronics, and a professor of physics, the team described a method to achieve a 20-fold increase in the coherence times for nuclear-spin qubits. 

The work is described in a paper published in Physical Review Letters. The first author of the study is Guoqing Wang PhD ’23, a recent doctoral student in Cappellaro’s lab who is now a postdoc at MIT.


“This is one of the main problems in quantum information,” Li says. “Nuclear spin (ensembles) are very attractive platforms for quantum sensors, gyroscopes, and quantum memory, (but) they have coherence times on the order of 150 microseconds in the presence of electronic spins … and then the information just disappears. 

What we have shown is that, if we can understand the interactions, or the noise, in these systems, we can actually do much better.”  READ MORE...

Friday, May 27

The Revolutionary World of Quantum Computers

The inside of an IBM System One quantum computer.  Bryan Walsh/Vox


A few weeks ago, I woke up unusually early in the morning in Brooklyn, got in my car, and headed up the Hudson River to the small Westchester County community of Yorktown Heights. There, amid the rolling hills and old farmhouses, sits the Thomas J. Watson Research Center, the Eero Saarinen-designed, 1960s Jet Age-era headquarters for IBM Research.

Deep inside that building, through endless corridors and security gates guarded by iris scanners, is where the company’s scientists are hard at work developing what IBM director of research Dario Gil told me is “the next branch of computing”: quantum computers.

I was at the Watson Center to preview IBM’s updated technical roadmap for achieving large-scale, practical quantum computing. This involved a great deal of talk about “qubit count,” “quantum coherence,” “error mitigation,” “software orchestration” and other topics you’d need to be an electrical engineer with a background in computer science and a familiarity with quantum mechanics to fully follow.

I am not any of those things, but I have watched the quantum computing space long enough to know that the work being done here by IBM researchers — along with their competitors at companies like Google and Microsoft, along with countless startups around the world — stands to drive the next great leap in computing. Which, given that computing is a “horizontal technology that touches everything,” as Gil told me, will have major implications for progress in everything from cybersecurity to artificial intelligence to designing better batteries.

Provided, of course, they can actually make these things work.

Entering the quantum realm
The best way to understand a quantum computer — short of setting aside several years for grad school at MIT or Caltech — is to compare it to the kind of machine I’m typing this piece on: a classical computer.

My MacBook Air runs on an M1 chip, which is packed with 16 billion transistors. Each of those transistors can represent either the “1” or “0” of binary information at a single time — a bit. The sheer number of transistors is what gives the machine its computing power.

Sixteen billion transistors packed onto a 120.5 sq. mm chip is a lot — TRADIC, the first transistorized computer, had fewer than 800. The semiconductor industry’s ability to engineer ever more transistors onto a chip, a trend forecast by Intel co-founder Gordon Moore in the law that bears his name, is what has made possible the exponential growth of computing power, which in turn has made possible pretty much everything else.  READ MORE...