Jump to content

Google’s Quantum chip Willow


MalibuSheriff

Recommended Posts

I'm not a quantum computing person, but I've read a few white papers and understand what they stand to solve. The promise of quantum computing is so probabilistically solve NP-hard problems in constant time.

An NP-hard problem is one where its very easy to check the solution's correctness, but extremely hard to find a correct solution. This concept is what underlies cryptography and our current paradigm of encryption (and yes I know there's "quantum-safe" protocols now) in that if you have the right key, you know very easily that you do. And if you don't have the right key, you're almost certainly NOT going to guess the right one.

That's sort of what the announcement gestures at, in that it solved a problem in minutes that would otherwise take 10,000,000,000,000,000,000,000,000 years. 

But this is all just research in a lab with ideal problems, not generalized solutions to large scale problems. As far as we know at least lol.

Definitely interesting in the world of computer science, though.

  • Hook 'Em 1
  • Like 1
Link to comment
Share on other sites

Grade A Bullshit.

The tweet says  "a breakthrough that can reduce errors exponentially as we scale up using more qubits, cracking a 30-year challenge in the field"

Breaking that down: 

The "30-year challenge" is the fundamental problem in quantum computing that "qubits", the rough quantum computing equivalent of a bit, are unreliable and prone to failure.

https://www.livescience.com/technology/computing/google-willow-quantum-computing-chip-solved-a-problem-the-best-supercomputer-taken-a-quadrillion-times-age-of-the-universe-to-crack

Quote

Quantum computers are inherently "noisy," meaning that, without error-correction technologies, every one in 1,000 qubits — the fundamental building blocks of a quan computer — fails.

It also means coherence times (how long the qubits can remain in a superposition so they can process calculations in parallel) remain short. By contrast, every one in 1 billion billion bits fails in conventional computers.

 

The breakthrough, and maybe it is a breakthrough,  is that they use some type of array to error correct and the error correction gets better as the array gets bigger.  Sounds great, right?  Until you read deeper and realize the scale of what they tested:

This image is from the Google blog: https://research.google/blog/making-quantum-error-correction-work/

QuantumHW1-Qubits.width-1250.png

 

Yep, you're reading that right, the breakthrough research is with 17 qubits, 49 qubits, and 97 qubits.

The entire Willow chip has a grand total of 105 qubits.  Not exactly ready to take on any conventional computers.

 

As for the claims of solving a problem that would take a conventional computer a lifetime. Almost certainly bullshit.  I don't care enough to dig into what they are talking about here, but just about every other time a similar claim has been made for quantum computing, someone looks at it and realizes they have the conventional computer doing some unoptimized brute force type algorithm. A quick rewrite to a good algorithm and it crushes the quantum computer.  Or, another one I've seen, is the "beat the conventional computer" is based on a simulation of what could happen if you built the quantum computer scaled up by a million times.   

 

  • Hook 'Em 1
  • Like 1
Link to comment
Share on other sites

The error reduction breakthrough is a big, big deal. What was holding quantum computing back was you couldn't scale it because of the increasing errors.  Google is basically saying they've figured that out and can now begin building large scale quantum computers.

They are now saying the larger the system (qubits) the more accurate results will actually become, where previously more qubits meants less accurate results.

Edited by MonkeyDoughnut
  • Like 1
Link to comment
Share on other sites

The Bullshit spigot has fully opened.

 

The Headline :Google says its new quantum chip indicates that multiple universes exist

The Link: https://www.yahoo.com/finance/news/google-says-quantum-chip-indicates-192059739.html

The Money Quotes:

Quote

Google Quantum AI founder Hartmut Neven wrote in his blog post that this chip was so mind-boggling fast that it must have borrowed computational power from other universes.

 

Quote

Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

 

The benchmark that they claim proves a multiverse is the same crap they've been pushing for years. It's called "Random Circuit Sampling" (RCS) and is a made up task of "sampling from a random quantum circuit".   Yes, that's right the benchmark requires a quantum circuit.   Classical computers don't have quantum circuits to sample from, so they have to simulate the quantum circuit and do stupid shit like generate all possible outcomes.   It's as stupid as if you wanted to benchmark an Apple computer reading from an Apple branded disk against a PC.  But since PC's don't have Apple branded disks, you make the PC simulate an entire Apple computer including MacOS and the Apple disk drive.   Everyone would laugh at that, so they wrap the RCS test in layers upon layers of PhD level Math and Physics.

 I was struggling to describe RCS, so I asked ChatGPT about it:

Quote

Why It's Not True Benchmarking

True benchmarking involves comparing systems based on their ability to perform the same task under the same constraints, often with practical applications. RCS differs because:

  1. Specialized Task: The task (sampling from a random quantum circuit) is designed to favor quantum systems and has no immediate practical application.
  2. Asymmetric Requirements:
    • The quantum computer generates samples quickly but doesn’t output the full probability distribution.
    • The classical computer is tasked with verifying these samples or simulating the distribution, which is exponentially harder.
  3. No Practical Use Case: Unlike traditional benchmarks, which assess performance on tasks like sorting or matrix multiplication, RCS is not solving a real-world problem.

 

  • Hook 'Em 1
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...