Is this an actually good explanation? The introduction immediately made me pause:
> In classical computers, error-resistant memory is achieved by duplicating bits to detect and correct errors. A method called majority voting is often used, where multiple copies of a bit are compared, and the majority value is taken as the correct bit
No in classical computers memory is corrected for using error correction not duplicating bits and majority voting. Duplicating bits would be a very wasteful strategy if you can add significantly fewer bits and achieve the same result which is what you get with error correction techniques like ECC. Maybe they got it confused with logic circuits where there’s not any more efficient strategy?
show comments
cwillu
Wow, they managed to make a website that scales everything except the main text when adjusting the browser's zoom setting.
show comments
terminalbraid
Note the paper they are referring to was published August 27, 2024
While I'm still eager to see where Quantum Computing leads, I've got a new threshold for "breakthrough": Until a quantum computer can factor products of primes larger than a few bits, I'll consider it a work in progress at best.
show comments
dangerlibrary
I'm someone not really aware of the consequences of each quantum of progress in quantum computing. But, I know that I'm exposed to QC risks in that at some point I'll need to change every security key I've ever generated and every crypto algorithm every piece of software uses.
How much closer does this work bring us to the Quantum Crypto Apocalypse? How much time do I have left before I need to start budgeting it into my quarterly engineering plan?
show comments
computerdork
Does anyone on HN have a understanding how close this achievement brings us to useful quantum computers?
show comments
bawolff
Doesn't feel like a breakthrough. A positive engineering step forward, sure, but not a breakthrough.
Is this an actually good explanation? The introduction immediately made me pause:
> In classical computers, error-resistant memory is achieved by duplicating bits to detect and correct errors. A method called majority voting is often used, where multiple copies of a bit are compared, and the majority value is taken as the correct bit
No in classical computers memory is corrected for using error correction not duplicating bits and majority voting. Duplicating bits would be a very wasteful strategy if you can add significantly fewer bits and achieve the same result which is what you get with error correction techniques like ECC. Maybe they got it confused with logic circuits where there’s not any more efficient strategy?
Wow, they managed to make a website that scales everything except the main text when adjusting the browser's zoom setting.
Note the paper they are referring to was published August 27, 2024
https://arxiv.org/pdf/2408.13687
While I'm still eager to see where Quantum Computing leads, I've got a new threshold for "breakthrough": Until a quantum computer can factor products of primes larger than a few bits, I'll consider it a work in progress at best.
I'm someone not really aware of the consequences of each quantum of progress in quantum computing. But, I know that I'm exposed to QC risks in that at some point I'll need to change every security key I've ever generated and every crypto algorithm every piece of software uses.
How much closer does this work bring us to the Quantum Crypto Apocalypse? How much time do I have left before I need to start budgeting it into my quarterly engineering plan?
Does anyone on HN have a understanding how close this achievement brings us to useful quantum computers?
Doesn't feel like a breakthrough. A positive engineering step forward, sure, but not a breakthrough.
And wtf does AI have to do with this?