• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • Articles
  • News
  • Events
  • Advertize
  • Jobs
  • Courses
  • Contact
  • (0)
  • LoginRegister
    • Facebook
    • LinkedIn
    • RSS
      Articles
      News
      Events
      Job Posts
    • Twitter
Datafloq

Datafloq

Data and Technology Insights

  • Categories
    • Big Data
    • Blockchain
    • Cloud
    • Internet Of Things
    • Metaverse
    • Robotics
    • Cybersecurity
    • Startups
    • Strategy
    • Technical
  • Big Data
  • Blockchain
  • Cloud
  • Metaverse
  • Internet Of Things
  • Robotics
  • Cybersecurity
  • Startups
  • Strategy
  • Technical

Quantum Computing and Blockchain: Facts and Myths

Ahmed Banafa / 7 min read.
October 28, 2019
Datafloq AI Score
×

Datafloq AI Score: 81.33

Datafloq enables anyone to contribute articles, but we value high-quality content. This means that we do not accept SEO link building content, spammy articles, clickbait, articles written by bots and especially not misinformation. Therefore, we have developed an AI, built using multiple built open-source and proprietary tools to instantly define whether an article is written by a human or a bot and determine the level of bias, objectivity, whether it is fact-based or not, sentiment and overall quality.

Articles published on Datafloq need to have a minimum AI score of 60% and we provide this graph to give more detailed information on how we rate this article. Please note that this is a work in progress and if you have any suggestions, feel free to contact us.

floq.to/C90xL

The biggest danger to Blockchain networks from quantum computing is its ability to break traditional encryption [3].

Google sent shockwaves around the internet when it was claimed, had built a quantum computer able to solve formerly impossible mathematical calculations ‘with some fearing crypto industry could be at risk [7]. Google states that its experiment is the first experimental challenge against the extended Church-Turing thesis also known as computability thesis which claims that traditional computers can effectively carry out any reasonable model of computation

What is Quantum Computing?

Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously [5].

A Comparison of Classical and Quantum Computing

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra. Data must be processed in an exclusive binary state at any point in time or bits. While the time that each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over. In a quantum computer, a number of elemental particles such as electrons or photons can be used with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing [5].

Quantum Superposition and Entanglement

The two most relevant aspects of quantum physics are the principles of superposition and entanglement.

Superposition: Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. According to quantum law, the particle enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1.

Entanglement: Particles that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation. Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated. Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially [5].

Difficulties with Quantum Computers

  • Interference – During the computation phase of a quantum calculation, the slightest disturbance in a quantum system (say a stray photon or wave of EM radiation) causes the quantum computation to collapse, a process known as de-coherence. A quantum computer must be totally isolated from all external interference during the computation phase.
  • Error correction – Given the nature of quantum computing, error correction is ultra-critical – even a single error in a calculation can cause the validity of the entire computation to collapse.
  • Output observance – Closely related to the above two, retrieving output data after a quantum calculation is complete risks corrupting the data.

What is Quantum Supremacy?

According to the Financial Times, Google claims to have successfully built the world’s most powerful quantum computer [7]. What that means, according to Google’s researchers, is that calculations that normally take more than 10,000 years to perform, its computer was able to do in about 200 seconds, and potentially mean Blockchain, and the encryption that underpins it, could be broken.

Asymmetric cryptography used in crypto relies on keypairs, namely a private and public key. Public keys can be calculated from their private counterpart, but not the other way around. This is due to the impossibility of certain mathematical problems. Quantum computers are more efficient in accomplishing this by magnitudes, and if the calculation is done the other way then the whole scheme breaks [3].

It would appear Google is still some way away from building a quantum computer that could be a threat to Blockchain cryptography or other encryption.

“Google’s supercomputer currently has 53 qubits,” said Dragos Ilie, a quantum computing and encryption researcher at Imperial College London.

“In order to have any effect on bitcoin or most other financial systems it would take at least about 1500 qubits and the system must allow for the entanglement of all of them,” Ilie said.

Meanwhile, scaling quantum computers is “a huge challenge,” according to Ilie [1].

Blockchain networks including Bitcoin’s architecture relies on two algorithms: Elliptic Curve Digital Signature Algorithm (ECDSA) for digital signatures and SHA-256 as a hash function. A quantum computer could use Shor’s algorithm [8] to get your private from your public key, but the most optimistic scientific estimates say that even if this were possible, it won’t happen during this decade.

A 160-bit elliptic curve cryptographic key could be broken on a quantum computer using around 1000 qubits while factoring the security-wise equivalent 1024-bit RSA modulus would require about 1500 to 2000 qubits . By comparison, Google’s measly 53 qubits are still no match for this kind of cryptography. According to a research paper on the matter published by Cornell University.

But that isn’t to say that there’s no cause for alarm. While the native encryption algorithms used by Blockchain’s applications are safe for now, the fact is that the rate of advancements in quantum technology is increasing, and that could, in time, pose a threat. “We expect their computational power will continue to grow at a double exponential rate,” Google researchers.

Quantum cryptography?

Quantum cryptography uses physics to develop a cryptosystem completely secure against being compromised without knowledge of the sender or the receiver of the messages. The word quantum itself refers to the most fundamental behavior of the smallest particles of matter and energy.

Quantum cryptography is different from traditional cryptographic systems in that it relies more on physics, rather than mathematics, as a key aspect of its security model.


Interested in what the future will bring? Download our 2023 Technology Trends eBook for free.

Consent

Essentially, quantum cryptography is based on the usage of individual particles/waves of light (photon) and their intrinsic quantum properties to develop an unbreakable cryptosystem (because it is impossible to measure the quantum state of any system without disturbing that system.)

Quantum cryptography uses photons to transmit a key. Once the key is transmitted, coding and encoding using the normal secret-key method can take place. But how does a photon become a key? How do you attach information to a photon’s spin?

This is where binary code comes into play. Each type of a photon’s spin represents one piece of information — usually a 1 or a 0, for binary code. This code uses strings of 1s and 0s to create a coherent message. For example, 11100100110 could correspond with h-e-l-l-o. So a binary code can be assigned to each photon — for example, a photon that has a vertical spin ( | ) can be assigned a 1.

If you build it correctly, no hacker can hack the system. The question is what it means to build it correctly, said physicist Renato Renner from the Institute of Theoretical Physics in Zurich.

Regular, non-quantum encryption can work in a variety of ways but generally, a message is scrambled and can only be unscrambled using a secret key. The trick is to make sure that whomever you’re trying to hide your communication from doesn’t get their hands on your secret key. Cracking the private key in a modern cryptosystem would generally require figuring out the factors of a number that is the product of two insanely huge prime numbers.

The numbers are chosen to be so large that, with the given processing power of computers, it would take longer than the lifetime of the universe for an algorithm to factor their product.

Encryption techniques have their vulnerabilities. Certain products called weak keys happen to be easier to factor than others. Also, Moore’s Law continually ups the processing power of our computers. Even more importantly, mathematicians are constantly developing new algorithms that allow for easier factorization.

Quantum cryptography avoids all these issues. Here, the key is encrypted into a series of photons that get passed between two parties trying to share secret information. The Heisenberg Uncertainty Principle dictates that an adversary can’t look at these photons without changing or destroying them.

In this case, it doesn’t matter what technology the adversary has, they’ll never be able to break the laws of physics, said physicist Richard Hughes of Los Alamos National Laboratory in New Mexico, who works on quantum cryptography [6].

References:

[1] https://www.forbes.com/sites/billybambrough/2019/10/02/could-google-be-about-to-break-bitcoin/#1d78c5373329

[2] https://decrypt.co/9642/what-google-quantum-computer-means-for-bitcoin/

[3] https://www.coindesk.com/how-should-crypto-prepare-for-googles-quantum-supremacy?

[4] https://www.ccn.com/google-quantum-bitcoin/

[5] https://www.linkedin.com/pulse/20140503185010-246665791-quantum-computing/

[6] https://www.linkedin.com/pulse/20140608053056-246665791-understanding-quantum-cryptography/

[7] https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html

[8] https://qudev.phys.ethz.ch/static/content/QSIT15/Shors%20Algorithm.pdf

Categories: Blockchain
Tags: blockchain, encryption, Quantum

About Ahmed Banafa

Prof. Ahmed Banafa has extensive experience in research, operations and management, with focus on IoT, Blockchain, Cybersecurity and AI. He is a reviewer and a technical contributor for the publication of several technical books. He served as an instructor at well-known universities and colleges, including the Stanford University, University of California, Berkeley; California State University-East Bay; San Jose State University; and University of Massachusetts. He is the recipient of several awards, including Distinguished Tenured Staff Award, Instructor of the year for 4 years in a row, and Certificate of Honor from the City and County of San Francisco. He was named as No.1 tech voice to follow, technology fortune teller and influencer by LinkedIn in 2018 by LinkedIn, his researches featured in many reputable sites and magazines including Forbes, IEEE and MIT Technology Review, and Interviewed by ABC, CBS, NBC,BBC, NPR and Fox TV and Radio stations. He is a member of MIT Technology Review Global Panel. He studied Electrical Engineering at Lehigh University, Cybersecurity at Harvard University and Digital Transformation at Massachusetts Institute of Technology (MIT). He is the author of the books: 'Secure and Smart Internet of Things (IoT) using Blockchain and Artificial Intelligence (AI)' , and 'Blockchain Technology and Applications' . Winner of Author & Artist Award 2019 of San Jose State University for "Secure and Smart IoT" Book.

Primary Sidebar

E-mail Newsletter

Sign up to receive email updates daily and to hear what's going on with us!

Publish
AN Article
Submit
a press release
List
AN Event
Create
A Job Post

Related Articles

How to leverage novel technology to achieve compliance in pharma

March 23, 2023 By Terry Wilson

Applications Of Data Science In Decision-Making

March 17, 2023 By vc454071

Workflow Automation For Small Business

March 17, 2023 By yanakhain

Related Jobs

  • Software Engineer | South Yorkshire, GB - February 07, 2023
  • Software Engineer with C# .net Investment House | London, GB - February 07, 2023
  • Senior Java Developer | London, GB - February 07, 2023
  • Software Engineer – Growing Digital Media Company | London, GB - February 07, 2023
  • LBG Returners – Senior Data Analyst | Chester Moor, GB - February 07, 2023
More Jobs

Tags

AI Amazon analysis analytics app application Artificial Intelligence BI Big Data blockchain business China Cloud Companies company costs crypto Data development digital environment experience finance financial future Google+ government information machine learning market mobile Musk news public research security share skills social social media software startup strategy technology twitter

Related Events

  • 6th Middle East Banking AI & Analytics Summit 2023 | Riyadh, Saudi Arabia - May 10, 2023
  • Data Science Salon NYC: AI & Machine Learning in Finance & Technology | The Theater Center - December 7, 2022
  • Big Data LDN 2023 | Olympia London - September 20, 2023
More events

Related Online Courses

  • Black Performance as Social Protest
  • Workplace Culture for Everyone
  • Diversity, Equity, and Inclusion Best Practices for Managers
More courses

Footer


Datafloq is the one-stop source for big data, blockchain and artificial intelligence. We offer information, insights and opportunities to drive innovation with emerging technologies.

  • Facebook
  • LinkedIn
  • RSS
  • Twitter

Recent

  • Microsoft Power BI -The Future of Healthcare’s Most Important Breakthrough
  • The Big Crunch of 2025: Is Your Data Safe from Quantum Computing?
  • From Data to Reality: Leveraging the Metaverse for Business Growth
  • How BlaBlaCar Built a Practical Data Mesh to Support Self-Service Analytics at Scale
  • How Blockchain Technology Can Enhance Fintech dApp Development

Search

Tags

AI Amazon analysis analytics app application Artificial Intelligence BI Big Data blockchain business China Cloud Companies company costs crypto Data development digital environment experience finance financial future Google+ government information machine learning market mobile Musk news public research security share skills social social media software startup strategy technology twitter

Copyright © 2023 Datafloq
HTML Sitemap| Privacy| Terms| Cookies

  • Facebook
  • Twitter
  • LinkedIn
  • WhatsApp

In order to optimize the website and to continuously improve Datafloq, we use cookies. For more information click here.

settings

Dear visitor,
Thank you for visiting Datafloq. If you find our content interesting, please subscribe to our weekly newsletter:

Did you know that you can publish job posts for free on Datafloq? You can start immediately and find the best candidates for free! Click here to get started.

Not Now Subscribe

Thanks for visiting Datafloq
If you enjoyed our content on emerging technologies, why not subscribe to our weekly newsletter to receive the latest news straight into your mailbox?

Subscribe

No thanks

Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

Marketing cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

Please enable Strictly Necessary Cookies first so that we can save your preferences!