Utilising Quantum Entanglement

Quantum entanglement is a phenomenon whereby two or more quantum systems become connected so that the state of one system can affect the state of the other(s), even when separated by large distances. Classical physics does not explain this connection, or “correlation,” between the systems. 

It has been a subject of intense study and debate since it was first proposed by Albert Einstein, Boris Podolsky, and Nathan Rosen in 1935. Quantum entanglement is one of the fundamental properties behind quantum computing, and its potential impact on finance, business, and society is excellent. 

We will discuss the history of this scientific field, ongoing research, how entanglement relates to the exciting field of quantum computing, and how these technologies may solve some essential and potentially fundamental questions as we advance.

What Is Quantum Entanglement?

The concept of quantum entanglement was born in the 20th century during the atomic age. One of the first and most famous examples of quantum entanglement is the Einstein-Podolsky-Rosen (EPR) paradox. 

In this thought experiment, two particles are created at the same point in space and separated considerably. The spin state of one particle is measured, and it is found to be ‘up’. According to classical physics, the other particle’s spin state should also be ‘up’ or ‘down’ with a 50-50 chance. 

This graphic, courtesy of NASA/JPL-Caltech, is intended to explain “entangled particles.” Alice and Bob represent photon detectors, which were developed by the Jet Propulsion Laboratory (under NASA) and the National Institute of Standards and Technology.

However, in quantum mechanics, the state of the second particle is instantaneously affected by the measurement of the first particle, meaning that its spin state is also ‘up’. This concept is known as ‘spooky action at a distance’ and was considered by Albert Einstein to be a flaw in quantum mechanics.

This graphic, courtesy of NASA/JPL-Caltech, is intended to explain ‘entangled particles’. Alice and Bob represent photon detectors developed by the Jet Propulsion Laboratory (under NASA) and the National Institute of Standards and Technology.

Back in 1964

The phenomenon of quantum entanglement was first experimentally demonstrated by physicist John Bell in 1964. Bell proposed an inequality, now known as Bell’s inequality, which stated that certain measurements of entangled particles would always produce specific results if classical physics were correct. 

However, experiments have repeatedly shown that the results of these measurements violate Bell’s inequality, proving the existence of quantum entanglement.

One of the most critical implications of quantum entanglement is the concept of quantum teleportation. Quantum teleportation is the process of transferring the state of a quantum system from one location to another without physically moving the system. For example, in 1993, physicist Charles Bennett and his team successfully teleported a photon’s state over a few metres. 

Since then, scientists have been able to teleport the state of atoms, ions, and even larger objects over increasingly larger distances, both in and outside the laboratory. Quantum entanglement has also been used to create highly secure forms of encryption

For example, in a quantum key distribution (QKD) process, two parties can communicate securely by sharing a secret key encoded in entangled particles. Furthermore, because any attempt to measure the state of these particles will alter it, any third party attempting to intercept the communication can be detected. We’ll discuss the implications of QKD a bit more in a subsequent section.

The Potential of Quantum Entanglement

Several research areas are looking at the potential uses of quantum entanglement for practical applications. One possible application of quantum entanglement is in the field of quantum sensing

Quantum sensors use entanglement properties to measure physical phenomena with unprecedented accuracy. For example, quantum sensors can measure temperature, pressure, and acceleration more precisely than classical sensors. Quantum sensors can also detect faint signals, such as gravitational waves, that are otherwise difficult to detect.

Quantum entanglement is also being researched as a possible technology for quantum communication. A quantum communication network would use entangled particles to transmit information, making the communication highly secure and resistant to eavesdropping.

In medicine, quantum entanglement is being researched to develop quantum-based diagnostic tools. 

For example, in a study published in the journal Nature Communications, researchers from China proposed a new method to detect cancer cells using entangled photons; this technique is highly sensitive, non-invasive, and could be used for early cancer diagnosis.

Quantum Entanglement and Quantum Computing

Quantum entanglement and quantum computing are closely related. Quantum computing relies on the properties of quantum mechanics, including superposition and entanglement, to perform certain types of calculations much faster than their classical computer counterparts. In a quantum computer, information is stored in qubits, which exist in superposition and entanglement. Using its qubits, a quantum computer can solve specific complex mathematical problems, such as factoring large numbers, that would take a classical computer an impractical amount of time.

Quantum communication (introduced above) is also being incorporated into quantum computing. Quantum communication allows qubits to be shared between different locations, which is necessary for the distributed nature of quantum computing. This enables quantum computing to be performed on a large scale, with many qubits distributed across different locations, allowing for more powerful quantum algorithms.

Possible Uses of Quantum Entanglement

Quantum entanglement has the potential to be used in several ways, in finance, business, healthcare, and more.

  1. Quantum cryptography. Quantum key distribution (QKD) allows two parties to communicate securely by sharing a secret key encoded in entangled particles. QKD could be used to protect sensitive financial transactions such as online banking or stock trading. On the other hand, quantum computers could also break many current encryption algorithms to protect sensitive information. This code-breaking could have significant implications for online transactions, medical records, and communications security.
  2. Quantum computing. Quantum computers could be used to solve complex optimization problems in finance, such as portfolio optimisation or risk management.
  3. Quantum machine learning. Quantum machine learning (QML) is a field that combines the power of quantum computing with machine learning algorithms. QML could be used to analyze large sets of financial data, such as stock market trends, and make more accurate predictions, which could have applications in fields beyond finance, such as healthcare and transportation.
  4. Quantum internet. The idea of a quantum internet is based on using quantum entanglement to transmit information in a highly secure way. This could be used to create a new kind of internet that would be highly resistant to hacking, which could be important for financial institutions that must protect sensitive information.
  5. Quantum random number generation. Quantum entanglement can generate truly random numbers, which could be used to generate secure encryption keys for financial transactions and encode sensitive information.
  6. Drug discovery. Quantum computers could be used to simulate the behaviour of molecules. This ability could accelerate the drug discovery process and make it more efficient, produce better health outcomes for many patients, and prevent the growing problem of antibiotic resistance.
  7. Optimization problems. Quantum computers could solve specific optimization problems faster than classical computers. Such optimization could have applications in logistics, finance, and energy management.
  8. Quantum simulation. Quantum computers could simulate the behaviour of quantum systems with high accuracy. This simulation could study the properties of materials, predict the behaviour of complex systems, and understand the properties of fundamental particles such as quarks and gluons.
  9. Quantum chemistry. Quantum computers can be used to simulate the behaviour of chemical compounds and predict their properties, which could help speed up the discovery of new materials, catalysts, and drugs.

It’s important to note that these are only potential applications, but all show great promise, and most are currently in the research and development stages. It will likely take some time before they become practical technologies. However, many overlap, and if one were to be solved, a host of other applications would soon follow.  

Closing Thoughts

Quantum entanglement is a mysterious and fascinating phenomenon. It has already been used to create highly secure forms of encryption and to teleport the state of quantum systems over large distances. The study of quantum entanglement continues to be an active area of research, with scientists working to uncover its true nature, properties, and potential uses. 

Despite its many potential applications, the true nature of quantum entanglement still needs to be fully understood. Some theories propose that entanglement is a fundamental aspect of the universe, while others suggest that it results from more complex interactions between particles. 

Whatever the truth behind quantum entanglement, scientists will continue to research how it and the quantum computers built on its principles can solve some of the most challenging problems and questions we are currently asking. Suppose it does live up to the hype. In that case, quantum entanglement could prompt a new technological age and fundamentally alter our understanding of physics. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. Using this, you agree that the information does not constitute investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

​​Data and Machines of the Future

As we move toward our future, we increasingly notice two concepts that have always been at odds: data and computing power. This rift runs as follows: we have more data than we can process, while much of that data remains subpar for processing. 

Data and computing power have been such that the data we have has always been more than the data we can process, and the data we have is not always the best data to be processing. We are reaching the point where these two issues are starting to blur.

First, we are creating computers that have the ability to process the vast amounts of data that we are now creating. Second, we are creating synthetic data that may not be “real.” However, if it’s “authentic,” the users may prefer it. Let’s discuss these two topics and how they will interact in the future.  

Rise of the Machines

A new class of computers is emerging that stretches the boundaries of problems that they have the capability to solve. These new devices from three defined areas are pushing aside the limits of Moore’s Law and creating a new computing capability curve. 

Companies and industries have always been defined by their limitations, the currently unsolvable problems. However, these new machines may help companies solve and move beyond the presently unsolvable.

These ongoing challenges define the boundaries of companies and their core products, services, and overall strategies at any time. For decades, the financial services industry has operated under the assumption that predicting the movement of the stock market and accurately modeling market risk is either hard or impossible to do, but it may not be in the near future.  

When combined, there are emerging technologies that can potentially make these core challenges achievable. With quantum computing as the next level of problem-solving, combined with high-performance computers (HPCs) or massive parallel processing supercomputers (MPPSCs), the ability to use never-before-seen swaths of data becomes possible. 

As business leaders, we must create partnerships and inroads to understanding the latest technological developments in the computing field and in our industry at large. This creative process includes experimentation and the design of a skills pipeline that will lead to future success.  

New Data Types

With the increases in chatbots, augmented reality (AR), and synthetic data (including deep fake audio, images, and video), we are forced to evaluate what is “real” and what is not. When we see news of the latest global issue, we want to know that it is real, but do we care if the newest advertisement for our favorite snack is? 

We may even prefer the unreal. Say we are discussing a sensitive health issue with a synthetic (i.e. AR) nurse or we are training an AI using synthesized data that is designed to remove historical discrimination–unreal may be the preference. 

As technology progresses, we will shift from a desire for the real to a desire for the authentic, and authenticity is defined by four foundational measures:

1.     Provenance. What is the source of the data?

2.     Policy. How has the data been restricted?

3.     People. Who is responsible for the data?

4.     Purpose. What is the data trying to accomplish?

Synthetic data aims to correct data bias, protect data privacy, and make AI algorithms more fair and secure. Synthetic content helps design more seamless experiences and provide novel interactions with AI that saves on time, energy, and reduced costs. However, the use of synthetic data will be complex and controversial.  

Data and Computing

High performance is a growing necessity. IDC reported that 64.2 zettabytes (ZB) of data was created or replicated in 2020, and this is expected to triple to 180ZB by 2025. Only 10.6% of the 2020 data was useful for analysis; and of that, only 44% was used.  

The answer to dealing with this massive issue is through high-performance computing (HPC), and while the field of HPC is not new, the computing potential has expanded. The smartphones of today contain the processing power of supercomputers three decades ago. 

Now, GPUS, ASICs, and other purpose-built processors, like D1 Dojo chips specifically for computer vision neural networks, and which will be the foundation of autonomous driving tech, are pushing HPC capability to new levels.

The Unreal World

Another issue for the future is the unreal. Dealing with a call center that has a bot that does not understand your request is maddening. But the potential for AI and its use is already becoming indispensable in business. It constantly improves, and what was once a “differentiator” for businesses has now become a necessity

Synthetic data is being used for AI model training in cases where real-world data cannot apply. This “realish” yet unreal data can be shared, protecting confidentiality and privacy while maintaining statistical properties. Further, synthetic data can counter human-born biases to increase diversity.

However, synthetic data comes with significant challenges. The world of deep fakes and disinformation is causing predictable damage, and the use of AI algorithms in social media creates echo chambers, filter bubbles, and algorithmic confounding that can reinforce false narratives. 

New Computing Technologies

Quantum Computing

While HPCs and supercomputers are able to process more data, they’re simply new versions of the same old stuff. 

The next generation of computer evolution will likely be when quantum computers begin to solve problems that we consider obdurate. Quantum research is still in its infancy but is likely to follow an exponential curve. 

The estimated number of qubits needed to crack the current level of cybersecurity is several thousand, and the devices that are being designed by IBM and Google have reached an announced 127 while others have claimed to reach 256 qubits. However, this is up from 53 for IBM and 56 for Google in 2019. 

A doubling every two years sounds like Moore’s law. However, Moore’s law is not the same for quantum computing. Qubits’ property of entanglement means that by adding one more qubit to a quantum system, you double the information the quantum system can compute. The move from 53 to 127 means computing power has doubled 74 times in just three years.  

Mimicking and Using Nature

The other technology that is reshaping computing is taking lessons from nature. Biology-inspired computing takes its ideas from a 3.7-billion-year-old system. There are two subclasses of biocomputing:

1.     Biomimicry, or computing systems that draw their inspiration from biological processes.

2.     Biocomputing, or systems that use biological processes to conduct computational functions.  

Biomimicry systems have been used in chip architectures and data science algorithms.  However, we are now beginning to see machines that are not only mimicking biological operations but are leveraging biological operations and processes. 

Data storage is a biocomputing darling for a good reason. Based on natural DNA found on Earth, one estimate predicts that an exabyte of data (1 million Terabytes) could be stored in one cubic centimeter of space and has the potential to persist for over 700,000 years.  

Moving Forward

How do businesses incorporate new forms of data and new ways of computing into practice? 

The first action is to begin evaluating how these different technologies will shape the industry and your operations. What are the problems that are considered a cost of doing business, and what would change if these problems could be solved? How can synthetic data be used to improve your current business functions, and what things need to be looked out for that could have a negative impact? What kind of machines could affect your business first?

Those who desire to take an active role in shaping the future should consider what hardware can be used to solve the currently unsolvable.  

No matter the industry, the critical step of forging partnerships is essential. Most businesses have skills and capabilities that they can gain from such partnerships, and many industry problems require comprehensive scale collaboration.  

Alliances and partnerships formed today will be the outliers of the industry tomorrow.

Closing Thoughts

We have always been defined by our unanswerable questions, and the advent of computers has helped us to solve grand challenges. We are also facing a synthetic, unreal world that is intended to improve our lives, but–depending on the user’s intent, such data and its progeny can be a tool of malicious intent.

Both of these concepts are at the point where business leaders must consider them no longer abstract. They’re rapidly improving, and their impacts on industries will be profound in the coming decade. The unsolvable will be a thing of the past, and what we believe is real will come into question.  

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

Intercontinental Quantum Cryptography Is Now Here to Stay

Yet a year later, it became part of the first satellite-to-ground quantum network. In 2018, it found itself part of the first intercontinental quantum cryptography service. This was achieved via a videoconference between China and Austria, which was secured by using the Advanced Encryption Standard (AES). 

This demonstrated secure communication on a global scale. It laid out the groundwork for a future quantum internet. 

As MIT Technology Review stresses, many organizations await the commercial availability of this type of secure communication. When will this happen? In the world of COVID-19, chances are sooner rather than later. 

Source: https://www.technologyreview.com/2018/01/30/3454/chinese-satellite-uses-quantum-cryptography-for-secure-video-conference-between-continents/

design and development by covio.fr