AI’s Transformation of Oncology

Artificial intelligence (AI) is constantly reshaping our lives. It saves companies and us time and money, but it has applications that can be applied to medicine, potentially saving our lives. 

We can understand AI’s evolution and achievements to model future developmental strategies. One of AI’s most significant medical impacts is already being seen in and will continue in oncology. 

AI has opened essential opportunities for cancer patient management and is being applied to aid in the fight against cancer on several fronts. We will look into these and see where AI can best aid doctors and patients in the future. 

Where Did AI Come From?

Alen Turing first conceived the idea of computers mimicking critical thinking and intelligent behavior in 1950, and by 1956 John McCarthy came up with the term Artificial Intelligence (AI). 

AI started as a simple set of “if A then B” computing rules but has advanced dramatically in the years since, comprising complex multi-faceted algorithms modeled after and performing similar functions to the human brain.

AI and Oncology

AI has now taken hold in so many aspects of our lives that we often do not even realize it. Yet, it remains an emerging and evolving model that benefits different scientific fields, including a pathway of aid to those who manage cancer patients.  

AI has a specific task that it excels at. It is especially good at recognizing patterns and interactions after being given sufficient training samples. It takes the training data to develop a representative model and uses that model to process and aid decision-making in a specific field

When applied to precision oncology, AI can reshape the existing processes. It can integrate a large amount of data obtained by multi-omics analysis. This integration is possible because of advances in high-performance computing and several novel deep-learning strategies. 

Notably, applications of AI are constantly expanding in cancer screening and detection, diagnosis, and classification. AI is also aiding in the characterization of cancer genomics and the analysis of the tumor microenvironment, as well as the assessment of biomarkers for prognostic and predictive purposes. AI has also been applied to follow-up care strategies and drug discovery.  

Machine Learning and Deep Learning

To better understand the current and future roles of AI, two essential terms fall under the AI umbrella that must be clearly defined: machine learning and deep learning.

Machine Learning

Machine learning is a general concept that indicates the ability of a machine (a computer) to learn and therefore improve patterns and models of analysis.  

Deep Learning

On the other hand, deep learning is a machine learning method that utilizes algorithmic systems that mimic a system of biological neurons called deep networks. When finalized, these deep networks have high predictive performance.  

Both machine and deep learning are central to the AI management of cancer patients.  

Current Applications of AI in Oncology

To understand the roles and potential of AI in managing cancer patients and show where the future uses of AI can lead, here are some of the current applications of AI in oncology.  

With the below charts, “a” refers to oncology and related fields and “b” to types of cancers for diagnosis. +

Courtesy of the British Journal of Cancer; a. oncology and related fields: cancer radiology 54.9%, pathology 19.7%, radiation oncology 8.5%, gastroenterology 8.5%, clinical oncology 7.0%, and gynecology 1.4% b. tumor types: general cancers 33.8%, breast cancer 31.0%, lung cancer 8.5%, prostate cancer 8.5%, colorectal cancer 7.0%, and brain tumors 2.8%, others: 6 tumor types, 1.4% each.

The above graph, from the British Journal of Cancer, summarizes all FDA-approved artificial intelligence-based devices for oncology and related specialties. The research found that 71 devices have been approved. 

As we can see, most of these are for cancer radiology, which makes us correctly assume that it is for detecting cancer through various radiological scans. According to the researchers, of the approved devices, the vast majority (>80%) are related to the complicated area of cancer diagnostics.

Courtesy of cancer.gov

The image above shows a deep learning algorithm trained to analyze MRI images and predict the presence of an IDH1 gene mutation in brain tumors.

Concerning different tumor types that AI-enhanced devices can investigate, most devices are being applied to a broad spectrum of solid malignancies defined as cancer in general (33.8%). However, the specific tumor that counts for the most significant number of AI devices is breast cancer (31.0%), followed by lung and prostate cancer (both 8.5%), colorectal cancer (7.0%), brain tumors (2.8%) and six other types (1.4% each). 

Moving Forward with AI

From its origin, AI has shown its capabilities in nearly all scientific branches and continues to possess impressive future growth potential in oncology.  

The devices that have already been approved are not conceived as a substitution for classical oncological analysis and diagnosis but as an integrative tool for exceptional cases and improving the management of cancer patients. 

A cancer diagnosis has classically represented a starting point from which appropriate therapeutic and disease management approaches are designed. AI-based diagnosis is a step forward and will continue to be an essential focus in ongoing and future development. However, it will likely be expanded to other vital areas, such as drug discovery, drug delivery, therapy administration, and treatment follow-up strategies.

Current cancer types with a specific AI focus (breast, lung, and prostate cancer) are all high in incidence. This focus means that other tumor types have the opportunity for AI diagnosis and treatment improvements, including rare cancers that still lack standardized approaches. 

However, rare cancers will take longer to create large and reliable data sets. When grouped, rare cancers are one of the essential categories in precision oncology, and this group will become a growing focus for AI.  

With the positive results that have already been seen with AI in oncology, AI should be allowed to expand its reach and provide warranted solutions to cancer-related questions that it has the potential to resolve. If given this opportunity, AI could be harnessed to become the next step in a cancer treatment revolution.  

Closing Thoughts

Artificial intelligence (AI) is reshaping many fields, including medicine and the entire landscape of oncology. AI brings to oncology several new opportunities for improving the management of cancer patients. 

It has already proven its abilities in diagnosis, as seen by the number of devices in practice and approved by the FDA. The focus of AI has been on the cancers with the highest incidence, but rare cancers amount to a massive avenue of potential when grouped.  

The next stage will be to create multidisciplinary platforms that use AI to fight all cancers, including rare tumors. We are at the beginning of the oncology AI revolution. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

The Convergence of Technology and Healthcare

We saw the changes to our lives with the Covid-19 pandemic playing the role of catalyst for changes in life sciences and healthcare. This article will discuss how new technologies, including blockchain, cybersecurity, and the needed talent behind these, are impacting the medical sector.

Recent Changes to Healthcare

We have seen how the past few years have been shaped by the Covid-19 pandemic, which disrupted and revolutionized nearly every sector of our economy. 

When we look at monetary investment, it’s evident that technology spending is focused on healthcare. A report from Bain and Co found that even with economic uncertainty, healthcare is still planning to invest in tech, with software being a top five strategic priority for 80% of providers and a top three for 40%. 

This spending is for several reasons: efficiency, cost reduction, and telemedicine, whether by phone or video. Heavy technology investment in the era of Covid-19 caused healthcare to leapfrog into patients’ homes. 

These changes will be the driver of healthcare’s growth for the next few years. Yet we need to have a strong understanding of how the consumer fits into this system of delivering service, what their preferences are, and the new habits they are forming.

Once Before, in the 1920s

Periods of economic and geopolitical uncertainty have led to healthcare advancements. 

In the 1920s, there were many geopolitical tensions that eventually led to wars, but throughout the decade and the rest of the 20th century, there were remarkable advances in medicine. 

The construction of hospitals that followed the passing of the Hill-Burton Act in 1946 made the foundation of our current health delivery system, the same way we saw our highway system and other infrastructure change the face of America and its economy. We’ll likely see a similar change around needed vaccines and other due innovations. 

Rather than creating roads, bridges, and buildings, we’ll see digital infrastructure. Out of the discovery of the first mRNA Covid vaccines, we’ll find many ways to accelerate the process through biotechnology and innovation. Technology is an added dimension to healthcare innovation that has appeared out of the Covid turmoil. When technology is added to the mix, we’re going to see some fantastic opportunities.  

The Covid Cause

It’s remarkable to think that a significant, globally impacting event is a catalyst that accelerates healthcare sector tech investment. If the necessary Covid closures were only for a single week, many of these changes would not have resulted. 

Doctor visits would have been pushed back for that week instead of finding a remote solution that was needed to provide the required services and the resulting changed behaviors they have brought. The R&D plans that are now part of biotech and medical companies would likely not have manifested. 

But we see that necessity is the mother of innovation, and because of Covid-19, these changes are incorporated and permanent. Many experts believe that the two years of Covid moved the industry ahead 5 to 10 years.

A Move Toward NFTs in Healthcare

Non-Fungible Tokens (NFTs) have been an investment darling in the art world but have yet to gain prominence much outside that and the collecting arenas. This lack of diversified uses is starting to change. Healthcare is up next. 

NFTs are an exciting area for healthcare services. It’s easy to imagine a world where an NFT can become a patient’s profile in healthcare. An NFT profile has the capability to carry personal information such as the entire genome and all medical history and payment information as a unique footprint.

An NFT can also provide the owner with a pathway to get them into the healthcare system and provide them with services. This information can be combined with the banking system making their help more viable. Imagine a health saving account tied directly to the NFT through an oracle (a third-party gateway).  

This will be able to allow someone to fund their health savings account through their W2-qualifying job. Charges that fit under the account can be automatically withdrawn. 

This kind of payment system is just starting to happen on the municipal level. Cities like New York and Miami have begun to move toward such a system, with Philadelphia and Dearborn, Michigan, signaling similar moves. It’s not far-fetched to imagine a similar action to healthcare payments. 

Cybersecurity in Healthcare

When there is human involvement, there is the potential for security vulnerabilities. The second issue that all companies are dealing with is finding the right talent that is capable of building systems and products able to protect company and personal data. There is an ongoing global shortage of nearly 3.5 million cybersecurity professionals across all industries, with 700,000 unfilled cybersecurity jobs in the US.  

Cybersecurity for healthcare also requires the development of technicians that can play defense, quickly responding to cyberattacks in real-time. Hacking is accelerating and is a top risk profile for many companies, not just in tech. 

Interestingly, one of hacking’s growing tools, AI, may also be its best solution as more information and services are digitized. Significant investment is happening in software projects that help protect and defend all data. In November 2022, Crunchbase showed 258 privacy startups that have raised over $4.3 billion, with $800 million of this total raised in the last year.  

Life sciences and healthcare are industries that drive policies and security. Many boards and audit committees in the healthcare and life science sectors are attempting to identify various cyber risks and vulnerabilities. It’s fully expected that the demand for cyber-fluent personnel will increase dramatically. 

Permanent Changes Coming to Healthcare

Tech is now taking over in several areas, including consumer electronics. Wearables and connected devices are becoming a more common source of medical information. Alivecor’s KardiaMobile device is a 6-lead EKG that can send information via smartphone directly to the patient’s cardiologist for review.  

Source: Alivecor

The Las Vegas consumer electronics show is filled with sensors, apps, and embedded personalization. This expansion of devices for our health will only increase as the 5G networks expand their reach across the United States. The impacts will be wide-ranging, but ultimately focus on enhancing our lives through tech. 

One crucial, long-term benefit is that we are now seeing the healthcare economy moving from a sickness focus to a wellness mindset. This change is easier to accomplish with technology as we can monitor our health and see when things change.  

Upcoming Healthcare Trends

The healthcare sector will first see a move toward modernization in human resources, finance, and procurement through cloud services. Moving all legacy enterprise systems to the cloud will take nearly ten years. 

Next, innovation must tackle the back office to front office connection, including consumer-level devices. We have been discussing healthcare costs for decades, and the tech is now available to make it more efficient. This change can drive out costs and potentially deliver care to all.  

Closing Thoughts

Technology in healthcare has been accelerated by Covid-19, pushing digital health access, and drug and vaccine innovation. These trends are altering research and development pathways for healthcare. 

NFTs have begun to enter the healthcare space and, in the future, will likely be a secure way to provide needed information to providers, including genome and medical history. Cybersecurity issues will come to the forefront in healthcare tech with more need for talent and solutions to keep users’ data secure. 

This need for talent will include the opportunity for tech to provide equitable solutions that lower costs and bring healthcare to all. A process of modernization that puts enterprise services on the cloud will be the biggest change we will see. Further, it will promote a focus of wellness over sickness as consumer devices become ubiquitous. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

How AI Transforms Medical Research

Using artificial intelligence (AI), businesses have been moving toward digital transformation long before the Covid-19 pandemic in their collective quest to optimize production, product quality, safety, services, and customer experiences. Some actively desired a sustainable planet for all. 

The advantages of the next digital era feel limitless. Still, businesses are hesitant to adopt these technologies because they require significant behavioural and structural changes, such as new business models, operating procedures, worker skill sets, and mindsets. These technologies include not only AI, but machine learning, and deep learning “at the edge” (where rapid automation occurs). 

The pandemic acted as a wake-up call to drastically accelerate the timescale for digital transformation since it put our way of life in danger. 

The need is urgent and lifesaving, and the time is now. This is supported by a recent IBM poll that shows the Covid-19 epidemic caused the majority of global organizations (six out of 10) to accelerate their digital transformation strategies.

Source: https://www.globaldata.com/covid-19-accelerated-digital-transformation-timeline-pharmaceutical-industry/

Due to the pandemic, we can see how creative problem-solving and once-in-a-lifetime risk-taking leads to incredible breakthroughs and significant improvements. 

Medical research is one vital area that is reaping the benefits of accelerated AI adoption. 

AI and Predicting Outbreaks

Epidemiologists are already benefiting from the improvement of AI algorithms, which evaluate ever-increasing amounts of data made accessible to the public and track the onset and spread of infectious illnesses. To forecast the spread of the flu and other diseases in various regions, researchers are analysing geographical data and internet search inquiries on common symptoms.

Time is an advantage. Before calling a doctor, people are already aware that they are unwell. Before obtaining professional assistance, many people attempt to self-diagnose online. 

Epidemiologists may use machine learning models to anticipate the spread of the flu in a particular location with a high degree of probability if they see a surge in searches for phrases like “sore throat” or “difficulty swallowing” originating from IP addresses in a specific ZIP code.

Source: https://time.com/5780683/coronavirus-ai/

Governmental health organizations assess crowd densities by location and analyse that information to forecast the likelihood of future outbreaks using public data and demographic mapping. For instance, to train machine learning models in indicating how many people would visit specific sites on a given day, health authorities in Europe, Israel, China, and other places utilize anonymized mobile phone traffic density data. Venues might limit attendance, reduce visiting hours, or even close if the total rises to pandemic levels.

Optimizing Treatment

AI is already being used to diagnose diseases earlier and with more accuracy, such as cancer. The American Cancer Society claims many mammograms provide misleading findings, telling one in two healthy women they have cancer. Mammogram reviews and translations are now 30 times faster and 99% accurate thanks to AI, eliminating the need for pointless biopsies.

People with chronic or lifelong diseases may perform better thanks to AI. One inspiring example: Machine learning models analyse cochlear implant sensor data to provide deaf patients feedback on how they sound so they can interact with the hearing world more effectively. 

Computer Vision

In contrast to the human eye, AI-based computer vision can quickly sift through thousands of images to find patterns. In medical diagnostics, where overworked radiologists struggle to pick up every detail of one image after seeing hundreds of others, this technology is a great help. AI assists human specialists in situations like this by prioritizing visuals that are most likely to show a problem.

Source: https://www.altexsoft.com/blog/computer-vision-healthcare/

X-rays, CT scans, MRIs, ultrasound pictures, and other medical images provide a rich environment for creating AI-based tools that support clinicians with identifying various problems.

Drug Discovery

Small-molecule drug development can benefit from AI in four different ways: access to new biology, enhanced or unique chemistry, higher success rates, and speedier and less expensive discovery procedures. The solves numerous problems and limitations in conventional research and development. Each application gives drug research teams new information, and it might completely change tried-and-tested methods in certain situations.

Source: https://zitniklab.hms.harvard.edu/drugml/

AI is used by BioXcel Therapeutics to find and create novel drugs in the areas of neurology and immuno-oncology. The business’s drug re-innovation initiative also uses AI to uncover fresh uses for current medications or to locate new patients.

Transforming the Patient Experience

Time is money in the healthcare sector. Hospitals, clinics, and doctors treat more patients each day by effectively delivering a smooth patient experience.

In 2020, more than 33 million patients were admitted into U.S. hospitals, each with unique medical needs, insurance coverage, and circumstances that affected the quality of care. According to studies, hospitals with satisfied patients make more money, while those with dissatisfied patients may suffer financial losses.

New advancements are streamlining the patient experience in AI healthcare technologies, enabling medical personnel to handle millions, if not billions, of data points more effectively.

Employers who want to give their staff the tools to maintain good mental health can use Spring Health’s mental health benefits solution

Each person’s whole dataset is collected as part of the clinically approved technology’s operation, and it is compared to hundreds of thousands of other data points. Using a machine learning approach, the software then matches users with the appropriate specialist for in-person care or telemedicine sessions.

For treating chronic illnesses like diabetes and high blood pressure, One Drop offers a discreet solution. With interactive coaching from real-world experts, predictive glucose readings powered by AI and data science, learning resources, and daily records taken from One Drop’s Bluetooth-enabled glucose reader, the One Drop Premium app empowers people to take control of their conditions.

AI Does Not Replace Humanity

Faster, more accurate diagnoses and lower claim processing error rates are just two of the potential benefits of AI that CEOs at healthcare organizations already see. But they must also realize that no amount of advanced technology will ever fully replace the human experience.

Business executives must also consider the possibility of bias in AI algorithms based on past beliefs and data sets, and put safeguards in place to address this problem. For instance, there has historically been discrimination in how specific populations’ medical illnesses are identified and treated.

AI is there to augment human decision-making in healthcare–not replace it. 

Closing Thoughts

Traditionally, it’s tricky to understand whether AI is living up to its potential or whether everything we read is merely hype. For several years, due to the roadblocks outlined at the beginning of this article, progress has been slow and needed some hype. However, the pandemic is genuinely accelerating the integration of AI in healthcare and medical research. It almost sounds cliché now, but Covid-19 has initiated a “new normal” in healthcare. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

Artificial Neural Networks for Finance

Back in the early days of data science, before it was even called data science, any financial applications handled by programs were called Expert Systems. These were a domain of AI that was developed using the knowledge of a “Human Expert.” The expert’s knowledge was used to create a set of programming rules to assist the algorithm with making decisions. 

At its most basic level, an Expert System would look like this:

If the price of asset “A” when compared to asset “B” exceeds X%, then sell asset A (or buy asset B or do both), or: 

If a prospective borrower has a credit score below 591, do not lend them anything.

Such expert systems have been successfully used in fraud detection, medical diagnosis, and even when prospecting for minerals. However, there is a major limitation to them, which is that they require full information to be provided to them as an input and this fact means that they will either perform poorly or not at all with uncertainty. 

Financial applications primarily deal with the prediction of future events based on the results of past data. This is the reason that Artificial Neural Networks have become so popular in recent times, especially in the finance industry, because they have a better ability to handle uncertainty when compared to expert systems. When we consider various scenarios that involve predictions, we find a few primary areas enhanced by using artificial neural networks (ANNs): 

1.     Predicting the movement of the stock market, both indexes, and individual stocks

2.     Predicting loan application underwriting and repayment success

3.     Finding suitable credit card clients

In this article, we will explain the basics of artificial neural networks and go deeper into the applications where artificial neural networks can be the most successful and beneficial for the financial, banking, and insurance industries. Finally, we will finish with an example outline of an ANN for making credit decisions.

Artificial Neural Networks in Brief

ANNs are designed to mimic the actions of biological neural networks seen in life forms with nervous systems and brains such as humans.  

Image courtesy of Quora

The biologic nerve cell will take the chemical input into its dendrites, and if the signal is sufficient, then it will transfer this signal down its axon to its axon terminals, where it produces its own chemical signal to go to the next nerve cell.  

The artificial neuron (sometimes called a perceptron) will take input and evaluate it with a bias (or summing) function. The bias function decides what to do with the result, sending it on or not, and to what degree the message will be transferred.  

This perceptron was created by Frank Rosenblatt back in the 1950s and was used by the US Navy for image recognition tasks as well as many other applications.  

The ANN expands on the perceptron and consists of many interconnected neurons all performing their summing functions with the data inputs. Each of the following circles is a single artificial neuron.

Image courtesy of techvidvan.com

The ANN is made up of input and output layers, and a network will have at least one hidden layer between these, but can have dozens of hidden layers with numerous neurons in each layer depending on the model. 

The summing functions for each neuron (colored circles above) will have their own weights and use input data coming in from the left and are connected to the next layer to the right, where they send their decision results. Information is stored in the weights of the connections between the neurons. As an ANN is “trained,” the weights are what changes to improve the results that the model is providing as its output. 

This example is a “feed-forward architecture” and the most commonly used in ANN applications. There are other types of neural networks that are used in specific applications where they perform better. 

ANNs give the user the ability to utilize the data available fully and to determine the structure and parameters of a model without restrictive modeling assumptions.  

Artificial Neural Network Applications

ANNs are especially appealing in finance, banking, and insurance because there is an abundance of high-quality data available for these fields. This data means that there are plenty of inputs, and before ANNs, a lack of testable financial models to deal with all this data.  

Predicting Stock Movements

The prediction of stock market indices and specific stock values are handled by ANN using the vast supply of historical data and then predicting based on several parameters. The accuracy of the prediction is enhanced by the choice of the variables and the information that is provided during the training process.

It can be further improved with an ANN structure that has more hidden layers and more training variables. One group attempted to predict the NASDAQ stock exchange movement and found that a network with three hidden layers, consisting of a configuration of 20-40-20 neurons in the hidden layers, the team had an optimized network and a resulting accuracy of 94.08%. 

While there are other types of neural networks, these types of feed-forward networks are the most widely used because they offer generalization abilities and can be implemented easily.   

Searching for Credit Card Customers

Some credit card companies are using ANNs to decide whether to grant credit card applications. The underwriting process uses the analysis of past failures to make current decisions based on the past experience of other card holders.  

All banks that are in the credit card business wish to obtain an ideal customer who will help them remain profitable. If the client does not spend much with their credit card or uses the revolving line of credit, then that customer is not profitable.  This non-profitable customer will have a per card revenue much lower than the per card cost, and the result will be a low breakeven percentage. 

A group of researchers used an artificial neural network to approach this problem and more accurately predict ideal customers. This study used values called eigenvalues to find the lowest error rates for deciding on the best customers. After several rounds of testing, there were 14 eigenvalues that had the lowest error rates identified and settled on when choosing the most suitable customers. 

This process eliminates instances where credit cards are issued to customers who have no credit card needs, and it gives the bank more meaningful questions to ask on a credit application to better identify the ideal customer.  

This is now broadening beyond the yes-no approval decision and expanding to the amount of credit that is being provided to customers who are approved.

Evaluating Loan Applications

Financial institutions will provide loans to their clients for different reasons, and these decisions are based on various factors. ANNs can be employed to aid in the underwriting process, deciding whether to approve or decline the loan application. 

Any loaning institution will want to minimize its default rate for loan applications and maximize its returns on the loans they issue. A research group was able to test the accuracy of an ANN in predicting the success of loan recovery, and they found an accuracy of 92.6%. 

Additionally, their error rates for Type I (making a bad loan) and Type II (rejecting a good loan) errors were 6.5% and 8.2%, respectively. The failure rates that have been seen for loans approved using ANNs are lower than some of the best traditional methods.  

Other Applications

Beyond those applications listed above, ANNs can be applied to several valuable use cases:

·       Forex price predictions

·       Futures movements and pricing

·       Bond ratings

·       Prediction of business failures

·       Assessment of debt risk

·       Predicting bank failure

·       Bank theft

·       Predicting recessions

How an Artificial Neural Network Decisions Works 

To give an example of how an ANN decision can be made, let’s consider an example of what could be used to make a creditworthiness decision.   

Inputs

·       Age

·       Gender

·       Annual Income

·       Length of time at current job

·       Marital Status

·       Number of Children

·       Number of Children in the home

·       Education level attainment

·       Homeowner or renter

·       Cars owned

·       Address/area

·       Commute distance

·       Credit score

Training and Testing

There will be a large set of clean data created that contains all the inputs to be fed into the ANN to train it with known results (this is called the training set), changing the weighted variables for each neural node to increase the model’s prediction accuracy.  

Once the ANN is trained, a different set of input data is supplied (none of which is present in the training set), and the ANNs “Loan Approved” results are obtained. This second run through of data is a “test set” and can be done using real-time data coming in. 

Based on what was “learned” during the model’s training phase, the accuracy of the predicting ability is refined. The model’s prediction accuracy depends on the various input factors that go into it as well as the addition of hidden layers which are added to the neural network–until the optimum level of accuracy is achieved.

Closing Thoughts

ANNs have continued to improve, and their use has broadened with the decreases in computer costs and the persistent increases in computing power. They will likely be a foundation for financial and economic models but may need to evolve with the likely adoption of quantum computers. 

As we move forward, ANNs will become an even more useful tool to automate service- or data-oriented tasks. The financial, banking, and insurance worlds have an abundance of clean data that can feed into ANNs. 

Care must be taken to ensure that bias is removed from any data going into the model as this can ensure that bias will not come out of the model. Essentially, if we treat the models with care, they will bring us infinite value.

Disclaimer: The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

Banking Meets Blockchain

Initially, the banking industry ignored the world of blockchain. Blockchain’s origins were in direct opposition to the banking system and the control that banking has over our lives. 

As the blockchain industry gained momentum and investors earned their profits, the banking industry noticed. And when Ethereum and other crypto assets added smart contract functionality, the innovative vanguard of the industry saw massive potential.

It’s unwise to bet against the banks. Banks operate through their incentives to invest and adapt, and fight tooth-and-nail to keep their customers. While a minority of investors believe that blockchain could lead to a revolution displacing the power of large financial institutions, this is unlikely.

Prior to Covid in 2018, Deloitte conducted its Global Blockchain Survey and spoke with 1,000 banks. The survey demonstrated how much interest the financial world already had in blockchain technology. More than 95% of respondents confirmed they were investing or planned to invest in distributed ledger or blockchain technology.

Graph Courtesy of the 2018 Deloitte Global Blockchain Survey

As we move forward into mid-2022, and after wrestling with the pandemic, the initial curiosity seen in Deloitte’s study has manifested into realized projects.

A Need for Change

Many banking services are costly and slow, while other sectors are moving ahead quickly. They are replacing antiquated products and services with new versions.

Phones, cars, computers, and even lightbulbs are being reimagined–becoming more functional and efficient. Much of the too big to fail banking system is in no hurry to evolve, mainly due to fees.

As they are for-profit organizations, they want to optimize returns. Banks earn spreads on their deposit interests paid versus the interests collected from loans. Depositors receive low-interest rates (fractions of a percent), but banks lend at higher rates:

  • Today’s 30-year Lending Interest Rate = 4.921%*
  • Student Lending Interest Rate = 4.5–7.3%*
  • Average Credit Card Lending Interest Rate 19.53%*

Rates at this time of writing*

Banks easily found customers because there were limited choices. Debtors rarely complained, accepting their situations. With blockchain, debtors access lower rates from more competitive lenders.

Retail Banks Circumventing Competition

As blockchain evolved, more users learned that distributed ledger technology enables real-time transfers; no middlemen and no fixed costs.  

Consumer finance players now realize that blockchain projects pose significant threats to their similar services. They understand that they will lose their customers if they fail to evolve.   

How do banks fight back? They create blockchain-based solutions at prices low enough to prevent consumer switching.   

In Deloitte’s most recent Global Blockchain Survey, they found that many organizations were investing in projects across the board. 

Data courtesy of 2021 Deloitte Global Blockchain Survey

Representing only a portion of the industry, financial institutions understand the need to connect with non-financial blockchain projects growing in parallel to them. Defining these necessary projects or solutions and integrating them effectively is crucial.  

The Central Bank Movement Has Started

Globally, even slow-moving governments and central banks are beginning to create or overhaul their digital infrastructures.

The Biden administration made its first public announcement through an executive order recognizing the popularity of cryptos and their potential to destabilize traditional finance. This same order directed the federal government to create a crypto regulation plan, including the creation of a digital dollar.   

Data courtesy of 2021 Deloitte Global Blockchain Survey

Other nations’ central banks are adopting blockchain-based innovations and are overhauling their digital infrastructures to address complex operational challenges. Some central banks have already incorporated these technologies into their daily operations. 

In 2019, the Bank of England undertook a proof-of-concept test determining how real-time gross settlement (RTGS) could evolve with blockchain. RTGS is a funds transfer system allowing for the instantaneous transfer of money and/or securities.

In 2017, they synchronized the movement of two different currencies across two different real-time gross settlement systems using Ripple. Great Britain has actively researched digitizing its economy’s governance and investigated a blockchain-linked pound sterling.

The BoE’s report says that a number of opportunities for achieving their financial and monetary stability objectives are possible with digital currency. 

Returning Power to Central Banks

With national digital currencies, central banks can counter the dominance of Visa, Mastercard, and others over private networks by lowering transaction costs for users and small businesses. A “Digital Dollar,” “Britcoin,” or the “CDBC” (digital yuan) will each accelerate the creation and adoption of other national digital currencies. 

Beyond Cost Savings

Banks look to blockchains for more than cost savings or improvements to their network efficiencies.  They see blockchains as foundations to RTGS revolutions worldwide.

Through blockchain’s benefits, banks can increase the security of digital transactions and prevent errors, double counting, confusion, and fraud. Bookkeeping and auditing are examples of industries overdue for disruption by blockchain.   

Distributed ledgers also address the world’s new realities. Global populations, particularly in Asia and Africa, were already reducing their use of cash before the worldwide pandemic. Still, reductions have quickened, and the use digital payments reached $5.4 trillion, growing by 16% year-over-year from 2020.

Much of the growth was seen in Europe and the United States, but they are far from catching up to China, which was almost $3 trillion (over half of all digital transactions) in 2020 and may become cashless soon

The Digital Yuan

China is aggressively pushing the use of its “digital yuan” (the CDBC). It has gifted millions of the digital currency to its citizens in order to evaluate the feasibility of going cashless. While the initiative is not a true blockchain innovation as the CDBC is controlled by the central government and not decentralized, it demonstrates an increased use of digital infrastructure within the global financial system.

China’s mission is to ensure that any commercialization inherent to a blockchain-driven digital world matches its political makeup. Through the CDBC, China is playing a bit of a shell game: giving digital currency to users while maintain tight, centralized control. This is not the idea underpinning a decentralized, distributed ledger technology.

However, democracies want transaction transparency, and more of them are demanding that the costs of transactions be reduced. An open blockchain achieves both objectives as it has the five following traits:

  • Open
  • Permissionless
  • Transparent
  • Provides both finality and immutability of transactions
  • Maximizes on-chain liquidity

These features create more intelligent, compelling solutions.

Continuing Evolution

More businesses will utilize blockchain as it continues to evolve. However, not all blockchain projects are the same. Successful winners must meet the demands of excessive data and transaction use.  

Bitcoin presents many solutions. It reduces cost, increases trust, bypasses third-parties, and prevents sources of inflation inherent to centralized, fiat currencies.

Tall orders, yes, but Bitcoin successfully delivers, albeit with some limits. It suffers from a seven transactions per second limitation. Layer-2 solutions (like its lightning network) add additional throughput and functionality. Other layer-1 solutions however, solve this too.

Any successful blockchain project must be cost-efficient, stable, and scalable (what layer-1 Bitcoin lacks).  In October 2020, the Italian Banking Association introduced its “Sputna nodes network,” intending it to be cost-efficient, quick, stable, and scalable.

Sputna integrates most of the country’s banks, quickly processing transactions. This interbank cooperation creates a transparent landscape and standardizes Italian banking sector activities.

Moving Forward

The current state of blockchain and crypto feels akin to the mid-90’s internet boom. Blockchain is still not fully understood, and there will be a mix of successful (i.e., RTGS) and unsuccessful projects (i.e., Pets.com).

However, consumer banking must evolve to keep its customer base given the alternatives already presented by blockchain-based solutions. Central banks will have a similar task of creating digital systems balancing governmental desires with those of their citizens.

Disclaimer: The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees. This information should not be interpreted as an endorsement of cryptocurrency or any specific provider, service, or offering. It is not a recommendation to trade.

On Stablecoins, Banks, and Beyond 

The digital currency space exploded with the advent of Bitcoin, Ethereum, and its following cryptocurrencies, which now include stablecoins.

Cryptos have several benefits that proponents like. They are not controlled by a central body, many have low or no inflation, they are transparent, and they provide safe, anonymous transactions. However, there is one issue that remains.

They still have extreme volatility.  

The prices of the two top cryptos, Bitcoin and Ethereum, regularly move by five percent in a single day. Smaller market cap coins tend to swing by much more. 

Proponents link major cryptos to gold. As the market cap grows, volatility lessens. However, its hard to imagine gold’s price fluctuating by the same degree.  

There are two avenues by which the global financial system can move to a future that has digital payments as the norm. Stablecoins and central bank digital currencies (CBDCs). 

They allow users to remain anonymous when transacting with others and eliminate the need for third-party transaction processing agents. Cryptos and CDBCs are not so dissimilar by design, but they do have some key differences that we will cover. 

Central Bank Digital Currencies (CDBCs)

The most important difference between cryptos and CDBCs is that the latter are not cryptos. 

A CBDC, like a fiat currency, is produced and regulated by a central bank such as the Federal Reserve or the Bank of England. Rather than producing cash, the central bank will issue and “store” their digital currency.

This storage is not done by a distributed ledger, as is the case with blockchain and crypto, but with a more centralized method.  The digital cash would replace fiat cash, and your details would be attached to your CBDC assets, removing this part of the anonymity, but the transaction details would only be available to the sender, receiver, and bank. With this structure, CBDCs are controlled and monitored to the extent of their respective countries. 

In their 2022 Global CBDC Index and Stablecoin Overview, PWC found that around the world, 80% of central banks are at least considering the addition of a digital version of their national currency.

Such a shift would supply central banks with additional powers, including enhanced tax monitoring. However, their goals are as simple as “providing a digital form of cash” to citizens and “better financial inclusion.”

The Bank for International Settlements (BIS) released a significant collection of reports covering the CBDC plans of 26 developing nations from Argentina to India. In these, central banks claim that having a CDBC will achieve “greater payment system efficiency,” as well as strengthen competition between payments service providers (PSPs).

The BIS report did acknowledge concerns that countries had put forward, which included cyber risks such as hacks, network resilience, cost, sufficient scalability, bank disintermediation, and the potential for low adoption. It reported that more than half of the banks worried that if:

“Not carefully managed, [cross-border CBDCs] could spur currency substitution, exchange rate volatility, and tax avoidance.”

The exact thing they’re trying to prevent with creating a CDBC in the first place.  

Stablecoins

Stablecoins are a type of cryptocurrency. However, they differ from Bitcoin and most other cryptos in that their volatility is much lower. 

A stablecoin is specifically designed to combat the volatility seen with a conventional crypto by fixing its value to a particular fiat currency. In most cases, this is the United States dollar. 

However, there are also stablecoins linked to assets, such as gold. If the value of the asset that the stablecoin is linked to remains stable, then the coin will also remain stable. 

Tether (USDT) holds the highest spot amidst stablecoins, and its reserves are mostly cash & cash equivalents. In the past five years,  Tether’s wildest swings have not gone above $1.03 or below $0.95.

Image courtesy of CoinDesk

There are three types of stablecoins:

  • Fiat-collateralized. This means backed by a fiat currency or a commodity. Top examples include TrueUSD, and Circle (USDC), which are similar to Tether with a value of one dollar per coin. Fiat-collateralized coins use reserves of the currency or an asset to back their supply, they are maintained by independent financial institutions acting as custodians, and they are supposed to be audited regularly. Unfortunately, that doesn’t always happen. 
  • Crypto-collateralized. These differ from fiat collateralized as they are backed with other cryptos.  This backing causes them to lose their stability and requires them to have larger reserves. Using a single crypto to be a reserve is more volatile and requires the largest reserve. For $1 in stablecoins, there might be $2,000 in crypto reserves, including ETH for example. 
  • Algorithmic. These utilize computer programs to maintain their stability. If pegged to USD, an algorithmic coin’s code tracks its value and will adjust its own value to the prevailing exchange rate. It then changes the number of coins in circulation based on the coin’s value. While this may sound like a good solution, the Federal Reserve Board’s researchers reported this year that algorithmic stablecoins “may experience instability or design flaws.”

Stablecoins provide two specific attributes that make them an important part of the digital payment space: 

  1. They hold the ability to transfer value easily, making it possible for anyone with an internet connection. 
  2. They represent a foundation for programmable money able to run on various blockchain networks. Blockchains such as Ethereum, Polkadot, and others can provide the infrastructure for the creation of smart contracts interacting with stablecoins. 

This functionality has attracted the attention of large financial services companies such as Mastercard, which in the summer of 2021, said it was piloting a program to use Circle’s USDC to allow for cryptocurrency payments between cardholders and merchants. 

In 2020, Circle had a circulation of 1.1 billion digital dollars and over $58.7 billion transferred on-chain. In two years, Circle’s circulation has grown to $50 billion (Tether is presently at $74 billion). With its future SPAC IPO having a current $9 billion valuation, Circle and USDC have become a force to reckon with.

Therefore, the tensions between governments and stablecoins have increased, with U.S. regulators sharing concerns about its threats to financial stability. In February, New Jersey representative Josh Gottheimer released his draft legislation that would define stablecoins, and in November, the Bidon administration recommended that Congress regulate stablecoins to prevent them from posing a “systemic risk.”  

Moving Forward

We have seen that both cryptocurrencies and CBDCs are both similar and dissimilar. The general goal is to provide users with a digital payment system that, at a minimum: 

  • Has high adoption
  • Has low volatility
  • Causes competition between payment system processors
  • Increases monetary efficiency
  • Lowers costs for users
  • Provides public anonymity
  • Can be utilized with smart contracts 
  • Allows for low-cost cross border payments
  • Allows a country to control their monetary policy

This combination is a tall order but not impossible and is the target of any meaningful stablecoin or CBDC. 

Central banks and governments don’t wish to give up their control. They believe that having a stable economy involves controlling the monetary supply; being able to inject money into the economy when needed and altering the interbank lending rates.

The decentralized camp wants to keep governments out of economies entirely, programming crypto with respective, defined rates of inflation or with other controls over supply.   

The most difficult task will be accomplishing cross border transactions between nations. Countries are worried that if their currencies are seamlessly tied to other nations, currency substitutions could occur. Fortunately, the euro provides ample practical knowledge on the do and don’ts. 

Creating a global payments system maintaining currency stability enables the world to progress. Put another way, its vital to leave behind the disjointed system of economic fiefdoms that have stood for thousands of years. 

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees. This information should not be interpreted as an endorsement of cryptocurrency or any specific provider, service, or offering. It is not a recommendation to trade. 

Combatting Deforestation With 3D Printing

Environmental pioneers must celebrate the news of 3D printing disrupting deforestation and the timber industry.

A new method for growing material similar to traditional wood possibly eliminates all deforestation while enhancing the already broad uses for 3D printing. 

Currently, industrial forestry eliminates roughly 10 million hectares of forest each year, obviously leading the charge against global deforestation. Therefore, this newfound capacity to create custom “timber” from a lab-based setting reduces waste while enabling forests to remain untouched. 

Scientists at MIT demonstrated how this material is grown from cell cultures, tailored to meet specific property requirements. Like how we have oak, birch, and mahogany, wood-like materials can be grown on demand.

3D Printing “Wood” to Stop Deforestation

In the first stage, the scientists extracted cells inherent to the leaves of young elegant zinnia plants. Then they grew in a liquid base for two days before converting into a gel.

In this second gel stage, the hormones of these cells are adjusted to provide them with specific physical and mechanical properties like density and stiffness. As a result, they behave similar to stem cells, according to our MIT researchers. 

Using 3D printing, or bioprinting, these plant materials could be grown into artificial shapes, sizes, or other forms impossible to achieve through traditional methods. We eliminate deforestation by growing new “wood” and the waste associated with manufacturing and carpentry. 

In other words, manufacturers specify the exact parts needed and their quantities. These materials are then grown to meet specifications, such as strength, durability, color, shape, texture, etc. Again, there is no cutting involved–no second stage beyond transportation. 

“The idea is that you can grow these plant materials in exactly the shape that you need, so you don’t need to do any subtractive manufacturing after the fact, which reduces the amount of energy and waste. There is a lot of potential to expand this and grow three-dimensional structures,” said Ashley Beckwith, the lead author behind the research paper published by Materials Today.

Upgrading On Demand Manufacturing

In this context, a 3D printer produces the gel solutions in their desired form through a petri dish. Thereafter they incubate for three months at a rate roughly double the speed of a tree’s natural growth. 

Lower hormone levels within the culture generally result in plant materials with lower density, while higher hormone levels yield denser and stiffer materials. 

The researchers at MIT acknowledge that this a pioneering study. More research is required to understand how the plant materials can be made more wood-like. In particular, if and how extraction could happen from sources beyond the common zinnia plant—such as the commercially valuable pine.

Sustainable Finance and Global Equality

Conclusions extracted from the World Economic Forum in Davos, Switzerland, demonstrate that sustainable finance disproportionately supports high-income countries, increasing global inequality throughout the world. 

This not only widens the financing gap for meeting the UN’s Sustainable Development Goals (SDGs); it adds to the current crisis of global inequality. In this vein, “sustainable finance” becomes non-ESG. Put another way, the irony is profound, lost or found. 

The push for sustainability and sustainable finance inadvertently amplifies inequalities as 97% of new sustainable investment funds concentrate upon higher-income countries. Lower-income countries consequently obtain fewer resources to spend on their recoveries and development plans. 

This is because policies avoiding non-sustainable sectors (and regions or countries) avoid low-income nations still heavily reliant upon activities that produce relatively more carbon dioxide. And these nations often use these carbon-intensive activities due to lack of alternatives, which require capital and some existing income if support is unavailable. 

How More Action Can Help

A secondary cause is the lack of available data necessary to demonstrate compliance with sustainability standards. Otherwise viable investment opportunities remain hidden, exacerbating current biases in investment decision making, and continuing the mismatches between needs and offers with sustainable finance packages. 

Third, there also remains a lack of a structure supporting low-income or developing countries. For example, the ESG and sustainable finance communities contend with “more than 200 sustainability initiatives or coalitions of actors.” Both low- and high-income countries must navigate through hordes of individual requirements and taxonomies, depending on which investors they intend to solicit. 

By placing the sole emphasis upon rules without considering the natural limitations inherent to developing nations, global equality increased through a key global mechanism—sustainable finance—designed to combat it. An independent, competent third party must answer the call by transparently and objectively connecting deserving nations to sustainable investment funds.

A Lithium-Ion Battery Now Gets 60% Charge in 5.6 Minutes

A team of China-based researchers who published their groundbreaking work on Science Advances completely transformed electric vehicle charging stations from something akin to full afternoon siestas to quick pitstops by revamping the standard lithium-ion battery.

For example, it’s well known that it takes 45 minutes on average to charge a lithium-ion battery within a Tesla to 80% from 40%. The bottleneck hampering this charging derives from the battery’s anode. So, when it comes to electric vehicle news, this is nothing short of extraordinary. 

During discharge, lithium ions shift from the anode (negative electrode) to the cathode (positive electrode) through an electrolyte separator. Historically, the anode was constructed first using coal, which was then shifted to graphite to prolong the charge. 

The Problem With Graphite in a Lithium-Ion Battery

Yet as energy demands increase and electric vehicle charging stations become more widespread, graphite fails to keep pace. In addition, the slurry of the graphite anode is typically disorganized and inefficient at passing electrical current. 

Therefore, our researchers conducted particle-level theoretical models redesigning and optimizing the spatial distributions for different sized particles while also considering electrode porosity. With their findings, they coated a standard graphite anode with copper and included copper nanowires into the slurry. Then by heating and cooling the anode, this slurry further compressed, increasing its efficiency. 

By using this copper anode in place of standard, disorderly graphite, they increased the charge efficiency by roughly 50%. Their control battery reached 40% charge in 5.6 minutes, whereas their copper-infused battery reached 60% in the same time. In 11.4 minutes, their battery reached 80%. 

While the solution seems simple—heating and compressing while using copper—the ramifications remain profound. This eliminates the need for gas stops for most urban and semi-urban commuters and further paves the way for mass EV adoption. 

Going Green With Fossil-Free Steel

For centuries, artisans, crafters, and smelters created steel by mixing coal and iron at temperatures surpassing 1,600°C. By using coal, this process inevitably contributes to carbon dioxide production and global warming. Calls for going green include steel production as well. 

For example, McKinsey & Company found that every ton of steel produced in 2018 contributed, on average, 1.85 tons of carbon dioxide, or 8 percent of global carbon dioxide emissions for that year. Enter: Hydrogen. 

What Is Green Steel?

Swedish start-up Hybrit is now answering the going green call through “green steel.” Instead of using coal, they add hydrogen to manufacture sponge iron. Sponge iron offers little utility, except that it is ultimately processed into steel. 

Their demonstration facility for this hydrogen-based is due to be constructed in Vitåfors, Sweden, by 2026. Currently, Hybrit is researching the best location and design to minimize their future environmental impact. 

The necessary hydrogen is produced on-demand through electrolyzing water and adding it to the reduction shaft. This eliminates coal, its carbon footprint, and its associated transportation. 

Hybrit’s new process produces less than 10 percent of carbon dioxide emissions relative to traditional steel production. 

The carbon-conscious and electric vehicle worlds are eagerly looking forward to further news on Hybrit’s demonstration plant as they lead the charge into green manufacturing.

design and development by covio.fr