​​Data and Machines of the Future

As we move toward our future, we increasingly notice two concepts that have always been at odds: data and computing power. This rift runs as follows: we have more data than we can process, while much of that data remains subpar for processing. 

Data and computing power have been such that the data we have has always been more than the data we can process, and the data we have is not always the best data to be processing. We are reaching the point where these two issues are starting to blur.

First, we are creating computers that have the ability to process the vast amounts of data that we are now creating. Second, we are creating synthetic data that may not be “real.” However, if it’s “authentic,” the users may prefer it. Let’s discuss these two topics and how they will interact in the future.  

Rise of the Machines

A new class of computers is emerging that stretches the boundaries of problems that they have the capability to solve. These new devices from three defined areas are pushing aside the limits of Moore’s Law and creating a new computing capability curve. 

Companies and industries have always been defined by their limitations, the currently unsolvable problems. However, these new machines may help companies solve and move beyond the presently unsolvable.

These ongoing challenges define the boundaries of companies and their core products, services, and overall strategies at any time. For decades, the financial services industry has operated under the assumption that predicting the movement of the stock market and accurately modeling market risk is either hard or impossible to do, but it may not be in the near future.  

When combined, there are emerging technologies that can potentially make these core challenges achievable. With quantum computing as the next level of problem-solving, combined with high-performance computers (HPCs) or massive parallel processing supercomputers (MPPSCs), the ability to use never-before-seen swaths of data becomes possible. 

As business leaders, we must create partnerships and inroads to understanding the latest technological developments in the computing field and in our industry at large. This creative process includes experimentation and the design of a skills pipeline that will lead to future success.  

New Data Types

With the increases in chatbots, augmented reality (AR), and synthetic data (including deep fake audio, images, and video), we are forced to evaluate what is “real” and what is not. When we see news of the latest global issue, we want to know that it is real, but do we care if the newest advertisement for our favorite snack is? 

We may even prefer the unreal. Say we are discussing a sensitive health issue with a synthetic (i.e. AR) nurse or we are training an AI using synthesized data that is designed to remove historical discrimination–unreal may be the preference. 

As technology progresses, we will shift from a desire for the real to a desire for the authentic, and authenticity is defined by four foundational measures:

1.     Provenance. What is the source of the data?

2.     Policy. How has the data been restricted?

3.     People. Who is responsible for the data?

4.     Purpose. What is the data trying to accomplish?

Synthetic data aims to correct data bias, protect data privacy, and make AI algorithms more fair and secure. Synthetic content helps design more seamless experiences and provide novel interactions with AI that saves on time, energy, and reduced costs. However, the use of synthetic data will be complex and controversial.  

Data and Computing

High performance is a growing necessity. IDC reported that 64.2 zettabytes (ZB) of data was created or replicated in 2020, and this is expected to triple to 180ZB by 2025. Only 10.6% of the 2020 data was useful for analysis; and of that, only 44% was used.  

The answer to dealing with this massive issue is through high-performance computing (HPC), and while the field of HPC is not new, the computing potential has expanded. The smartphones of today contain the processing power of supercomputers three decades ago. 

Now, GPUS, ASICs, and other purpose-built processors, like D1 Dojo chips specifically for computer vision neural networks, and which will be the foundation of autonomous driving tech, are pushing HPC capability to new levels.

The Unreal World

Another issue for the future is the unreal. Dealing with a call center that has a bot that does not understand your request is maddening. But the potential for AI and its use is already becoming indispensable in business. It constantly improves, and what was once a “differentiator” for businesses has now become a necessity

Synthetic data is being used for AI model training in cases where real-world data cannot apply. This “realish” yet unreal data can be shared, protecting confidentiality and privacy while maintaining statistical properties. Further, synthetic data can counter human-born biases to increase diversity.

However, synthetic data comes with significant challenges. The world of deep fakes and disinformation is causing predictable damage, and the use of AI algorithms in social media creates echo chambers, filter bubbles, and algorithmic confounding that can reinforce false narratives. 

New Computing Technologies

Quantum Computing

While HPCs and supercomputers are able to process more data, they’re simply new versions of the same old stuff. 

The next generation of computer evolution will likely be when quantum computers begin to solve problems that we consider obdurate. Quantum research is still in its infancy but is likely to follow an exponential curve. 

The estimated number of qubits needed to crack the current level of cybersecurity is several thousand, and the devices that are being designed by IBM and Google have reached an announced 127 while others have claimed to reach 256 qubits. However, this is up from 53 for IBM and 56 for Google in 2019. 

A doubling every two years sounds like Moore’s law. However, Moore’s law is not the same for quantum computing. Qubits’ property of entanglement means that by adding one more qubit to a quantum system, you double the information the quantum system can compute. The move from 53 to 127 means computing power has doubled 74 times in just three years.  

Mimicking and Using Nature

The other technology that is reshaping computing is taking lessons from nature. Biology-inspired computing takes its ideas from a 3.7-billion-year-old system. There are two subclasses of biocomputing:

1.     Biomimicry, or computing systems that draw their inspiration from biological processes.

2.     Biocomputing, or systems that use biological processes to conduct computational functions.  

Biomimicry systems have been used in chip architectures and data science algorithms.  However, we are now beginning to see machines that are not only mimicking biological operations but are leveraging biological operations and processes. 

Data storage is a biocomputing darling for a good reason. Based on natural DNA found on Earth, one estimate predicts that an exabyte of data (1 million Terabytes) could be stored in one cubic centimeter of space and has the potential to persist for over 700,000 years.  

Moving Forward

How do businesses incorporate new forms of data and new ways of computing into practice? 

The first action is to begin evaluating how these different technologies will shape the industry and your operations. What are the problems that are considered a cost of doing business, and what would change if these problems could be solved? How can synthetic data be used to improve your current business functions, and what things need to be looked out for that could have a negative impact? What kind of machines could affect your business first?

Those who desire to take an active role in shaping the future should consider what hardware can be used to solve the currently unsolvable.  

No matter the industry, the critical step of forging partnerships is essential. Most businesses have skills and capabilities that they can gain from such partnerships, and many industry problems require comprehensive scale collaboration.  

Alliances and partnerships formed today will be the outliers of the industry tomorrow.

Closing Thoughts

We have always been defined by our unanswerable questions, and the advent of computers has helped us to solve grand challenges. We are also facing a synthetic, unreal world that is intended to improve our lives, but–depending on the user’s intent, such data and its progeny can be a tool of malicious intent.

Both of these concepts are at the point where business leaders must consider them no longer abstract. They’re rapidly improving, and their impacts on industries will be profound in the coming decade. The unsolvable will be a thing of the past, and what we believe is real will come into question.  

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

The Web3 Race

The race to create the future internet, Web3, is heating up daily. Providers are competing against each other, demonstrating a healthy, expanding, and decentralized Web3 ecosystem to come.  

The Basics

Users interact through various open-source applications such as MetaMask, Web3 gaming, the metaverse, and DeFi protocols. However, they don’t usually stop and think about what happens behind the Web3 scenes, or what is piecing the blockchain-based web together. If we imagine Web3 as a burgeoning new metropolis, it’s the providers of the underlying infrastructure and power grid that make all these operations possible.  

Every Dapp relies on communication with one or more blockchains. Daily, full communication nodes serve billions of requests from Dapps to read and write data onto a blockchain. We require a massive node infrastructure to keep up with the ever-expanding Dapp ecosystem.

However, the running of nodes is both time- and capital-intensive. Dapp builders must turn to external providers for remote access to nodes. This requirement results in extreme monetary incentives for infrastructure providers to serve as many of the Web3 ecosystems as possible. But, who are the ones winning this race?  

The Problem of Centralization

The most expeditious way to provide the reliable infrastructure that can power Dapp ecosystems is for centralized companies to set up a web of blockchain nodes, which would commonly be held in data centers such as those of Amazon Web Services (AWS). They would allow the developers to access these from anywhere for a subscription. 

This system is precisely the method that a few players in the Web3 space did, but it resulted in centralization, which is against the ideals of the self-named decentralized space. 

Centralization is a significant issue for the Web3 economy because centralization means that first, the ecosystem becomes susceptible to 51% attacks, and second is at the mercy of a few powerful players.  

Let’s consider that 81% of Ethereum beacon chain nodes are located in the United States and Europe. Additionally, if the three largest mining pools were to come together, they could conduct a 51% attack on the Ethereum network. Today’s blockchains are less distributed and more centralized than we think them or would like them to be. This structure starkly contrasts the vision expounded by Satoshi Nakamoto’s Bitcoin white paper.  

Courtesy of bitpanda

If the large node providers were to collude, then the advantages Web3 has over Web2 would be lost.  What’s more, the reliance that users would have on centralized providers can increase the chances of system outages. The Ethereum outage that occurred in 2020 due to Infura, one of the larger node providers, shows the problems of a centralized system. The outage caused several crypto exchanges, including MetaMask, Coinbase, and Binance, to suspend their withdrawals of Ethereum and ERC 20 tokens because the exchanges could not entirely rely on the Infura nodes.  

It must be noted that Amazon is often the backbone of these centralized providers. It has suffered from several past outages, which now creates a second, severe layer of vulnerability. 

The Infura outage was not the only such outage, with the Ethereum network’s move to Ethereum 2.0 or “The Merge.” The move to ETH 2.0 was interrupted by a 7-hour outage resulting from a single-node hardware failure on the network. A genuinely decentralized network would not have these types of worries. 

Solana’s Problem

Decentralization remains a crucial tenet of Web3 and its economy, and a centralized blockchain infrastructure is a threat capable of undermining it. The Solana blockchain has suffered through multiple outages, all due to a lack of decentralized nodes. The network was insufficient to handle a spike in traffic. 

Solana’s problem is common for many blockchains that are trying to scale their operations and throughput. Many of the top decentralized blockchain protocols continue to struggle with a pathway to scale while also being decentralized. The largest blockchains, Bitcoin and Ethereum, are steadfast in their part in the decentralized war, but ETH is still vulnerable. 

In the early days of blockchain, on June 8th, 2013, Feathercoin (FTC) was the victim of a 51% attack. One entity was able to control over half of the FTC network’s total processing power. This strategic move allowed them to reverse the confirmed transactions on the chain and even prevented new transactions from going forward. FTC has fallen into blockchain obscurity with a price that plummeted, alongside a delisting from all major exchanges.  

The reason for the ongoing centralization is due to the overreliance on Web2 cloud providers such as AWS and Infura, which have continued in their roles from Web2 and have provided the infrastructure for Web3 and its economy. However, the current strategy to avoid centralization and blockchain’s problematic “single point of failure” is gaining significant steam. This change is good news for the future of Web3 ecosystems that wish to remain healthy, secure, and decentralized.  

Better Solutions With Decentralized Infrastructure

Courtesy of Statista

With the advent of novel innovations, there is a rise to a new breed of decentralized provider. These node providers are running their services on-premises or even in users’ homes instead of relying on centralized cloud providers. 

The Architecture of Web2 vs. Web3

Courtesy of Coinbase Blog

The key advantage that decentralized nodes provide is that they cannot be taken down in the same way as a single point of failure. They can also provide faster connections for global users. Additionally, providers of decentralized node infrastructure create new economies, where these independent providers serve requests for data and earn rewards in their native tokens. 

Increasing Competition

Several providers in the decentralized Web3 space, such as Flux, Ankr, and QuickNode, compete for market share. This competition ensures that providers are consistently motivated to improve their services and provide the best possible user experience for their customers. Such a competitive environment is good for the Web3 economy because it leads to innovation while also lowering prices.  

Investors are seeing great returns acting as pooled node providers as well. Yieldnodes sets up decentralized nodes for investors and has paid 10% returns a month for over two years, with a high of over 19% in February 2021.  

Courtesy of Yieldnodes

What’s even more important is that competition for blockchain infrastructure results in a more decentralized Web3 economy. The more decentralized the network, the more resilient it is to censorship, and 51% attacks will remain an issue of the past. 

Closing Thoughts

The idea behind Web3 is not just to create a better internet but a better world. Decentralized infrastructure providers are building an internet foundation that is more equitable, censor-resistant, and secure. 

By shaking things up, they are supplanting the status quo of giant, centralized hosting providers that are carryovers from Web2 and which make blockchains more susceptible to attacks, outages, and censorship. The new decentralized providers are on the cutting edge and have an incentive to push innovation, providing their users with both the best possible service and the greatest level of integrity. 

Disclaimer:  The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees. This information should not be interpreted as an endorsement of cryptocurrency or any specific provider, service, or offering. It is not a recommendation to trade. 

NFTs and Deep Learning

Non-fungible tokens (NFTs) are becoming more popular by the day. According to DappRadar, the trade volume of NFTs in 2021 was $24.9 billion–over $95 million more than in 2020.

One of the most significant developments in the cryptocurrency ecosystem is the rise of non-fungible tokens. The initial generation of NFTs concentrates on developing the fundamental components of the NFT market’s infrastructure, including ownership representation, transfer, and automation.

Even the most basic kind of NFTs capture great value, but the hype in the industry makes it difficult to tell the difference between signal and noise. As the market develops, the value of NFTs should shift from static photos or text to more intelligent and dynamic collectables. The upcoming wave of NFTs is going to be heavily impacted by artificial intelligence (AI).

NFTs and AI

We need to know what AI disciplines cross with the current generation of NFTs to comprehend how intelligent NFTs can be created. NFTs are represented virtually using digital media, including photos, videos, text, and audio. These representations translate to several AI sub-disciplines amazingly well.

The “deep learning” branch of AI uses deep neural networks to generalize information from datasets. The concepts underpinning deep learning have been known since the 1970s. Still, in the last ten years, they have experienced a new boom thanks to various platforms and frameworks that have accelerated its widespread use. Deep learning can significantly impact a few critical areas, enabling the intelligence of NFTs.

Computer Vision

NFTs are mostly about pictures and videos nowadays, making them the ideal platforms for utilizing the latest developments in computer vision. Convolutional neural networks (CNN), generative adversarial neural networks (GAN), and transformers are approaches that have advanced computer vision in recent years. 

The next wave of NFT technologies can use image production, object identification, and scene understanding, amongst other computer vision techniques. It appears obvious to integrate computer vision with NFTs in the field of generative art.

James Allison, a Nobel Prize-winning cancer researcher, was the subject of an NFT that the University of California, Berkeley auctioned off on June 8 for more than US$50,000. Designers scanned faxes, handwritten notes, and legal documents related to Allison’s important findings filed with the university. Everyone may view this piece of art, titled The Fourth Pillar, online, and the team created an NFT to prove ownership.

Natural Language Processing

Language is the primary means through which cognition, including forms of ownership, may be expressed. Over the past ten years, some of the most significant advances in deep learning have been in natural language understanding (NLU). 

In NLU, methods like transformer powering models, or GPT-3s, have achieved new milestones. New versions of NFTs could benefit from research in fields like sentiment analysis, question answering, and summarization. The concept of adding language comprehension to NFTs in their current forms feels like a simple way to improve their usability and engagement.

For instance, Eponym, a program that enables the translation of words into art and the direct development of NFTs, was recently released by Art AI.

Voice Recognition

Speech intelligence is the third branch of deep learning that can immediately affect NFTs. The field of voice intelligence has recently evolved because of techniques like CNNs and Recurrent Neural Networks (RNNs). Attractive NFT designs may be powered by features like voice recognition or tone analysis. It should be no surprise that audio-NFTs appear to be the ideal application for speech intelligence techniques.

NFTs need voice AI because it enables people to connect with their digital collectables naturally. Voice AI, for instance, may be used to query an NFT or issue commands to it. In the future, NFTs will be even more important since they are now more dynamic and engaging. Platforms such as Enjin allow users to create music industry NFTs, which could be game-changing. 

The potential of NFTs is increased by language, vision, and voice intelligence improvements. The value unleashed at the point when AI and NFTs converge will influence several aspects of the NFT ecosystem. Three essential categories in the current NFT environment may be immediately reinvented by introducing AI capabilities.

Using AI to Generate NFTs

This aspect of the NFT ecosystem stands to gain the most from recent developments in AI technology. The experience for NFT creators may be enhanced to heights we haven’t seen before by utilizing deep learning techniques in areas like computer vision, language, and voice. Today, we can see this tendency in fields like generative art, but they are still very limited in terms of the AI techniques they employ and the use cases they address.

We should soon observe the usefulness of AI-generated NFTs to spread beyond generative art into other general NFT utility categories.

Digital artists like Refik Anadol, who are experimenting with cutting-edge deep learning techniques to develop NFTs, illustrate this value proposition. To produce astounding graphics, Anadol’s company trained models utilizing hundreds of millions of photos and audio snippets using techniques like GANs and quantum computing. 

Natively Embedding AI

Even if we can create NFTs using AI, they won’t necessarily be clever. But imagine if they were? Another commercial opportunity presented by the convergence of these two exciting technological phenomena is the native integration of AI capabilities into NFT. Imagine NFTs with language and speech skills that can interact with a specific environment, engage in a conversation with people, or respond to queries regarding their meaning. Here, platforms like Alethea AIand Fetch.ai are beginning to make headway.

NFT Infrastructures With AI

Building blocks like NFT markets, oracles, or NFT data platforms incorporating AI capabilities can provide the groundwork for gradually enabling intelligence across the whole ecosystem of NFTs. Consider NFT markets that utilize computer vision techniques to give consumers intelligent suggestions or NFT data APIs or oracles that provide intelligent indications from on-chain statistics. The market for NFTs will increasingly depend on data and intelligence APIs.

Closing Thoughts

AI is reshaping nearly every industry. By combining with AI, NFTs can go from simple, rudimentary forms of ownership to intelligent, self-evolving versions that allow for richer digital experiences and much greater forms of value for NFT creators and users. 

Smart NFT technology does not require any far-fetched technological innovation. The flexibility of NFT technologies combined with recent developments in computer vision, natural language comprehension, and voice analysis already provide an excellent environment for launching new innovations in the ever-growing digital asset space. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

NFT and Crypto in GameFi

The GameFi sector has been one of the main contributors to the explosive growth of the cryptocurrency market over the past few years.

Gamers can earn incentives while playing thanks to GameFi, a combination of the words “finance” and “gaming.”

With consistent growth, the market has a token market cap of almost $9.2 billion. Notably, despite the crypto winters, GameFi networks have endured and thrived. 

By 2031, the sector is expected to be worth $74.2 billion.

What Is GameFi?

GameFi is a platform that combines blockchain technology, non-fungible tokens (NFTs), and game mechanics to build a virtual world where users may interact and earn tokens.

Video games used to be stored on centralized servers, allowing publishers and creators complete control over everything in their games. This meant that none of the digital objects that players had gathered over many hours or even years of gaming belonged to them. 

Few of these objects had any use outside the game, ranging from avatars and virtual territories to weapons and clothes (sometimes referred to as “skins”). Therefore, there was no practical mechanism for users to get reimbursed for their online time or have access to the value of their acquired in-game goods without undertaking a professional gaming career.

Players earn in-game rewards by completing objectives and moving through different stages in GameFi games. These prizes, as opposed to conventional in-game money and equipment, have monetary worth outside of the gaming industry.

The industry has been dubbed “play-to-earn” because of gaming products given in the form of NFTs as “accomplishment tokens” that may be exchanged on NFT marketplaces or cryptocurrency exchanges.

Even though “play-to-earn” is the preferred terminology, participating in GameFi involves risk, including the possibility of incurring significant upfront fees that a player might lose.

How GameFi Works

There is an in-game currency, market, and token economy in almost all blockchain-based games. There is no centralized authority, in contrast to conventional games. Instead, the community generally manages and governs GameFi projects, and users participate in decision-making.

Although each GameFi project has its own unique mechanics and economics, they have similar characteristics:

  • Blockchain Technology: The distributed ledger of a blockchain powers GameFi initiatives. This maintains player ownership records and guarantees the openness of all transactions.
  • In contrast to traditional gaming, where players play to win, GameFi initiatives employ a P2E business model. By providing incentives with measurable values outside of the game, these games encourage players to play more. These incentives typically take the form of NFTs or in-game money.
  • Asset Ownership: In conventional gaming, in-game purchases are immutable investments that are trapped inside a particular game. Players own their tokenized in-game assets via P2E. In most cases, people can trade them for cryptocurrencies and, ultimately, money. On the blockchain, assets are tokenized and might include anything from a suit of armor to a piece of virtual real estate.
  • Decentralized finance (DeFi) solutions, such as yield farming, liquidity mining, and staking, may also be a part of many GameFi initiatives. These provide participants more ways to grow their token investments.

Decentraland, The Sandbox, Axie Infinity, and Gala are examples of well-known blockchain gaming networks that use the P2E GameFi architecture.

Axie Infinity

Consider the Ethereum-based game Axie Infinity, which gained popularity in 2021 and became the most Googled NFT worldwide in March 2022. Players in Axie Infinity gather, breed, train, and engage in combat with “Axies.” Each Axie may be exchanged on the game’s market for real money, unlike other in-game products (to give you an idea, the most expensive Axie ever sold was for US$820,000).

Axie Infinity Shards (AXS), which can be purchased and sold on exchanges like Crypto.com, and Smooth Love Potion (SLP), which users earn by playing the game, are the two native cryptocurrencies of the game. AXS is also utilized as a governance token, enabling token holders to decide how the gaming experience will evolve in the future.

Having said that, there may be a significant barrier to entry for games like Axie Infinity. User purchases of three pet characters are required to launch the game. A typical team setup used to cost roughly $300, although prices have recently decreased by approximately one-third. 

Despite the price decline, this initial cost remains a significant barrier for many, especially given that the vast majority of blockchain gaming players now come from underdeveloped nations. Due to this barrier, gaming guilds have emerged, which allow NFT owners to lend out in-game assets (NFTs) in exchange for a percentage of the assets created. 

GameFi Is Boosting Growth

Because GameFi initiatives use cryptocurrencies to settle transactions, the use of digital currencies has grown significantly in recent years.

The number of Unique Active Wallets (UAW) connected to the blockchain gaming industry increased significantly in the third quarter of 2021, according to a report recently released by DappRadar, a platform that monitors activities on decentralized applications (DApps). 

These wallets made up roughly 49% of the 1.54 million daily UAWs registered during that time. The information supports the sector-disruptive potential of GameFi and the rising use of cryptocurrencies, which in turn encourages their uptake and use.

Another study report on the subject was recently made public by Chainplay, an NFT game aggregation platform, and it showed that 75% of GameFi investors entered the cryptocurrency markets as a result of their engagement in GameFi, demonstrating GameFi’s expanding influence on crypto adoption.

In addition to an expanding crypto universe, and growing retail crypto exchanges, GameFi has significantly contributed to the NFT market expansion. NFTs are used more often on the blockchain since GameFi largely relies on them for in-game assets. The growth of the GameFi market in 2021 closely mirrored the NFT boom.

Sales of GameFi’s NFTs increased from $82 million in 2020 to $5.17 billion in 2021. 

Closing Thoughts

Because GameFi is a part of the cryptocurrency sector, it is also impacted by its many ups and downs. As a result, activity in the GameFi sector increases during uptrends while it declines during downtrends.

GameFi platform developers must work hard to create captivating games and help crypto ecosystems withstand market declines if they hope to keep users onboard. 

Although the endeavor is easier said than done, GameFi investors are now focusing on enhancing gaming experiences with the clear objective of sustainability.

There are many obstacles for developers to overcome, but if they can draw gamers in with excellent gameplay, the future of blockchain-based gaming is more than promising.

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

The Future of NFTs in Web3 and Web4

The market for non-fungible tokens, or NFTs, was valued at $232 million in 2020, increasing to $22 billion in 2021. Because of its growing popularity in collectible trading and the developing significance of Decentralized Finance (DeFi), this market is anticipated to expand by around three times by 2031 as Web3 comes into being.

This article delves into why NFTs are an essential part of the future and how they fit into the evolution of the internet, known as Web3 and, eventually, Web4. 

What Are NFTs?

NFTs are digital coins that operate similarly to cryptocurrencies on a blockchain. But they differentiate from crypto tokens because each one is bespoke. This means they can grant “uniqueness” to other assets to which they are connected. Digital art has proven to be the most common initial use for NFTs, with pieces made by artists like Grimes and Beeple being well-liked with online collectors, frequently fetching high prices. 

NFTs are most frequently stored on the Ethereum blockchain. However, they may also be kept on other blockchains, including Polygon and Binance.

What Is Web3?

The phrase Web3 represents the notion of an innovative, improved internet. In essence, Web3 leverages blockchains, cryptocurrencies, and NFTs to return ownership and authority to the consumers. As put by a tweet from 2020: Web1 was read-only, Web2 is read-write, and Web3 will be read-write-own.

Some Web3’s core virtues are:

  • Decentralized. Ownership of the internet is distributed among its builders and users, rather than with a controlled entity. 
  • Inclusive. Everyone has equal access to participate.
  • Merit-driven. Web3 uses economic mechanisms and incentives rather than relying on third parties.

Let’s give a contextual example of how Web3 works. 

Web3 offers you control of your digital assets. Let’s take the scenario of playing a web2 game. An in-game item that you buy is linked to your account immediately. You will lose it if the game’s developers terminate your account, or if you quit the game. 

Direct ownership is possible with Web3, thanks to (NFTs). Nobody, not even the game designers, has the authority to revoke your ownership. Additionally, you may sell or trade your in-game possessions on open marketplaces to recuperate their worth if you decide to stop playing.

What Is Web4?

The semantic web, where computers instead of people will generate new information, is commonly seen as the result of “Web 3.0.” The Internet of Things (IoT), or a web of intelligent links, will be what we refer to as “Web 4.0.”

The core features of Web4 include: 

  • A hazy, blurred gap between man and machine
  • Information transmitting from every part of the worldArtificial intelligence capable of human-like communication
  • Completely transparency and traceability
  • Incredible speed and resilience

Everything around us is changing because of Web4, including the economy, logistics, and even medicine. The customer would have complete control over the internet and unbridled access to their activities and data. Web4 expands the potential of any internet-related sphere of activity by enabling a connection between man and machine.

NFTs in Web3 and Web4

Web4 will be synonymous with the digital economy and all digital assets. The impending metaverse will bridge the digital gap, making “tokenomics” seem like nothing. 

In this context, NFTs will prove key to online communities, events, exchangeable assets, digital identities, and more. They will also greatly add to the rising popularity of retail cryptocurrency trading, which has already produced its fair share of mavericks and winners. In essence, NFT technology protects the integrity of the growing digital asset space. 

NFTs have been utilized to grant exclusive access to offline events in addition to online groups and events. The permanent evidence of ownership provided by NFTs on the blockchain makes the technology well suited to address significant problems in the realm of event tickets, such as forging and digital theft.

NFTs have been integrated into blockchain games like DeFi Kingdoms, Axie Infinity, and Crabada, resulting in the development of thriving in-game economies where NFTs are valued according to their characteristics and statistics. In these games, playing more is highly rewarded since leveling up NFT assets increases profits and boosts the likelihood that uncommon and expensive item drops will occur. 

Concerned that someone could take your metaverse username? Through the Ethereum Name Service (ENS), NFTs have already made it possible for users to possess unique “.eth” Ethereum wallet addresses. The network is attracting a huge number of new addresses each day.

These unique addresses, which are NFTs, are linked to other decentralized services and make complicated wallet addresses more individualized and much simpler to remember.

Closing Thoughts

Usernames and wallet addresses are no longer the primary means of identifying assets in the metaverse–non-fungible tokens have taken their place. The Sandbox’s metaverse project already uses NFTs to represent virtual locations, furnishings, and other objects.

The Sandbox generated more than $24 million in revenue in March 2022 from selling NFTs representing real estate in the metaverse. Leading companies and well-known individuals from various industries, including Atari, Snoop Dogg, and the South China Morning Post, all own land in the metaverse.

NFTs are building the groundwork for digital communities, tradeable in-game items, and the greater metaverse economy while also revolutionizing the ownership and exchange of digital assets.

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

Ethereum Is Not Decentralized

With “the Merge” to Ethereum 2.0, the world’s second largest blockchain network by market cap has been the talk of the town. We have heard and read some postulating that Ethereum (ETH) can succeed even if it does not scale correctly

The ETH is more than its potential scalability. It stores billions, and eventually, trillions in value and does not need a central monetary authority or bank. However, the necessary key is increased liquidity and decreased volatility. 

Bitcoin (BTC) has followed a similar path to Ethereum, with many saying they wanted the BTC network to scale as a priority. They believe that Bitcoin’s success is measured as a rival for Visa, or it will fail, arguing that it must be a method of exchange rather than only a “Store of Value” like gold.  

This requirement results in a decentralized network in jeopardy, prone to censorship and government capture. Fortunately, even with the high cost of mining hardware, small block miners have been successful. At the same time, network scaling is happening through “layer-2” (outside the base blockchain layer) solutions and side chains like Liquid and the Lightning Network. And Bitcoin’s base layer can keep its role as an SoV while building its own exchange network.   

Ethereum is designed to be the world’s computer with unstoppable code that can run Dapps cheaply and trustlessly. But Ethereum has had to implement a major fix since the DOA hack, forgoing decentralization, and scalability was also jeopardized when the Dapp CryptoKitties broke the chain’s usability. We hope that Ethereum’s move to proof of stake will solve the throughput issues and lower the gas (transfer) fees for base-layer transactions which can be excessive.  

Source: YCharts

Since Ethereum’s Merge, the supply of new ETH is slowing. According to data from Ultra Sound Money, the Ethereum issuance rate has fallen by 98%. Though it has not become deflationary. At the end of September 2022, it is only sitting at 0.09% annualized growth per year with a total of 14,042,583 ETH currently in Ethereum staking contracts, totaling $18.7 billion.  

Ethereum Is Not Decentralized

Glassnode data shows that 85% of Ethereum’s total supply is held by entities that have 100 ETH or more, and 30% of the supply is in the hands of (wallets of) those with over 100,000 ETH.

Source: Glassnode

Ethereum’s centralization issue is even more apparent with the shift to proof of stake. Being a “staker” does not require the same hardware as proof of work, but a validator needs to have 32 ETH staked to participate, a sum that most cannot afford. Ethereum’s “Beacon Chain” validators illustrate how the PoS system will look. 

Most Beacon Chain validators are large entities, large exchanges, and newly founded staking providers with significant ETH holdings. A large portion of validators are legal entities registered in either the US or the EU, subject to those jurisdictions’ regulations.  

These centralized holdings mean that just under 69% of the total amount of ETH staked on the Beacon Chain is held by a mere 11 providers, with 60% staked by only four providers. A single provider, Lido, makes up 31% of the staked supply.  

Source: TheEylon

When there is a bull market like we saw until the first quarter of 2022, this amount of centralization generally goes unnoticed, but as the tide turns, uncertainty reveals such flaws.  

The potential for a proof of stake attack is just under 68%, and if the top 11 stakers were to collude, they could succeed in such a play. 

Source: Glassnode

No Really, DeFi Is Centralized Too

Decentralized finance was never really decentralized in the first place. The truth is that Ethereum’s not securing as much decentralized wealth as we think. Much of the value of Ethereum is in yield farming that results in high annualized yields, and in other Ethereum-based DeFi that worked until 2022’s crash. High yields had prevented major players from looking behind the curtains and finding the flaws. 

There was a massive growth in the “Total Value Locked” (TVL) of Ethereum, but since the crypto is not actually locked—it is temporarily deposited to capture those ridiculously high yields, hence dropping from $110B to $31B in only a year. 

Source: Defillama

Lack of Users

Likely fewer than 500,000 people have interacted with DeFi or with Ethereum. The user numbers for YCharts, DappRadar, DefiPulse, Etherscan, and Nansen, are all underwhelming.

Source: YCharts
Source: DappRadar

While the most valuable Ethereum-based DeFi coins have a small number of active users, it means that their fees are high enough to drive new users away from Ethereum. The highest valued DeFi protocols only have users in the thousands (OpenSea has only 26K daily users; see above). The reason that $31 billion is “locked” is the incentive of high “APY” (yield) liquidity mining. 

When users are being paid to borrow money from a DeFi protocol, you cannot consider any one user the same as a long-term holder. They can disappear quickly. 

The Solution

The solution for a centralized system is quite simple–make it more decentralized. Unfortunately, those currently working on Ethereum are rebuilding everything that is wrong with Wall Street and putting it on a blockchain. 

The deep-pocketed backers or in-the-know developers are pushing for centralization because they desire the most of the pie. As limited as Bitcoin is functionally, it is the most decentralized cryptocurrency with the prevalence of Bitcoin’s fractional shares. However, Ethereum has more potential and can be successful.

Staking

Staking was intended to remove the hardware requirements that kept the small player from participating. The requirement of a 32 ETH stake for participation in PoS is limiting. In October 2022, 32 ETH is about $44,000. 

If there are enough decentralized staking pools, then much of this issue can be resolved. By allowing many to invest through aggregating pools, the small players can finally join in. 

Layer-2 and Beyond

If the number of nodes (validators) is high enough to lower the gas fees and increase the throughput of ETH, then Ethereum may have a successful decentralized future. If other avenues imprinted on Layer-2 solutions can increase the affordability of transactions further, enabling micropayments like what µRaiden wants to do, then Ethereum can be truly decentralized. This may draw much more retail, everyday users to cryptocurrency through the many American or European crypto exchanges

When combining affordable layer-2 solutions with a broad and decentralized PoS system, Ethereum will lose much of its centralization.

Volatility and the Catch-22

The use of ETH is the key, but volatility is the restriction. While the types of price swings we have seen in the crypto markets remain (10% or more in a day), all crypto use will be limited. The problem is that while the volatility is high, the acceptance will not be widespread, and while acceptance is not widespread, volatility will be high.  

Getting transaction prices down and making micropayments possible will allow for more widespread use, and more widespread use means more stability. If the large holders can release their grip that is centralizing Ethereum, then they will likely do better in the long run for the greater good of decentralization. 

We are incredibly positive about the Ethereum network and look forward to its decentralization. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment. Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business. Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

The Metaverse and Its Ingredients

Since the early inception of the metaverse, starting with the 1982 book by Neil Stevenson, “Snow Crash,” the idea of an immersive world without limitations intrigued many. This interest has been boosted by the increased online presence characterizing the Covid pandemic and Facebook’s name change to “Meta.” 

There remain several questions about the metaverse, and we have addressed many of these with our previous article examining the metaverse’s relationship with programmable data. Yet there remain several more questions: Can the metaverse live up to its hype? What is it good for? And how is it distinguishable from any other virtual reality-based world? 

The Metaverse’s Core Idea

The metaverse is, at its core, just a named version of the internet’s evolution to a more social, economically sophisticated, and immersive system. The tech world has two perspectives by which they believe this can be accomplished, which are in opposition to each other.

  1. The decentralized approach: An open and interoperable system owned by the communities that maintain it.  
  2. The centralized approach: A centralized system that is closed and controlled by corporate mandates. This is like the current “Web 2.0” system that demands economic rents from creators, donors, and residents. Think of the Apple store that prevents some apps from being distributed and demands a cut of every app’s revenue.

Open against closed is the distinction separating these two perspectives.

A closed metaverse is a world created by a single entity and is controlled by them. They dictate rules, enforce those rules, and can decide who is excluded and why they are. We can easily imagine Meta developing this example.    

In an open metaverse, individuals govern their own identities, and the collective will enforces property rights and benefits for the users. Transparent, interoperable, and permissionless, an open metaverse enables users to freely build a metaverse of their choice. 

A true metaverse is an open one, where in a Web 3.0 style, the users determine what’s best.

The Seven Ingredients

We have identified seven ingredients that are required to build an open metaverse. 

1.    Decentralized

The overarching fundamental requirement of a healthy metaverse is that it must be decentralized. Centralized networks start as friendly and cooperative places to attract new users and developers.  However, as the growth curve slows, they transition to a competitive system extracting more and demanding a zero-sum game. 

Powerful intermediaries become involved in repeated violations of users’ rights and then may ultimately de-platform, or phase out a version of a metaverse, of their metaverse entirely. A decentralized platform avoids this by propagating user ownership and a healthy community. 

Decentralization is critical. A centralized network stifles innovation while the opposite remains true for a decentralized counterpart. Maintaining decentralization offers the best protection against a failed metaverse. 

2.    Autonomous

The next ingredient of an open metaverse is the self. The first thing you should have in a virtual world is yourself. A person’s identity must persist when crossing the real-to-virtual threshold and across the metaverse. 

Identification is established by authentication, confirming who we are, what we can access, and what information we can provide. This is currently done through an intermediary that conducts the process using solutions like single sign-on (SSO).

Leading tech giants of today (i.e., Google and Meta) built their companies on user data. They collected it by analyzing people’s activity and developing models to provide more relevant and effective marketing. 

Cryptography, which is at the heart of Web 3.0, allows users to authenticate without relying on a central intermediary. Users govern their identity directly or with the support of a chosen service. Crypto wallets (i.e., Metamask or Phantom) can be used for identity authentication. Open-source protocols such as EIP-4361 (Sign-in with Ethereum) or ENS (Ethereum Name Service) can be used by projects to build a decentralized system securing identity. 

3.    Property Rights 

The most popular video games of today make money from the sale of in-game items: skins, weapons, emotes, and other digital things. People who buy these are not really purchasing them but instead renting them. If the game shuts down or unilaterally changes the rules, the players will lose access to their purchases. 

While we are used to this Web 2.0-based system, digital assets could be genuinely “owned,” transferred, sold, and or taken outside of games. The same logic of what is owned in the physical world can be applied to the digital world. When you buy something, you take ownership. It really is that simple. 

These ownership rights should be enforced in the same fashion that courts enforce them in the real world. Digital property rights were not a possibility before the advent of encryption, blockchain, and complementary advances like NFTs.  The metaverse can turn a digital serf into a landowner.  

4.    Flexibility

The mixing and matching of software components in the same way that Legos can be combined is called composability. Each software component is written once, and then it is reused. 

This system is analogous to Moore’s law, or the way interest compounds. The exponential potential that such a system provides has shaped the worlds of finance and computing. It can be applied to the metaverse. 

Promoting metaverse composability, which is closely related to interoperability, requires a high-quality foundation with open technical standards. With Web 2.0, developers build digital goods and novel experiences using a system’s foundational components, like those found in Roblox and Minecraft

However, using those goods or experiences outside their native settings is more complicated or impossible. Companies offering embeddable services like Twilio’s communications or Stripe’s payments work across multiple websites and apps, but don’t allow developers to change or alter their code. Composability enables developers to use and modify the underlying codes, similar to open source. 

Decentralized finance (DeFi) is a fairly good example of composability and interoperability. Anyone can adapt, change, recycle, or import the underlying code. Further, engineers can work on live programs, such as Uniswap’s automated market-making exchanges or Compound’s lending protocols, using the memory of Ethereum’s shared virtual computing system. 

5.    Open Source

Composability is not possible without open source. The finest programmers and producers, not platforms, deserve absolute control so that they may be truly innovative. This way, developers can achieve their goals of creating more sophisticated and trustworthy experiences when codebases, algorithms, protocols, and marketplaces are accessible to all.

This openness produces better software, heightened transparency with economic arrangements, and closes information gaps. All these features aid in the development of more egalitarian and equitable systems that align all network participants. Such systems can make many securities laws obsolete, which were designed to address the principal-agent dilemma and asymmetric commercial knowledge.  

6.    Community Ownership 

When a single entity owns and controls a virtual world, it provides limited escapism without offering a truly virtual experience—like a theme park. Users and programmers must not be forced to adhere to the possibly arbitrary rules of centralized management. All stakeholders should have a say in a metaverse’s governance. 

Community ownership is the ingredient that brings together all network players: builders, investors, creators, and consumers. The metaverse, using blockchain and ownership tokens, can provide this level of coordination. 

Web 3.0’s decentralized autonomous organizations (DAOs) have taken this idea to heart. They are moving away from the rigidity of corporate institutions and toward more flexible, democratic, and informal governance models. DAO communities can be constructed, governed, and pushed forward by their users rather than through centralized bodies.  

7.    Total Social Involvement 

Tech companies would like you to believe that virtual reality (VR) and augmented reality (AR) hardware are essential components of the metaverse. They are not necessary but are instead modern Trojan Horses. The tech giants see this hardware as their pathway to being your primary provider of 3D. Yet, the theme park analogy also applies here.

The metaverse does not require VR or AR. The best manifestation of the metaverse demands social immersion. The metaverse enables activities and interactions that are more important than the hardware. People are there to interact, mingle, cooperate, and have fun from anywhere in the real world, as is done with Twitter Spaces, Discord, and Clubhouse.  

Covid-19 showed us the need for more immersive experiences. For example, zoom replaced text chatting. FaceTime and Google Meet entered the market by storm. 

Closing Thoughts

While companies have started building metaverses, any virtual environment that lacks the above ingredients cannot be considered a fully developed metaverse. Web 3.0 is required to achieve the greatest potential inherent to the word metaverse.   

The metaverse is built with openness and decentralization as its core principles. Self-autonomy and property rights must endure the influences of centralized powers and do require decentralization to flourish. With collective ownership, the metaverse avoids the pitfalls of unilateral ownership. With collective ownership, innovation flourishes.   

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

IoT Devices Enhance Proactive Risk Management

IoT (Internet of Things) is a buzzword that has been around for a few years and is growing in popularity as we slowly connect everything to the net. An enormous amount of data is being collected already, and this is going to the next level through IoT sensors. 

While there are many problems with IoT sensor security that still need to be solved, the data that is being supplied by these devices, if useful and used correctly, has the power to disrupt traditional risk management. This article will discuss some proactive uses of IoT for risk management and why IoT will be invaluable in the finance and insurance fields.

IoT’s Growth

The growth of IoT as a technology is unbelievable. IoT use cases are being seen in nearly every business sector, from connected technologies to cloud computing and digital data.  Pharma is using IoT for material tracking and machine monitoring. Oil producers are using IoT for safe extraction and delivery. The travel industry is connecting aircraft to regulate seat temperatures and other IoT devices to make travel seamless. 

Cannabis producers are using IoT devices for monitoring their plants from seed to store to stay compliant with their local regulations. Any industry can find benefits from IoT devices. And for finance and insurance, this spread of devices can be used for our own needs.  

IoT for Risk Management

The embedding of IoT sensors into physical objects can complement risk mitigation and risk management services. The finance and insurance industries can either piggyback, extracting data from devices that are already installed, or can require the use of our own device’s native sensors. Our goal is to predict and identify risks with reliable accuracy.  

During the COVID-19 pandemic, the use of IoT sensors surged in popularity. The shutdowns of the pandemic forced many businesses to rely on IoT sensors to be their eyes and ears.  

These new sensors had the ability to watch over vacant buildings. If a building’s system fails, the IoT sensor would identify the failure and notify someone to deal with the problem. The ubiquity of these sensors means that there is a continuous supply of tracking data, like with the data inherent to finance and insurance.

At this year’s Risk Management Society (RIMS) conference, several industry leaders from Waymo, Chubb and Prologis Inc. spoke about how IoT is being used for their risk mitigation practices.  

The team members from Chubb, including their chief risk officer, spoke about how IoT is helping Chubb take risk mitigation and management to the next level, allowing them to predict and even prevent potential damage before it happens. A Chubb team member stated that IoT is having a particularly noteworthy impact on their commercial insurance industry. This change is evolving the way that they are now pricing, underwriting, and servicing commercial insurance. 

IoT in Insurance

The adoption of IoT in the commercial insurance segment has accelerated significantly since the beginning of the pandemic, and they expect it to expand further. Chubb’s senior vice president and IoT lead, Hemant Sharma, said that Chubb sees IoT as a valuable opportunity to offer their clients bespoke risk prevention services that will ultimately reduce or, in some cases, avoid losses. 

Prologis Inc’s senior vice president of global risk management, Jeffery Bray, spoke about how critical IoT was to their business. Prologis has a billion-dollar portfolio of warehouses, and they are using IoT to find better ways to manage and predict risk. IoT tech provides the perfect fit as Prologis’s main risk is driven by property exposure. 

The IoT sensors help Prologis get ahead of their operating risks, collect more data in real-time and be more predictive. According to Bray, Prologis is now working on valuing leading indicators as opposed to reacting to lagging counterparts. This switch involves the ideation and development of “autonomous” buildings, those which effectively use IoT devices. 

One new area advancing IoT: drones. After a natural disaster, drones can be utilized to gather in-field data quickly for any resulting claims. Drones gather data for building inspections, providing underwriters with more information and people with faster payouts. 

Potential Uses for IoT in Risk Management

For future uses of IoT, there are two crucial questions to ask:

1.     Will this new technology help drive differentiation in the marketplace?

2.     Will it stand the scrutiny required of a solid and profitable business case?

The risk management space has many candidates that can potentially fulfill these requirements.

Oil and Gas

The oil and gas industry has consistently invested in its sensor and early warning infrastructure to ensure safety. Some of the most common risks in the energy industry are injuries, fires, hazardous gas leaks, and vehicle accidents. 

A collaboration between the energy industry and insurers can be formed through IoT data to look for the early signs of potential accidents. This can prevent costly accidents, environmental spills, and insurance claims.  

Despite preventive measures, risk is always present with oil and gas, and the costs of adverse events are often devastating. Research from 1974 to 2015 shows the total accumulated value of the 100 largest oil and gas disasters exceeds $33 billion. Another report shows that only Russian refinery damage from 2011 to 2015 exceeds $1.5 billion. 

Infrastructure

The variety of sensors for commercial infrastructure OEMs has seen a substantial increase.  These sensors can monitor safety breaches, ranging from water leakage, smoke, overloading of weight-bearing structures, and the presence of mold and mildew, among others. There will be an ongoing integration of infrastructure management systems with IoT data to aid loss prevention programs and provide preventative actions. 

A 2018 study compared a classical (non-telematics, IoT-based) risk model against a telematics-based version and a hybrid (telematics and traditional factors) version, measuring their predictiveness levels. The result: the classic model ranked least predictive. 

Grocery and Other Retail

With the millions of routine visits to these stores and the potential hazardous locations within grocery and convenience store aisles, seafood facilities, salad bars, and liquid storage areas, opportunities for proactive risk management are abundant. 

IoT devices can be used in accident-prone areas to monitor human traffic patterns, debris, and cleaning. Beyond the logging of activity for compliance reasons, IoT can help prepare injury reports and the necessary remedial actions for reducing claims-based losses. 

Smart Homes

We now see the addition of new connected devices entering our homes.  Ring doorbells, smart thermostats, baby monitors, IoT-enabled refrigerators, other appliances, pipe leakage sensors, lighting, and entertainment controls are becoming more commonplace.  If utilized correctly, the resulting increase in data can allow for new innovative insurance products and engagement with the insured and mortgage borrowers. 

Wearables

Connected health wearables such as watches, patches, shoes, socks, and a new supply of industrial safety wearables are entering the market.

These different items of clothing monitor biometric data, as well as odd joint angles (improper lifting technique, carpal tunnel syndrome), bad posture, and more. They help prevent injuries and costly medical insurance claims.

Proactive Risk Management in IoT Programs 

IoT technologies continue to evolve, and the real test is whether the technology can benefit the finance or insurance carrier and the borrower or insured respectively. Until the industry can get a high engagement index with the user, be they personnel or commercial, the chance of the user opting out remains high. Thus, the technology’s potential is limited.

Progressive Insurance and other pioneers in the IoT space have moved in the right direction, initially focusing on the automotive sector. Their Snapshot program rewards the insured with monetary benefits when they can drive safely and avoid high-risk driving behaviors such as late-night driving or excessive acceleration and breaking. 

The result is a “high stickiness” describing their insured population, who will keep lower rates for passing the six-month “Snapshot” test. It also allows Progressive to identify more risky drivers that will not receive the lower rates while still notifying those drivers with “beeps” that their actions are hazardous. Additionally, Snapshot has withstood the scrutiny of actuaries, reshaping how insurers assess, limit, and price the risk of their product offerings. 

Image courtesy of Progressive Insurance

So, what can we do to fulfill the two questions of market differentiation and profit?

  • Develop an ecosystem with technology partners. This means to explore the IoT marketplace thoroughly by studying product roadmaps, vendors, and system integrators. 
  • Continuously experiment. This means to include businesses and markets adjacent to your usual targets through expanded coverage or product rehauls. 
  • Integrate IoT into operations early. In other words, developers must marry underlying systems to IoT-capable devices starting from the ideation stage. 
  • Plan for the long-term. As IoT evolves, business leaders should increasingly take on an “investor mindset,” seeking out opportunities to improve income or reduce costs? 

Closing Thoughts

The internet of things (IoT) is flourishing globally as the number of connected devices continues to expand, projected to grow beyond $50 billion in 2025, with more than two devices for every human (19.1 billion). This massive expansion, coupled with ongoing device computing power improvements, is giving rise to new possibilities for the finance and insurance industries..

Possible incentives include better pricing on mortgages and loans, rebates on policies, and discounts for companies that use them. IoTs also come with added conveniences, such as reduced employee absence, less downtime, and faster repairs. The key is to remain proactive and consistently seek out methods by which IoT reshapes the global risk management industry. 

Disclaimer: The information provided in this article is solely the author’s opinion and not investment advice – it is provided for educational purposes only. By using this, you agree that the information does not constitute any investment or financial instructions. Do conduct your own research and reach out to financial advisors before making any investment decisions.

The author of this text, Jean Chalopin, is a global business leader with a background encompassing banking, biotech, and entertainment.  Mr. Chalopin is Chairman of Deltec International Group, www.deltecbank.com.

The co-author of this text, Robin Trehan, has a bachelor’s degree in economics, a master’s in international business and finance, and an MBA in electronic business.  Mr. Trehan is a Senior VP at Deltec International Group, www.deltecbank.com.

The views, thoughts, and opinions expressed in this text are solely the views of the authors, and do not necessarily reflect those of Deltec International Group, its subsidiaries, and/or its employees.

design and development by covio.fr