Interview Business Lifestyle Markets Software Technology
August 02, 2024

The Future of DeFi is Automated: PowerPool’s Head of Research Predicts 50% of Transactions Will Be AI-Driven Within Three Years

In Brief

Vasily Sumanov, Head of Research at PowerPool, discusses the innovative DePIN layer, its challenges, and future directions in the Web3 market, emphasizing the transformative potential of automation and AI in blockchain technology.

The Future of DeFi is Automated: PowerPool's Head of Research Predicts 50% of Transactions Will Be AI-Driven Within Three Years

We are pleased to share with you our discussion with Vasily Sumanov, Head of Research at PowerPool, regarding the innovative DePIN layer. From the initial focus on meta-governance to the latest advancements in automated DeFi solutions and AI agent integration, Vasily offers insights into the current challenges and future directions of the Web3 market, emphasizing the transformative potential of automation and AI in blockchain technology.

Many entrepreneurs are drawn to their field by a specific moment or event. What sparked your interest in this industry, and how has your passion evolved over time?

It was a long time ago, in 2013, when I discovered Bitcoin on one of the developers’ forums. I was particularly interested in what it was and started to dig into it. Since that day, in October 2013, I have developed a real interest in it. I started exploring how to mine and participate in all of this. 

I bought some GPUs and started mining Litecoins because it was too late to mine on video cards. But Litecoins were still a promising project, so I started mining Litecoins. I built a mining farm with 120 video cards at that time. I began mining and understood how all this works.

For the first several years, it was more of an enthusiastic vibe. I didn’t consider it as a main business or a full-time work activity; it was more like a hobby that generated some money. I did it with my friends and brother, and we shared all the responsibilities for different tasks. 

However, in 2017, when the ICO boom started, I saw how the space had grown over the years. I understood that I wanted to work on this full-time and dedicate all my efforts to growing in this field. In early 2017, I fully pivoted my career and started working in this space.

My first project was more about the direction of work that I decided to pursue. I have a PhD in chemistry and am an academic researcher in this space. I decided to focus on research activities because I understand and like them. 

This is my passion. I focused on token engineering and the economics of decentralized systems. After that, all my work was tied to the economics of decentralized systems, such as PowerPool and other projects, token designs, token analysis, and similar tasks.

Can you tell us about your journey to PowerPool 2.0? We know you started in 2020 as one of the pioneers in DeFi indices. How was that experience?

PowerPool started in 2020 during the COVID pandemic. I was early in the community there. In 2021, the DAO officially hired me as the Head of Research because I contributed much to the project. I was particularly interested in the meta-governance concept because it was one of the first token engineering ideas in space. PowerPool started as a meta-governance protocol.

After that, we moved towards indices influenced by Delphi Digital and other top-tier VCs. We had a governance forum. Over the years, the team and the community found that the index is a complicated product to deliver to the market. People like it when indexes grow, but nobody wants to invest in it if the index doesn’t grow. Particularly in crypto, this is even more specific due to the high volatility. It can grow fast or drop fast, and nobody likes that.

This is why the PowerPool community and team started to focus on what could be the next big narrative for PowerPool. We didn’t start with what was popular but with what we could do really well. We identified a gap in the market that we could fill with our services. We found that we could excel in automation – automating smart contracts or triggering contracts according to certain conditions. 

This is a significant market, and many projects and retail users need it. Many big protocols, like Yearn Finance, started by automating the compounding of yield in Yearn, for example. This automation is widespread and in demand across different ecosystems, projects, and users.

This is why we focused on automation and built PowerAgent V2, a new version of the automation network or Keeper bot that can automate transactions based on different on-chain and off-chain conditions. 

Recently, we updated our vision and made it even more specific. We focused on AI agents because we believe that transaction execution on behalf of AI agents is significant. It can complement the narrative we already have. We are still building the DeFi network, which helps automate protocols, strategies on-chain, and some user activities. At the same time, this DeFi network is suitable for powering the whole AI sector.

This is really big. Almost all AI agents making some decisions and generating triggers to execute transactions need someone to specifically execute the transaction. There is a gap at this moment, and PowerPool can fill it. 

Our journey has started from meta-governance to indexes, to PowerPool automated V1, and now to PowerPool 2.0, which focuses on AI agents and multiple roll-ups. It’s a big expansion of the concept.

How does the configurable execution conditions feature of PowerAgent benefit both routine and high-value tasks?

In Web3, tasks that need automation can be categorized as routine tasks and high-value tasks. Routine tasks are those you want to automate but don’t have significant risks of losing money if the automation isn’t successful at a certain point in time. 

For example, if you want to compound interest in a wallet, but if you compound this interest a little bit later, say two minutes after the desired time, you won’t lose a lot of money. Many such tasks need automation only because you don’t want to spend time on them, but there’s no significant risk of execution failure.

High-value tasks, on the other hand, involve the protection of positions in the lending market from liquidation, where users can lose money if the task isn’t executed on time or under specific conditions. This could be really bad for the user. That’s why in PowerAgent, we created a wide range of conditions and parameters that you can tune to ensure that your high-value task will be executed properly.

First, you can set the stake range for the Keeper to cut off keepers with low stakes, meaning Keepers with low responsibilities. Only those with high stakes, who have higher responsibility and will lose tokens if they don’t execute your task, will be able to execute your high-value task. 

We also have a diversified Keeper set, so if some Keepers fail to execute the transaction on time, another one will take over and execute it. They act as watchdogs for each other to protect the system.

Additionally, you can set execution conditions like gas limits for your task. For high-value tasks, the gas price doesn’t matter because if you risk losing significant amounts of money, you’re willing to pay any gas cost to ensure the task is executed. The configurable execution conditions benefit users by providing flexibility. 

For routine tasks, you can set a gas limit, so if gas prices are high, the task will be postponed to save on costs. For high-value tasks, you can pay more for keepers and set specific gas ranges and execution conditions. The main idea of PowerPool is to make it flexible for different kinds of tasks, allowing users to automate both routine and high-value tasks efficiently.

What were your last and next milestones? What is your current strategy, and in which direction do you want to move forward?

We launched some roll-ups on the Base network. We also developed a GitHub random function that allows PowerPool to be deployed on various new systems. This is more technical but essential for PowerPool’s operation on new roll-ups and chains where GitHub’s random function or Ethereum’s post-consensus isn’t available for random generation.

The DAO approved a proposal for treasury restructuring, and we focused on a growth strategy for different roll-ups. We have massive growth planned for PowerPool on various roll-ups, which is a significant milestone for the DAO. 

We also have a new vision involving AI agents and their automation. This vision focuses on PowerPool 2.0 and its value proposition for the AI agents market, aiming to be a service network for AI agents and DeFi. We are working with partners and projects from the AI sector to implement this vision practically.

What is your perspective on the current Web3 market’s challenges and benefits since 2020?

Web3 is quite big now. When I joined crypto, even the term “Web3” wasn’t widespread. It was mainly about Bitcoin. Now, it’s a huge industry with many subsectors, like ZK and others, forming large sub-industries. 

Web3 lacks user convenience. Currently, users need to manage multiple wallets and switch between networks, which can be quite cumbersome. The user experience is really poor for the average non-crypto user.

Automation can significantly improve this experience because most users don’t want to perform many manual actions on-chain. They just want to make a trade, buy tokens in portions, or automate payments. 

Web3 needs better UI, well-designed automation services, and on-chain agents to facilitate these processes. These agents can help users perform tasks by understanding their needs and executing transactions automatically. This level of automation and user-friendliness is crucial for consumer adoption.

Do you think the focus is shifting from DeFi to AI or other topics?

The focus is shifting for both users and developers. Users mainly use integrators like MetaMask, Zapier, or 1inch, which provide multiple services and simplify actions. AI could become an alternative by using human-readable prompts to achieve similar results. Developers are drawn to AI because it offers higher valuation and funding opportunities than DeFi. AI agents in Web3 represent a blue ocean with less competition and more room for innovation.

Building new solutions in DeFi is now complicated due to the established competition and the ability to easily fork main DeFi primitives. Competing in TVL (Total Value Locked) with large projects is challenging. Therefore, spending time on AI seems more valuable as it involves adapting existing AI models to specific tasks and goals in crypto.

Which market segments and narratives are currently relevant? Which products have reached their Product-Market Fit (PMF)?

In the blockchain space, several products have achieved product-market fit. From an infrastructure perspective, the graph industry is essential for building, and various dev tooling projects have also found a strong fit. Some RPC projects have similarly succeeded. Within DeFi, DEX aggregators and lending markets have demonstrated brilliant product-market fit. DEXs, in particular, have achieved significant success.

Regarding AI agents, while some protocols have good market valuations, they aren’t yet at a stage where they are fully complete and reliable products. Unlike a new car from a reputable brand that works well without constant updates, AI projects in Web3 are still evolving and being developed. Therefore, while they show promise, they haven’t reached the same level of product-market fit as some established blockchain and DeFi products.

How do you see the current state of the DePIN ecosystem? Why has it recently attracted venture capital interest?

DePIN wasn’t the first or second year of this narrative. It’s been around for a while. You can remember the Orchid protocol for decentralized VPN. If I’m not mistaken, it even launched in 2018 or 2019. Other networks have tried to make decentralized VPNs, CDNs, and video streaming, like Livepeer, which was already operational in 2019. I think this was one of the earliest practically operating DePIN networks on the market.

I think there was a big problem for DePIN in 2018, 2019, and 2020, because there was only Ethereum. There were no big variety of rollups or networks with really low gas fees and opportunity to make a lot of transactions without paying significant costs. The absence of layer-one scalability and options for a lot of cheap transactions was a big limiting factor because you need to make proof of work.

All DePIN networks are physical nodes distributed somewhere on the Internet. They all do some physical work. Of course, it’s digitized. It can be video streaming, data transmission, file storage, or something else. However, the DePIN network needs to prove the correct operation of nodes. 

For example, if you store files, you make proof that your files are still stored. Or if you provide other services, you make proof that your services are done. This requires a lot of transactions sometimes. Also, you need to implement staking and slashing mechanics to protect the DePIN network. So, you have a lot of on-chain interaction that is necessary for the correct operation of a DePIN network. And it was not possible to do it on this scale before.

With the introduction of Solana and all these roll-ups and stuff, it’s much easier to launch something new. So, you have some nodes off-chain and some on-chain components, for example, staking or registered workers or proof that the service is running correctly.

The rise of DePIN is connected with the availability of Layer 1s and Layer 2s for operation without significant cost. The community also grew a lot because more and more ordinary users joined crypto and started to think, “Okay, we need VPN. We need this. We need that.” So, when there are more traditional users outside of crypto geeks, people realize that all these digital-physical services are really needed.

So, I think this is a combination of market demand and availability to build and deliver. And the industry is mature enough to make all this vision like it was in the Silicon Valley series, right? We want to build decentralized storage or something. 

This idea is not new. For example, let’s discuss Siacoin. It existed almost 10 years ago. It was just a proof of work mining and also with file storage. You could also make file storage there. Then Filecoin came along. Of course, Filecoin is not a purely DePIN network. They have their own blockchain layer, which is much more complicated. But the idea of DePIN is really old. It’s not something new, but the implementation and the combination of factors to implement the product and deliver it to the market and get all the attention and users to the product likely happened two years ago.

The concept of “proof of physical work” is central to many DePIN projects. Could you explain this concept in depth and compare it to traditional blockchain consensus mechanisms like proof of work or proof of stake? 

For example, in PowerPool, the node does the transaction. The proof of work is the transaction itself because you can find it on-chain. You can easily verify that the transaction was made according to the time or condition and that some job was executed. This is a transaction. So, the transaction itself is a result of the work. In PowerPool, the proof of physical work is essentially a result of this work and the product that the PowerPool node delivered. It’s just a case in which it works this way.

For some other DePIN networks, like video streaming, you need to deliver the files you transcoded using your GPU farms. You also need to prove that these transcoded files were delivered, so you need to post a hash. I don’t know exactly how it works in video streaming because I’m not in this development sector.

But from the basic point of view, you always need to have some simple proof on-chain that is verifiable by everyone that the work was delivered. So, it is done just to monitor. The idea of DePIN is that some physical nodes execute some tasks.

They receive tasks from the end users. Sometimes, they receive them directly. Sometimes, the network itself receives the task and then distributes it to the other nodes. For example, in Livepeer, they just split the video clip or small pieces of this video information and split it across the nodes according to the staking rules. Then, all the nodes do their work, and they collect it back and send the result to the user. So, that’s how it works.

The proof of physical work is needed just to see that all nodes behave correctly and deliver the results, and other nodes can verify them and slash them if necessary. So, this proof is like a cryptoeconomic security mechanism. 

It’s a piece of the mechanism only, and the total cryptoeconomic security mechanism is based on the work that the node needs to stake in order to participate in the network and in exchange, the node receives the cash flow from executed tasks rewards. But if a node performs some malicious actions or does not execute the service, it will be slashed. So it’s a very simple design.

What are the main differences between ChatGPT and true AI agents?

There are not many differences, but some key distinctions exist. First, the AI agent landscape is quite vast, and I can’t say I’m a professional in developing AI agents, particularly. I am an experienced user trying to integrate AI agent decisions into the Web3 blockchain-centric transaction environment. I’m working on the interface between AI and Web3, aiming to settle AI decisions on-chain and facilitate on-chain actions.

With ChatGPT, from the user’s perspective, you write something and receive an answer in text form or links or use some plugins, possibly getting additional functions like video output or images or accessing additional services. Essentially, you’re receiving content. You prompt information, make a request, and receive content back. This is the basic functionality.

In Web3, you might make a prompt or provide your needs in some other way. For instance, you could send the contract name of a token, and the AI agent would assist you in purchasing it. Here, you make a prompt and receive actionable results. The AI might not advise you to buy a certain token but could directly execute the purchase based on market analysis. This approach simplifies the process significantly, especially for non-experienced users who might find the traditional steps cumbersome and confusing.

In essence, AI agents in Web3 should provide advice and direct task execution. They should work seamlessly, much like ChatGPT does in its domain but within the Web3 ecosystem. PowerPool, for instance, aims to bridge the gap from advice to executed transactions, ensuring users don’t just get recommendations but actual results.

What are use cases or examples of routine tasks that can be automated via blockchain and AI technologies?

One clear example involves Uniswap, where an AI agent could assist in trading. Another promising project is called POND. They perform cluster analysis of token holders for specific tokens. You provide a contract address for a token, and POND offers trading strategies based on the analysis of token holder behavior. They identify whether wallets are speculative or long-term investors and provide strategies like buying or selling based on these insights.

In Web3, AI agents mainly process on-chain data but also need off-chain data to understand concepts like decentralized exchanges (DEXs). The unique feature of AI in Web3 is the deep integration with on-chain data alongside available off-chain information. For instance, understanding how a DEX operates requires both on-chain transaction data and off-chain definitions and standards.

What are the important conditions in L2 and L1 networks for easier onboarding and automation of DePIN and AI solutions?

The main conditions include low transaction fees and scalability to support numerous transactions without high costs. This is critical because high transaction fees can deter users from engaging with the network frequently. Scalability ensures the network can handle a large volume of transactions efficiently.

Another essential factor is having a substantial user base and liquidity. Automation won’t be effective if a roll-up is empty or lacks liquidity. A robust user base and high liquidity drive the usage of automation, enabling users to utilize their liquidity more efficiently. This can involve various strategies like litigation protection, advanced limit orders, or smart strategies for capital allocation, such as moving capital between different vaults or staking projects to maximize yield.

User base and liquidity are crucial as they determine the effectiveness of automation. Automation allows users to interact more efficiently with the network, making more transactions and benefiting the network by increasing transaction volume and fees. This, in turn, compensates validators or sequencers in the roll-up and supports the network’s overall health.

How do you see the role of AI in DeFi in the future? What is PowerPool’s role in this context?

I think that there can be AI agents that will help users manage their portfolios, to analyze their assets, to automatically provide all the data on what’s growing, what’s going down, what yield is available from different strategies, pools, and so on, and to seek other opportunities in the market to get more from these assets. So I think now it’s more like a personal assistant or trading AI.

We will not have something more advanced soon. So, from the user’s perspective, it’s like a personal assistant, but what is interesting is from the developer’s perspective. For example, launching new protocols and new products with AI, with code developed and audited by AI, is a big thing for implementation, and now it works.

I know developers who are using ChatGPT or other AI products to develop code faster, perform audits or security checks, or build interfaces. So, from the developer’s perspective, AI will provide much faster go-to-market opportunities.

What do you think about the concerns in the AI and blockchain business regarding centralization problems and monopolies in computing power?

This is a real problem, but from another point of view, how can you train really big AI models like ChatGPT, like groundbreaking AI products, without having huge computing power and huge resources that you need to pay for? 

The thing is that to get AI training properly on this scale of data and computations, you need to put a lot of resources in one place and use them. And there are really only several corporations in the world that can afford that so far.

So, this is why I’m not surprised that it’s all centralized. Nobody really is worried from there because they have money and they have resources to train AI. Some crypto geeks are spreading around with, “Oh my God, this is all centralized, this is all centralized.” Of course, but how will you be able to create this computing power and decentralize it already? Where will you get the money? Where will you get all this?

Because of the number of video chips and computing power that is produced, you know that I want four factories there. This is the whole worldwide production of video chips and graphical processors. So, the whole computing power production process is entirely centralized. There’s a company that is making designs for all graphical processors and chips. And this is their monopoly. 

Еhere were some projects that tried to pool the computing power from GPU farms and get it for AI development and training. They have some success, but of course, the scale of these projects, which we see on the scale of Microsoft, Google, or other corporations, is impossible to compare.

And because it’s economically impossible to obtain, it’s almost impossible to gain as much power as Google or Microsoft can based on their money and opportunity to get those chips and other stuff.

What about the electricity concerns and regulations in AI, especially in Europe?

I understand there are regulatory issues similar to those in crypto. Regulations can limit the wider usage of AI, especially concerning data privacy, ethical considerations, and resource consumption. For instance, Europe has strict regulations to ensure AI systems are transparent and ethical and do not misuse personal data. These regulations can pose challenges to the development and deployment of AI technologies.

How do you foresee the development of blockchain in the next three years?

I think that blockchain development is a complicated question because there are a lot of things going on, and blockchain is a very wide term. It includes all the DeFi, all the real-world assets, maybe 30 or 40 sectors, each of them multi-billion dollar sectors. I think that we will have a really well-developed and diversified rollup ecosystem.

We already have many rollups, and we’ll have even more. We will have a really competitive market for transaction fees, so we’ll always have opportunities to have networks with low transaction fees, which will be really good for users because they will spend less on transactions. 

We will possibly have a much more regulated DeFi and much more complicated financial products in terms of regulation and ability to use them because everybody will implement KYC and AML stuff on the chain and try not to cut off some users that they think are high-risk, so we’ll have, I think, huge regulatory steps.

We will also have a lot of AI that will really assist users. In three years, AI for portfolio management or doing some simple actions will be like using DeFi or Zapier now. So, it’s just some common knowledge: you just access something and run it.

Of course, people will want to get some ZK agents, for example, to have more privacy and not get all this information from AI developers. Of course, because of the privacy issue in AI, nobody wants other people to know what you’re asking AI, right? It’s your private information. So I think this will also rise.

We will have a lot of ZK-proofs, prover markets because ZK-proofs can add a lot of, you know, trustless components to different cross-chain and like operations and some strategies and stuff. So you don’t need oracles, for example, you can make some proofs. And we’ll find these proofs and use these proofs instead of oracles, as I understand.

But I also think that we will have maybe 40 or 50% of user transactions automated. They will be automated through some strategies, like jobs at Gelato, or they will be automated because people will ask AI to do it for them. Anyway, it will be automated.

I published a vision almost one year ago that 40% of block space in some networks will be consumed by automation networks like Gelato, which will make transactions on behalf of the users. In other words, users will not send transactions themselves anymore in the majority of cases. And now I’m feeling even more confident that this vision was correct.

But now I understand that this vision will be realized because of AI agents’ expansion, as users don’t even want to do the transaction and think about how to do it.

They just want to make a prompt and get the results. I think that AI agents will eliminate this barrier to the adoption of basic functions in Web3 and automate transactions using automation networks like Gelato. So, we will have this vibe in the market.

Disclaimer

In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.

About The Author

Victoria is a writer on a variety of technology topics including Web3.0, AI and cryptocurrencies. Her extensive experience allows her to write insightful articles for the wider audience.

More articles
Victoria d'Este
Victoria d'Este

Victoria is a writer on a variety of technology topics including Web3.0, AI and cryptocurrencies. Her extensive experience allows her to write insightful articles for the wider audience.

Hot Stories
Join Our Newsletter.
Latest News

From Ripple to The Big Green DAO: How Cryptocurrency Projects Contribute to Charity

Let's explore initiatives harnessing the potential of digital currencies for charitable causes.

Know More

AlphaFold 3, Med-Gemini, and others: The Way AI Transforms Healthcare in 2024

AI manifests in various ways in healthcare, from uncovering new genetic correlations to empowering robotic surgical systems ...

Know More
Read More
Read more
Crypto Exchange OKX Launches CATI Pre-Market Futures With 50,000 CATI In Rewards 
News Report Technology
Crypto Exchange OKX Launches CATI Pre-Market Futures With 50,000 CATI In Rewards 
September 9, 2024
Gate.io Hosts 2024 Lead Asia Charity Carnival, Featuring SHIB Co-Founder’s First Public Appearance In Korea
Lifestyle News Report Technology
Gate.io Hosts 2024 Lead Asia Charity Carnival, Featuring SHIB Co-Founder’s First Public Appearance In Korea
September 9, 2024
OORT Updates Tokenomics, Locks 800M OORT For Five Years And Extends Team Token Unlock Period
News Report Technology
OORT Updates Tokenomics, Locks 800M OORT For Five Years And Extends Team Token Unlock Period
September 9, 2024
Top Conferences, Expert Panels, and Exclusive Gatherings in Singapore this September
Opinion Top Lists Business Lifestyle Markets Software Technology
Top Conferences, Expert Panels, and Exclusive Gatherings in Singapore this September
September 7, 2024