Harvested Compute and Pervasive AI: The Protocol Labs Bull Thesis
The concept of decentralized computing and Edge AI has long been an area of fascination for me. In this article I will spell out a growth thesis for Protocol Labs. I will also try to outline the total market opportunity.
This analysis is based on my personal experience as a former software engineer and ecosystem manager for global software and edge AI, AIOT ecosystems at Intel, Arm and NVIDIA. I now work in Web3 full time to build blockchain infrastructure at Saga (https://www.saga.xyz). At Saga, I work extensively on Web3 gaming and metaverse ecosystem as well as on infrastructure to help launch sovreign blockchains.
Over the last 12 years, I have personally spoken with CEOs and CTOs at thousands of edge AI, Web3 and AIOT startups. This is my analysis and recommendations for Protocol Labs to win in this market.
Current State of Affairs
As a quick recap: Protocol Labs plans to build the worlds largest decentralized storage network before shifting increasingly into decentralized compute. This is a vision spelled out in their Master Plan, which they have executed to over the last several years.
For the most part, the existing data shows that IPFS and Filecoin have continued to gain traction in the market. This is a good thing. When AWS first came on the scene, S3 (their storage solution) proved to be the key driver of cloud computing. Once you convince a customer to store their data with you, transitioning that customer to next buy computation based services becomes much easier.
In other words, before Protocol Labs can sell decentralized computation, they must first build a massive footprint of decentralized storage. That part seems to be going well, leaving me feeling that success in decentralized compute is a possibility.
Protocol Labs seems to be having success attracting customers to use it’s storage. This success will make it much easier to upsell compute, next. That’s coming.
Headwinds
Rising interest rates, fraud-related implosions of major exchanges, the collapse of CeFi, trade war escalations, market downturns and war in Europe are all weighing heavily on the crypto market.
The mess from the FTX, Celsius, 3AC, Luna collapses last year continue to cast a shadow over the Web3 space, along with the ever-present threat of misinformed regulation attempts. I am optimistic we will make it through this with the overall decentralization thesis intact.
Comprehension Gap
A big problem for Protocol Labs is the comprehension gap between the majority of web2 developers and enterprises and their offering.
Developers must be shown, in a tangible way, why the offerings of Protocol Labs are superior to what they have now … without the distracting verbiage of decentralization and tokens and in a manner which these developers can comprehend natively with low effort.
Web3 is currently doing a spectacularly bad job at helping more developers cross the chasm. Selecting and designing better language, and tangible hands-on and fun projects which developers can build in their home labs to test Protocol Labs will be critical to further progress.
The Biggest Headwind: Trust and Security
The #1 problem I see Protocol Labs having in attempting to drive adoption of it’s decentralized computation and storage offerings is the inherent lack of trust in “sending my storage and compute over the network to god knows where.”
Because Protocol Labs technology is based on distributing storage and eventually compute to a network of volunteer nodes, it is saddled with increased requirements of attempting to establish trust. People are leery of dispatching their sensitive data to such networks.
This same hurdle greatly slowed adoption of AWS, and represents a potentially far greater hurdle for Protocol Labs which will require far more “proof” and disruptive growth to overcome.
It is my feeling that customers will be dragged, kicking and screaming, into a decentralized storage and compute future. However, there is good news:
I believe that Generative AI represents a disruption in the trust model which has powerful potential to unleash Protocol Labs to become a massive player in decentralized storage and compute, possibly far beyond what anyone might be expecting.
In my opinion, attempting to assuage the fears of customers worried about relying on decentralized compute and storage networks will be a waste of time. The majority of the future “Customers” for decentralized storage and compute won’t be humans, it will be AI!
Generative AI Is The Killer Feature For Protocol Labs
AI is hot right now.
But what people are not talking about enough is the fact that decentralized, open source AI which is free from arbitrary censorship represents a disruptive attack vector which offers unique advantages to Protocol Labs which no other market participant can offer.
While centralized cloud players such as Amazon, Google and Microsoft have to “Follow the Woke Agenda” (in the parlance of Web3 proponents), decentralized networks are free of such constraints.
Generative AI is like a new form of compression
Generative AI should be viewed as a disruptive new form of compression. Under existing storage schemes, data often needs to be replicated across many nodes to ensure safety.
With Generative AI, only the seed phrase needs to be replicated. A single copy of the asset which has been produced via generative means is needed.
This has the potential to significantly reduce the cost of decentralized storage as only the seed phrase and available compute are needed to recreate any lost, corrupted or stolen data.
Generative AI and game engines, streaming
I personally believe that generative AI will play a significant role in disrupting both game engines and streaming services.
Instead of viewing a fully created movie, downloaded from a distant host, we may soon only stream a script which is then generated into a movie or game on the client or at the edge.
In other words, in the prior era of game engines, game scripts and assets were fed into the game engine to produce the game. In the new era, prompts will be fed into an AI engine and a movie or game will be generated on the fly.
This may restructure the way that streaming of games and movies works and enable a completely new way to play games and movies.
This new era of AI streaming may strongly benefit Protocol Labs, the very nature of compression itself is being disrupted.
Generative AI represents an unblockable attack vector
Generative AI is a truly perfect development for Protocol Labs to achieve it’s master plan, far beyond what most people in the market are currently thinking.
It isn’t that Amazon, Microsoft and Google can’t match the functionality that customers desire — It is that they are bound by a set of rules which will prevent them from ever being able to service the true demand.
Let me spell this out:
- Protocol Labs biggest barrier to trust and adoption is the fact that people are not comfortable shipping their data and compute to a volunteer network of untrusted computers located “god knows where”
- Generative AI eliminates this barrier because the code and assets generated are cheap, arbitrary, highly abundant, have lower security requirements (because they were generated on the fly)
- Generative AI will also results in tremendous amounts of overproduction of heavy duty video, AI art, voice, music and other assets. Because it is now so easy to generate assets, demand for storage and compute will skyrocket!
- Protocol Labs’ offering of “Our storage is SIGNIFICANTLY cheaper at the expensive of being operated by a centralized authority” is the perfect disruptive offering. Many customers will use Protocol Labs to store AI generated assets and will want cheap access to GPUs to generate new assets.
- Because AI generated assets are produced so easily, they are nearly “throw-away” in nature. In other words, customers generating these assets will be far less sensitive to allowing these assets to be stored or generated on the fly using decentralized nodes. In many cases, AI will be the one generating these assets, and AI cares less about their security and privacy than humans do.
- Generative AI performed via decentralized networks will be more free from censorship, making it possible to unleash completely new markets via Protocol Labs technologies which competitors will refuse to match
In other words: We are looking at the perfect storm for Protocol Labs to dominate both decentralized storage and computation.
More Tailwinds
Other tailwinds include the upcoming launch of their Filecoin Virtual Machine, the advent of the DePin market (decentralized physical infrastructure), the downturn in use of GPUs for mining and successful traction of Filecoin (rising demand for decentralized storage for NFTs and other assets).
Energy Efficiency and Silicon Supply Glut Are Good
Another powerful advantage that Protocol Labs has is the increasing discussion and emphasis on energy efficiency.
Lisa Su, for example, has gone on record stating that energy efficiency is the only metric that matters for compute in coming years. My hope is that Protocol Labs is able to position into the renewable energy market in a major way.
I believe that Protocol Labs is in position to make better use of the full total footprint of Arm processors which offer a more favorable energy consumption profile. The Filecoin Green platform is a step in the right direction. They can go much further.
Silicon vendors are on record stating that they have been forced to restrict supply of CPUs and GPUs to maintain pricing in the current market as we are facing a glut of components.
This means that hardware to perform decentralized computation has never been cheaper and more abundant, laying the groundwork for Compute Over Data (Filecoin’s solution for decentralized compute which will be bundled with Filecoin Virtual Machine).
I believe that if Protocol Labs executes properly into the tailwinds they have here, they will come out in a highly superior position as the global decentralized compute and storage provider of choice.
Protocol Labs Must Focus On Raspberry Pi and NVIDIA Jetson
It is a big mistake, in my opinion, for Protocol Labs not to devote the majority of it’s efforts to enabling Raspberry Pi and NVIDIA Jetson as the favored compute node of choice.
Why? They are cheap, very fast, energy efficient, semi-open source and massively widespread and available.
By focusing on these low cost compute nodes, Protocol Labs can make the argument that the generative AI performed on their networks will be “Greener” than other sources.
Protocol Labs should, in my opinion, dedicate significant ecosystem enabling energy towards unlocking the Raspberry Pi as the default decentralized compute node of choice.
I would like to see a lot more “Broad Market Developer Enabling” in the form of workshops, hackathons, connected, practical solutions with developers using NVIDIA Jetson and Raspberry Pi. The value proposition of Protocol Labs technology is too abstract currently. I would like to see efforts to ground the value proposition more closely in use cases most hobby developers can directly understand and engage with on a personal level. Then and only then, once a sufficient “army” of IoT developers understand the value, would I expect to see true expotential growth in developer adoption.
My experience from AI, AIOT and IoT as well as edge computing is that most developers like to see use cases they can test out in their home labs. Protocol Labs should prioritize projects and activations which appetize these developers early to get them into the ecosystem.
Ecosystem Gaps & Marketing Problems
I see significant gaps in Protocol Labs current ecosystem. In particular, I would like to see far greater alignment and involvement with major silicon providers such as Qualcomm, Intel, Arm, NVIDIA and others.
I see huge problems in how Protocol Labs is messaging and communicating with the existing market. The barrier between what the “Web2 boomers” are saying and thinking and what is happening in Web3 is extremely large.
I believe Protocol Labs should closely analyze the existing verbiage in the Web2 and silicon and cloud-native market and more closely align their language which how these existing markets are thinking.
Web3 thinking and messaging is a loony tunes space of made up verbiage and abstract thinking. I believe Protocol Labs should not distance itself too far from the existing market and risks alienating major potential partners and customers by straying too far from the mark.
Security, Trust, Privacy: Protocol Lab’s Achilles Heel Is Turning Into It’s Greatest Strength
I believe that the number one barrier to Protocol Labs success in compute and storage adoption have turned from a weakness into a strength.
The rise for demand for “Jailbroken AI” is a perfect match for Protocol Labs future plans for decentralized compute. Likewise, AI generated assets will be more abundant and the need to secure them will be lower in my opinion.
For this reason, cheap and abundant “volunteer” compute networks will be highly desirable for the Generative AI market.
Deliberate Ecosystem: What Should Protocol Labs Focus On (Advantages)
There are a set of forward looking steps which I believe Protocol Labs should take in order to lay the groundwork for a decentralized compute future based on Filecoin Virtual Machine and other technologies.
I will outline a list of them here:
- Persuade the majority of Filecoin ecosystem partners building user interfaces on top of Filecoin to add native compute functionality. The most likely source of compute workload demand will come from these partners organically, who are already servicing customers with decentralized storage demands.
- Build a foundational ecosystem of edge computing and co-location providers to offer a decentralized compute marketplace. These include Equinix, Digital Realty and other baremetal providers. I believe that Protocol Labs’ next generation of platforms resemble the vision that Equinix has advanced of an open compute mesh network. I believe Protocol Labs is in an ideal position to enable this vision, perhaps as a deal or deals with enterprises like Equinix. Protocol Labs should perhaps embrace and extend the vision of Equinix Fabric.
- Protocol Labs must enable the Raspberry Pi broadly as a decentralized compute node as their #1 priority followed by the NVIDIA Jetson.
- Innovate novel use cases in Generative AI to drive demand for decentralized storage. Generative AI should be a powerful new growth vector which will skyrocket demand for both decentralized compute and also decentralized storage.
- Generative AI, by its nature, will also skyrocket demand for compute. Every available GPU and CPU which is sitting idle represents a form of waste. In an ideal world, Protocol Labs technology will skyrocket efficiency of extant compute by increasing the total mobilization of global capacity via decentralized contracts.
- Position Protocol Labs technology as a leading sustainability, climate, energy enabler. An open compute network that prioritizes sustainable sources of compute can help achieve this
- I believe that Protocol Labs should form a unique body in collaboration with global silicon providers, edge computing ecosystems and the United States government on the topic of a national compute grid designed to further national security
National Compute Grid and Supply Chain Hell
The United States has been taking significant action to secure its access to microprocessors, viewing these chips as critical national security assets.
I believe that they have not gone far enough.
It is not enough to simply enable access to chips by investing in on-shoring and friend shoring. The next phase of action should involve improving the efficiency of the processors we do have access to.
In my opinion, market pricing of compute should be a national security imperative. In other words, requiring and passing laws that datacenters do a better job listing the resources they have on an open compute grid or open compute market place structure is a critical step to better ensuring national security.
If the United States both has the best access to semiconductors and also ensures that the semiconductors we do have are used the most effectively, these two factors together will maximize our national security by ensuring greater total share of compute within our borders.
This open market place for compute (and storage), which I am calling “The National Compute Grid” may best be accomplished by a structure similar to Filecoin Virtual Machine’s concept of Compute Over Data.
Closing Thoughts
I am strongly bullish on Protocol Labs. However, unlocking the full potential here will require massive ecosystem innovation. I am not sure who Protocol Labs has on staff focusing on growth ecosystem and innovation, but whoever it is — They are going to have to overcome massive scale challenges very quickly, enable the right use cases and move as fast as possible to capture this market.
The bets and ecosystems Protocol Labs builds today will be the foundation of it’s future success.