top of page

Web 3.0 - Future or Buzzword


CONTEXT: Web 3.0 attempts to address Web 2.0 challenges

WHY IT MATTERS: Web 3.0 will change the business landscape

Image-empty-state (1).jpeg

We’ve all heard the term Web 3.0. For the last year or so, it was thrown about a lot. Businesses and brands from San Francisco to Sydney have attempted to get involved somehow. This involvement is in the form of development, investment, community building and education. However, not everyone is on board. Jack Dorsey and Elon Musk have famously dunked on the concept. So, what exactly is Web 3.0? Is it important as many propose or is it just another buzzword business types use to sound ‘well informed’? Surely if so many are talking and working on it, we can define it - right?

Sounds simple enough. The current consensus is that Web 3.0 is supposedly the next iteration of the internet or the world wide web (WWW). But there doesn’t seem to be a common collective view. In contrast to the absolute and binary nature of the code that the internet is based on, Web 3.0 seems to be somewhat vague. In some respects, the term seems to represent what the future of computing looks like. This is at the heart of why the vision for Web 3.0 can be deemed blurry. If no one can predict the future, how do we know what computing looks like years from now? The answer is we don’t. So where did the term come from?

The term was apparently coined in 2014 by Gavin Wood. For those not familiar, Wood is a co-founder of Ethereum and founder of Polkadot - both decentralized blockchain projects. In 2021 the term seemed to become synonymous with anything related to crypto. With the price of Bitcoin reaching more than USD 60k, it’s no wonder it entered the mainstream vocabulary. So, with the vision for Web 3.0 being relatively subjective, is there anything that proponents agree on?

Despite differing views, there are some common concepts and goals. These common characteristics include:

⁃ Decentralization
⁃ Trustless and permissionless networks
⁃ Self-sovereignty of data and funds
⁃ Machine learning and Artificial Intelligence search capability

So why are these features important to technologists? Understanding Web 1.0 and Web 2.0 makes things a little clearer.

Web 1.0 began in the late 1980s when Tim Berners-Lee conceived a new way to manage information. The system would allow resources and information to be accessed over the internet. It became known as the world wide web. The 1.0 era consisted of mostly static-based web pages of information retrieved from a server. Probably the most used feature was email. Interaction as we know it today didn’t really exist. Web 1.0 was a tool for us to read and consume information served by someone else. A digital noticeboard so to speak.

Web 2.0 ushered in a major shift that spurred mass global adoption. Unlike its predecessor, Web 2.0 become known for its connectivity. As computing power improved and the internet went mobile, users now had more available access to the internet. Seizing on the developments of said technology, companies such as Google, Meta (Facebook), Instagram, Twitter, and others, built applications that allowed us to interact with each other in real time. Photo & video sharing, tweeting, and networking meant these companies experienced huge growth in users - as well as profits. We then quickly adopted other applications that dramatically challenged traditional business concepts. Apps such as Uber, Airbnb, Netflix, and Spotify become the norm. At its heart, some of Web 2.0’s defining features have been:

⁃ User-generated content
⁃ Social connectivity
⁃ Fragmented publishing
⁃ Platform-enabled commerce (gig economy)
⁃ On-demand entertainment
⁃ Real-time media and information

Sounds useful and practical. So why are proponents trying to move on from this era? The basis of the movement seems to be the control wielded by the major companies that have successfully built these applications and platforms. They have graciously provided user access for free in most cases. So why complain?

The models employed by big tech give them a gatekeeper-like status to consumer behavior and spending. For example, try to develop an application these days and make it accessible to consumers without going through Apple’s Appstore or Google Play. As gatekeepers to global mobile devices, developers have no choice but to give up a slice of their revenue to the device gatekeepers. In some cases, this cut can be up to 30%. There have been antitrust lawsuits on the subject accusing these giants of anti-competitive practices.

Or take for example Google. We can’t find anything on the internet without them. Their algorithm decides what shows up in search results. And when it comes to video, let’s face it - YouTube is king. Another Google-owned product. Until the emergence of TikTok, there was only one place to find all the major video content out there.

Let’s not forget Meta (formerly Facebook). They pretty much control our social lives through their apps Facebook, Instagram, and WhatsApp. A lot of what we share runs through their gates.

There are also big banks. Why should institutions decide who can use a bank account or access credit? In terms of engaging in global commerce, you wouldn’t do very well without a bank account to transact.

Big corporations argue the benefits of such a system. However large corporations with massive influence and power have never boded well with the masses. In a western world dominated by free-market thinking, monopolistic practices are frowned upon in the name of fairness. Historical examples include Standard Oil, U.S. Steel, AT&T, and Microsoft.

So, the driving force behind the concepts of Web 3.0 is a democratization of sorts and an attempt to level the playing field. It’s an attempt to not only open the gates but to create much more of them. To rebuild the internet into an open tool, used and governed by its users - not the application developers. The ideology of fairness and openness seems reasonable. But how that is achieved is anyone’s guess.

So, what we know is that the future of the internet, or Web 3.0 as some define it, should usher in changes that benefit users and not large corporations. It should be accessible to everyone with the governance being in the control of the masses. It should have no one point of failure and be decentralized.

There seem to be more unknowns than knowns. Questions remain such as:

⁃ Will future developments truly be decentralized, or will they just appear so?
⁃ Will the technologies be censorship-resistant?
⁃ How does one identify trustworthy information?
⁃ Is blockchain the technology solution that underpins it?
⁃ If blockchain is the answer, which blockchain or blockchains will best support it?
⁃ Can personal and company data safely be stored in a decentralized manner?
⁃ Will all organizations benefit from fewer intermediaries or do centralized concepts still play a role?
⁃ Does my organization need to be educated and upskilled on the technology and its use cases?

Reach out to us to chat more about what the shifting tech landscape means for your business. Find out if your company is ready for ‘Web 3.0’ or whatever it will be labeled when it’s mainstream and operational.

bottom of page