Web3 Conversations

Conversations around Web3, otherwise called Decentralized Web or dWeb, have been happening for a long time, and they have only increased in the number of late. For example, here's an excerpt from one of Opensource.com's posts I came across in 2012.

Deb Nicholson is worried about the future of our data.

They're being stockpiled by organizations whose ability to protect them is diminishing, but whose ability to keep users from accessing or transporting them is increasing. Bit by bit, these data are being centralized--gathered in large repositories outside users' control and sequestered there when competing products or services can't access or read them.

Nicholson, community outreach director for Open Invention Network and community manager for the MediaGoblin project, says movement in the opposite direction--toward decentralization--is necessary for a more open, safe, and competitive future. And in "We Are Legion: Decentralizing the Web," a presentation she delivered today at SouthEast LinuxFest in Charlotte, NC, Nicholson explained how open source projects are at the forefront of attempts to realize this future.

"I'd like to see more people working on decentralized services," Nicholson said.

Centralization's consequences are myriad, Nicholson noted. It can lead to monopoly, as firms with singular control over key pieces of software or data sets can almost completely control how those resources are used. This can lead, in turn, to censorship when entities with complete control of an infrastructure define precisely which kinds of communication are permissible via that infrastructure (Nicholson reminded listeners that the U.S. Postal Service refuses to transport "obscene" materials, though the definition of "obscene" has shifted repeatedly throughout the 20th and 21st centuries). And centralization can accelerate media consolidation, homogenizing the media landscape and narrowing the number of perspectives available to consumers and citizens.

And here's an excerpt from one of The Guardian's posts (2018):

With the current web, all that user data concentrated in the hands of a few creates risk that our data will be hacked. It also makes it easier for governments to conduct surveillance and impose censorship. And if any of these centralised entities shuts down, your data and connections are lost. Then there are privacy concerns stemming from the business models of many of the companies, which use the private information we provide freely to target us with ads. “The services are kind of creepy in how much they know about you,” says Brewster Kahle, the founder of the Internet Archive. The DWeb, say proponents, is about giving people a choice: the same services, but decentralised and not creepy. It promises control and privacy, and things can’t all of a sudden disappear because someone decides they should.

Ever since NFTs started receiving attention, Opensea raised $300 million in funding and reached a market valuation of $13 billion, and OpenSea banned a few NFTs; Web3 has been receiving increased coverage. As you can see from the 2012 excerpt, there has been a voice around the need to change the internet from its present centralized architecture to a decentralized one for a long time. The 2018 excerpt from The Guardian adds more details to the earlier excerpt. The difference between the two was that when Nicholas presented her thoughts in 2012, she predicted that the centralized internet would create monopolies, and at the time of The Guardian's post, the internet already had created such monopolies.

Web1 introduced HTTP-based communication and introduced static web pages as the internet content. If you had owned a website back then, you perhaps had installed a web server, and for an email, you would have established a mail server too. This process isn't an ordeal for a person with a technical background; however, this would have been a great inconvenience for a layman. One had to invest time in knowing the nuances of setting up and maintaining a website. This inconvenience manifested by Web1 provided an opportunity for its evolution. The convenience of collaborating or communicating with each other or even quickly setting up a website for one's core business or services without the hassle or difficulty of understanding the back-end technical aspects was not present during the times of Web1. However, Web2, with its centralized architecture, changed the scenario. Providing convenience became the primary motivating factor. This factor pushed the internet to create a multifaceted centrality using the client-server communication structure.

The centralized architecture of Web2 has nurtured and created companies such as Facebook, Amazon, Microsoft, and Google, which provide centralized services (e.g., cloud data storage, application hosting, etc.) upon which other businesses have housed their platform of services. This provision has translated into the democratization of the benefits furnished by the Internet. As a result, people can access services universally with ever-growing convenience, leading to the proliferation of online applications. Furthermore, the evolution of the Internet and its development into the centralized web paved the way for different technologies such as mobile devices, the Internet of Things, and others. This brought more convenience to internet users (consumers and businesses). The flip side is that with the introduction of centralized services - no doubt these services provided convenience and speeded up the maturity of the internet, they also created monopolization from the online services. The monopolization and centralized architecture have translated into risks and dependencies for the users.

As the monopolies grow in size and number, they certainly pull in more resources (both from the perspective of human resources and infrastructure), which certainly is helpful in developing the services further to provide more convenience to the users; however, other initiatives or innovations could be limited, along with the influx of risks and dependencies—these dynamics of Web2 form the premises for the Web3 (decentralized web or dWeb) discussions or conversations.

Consider the following excerpt from The Guardian:

There are two big differences in how the DWeb works compared to the world wide web, explains Matt Zumwalt, the programme manager at Protocol Labs, which builds systems and tools for the DWeb. First, there is this peer-to-peer connectivity, where your computer not only requests services but provides them. Second, how information is stored and retrieved is different. Currently we use http and https links to identify information on the web. Those links point to content by its location, telling our computers to find and retrieve things from those locations using the http protocol. By contrast, DWeb protocols use links that identify information based on its content – what it is rather than where it is. This content-addressed approach makes it possible for websites and files to be stored and passed around in many ways from computer to computer rather than always relying on a single server as the one conduit for exchanging information. “[In the traditional web] we are pointing to this location and pretending [the information] exists in only one place,” says Zumwalt. “And from this comes this whole monopolisation that has followed… because whoever controls the location controls access to the information.”

Essentially, Web3 is an effort to provide the convenience and benefits of Web2, however, through decentralized services. The factors of convenience and dependencies offer a prelude to our next section here.

NPM Corruption

On January 09, 2022, BleepingComputer reported that their developer intentionally corrupted NPM libraries: 'colors' and 'faker.'

From BleepingComputer:

Users of popular open-source libraries 'colors' and 'faker' were left stunned after they saw their applications, using these libraries, printing gibberish data and breaking.

Some surmised if the NPM libraries had been compromised, but it turns out there's much more to the story.

The developer of these libraries intentionally introduced an infinite loop that bricked thousands of projects that depend on 'colors' and 'faker.'

The colors library receives over 20 million weekly downloads on npm alone and has almost 19,000 projects relying on it. Whereas, faker receives over 2.8 million weekly downloads on npm, and has over 2,500 dependents.

The reason behind this mischief on the developer's part appears to be retaliation—against mega-corporations and commercial consumers of open-source projects who extensively rely on cost-free and community-powered software but do not, according to the developer, give back to the community.

In November 2020, Marak had warned that he will no longer be supporting the big corporations with his "free work" and that commercial entities should consider either forking the projects or compensating the dev with a yearly "six figure" salary.

"Respectfully, I am no longer going to support Fortune 500s ( and other smaller sized companies ) with my free work. There isn't much else to say," the developer previously wrote.

"Take this as an opportunity to send me a six figure yearly contract or fork the project and have someone else work on it.

What is evident from this development is that the open-source community has created both dependencies and convenience for its users. The open-source-based licenses allow businesses to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software. The same licenses also allow developers to do whatever they want to do with it. Businesses, irrespective of their size, have grown to rely on the open-source community's libraries or software, as this provides convenience to their software development practice. This also has created the dependencies that bring along the risks such as the one mentioned in the above excerpt.