The domination of Internet platforms is not destined to last

0

There is a widely held belief that internet platforms are too big to be dislodged from their current dominant positions. Many believe that we have no choice but to apply the full force of competition regulation to protect customers from harm resulting from their size and market power.

But are these platforms really too big to fail? Or are we in the middle of a cycle that has already repeated itself and will continue to repeat itself in the future?

When I first accessed the Internet, it was through VSNL’s Gateway Internet Access service which offered a 9.6 kilobyte per second connection over notoriously spotty dial-up lines. Even under these constraints, it was obvious to early adopters like me that the Internet was a vast storehouse of extraordinarily diverse and useful content.

At that time, the World Wide Web was little more than a loosely linked collection of “html” pages. This meant that it was really hard to find the information you needed unless you knew the URL of the specific internet page it was stored on.

I’ve previously written about Archie, the world’s first Internet search engine, which attempted to solve this problem by indexing web pages by title. In the beginning, browsing the Internet involved wading through Archie indexes, hoping to find relevant information by decoding a page’s content from its title.

The inherent limitations of this approach have prompted organizations like Yahoo and Alta Vista to invest heavily in curation. Armies of librarians have been hired to organize the Internet. They personally visited hundreds and thousands of websites to manually sort them into a hierarchy that could lead to more effective search results. This is how we browsed the web when I first went online. But it was already apparent that given the speed at which the internet was growing, it would very soon be impossible for human curation to keep pace.

In 1996, news began filtering through the academic vine of a magical new search engine that could generate highly relevant results by ranking pages based on the importance of its back-link data. When I first used Google, it was as magical as I was promised. I was able to access more relevant information than Yahoo or Alta Vista had ever been able to suggest. This upstart search engine was so confident in the accuracy of its needs that it had a button on its search page that bypassed search results and took you directly to the website of its top-ranked listing. And you have rarely been disappointed.

Since then, search has been our primary means of accessing the Internet. But even though it served us well for over two decades, I recently began to feel something was missing in its quality and completeness. Faced with diverse and often conflicting sources of information, I began to find it increasingly difficult to locate content that I can trust.

This forced me to turn to curation again. Wirecutter is now my first port of call for product reviews, though I often head straight to even more niche websites for things I’m passionate about – DPReview.com (for photography gear), HeadFi. org (for premium headphones) and wholelattelove.com (for coffee).

As good as algorithmic recommendations often are, I prefer to find new artists from playlists manually assembled by friends who share my musical tastes. I stopped relying on online book recommendations unless they came from reading lists of people I admire or were featured on a podcast I subscribed to. I’ve gotten to the point where I’m more confident of getting a sufficiently diverse range of relevant viewpoints on a given issue from Substack authors and Reddit threads than from generic search results.

Benedict Evens summed it up perfectly in a tweet: “All research grows until you need curation. All curation grows until you need research.”

It is an inescapable truth that technology evolves in cycles, often swinging like a pendulum between extremes. If it feels like the tech giants of the day are unassailable, it’s only because we can’t yet see the upstarts lurking just around the corner who are going to be their downfall.

Take the Internet itself. Although it was originally designed to be open, our access to it today is almost exclusively facilitated by services that determine what we can see. All the content, commerce, entertainment and social connections we consume have been pre-packaged in endless scrolling streams of information in an online experience that is a far cry from the original open vision of the internet.

I’m not mentioning this to disparage these platforms or decry the current state of the internet, but because I think the pendulum has already started swinging in the opposite direction. As much as the past two decades have been spent centralizing the once open internet into the hands of a few, over the past five years it is impossible to ignore decentralized solutions – blockchain-based services and autonomous organizations. decentralized – which arose as a counterpoint to this narrative of extreme centralization.

While I generally agree that we need to have proper regulations to protect consumers from harm, I’m still not convinced that we need to because Big Tech has become too big to fail. On the contrary, if history is anything to go by, we are probably on the cusp of a major cyclical transformation.

After all, current technologies have always been replaced by the best.

Rahul Matthan is a partner at Trilegal and also has a podcast under the name Ex Machina. His Twitter handle is @matthan

To subscribe to Mint Bulletins

* Enter a valid email

* Thank you for subscribing to our newsletter.

Share.

About Author

Comments are closed.