The Internet is a Thin Cylinder: Supporting Millions, Supported by One
<p>The internet is not always a busy place full of charm, but that is only<br> a blink of it.</p> <p>The other places? Either no one knew it existed, or the only thing that<br> knew it was the bots.</p> <h2> Why is this an actual thing? </h2> <p>You might have been thinking, how could this happen? As of 2026, there<br> are at least 1 billion websites, and if all these just have a few links<br> to another, everyone should be able to see and visit them.</p> <p><a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8eoo2r18ptylwzv4gu2z.png" class="article-body-image-wrapper"><img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=aut
The internet is not always a busy place full of charm, but that is only a blink of it.
The other places? Either no one knew it existed, or the only thing that knew it was the bots.
Why is this an actual thing?
You might have been thinking, how could this happen? As of 2026, there are at least 1 billion websites, and if all these just have a few links to another, everyone should be able to see and visit them.
But this is not the case, the reason is not because people cannot find it, it is because they ignored it, because the site had been flushed away by the high-speed development of the Internet.
Some of the sites and tools might be a legendary in the past. But after new tools and technology came in, it is quickly forgotten, only leaving the few people quietly maintaining it.
An example of this
There are millions of examples of old websites that have been forgotten by time. We can take the GNU project Nano as an example that is fairly neutral, not too active, but not completely dead.
It is not a dead project; in fact, it is quite active. But active does not mean it has a big community; few people can create an active environment via just creating code, but few people cannot create charm as it requires not just the code.
If you look at the git log of Nano or the mailing list, you will find out that the only person that is really contributing to the project a lot is the maintainer.
And this project is the text editor that millions of programmers every day, a very important tool that I also use a lot of in. I am very thankful that its creator created it, maybe at the time, they never knew that it's going to be this big.
I am not saying that this is not good, or anything I mentioned in this post. Sometimes these projects are just "finished", and there are seldom bugs; a few people are enough for it to continue. Or working alone is what people prefer, including me.
I chose this project as an example, not because it is the worst. There are thousands of projects that have no one maintaining, it is because it represents the very fragile state of technology - Supporting millions of people, supported by 1 person.
New sites are also involved
Nano is an old project, but not only old projects are lonely, but new ones also do, in the same way. But it's also, a different way. That creates loneliness.
Technology only grabs attention when it can actually do something, or when it is interesting, projects like Nginx and Chromium give a very clear use, therefore people contribute to it and create an active environment.
But not all provides this type of engagement and interest of their websites and projects, take an example of my website at vyang.org, this is a simple website about myself.
I get about a few hundred unique visitors requesting my site; most of them are robots or Internet scans for vulnerabilities. Almost 70% of my requests are made by robots.
This gives you a fake feeling that your website is visited by a lot of "people", but the "people" are not actual "people".
This impacted the creativeness of the Internet, as of most people are moving into big platforms to post their things, and not like the Internet before, where we have to go access the entire Internet to find the solution to a problem.
Because of this, the Internet will be highly standardized, and now when people try to do something, everything is the same.
Looking back, what we have now
I know that there are a lot of problems on the Internet, it is growing fast, and it is very common that we will throw things away when we do not need them anymore.
The issue is that when we throw away things, we do not stop depending on it, while the attention is driven to new projects, the use of old ones never stops, but the development slows or stops.
We constantly invent new things, and shift attention to them. But we do not stop using them, causing a lot of old code to restrict the development of new ones.
Until now, I want to reference a meme picture from the Cloudflare and AWS outages, but the thin cylinder is the old software.
But the worse part about this is that when the fragile part is AWS or Cloudflare, they have engineers waiting 24/7 to fix the problem, but for old software, no one could fix it in time.
How do we solve it? Or do we need to solve it?
This is basically the entire post, what I want to say is that these all happened because the Internet had developed too fast. That people did not have time to absorb all the information and software already created, and new ones had already been created..
What I think is that we should care about the old but software that you use every day, you should always know that behind these software, there are always 1 or few people working for it
The full more detailed version of this blog containing more of my opinions is available on vyang.org.
Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
availableversionplatform
MCP: Programmatic Tool Calling (Code Mode) with OpenSandbox
Introduction Model Context Protocol or MCP enables AI agents to access external systems they cannot reach by default, including authenticated APIs, CI/CD pipelines, live process streams, and IDE integrations. It acts as a structured bridge between the model and real-world environments, allowing controlled interaction with tools and infrastructure. However, MCP does not automatically make interactions efficient or intelligent. Traditional MCP implementations often inject large JSON payloads into the model context, which increases token consumption and reduces efficiency. MCP also does not eliminate the need for proper tool selection and orchestration; if poorly structured, it can introduce unnecessary abstraction and overhead. In environments where agents can directly execute commands or in

The Security Scanner Was the Attack Vector — How Supply Chain Attacks Hit AI Agents Differently
In March 2026, TeamPCP compromised Trivy — the vulnerability scanner used by thousands of CI/CD pipelines. Through that foothold, they trojaned LiteLLM, the library that connects AI agents to their model providers. SentinelOne then observed Claude Code autonomously installing the poisoned version without human review. The security scanner was the attack vector. The guard was the thief. This is not a hypothetical scenario. This happened. And it exposed something that the traditional supply chain security conversation completely misses when agents are involved. The Chain Trivy compromised (CVE-2026-33634, CVSS 9.4) ↓ LiteLLM trojaned (versions 1.82.7-1.82.8 on PyPI) ↓ Claude Code auto-installs the poisoned version ↓ Credentials harvested from 1000+ cloud environments Each component functione

The Full-Stack Factory: How Digital Architectures are Re-Engineering the Textile Supply Chain
In the world of software development, we obsess over latency, vertical scaling, and the elimination of technical debt. We build CI/CD pipelines to ensure that code moves from a developer’s IDE to a production server with zero friction. But what happens when the "production environment" isn't a cloud server, but a physical manufacturing floor? The global textile industry is currently undergoing its most significant "version update" in a century. For decades, the industry operated on a fragmented, "monolithic" architecture—slow, prone to bugs (defects), and incredibly difficult to scale ethically. Today, a new breed of FashionTech is emerging, treating the supply chain as a programmable stack. This article explores the technical transition from fragmented outsourcing to Vertical Integration
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Products

I Replaced Vector DBs with Google’s Memory Agent Pattern for my notes in Obsidian
Persistent AI memory without embeddings, Pinecone, or a PhD in similarity search. The post I Replaced Vector DBs with Google’s Memory Agent Pattern for my notes in Obsidian appeared first on Towards Data Science .

Designing a Message Bus for AI Agents — Lightweight Communication for 20+ Autonomous Agents
How do 20+ AI agents talk to each other? A lightweight message bus design and lessons from real-world operation. The Problem: How Do Agents Communicate? When you have a single AI assistant, communication isn't a problem. But when you scale to 10+ agents distributed across multiple servers, a fundamental challenge emerges: how do agents communicate with each other? Our environment runs 20+ agents spread across 9 nodes, each responsible for different domains. They frequently need to: Delegate tasks : A manager agent assigns sub-tasks to specialist agents Sync state : An agent notifies others after completing a task Request information : Agent A queries knowledge held by Agent B Broadcast : System-wide announcements Why Not Use an Off-the-Shelf Message Queue? RabbitMQ, Redis Pub/Sub, or NATS

The Full-Stack Factory: How Digital Architectures are Re-Engineering the Textile Supply Chain
In the world of software development, we obsess over latency, vertical scaling, and the elimination of technical debt. We build CI/CD pipelines to ensure that code moves from a developer’s IDE to a production server with zero friction. But what happens when the "production environment" isn't a cloud server, but a physical manufacturing floor? The global textile industry is currently undergoing its most significant "version update" in a century. For decades, the industry operated on a fragmented, "monolithic" architecture—slow, prone to bugs (defects), and incredibly difficult to scale ethically. Today, a new breed of FashionTech is emerging, treating the supply chain as a programmable stack. This article explores the technical transition from fragmented outsourcing to Vertical Integration



Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!