Archive.today CAPTCHA Generates Sustained DDoS-Level Traffic

Archive.today CAPTCHA Generates Sustained DDoS-Level Traffic

Published February 2026 · Traffic Analysis · Web Security

A closer inspection of archive.today reveals that its CAPTCHA page runs client-side code that repeatedly sends automated requests to a third-party blog every few hundred milliseconds — behavior consistent with a sustained DDoS-style traffic pattern.

What is happening

When a user opens the archive.today CAPTCHA page, a small JavaScript loop executes in the browser. As long as that page remains open, the script continuously sends requests to a specific blog’s search endpoint.

This is not a one-time request. It repeats multiple times per second and uses random query values, preventing caching and forcing the target site to process every request.

Non-technical explanation

For non-technical readers: imagine thousands of visitors unknowingly refreshing a blog’s search page over and over again, every second. Even if each visitor sends only a few requests, the combined effect can overload a small site.

Observed script behavior

setInterval(function() {
  fetch("https://target-site.example/?s=random");
}, 300);

At roughly three requests per second per open page, this pattern creates continuous load — a key characteristic of denial-of-service traffic.

Why this matters:
Sustained automated traffic can slow sites, spike hosting costs, trigger provider suspensions, or cause full outages — especially for personal blogs and small publishers without enterprise-level protection.

Community discussion

After the behavior was documented, the issue sparked wide discussion on Hacker News and Reddit. Users reviewed screenshots, code samples, and debated responsibility, intent, and the risks posed by third-party client-side scripts.

Comments