I've been sending a nice 10GB gzip bomb (12MB after compression, rate limited download speed) to people that send various malicious requests. I think I might update it tonight with this other approach.
I could, at the expense of a lot of bandwidth. /dev/urandom doesn't compress, so to send something that would consume 10GB of memory, I'd have to use up 10GB of bandwidth. The 10GB of /dev/zero that I return in response to requests takes up just 11MB of bandwidth. Much more efficient use of my bandwidth.
A more effective (while still relatively efficient) alternative would be to have a program that returns an infinite gzip compressed page. That'll catch anyone that doesn't set a timeout on their requests.
I don't imagine it would be too difficult to write a python app that dynamically creates the content, just have the returned content be the output of a generator. Not sure it's worth it though :)
I had a few minutes. This turns out to be really easy to do with FastAPI:
from fastapi import FastAPI
from starlette.responses import StreamingResponse
from fastapi.middleware.gzip import GZipMiddleware
app = FastAPI()
app.add_middleware(GZipMiddleware, minimum_size=0, compresslevel=9)
def lol_generator():
while True:
yield "LOL\n"
@app.get("/")
def stream_text():
return StreamingResponse(lol_generator(), media_type="text/plain")
Away it goes, streaming GZIP compressed "LOL" to the receiver, and will continue for as long as they want it to. I guess either someone's hard disk is getting full, they OOM, or they are sensible and have timeouts set on their clients.
Probably needs some work to ensure only clients that accept GZIP get it.
Yikes, the gzip stdlib module is painfully slow in python. Even by "I'm used to python being slow" standards, and even under pypy. Even if I drop it down to compresslevel=5, what I'm most likely to do is consume all my CPU, than the target's memory.
A quick port to rust with gemini's help has it running significantly faster for a lot less overhead.