Zipbombing bots and vuln scanners for fun and profit

So, we recently saw this blog post explaining how'a "zipbomb" vuln scanners n the like using gzip. Previously, what we had done was redirect things that looked like such bots (specifically, things requesting PHP or ASPX paths) onto one'a Hetzner's 10GB downloads on their speedtests sites. But, this seems even more effective, since we can make them download sth that unpacks into much more than 10GB (while having an innocuous content size at first).

Now, the method as described in the original post had several issues:

So, we figured we'd

W that said, we'll demonstrate how we did't.

Generating the payloads

So, first we need a source other than /dev/zero for getting large amountsa the same character. We could just run tr on /dev/zero every time, but tr can be very slow/heavy on the CPU n apparently bottlenecks the faster compressors, in our experience. So, we only use't for generating a relatively small file; 1GB:

dd if=/dev/zero bs=1G count=1 | tr '\0' a > aaa

Additionally, we said we'd wanna make't look like an HTML file. So:

echo '<!DOCTYPE html><html><head><title>a</title></head><body>' > start
echo '</body></html>' > end

Next, on compression levels n final payload sizes. Brotli's very slow at levels 10 n 11, but level 9's decently fast while still achieving good compression ratios, enough for compressing several TB into a few KB, but we figured anything above 1TB'd prolly be overkill. gzip's just meh overall. Bad compression ratios n also rather bad speeds. For what it's worth, the highest level (9) didn't seema achieve better ratios here than the default (6), so we figured we'd just leave't on the default. Anything that decompresses into more than 100GB might be more than ya wanna actually serve every time ya get scanned, tho. So, we figured 100GB'd make a good size. I dunno'f Zopfli or ECT might achieve better ratios than gzip for this sorta file, but we can't seema make them read from stdin, so. Zstd's the fastest compressor here (at least when using multithreading), even at the highest level (19), it's just way faster than the other two. The ratio's not as good as Brotli's tho, so we figured we'd settle for a compromise ‒ sth like 500GB should be good. Zstd's also the only one w its own progress indicator, for the other two we'd suggest adding some rudimentary progress indicator onto stderr like we did here.

for i in {1..100}; do cat aaa && echo "$i" 1>&2; done | cat start - end | gzip > bomb.gz
for i in {1..1000}; do cat aaa && echo "$i" 1>&2; done | cat start - end | brotli -9 -o bomb.br
for i in {1..500}; do cat aaa; done | cat start - end | zstd -T0 -19 -o bomb.zst

Once that's done, just put the bomb.* files into a directory on ur webserver ‒ in the following examples we assumed /srv/www/bombs but anything that ur webserver can read should work.

Configuring the webserver

We use Caddy w Caddyfile configs but ofc other config formats or webservers should work, too, we just dunno how ya configure those, so we're only providing our Caddy configs here (I don't think they're on our Git repo yet, we haven't gotten arounda update the source so far since we're currently migrating the servers onto different hardware n its taking some time, thx for ur understanding <3). We set up a named route called bot-defense that can be imported in relevant site blocks by putting the line invoke bot-defense in them. Also, do make sure the /srv/www/bombs directory contains a bomb file (e.g. just run touch bomb in there) n not just the compressed files, otherwise Caddy won't attempt sending the compressed files either.

&(bot-defense) {
    @bot `header({'User-Agent': 'nikto'}) || header({'User-Agent': 'sqlmap'}) || path('*.php', '*.aspx', '/wp/*', '/wordpress*', '/wp-*')` # This matches the default nikto n sqlbase user agent headers, as well as anything that looks like a PHP, ASPX or WordPress path ‒ ya may needa adjust this based on what kinda service ur offering, e.g. for a PHP service ya prolly don't wanna match all PHP paths after all
    @compression header_regexp Accept-Encoding ^(.*,\s*)?(gzip|br|zstd)(,.*)?$ # This regex should (?) match any header that indicates compression support
    route @bot {
            handle @compression {
                    rewrite * /bomb
                    file_server {
                            root /srv/www/bombs
                            precompressed br zstd gzip # Preference based on ascending compressed size + descending decompressed size, do note we dunno'f there's clients that support zstd but not brotli, in any case't prolly doesn't hurt having't here
                    }
                    header {
                            -ETag # No point in clients caching this
                            -Server
                            Content-Type text/hmtl
                            X-Content-Type-Options nosniff
                    }
            }
            redir https://hel1-speed.hetzner.com/10GB.bin # When compression ain't supported, this's the default fallback
    }
}

Well, that concludes stuff Ig. Let us know'f there's any questions or suggestions for improving this ^^