Hi all,
I run a pretty popular site that is constantly experiencing some form of abusive traffic- most notably attackers running password dumps against our login endpoint (sometimes from 10k+ ipv4 addresses among several subnets and ASNs). We've mostly mitigated this with rate limiting, captcha and other forms of "suspicious login" detection. But I've been recently pondering ways to waste their time or resources to make password dumping less appealing.
The most recent attack could be accurately and precisely detected, and I noticed the abusive traffic would follow 301 redirects, so I decided to redirect all requests back to their own IP addresses.
I don't think it really slowed them down, but got me thinking of otherwise to stop/slow them down:
* 301 redirect them to a "honeypot" server that holds onto sockets for as long as possible and/or causes the client to waste time/cpu cycles (perhaps, constantly asking the client to renegotiate TLS)
* 301 redirect to https://nsa.gov so they might get on someone's radar with the time and resources to stop them
* Redirect them to a non http protocol, like geo:, potentially blocking the abusive client with a dialog "Want to open this with <Maps Application>?" (Some attacks originate from Android devices- I'm not sure how deeply the custom protocol hooks are registered and if that would even work)
I know abusive web traffic is pretty widespread and was curious how others dealt with it other than standard captcha, rate limiting and iptables rules