james_b at neurogami.com
Thu Feb 23 01:54:15 EST 2006
This may or may not be a Nitro question. It *could* be, though perhaps
a non-Nitro solution is available. Which is fine, too. :)
I've noticed a remarkable increase in bandwidth usage on ruby-doc.org.
Some of it may be due to a general rise in Ruby popularity, but based on
log file inspection some of the usage is from bad bots or ill-mannered
I can process the logs and ban user agents and IP addresses after the
fact, but by then the damage is done. What I'd like is a way to
intercept requests for certain static content and optionally deny access
if the client has made too many requests in too short a time, or (of
more value) has fetched too many bytes in too short a time. I want a
gatekeeper controller to stop thoughtless clients *before* they can
slurp down several GBs.
I know how to grab the client info and check the size of the requested
file, but I'm unsure what to do next if the client is allowed to fetch
the file. (Sending back an "Access denied!" page or whatever it is when
denying a client is not a problem).
One option is to return a redirect header to the real file location.
The downside is that this then exposes an unprotected path to the
resource. (Though this may be a Good Enough solution; few people may
bother to note this redirection if they are actually getting the
Another is to read the file and stream it back through the Nitro
gatekeeper code. But I'm concerned about performance.
(I'm running Apache2 on Red Hat Linux. I've tried bw_mod, which does
some coarse-grained throttling, but it does nothing to shut off download
hogs. It only slows them. There may be a way to hack iptables or
something, but my knowledge there is weak. Suggestions welcome; we can
take this of-list if doesn't concern Nitro.)
More information about the Nitro-general