[Mongrel] http keep-alive?

Kirk Haines wyhaines at gmail.com
Fri Sep 14 20:16:34 EDT 2007

On 9/14/07, Roger Pack <rogerpack2005 at gmail.com> wrote:
> I read this in a previous post
> (http://rubyforge.org/pipermail/mongrel-users/2006-December/002354.html)
> ....
> First, Mongrel accepts remote clients and creates one Thread for each
> request.  Mongrel also enforces a single request/response using
> Connect:close headers because Ruby only supports 1024 files (so far).  If
> Mongrel doesn't do this then people like yourself can write a simple
> "trickle attack" client that hits the Mongrel server, opens a bunch of
> continuous connections, and then eat up all available files very quickly.
> Basically, a DDoS attack that's very simple to do.
> ....
> Is this still a problem?  If it is, I think it might be sweet if it were
> optional (then load balancer's could keep open connections--if only load
> balancers can hit it...).  Just a thought :)

It's still possible, and probably will remain so for quite a while.
Ruby uses a select() loop to manage it's threads.  It's fd_setsize is
1024.  select()'s performance also degrades as the count of handles it
is managing goes up.

With the next version of evented_mongrel I am going to provide a way
for people to specify, if they are on a platform that supports epoll
(Linux 2.6.x), the max number of connections that they want to be able
to handle.  This would, in theory, reduce the threat of the trickle
attack because an evented_mongrel could have many more than 1024
concurrent connections without any problem.

If you read the archives, the subject of keep-alive is somewhat
controversial, though we (the current admin/developer crew on mongrel)
have discussed it at least once.  I think it is something we are
willing to explore further, if I recall the discussion correctly.

Kirk Haines

More information about the Mongrel-users mailing list