[Mongrel] Memory leaks in my site

Jim Powers rancor at mindspring.com
Wed Mar 7 08:03:59 EST 2007

On Wed, 7 Mar 2007 04:14:57 -0700
"Kirk Haines" <wyhaines at gmail.com> wrote:

> > By the way, check the errors section of httperf report, and the
> > production.log. See if there are "fd_unavailable" socket errors in
> > the former, and probably some complaints about "too many files
> > open" in the latter. If there are, you need to either increase the
> > number of file descriptors in the Linux kernel, or decrease the max
> > number of open sockets in the Mongrel(s), with -n option. I don't
> > know if it solves the "RAM footprint growing to 150 Mb" problem...
> > I will know it first thing tomorrow morning :)
> No.  That is probably happening because of the file descriptor limit
> in Ruby.  Your Mongrel has accepted as many connections as Ruby can
> handle; it is out of descriptors.

What file descriptor limit are you referring to?  A typical Linux
<default> ulimit on file descriptors is 1024, which should be more than
enough for the test Ken is performing.

Also, I would recommend doing a test where you separate Mongrel from
Rails.  Use a simple Mongrel handler like the one found here:


require 'mongrel'

 class SimpleHandler < Mongrel::HttpHandler
    def process(request, response)
      response.start(200) do |head,out|
        head["Content-Type"] = "text/plain"

 h = Mongrel::HttpServer.new("", "3000")
 h.register("/test", SimpleHandler.new)
 h.register("/files", Mongrel::DirHandler.new("."))

This will possibly narrow down the problem area.  If Mongrel itself is
to blame then you should still see lots-o-memory growth.  Or it is the
interface with Rails that is causing the problem.

Jim Powers

More information about the Mongrel-users mailing list