[Mongrel] Possible memory leak problem...

Zed A. Shaw zedshaw at zedshaw.com
Sat Sep 1 01:40:29 EDT 2007


On Fri, 31 Aug 2007 21:54:53 -0700
"Christopher Bailey" <chris at codeintensity.com> wrote:

> We are finding that anytime our application sends back large files to the
> requestor, we start chewing up memory fast.  We haven't determined the
> precise cause or how to solve it yet, but it sounds like the same kind of
> situation.  In our case, the particular large files are usually not
> generated by Builder, but typically are actually files being
> downloaded/transferred (e.g. images, documents, etc.).

This is because Mongrel collects your large file response into a StringIO so that it can keep rails happy.  After the StringIO is done and rails leaves the locked section of code it then shoves that out the door on the socket.

You are expecting to write a file, in rails, using Ruby's crappy IO, and that it would go immediately on the socket.  That's probably why you're seeing this.

In reality, if it's a large file, and you know where it is, then you should let a real web server handle it, or at a minimum write a Mongrel Handler to do the real heavy lifting.

There's plenty of information on doing this, but seriously, do not ever use send_file in rails or similar.  It's just a waste of resources, and even if you need to authenticate you can use X-Sendfile in apache or nginx and that'll let you auth someone then send the file.

-- 
Zed A. Shaw
- Hate: http://savingtheinternetwithhate.com/
- Good: http://www.zedshaw.com/
- Evil: http://yearofevil.com/


More information about the Mongrel-users mailing list