[Mongrel] Possible memory leak problem...
chris at codeintensity.com
Tue Sep 4 19:55:36 EDT 2007
Thanks Zed. This is the direction we are going, but it is non-trivial due
to how our storage and authentication system works. A reality
On 8/31/07, Zed A. Shaw <zedshaw at zedshaw.com> wrote:
> On Fri, 31 Aug 2007 21:54:53 -0700
> "Christopher Bailey" <chris at codeintensity.com> wrote:
> > We are finding that anytime our application sends back large files to
> > requestor, we start chewing up memory fast. We haven't determined the
> > precise cause or how to solve it yet, but it sounds like the same kind
> > situation. In our case, the particular large files are usually not
> > generated by Builder, but typically are actually files being
> > downloaded/transferred (e.g. images, documents, etc.).
> This is because Mongrel collects your large file response into a StringIO
> so that it can keep rails happy. After the StringIO is done and rails
> leaves the locked section of code it then shoves that out the door on the
> You are expecting to write a file, in rails, using Ruby's crappy IO, and
> that it would go immediately on the socket. That's probably why you're
> seeing this.
> In reality, if it's a large file, and you know where it is, then you
> should let a real web server handle it, or at a minimum write a Mongrel
> Handler to do the real heavy lifting.
> There's plenty of information on doing this, but seriously, do not ever
> use send_file in rails or similar. It's just a waste of resources, and even
> if you need to authenticate you can use X-Sendfile in apache or nginx and
> that'll let you auth someone then send the file.
> Zed A. Shaw
> - Hate: http://savingtheinternetwithhate.com/
> - Good: http://www.zedshaw.com/
> - Evil: http://yearofevil.com/
> Mongrel-users mailing list
> Mongrel-users at rubyforge.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Mongrel-users