[Mongrel] mongrel memory usage ballooning and process stomping
ezmobius at gmail.com
Thu Jan 18 14:59:24 EST 2007
On Jan 17, 2007, at 11:06 PM, Surendra Singhi wrote:
> First of all thanks everyone for their responses.
> On 1/18/07, Ezra Zygmuntowicz <ezmobius at gmail.com> wrote:
> Some linux distros have weird reporting of processes. I
> have seen it
> where top or ps will report 3 mongrels for each one that is really
> running. And looking at your output you can see that each set of
> three mongrels reported have the same port number. This means that
> you really only have 3 mongrels running because only one can be
> running per port at a time.
> The output of 'ps aefux' is below. Though I don't understand ps
> completely it seems that the one mongrel process spawns another
> child process, which in turns spawns a third one.
Yes I have seen this before and it is nothing to worry about, even
though it shows 9 mongrels you really only have 3 running.
> As far as the memory usage goes, that sounds like a classic
> leak in your rails app. I have seen mongrel balloon like that when
> people add an extra :include directive to a find that ends up loading
> a ton of extra records into memory. If you are loading more then a
> few hundred ActiveRecords into memory on any one page then that will
> surely cause memory to ballon like this.
> Yes, we are doing eager loading at many places, and there is a
> sitemap part
> where about 10,000 records are loaded. I will try to optimize those
> parts, and see
> if they make a difference.
This is one of your biggest problems right here. Its easy to return
thousands of records from an AR query. But AR objects are expensive
in terms of cpu and memory usage to construct. Working with 10,000
records at once is a sure fire way to leak tons of memory and make
your mongrels unstable. Try to work with smaller sets of data. I
have yet to ever see a legitimate use case where you want to display
10k records on one web page ;)
Use pagination or whatever you need to do to work with smaller sets
of data. There is a paginating_find plugin that is useful and there
is a gem called Paginator that I highly recommend over the stock
> A little more info about your app and what it does woudl
> help debug.
> Are you using Rmagcik?
> Yes we are using that. I will look into mini-magick. Joey thanks
> for that.
Mini magick is very nice. But also look at ImageScience. If you are
only using rmagick for thumbnailing, resizing and cropping then image
science is much better. Its about 5 times faster and it doesn't leak
any memory. Highly recommended.
> Are you useing send_file or send_data to
> stream out large content?
> We are streaming data for images.
Is this a convenience or a requirement? This should be avoided if at
all possible. The way rails works with mongrel is that mongrel will
not send the finished request to the client until rails has
completely finished. WHat I mean is when you use send_data or
send_file with mongrel;, raiuls will try to stream the data in
chunks. But mongrel will just buffer these chunks into a StringIO
until rails is completely finished and then mongrel will send the
entire thing to the client. This means that every time you stream
data like this, the entire image has to be loaded into ram in mongrel
before it will send it to the client.
> We are also using ferret and mediacloth.
I don't think these two are the cause of any of your leaks.
> I guess, I need to investigate more on the above mentioned things.
> Thanks a lot.
> Surendra Singhi
From what you have said the thing I worry about the most is loading
10k AR objects into memory. That is not going to scale and will cause
you no end of problems. Work with smaller sets. Switch to
ImageScience if you can or at least to mini magick. If image Science
will do everything you need to do then use it above all other options.
Hope that helps.
-- Ezra Zygmuntowicz
-- Lead Rails Evangelist
-- ez at engineyard.com
-- Engine Yard, Serious Rails Hosting
-- (866) 518-YARD (9273)
More information about the Mongrel-users