[Backgroundrb-devel] Sequence of multiple background process submissions

Raghu Srinivasan raghu.srinivasan at gmail.com
Sat Apr 12 02:53:19 EDT 2008

On Fri, Apr 11, 2008 at 11:30 PM, Ryan Leavengood <leavengood at gmail.com>

> On Sat, Apr 12, 2008 at 1:57 AM, Raghu Srinivasan
> <raghu.srinivasan at gmail.com> wrote:
> >
> > On my site, a user enters an RSS feed to be processed and since this
> takes
> > about 5-10 secs, I pass the process off to a background job and
> meanwhile do
> > some Ajaxy spinners and tap-dancing until the job completes and then
> > redirect_to the appropriate page. This works great.
> Tap-dancing, LOL. I love that.
> > Next, is there a way around this? Can I have 2
> threads/processes/ports/etc
> > for Bdrb so that the batch job doesn't interfere with a live user's
> > experience. Or any other workaround for this? Right now if the web user
> > comes along when 50 jobs are left and each job takes 10 secs, then he
> has a
> > nearly 10 minute wait, which sucks.
> In my opinion you should always separate scheduled long running
> processes from user spawned ones. What I would do in this case is
> extract the common RSS processing functionality into a class in your
> Rails lib directory, then create two different BackgrounDRb workers
> that make use of that class. One would be UserRSSWorker and the other
> could be ScheduledRSSWorker. The first should only be used for user
> requests (and in addition you should use thread_pool.defer to allow
> multiple requests at once, which also might solve your original
> problem with one worker) and the other can be set up on your schedule.
> Also I am not sure how to do it offhand, but you should try to set up
> the ScheduledRSSWorker so that BackgrounDRb instantiates it fresh to
> run and then kills it once it is done, since you don't need it sitting
> around all day doing nothing.

The RSS processing is already extracted into a .rb in my lib directory.
Right now I have just one worker (called parse_feeds). If I understand you
correctly, here's what I need to do: duplicate parse_feeds such that I have
two identical workers, once called parse_feeds_web and the other
parse_feeds_batch. And in my Rails controller, kick off parse_feeds_web for
a live user but use the parse_feeds_batch for the nightly job. This will
ensure that parse_feeds_web need not wait for parse_feeds_batch processes to
complete. Sounds great. I'll try that.

> But as I said above in parenthesis in general if you want a worker to
> be able to handle many jobs at once, use thread_pool.defer. Be sure to
> read the documentation because since this is threaded code there are
> things you need to be careful about.

I did not know about thread_pool.defer. Thanks. I'll be sure to read and
understand it before I mess with threaded stuff. But it will be good in case
several users all plug in their feeds at the same time.

> Ryan

Thanks Ryan!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://rubyforge.org/pipermail/backgroundrb-devel/attachments/20080411/76bca6a6/attachment.html 

More information about the Backgroundrb-devel mailing list