[Backgroundrb-devel] Creating another database connection for large mysql import?

Mike Garey backgroundrb at gmail.com
Mon Dec 18 17:48:22 EST 2006

I'm using backgroundrb to periodically download a large file via ftp from a
remote location and then import it into the database.  To perform the
import, I was using the following:

ActiveRecord::Base.connection.execute(%{load data infile ...;})

although on a file with 2.5 million records, this can take 5 minutes, which
seems to tie up my rails application while this executes (even though it's
being run from a backgroundrb process).  I figured this was happening
because I'm using the same database connection that the rest of my rails app
is running on.. I then tried using the following:

Importer.connection.execute(%{load data infile ... ;})

where Importer is a dummy class I created just so I could try to establish a
database connection separate from the rest of my rails classes.  However,
this seems to produce the same behavior as when I use ActiveRecord::

Does anyone have any suggestions on how I might be able to resolve this?

Also, I read a message on the list that an upcoming version of backgroundrb
will have the ability to automatically reload a worker if running in
development mode and the worker file changes.. Just wondering how to enable
this, since I've been stopping/restarting the server to get the file

Thanks for any help, and also a big thanks to Ezra (and any other
contributors) for all the hard work that's been put into backgroundrb, it's
truly appreciated!

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://rubyforge.org/pipermail/backgroundrb-devel/attachments/20061218/8c2f1f3c/attachment.html 

More information about the Backgroundrb-devel mailing list