[Backgroundrb-devel] Worker pool worker for page scanners: lost connection to mysql
ezmobius at gmail.com
Thu Aug 17 18:24:06 EDT 2006
On Aug 17, 2006, at 2:15 PM, P. Mark Anderson wrote:
> I want to scan many URLs simultaneously with a new scanner worker for
> each URL, but I keep getting Mysql::Error: Lost connection to MySQL.
> Yes, allow_concurrency=false.
> Each scanner thread needs access to my ActiveRecord models to store
> discovered feeds and whatnot. There is a DB table called page_scans
> that acts as a big queue holding URLs waiting to be scanned.
> It would be nice to run a worker that spawns up to 10 scanner workers
> at a time to chip away at the queue, but first I need to figure out
> what kinds of problems kill the mysql connection in a worker.
> Any thoughts are much appreciated!
> Backgroundrb-devel mailing list
> Backgroundrb-devel at rubyforge.org
Have you tried it with allow_concurency = true? Stuff like this is
hard because even though Active Record pretends like it is thread
safe I don't really think it is. If it doesn't work with
allow_concurrency = true then you might need to look at just using
the mysql-ruby bindings directly and *gasp* writing some sql just for
More information about the Backgroundrb-devel