[Mongrel] Uploading Large (100mb+) files
mongrel at philip.pjkh.com
Tue Nov 28 19:55:25 EST 2006
>>>> Out of curiosity, has anyone managed to pair mongrel_upload_progress
>>>> (or any of the other options) with MogileFS?
>>> That's been mentioned in the past, but I believe it wouldn't work
>>> since the file has to be streamed to the local disk and then pushed
>>> up to MogileFS. Since disks are so cheap it turns out to be
>>> cheaper to just upload the file to a big disk and then use a fast
>>> web server to serve the files.
>> The problem with this, I think, is that you won't get the benefits of
>> no single point of failure (unless you use an expensive RAID array or
>> SAN setup).
> So you're saying a couple of web servers running apache or nginx to
> serve files right off a server's disk is LESS reliable than a nearly
> unproven MogileFS setup that uses mysql, trackers, storage nodes, and
> then eventually just transmits off HTTP anyway?
I think he was referring to "big disk... web server", and the obvious lack
of plurals... one disk on one web server is giong to give you a single
point of failure...
What we did was to have a "master media" server that everything gets
uploaded to. Then we have several slaves that are configured such that if
they don't have the file requested they get it from the master. We also
have things setup so that the master will request each file from the slave
upon it receiving a new file so within about no time we've replicated the
file to several slaves and on the off chance one gets uploaded manually it
will get picked up...
Works for us...
More information about the Mongrel-users