[Mongrel] failed to allocate memory while downloading large files

Amit Tomar lists at ruby-forum.com
Tue Sep 14 10:14:29 EDT 2010

Luis Lavena wrote:
> On Tue, Sep 14, 2010 at 10:18 AM, Amit Tomar <lists at ruby-forum.com> 
> wrote:
>> Thanks luis
>> you said right ,i have to check how uploading takes place..
>> could you suggest some documnet ??
> Google is your friend, "Rails File upload"
> http://www.rubyinside.com/rails-file-uploading-101-406.html
> And others.
> If the files are located in a remote server that is not in public
> directory, then you will not have alternative but use something like
> apache to serve map the files to the remote disk and serve them.
> But please, avoid send_data at any cost, and even the file
> reading/writing you're doing, because with the size you're working,
> you're just slowing your application with IO operations.
> --
> Luis Lavena
> AREA 17
> -
> Perfection in design is achieved not when there is nothing more to add,
> but rather when there is nothing more to take away.
> Antoine de Saint-Exup鲹

luis ,but do n't have option other than send_data and one thing i can do 
is while downloading large files i can do it in chunks but i only 
download 4096 byte ,this is my code

      |f| @data = f.read(4096) end

      ext = File.extname(@containerformat.streamName)

        send_data(@data,:filename => name+extension,
        :disposition => 'attachment')
what happening here..
Posted via http://www.ruby-forum.com/.

More information about the Mongrel-users mailing list