Carlos Santana wrote:
> I am really confused now.
> The FAQ says - Mongrel uses one thread per request.
> So it can handle multiple requests (not surprising).
>
> What you are suggesting is that my archiving method is trying to make 
> another http req. call within same  http request?
>
> I can see the wget requests in the mongrel (development.log). Clearly, 
> its not blocking these requests.
>
> However, it is blocking other http requests (if I try to access my 
> application from a browser, then it times out or waits forever).
> So it is not processing other requests. This could be because server has 
> reached max. number of connections.. (just one possibility).
>   
Rails does not support multithreading and will only handle 1 request in
1 thread at a time
Mongrel supports multithreading and will handle multiple requests in
multiple threads.

Background threads and forks are not dependant on the webserver and are
always supported.

> I tried Passenger and it is really cool. However, there seems to be some 
> serious problem with my archiving code. When I run my app. using 
> passenger and try archiving method then, system slows down and I had to 
> reboot it forcefully.
> The wget seems to make infinite calls to the server.
>
> I am posting my archiving code for ref.:
> ----------------
>   def generate_archive
>     dir = "/tmp/topicsys/#{self.id}"
>     title = self.title.gsub(/^\s+/, '').gsub(/\s+$/, '').gsub(/\s+/, 
> '-')
>     id = self.id
>     host = "#{RAILS_ENV}_HOST".upcase.constantize
>     url = url_for :host => host, :controller => :topics, :action => 
> :show, :id => id
>     logger.info "Generating topic - (#{title}) archive '#{dir}'."
>     pid = fork do
>      `wget --page-requisites --html-extension --convert-links 
> --no-directories --recursive --level=1 -np --directory-
> prefix=#{dir} #{url};`
>      #`mv #{dir}/#{id}.html #{dir}/index.html;`
>       `zip -mj #{dir}/#{title}.zip #{dir}/*;`
>     end
>     Process.detach(pid)
>   end
>
> ----------------
>
> Any clues?
>   

What do you mean by wget seems to make infinite calls to the server?
Passenger should be no different than mongrel.

Anyway, if you are going to do a lot of zipping and wgetting of large
websites perhaps you should have a look at starling and working.

Have a look at this railscast that explains how to set this up:
http://railscasts.com/episodes/128-starling-and-workling

Using this setup makes it easy to handle many large background tasks
while preventing the server from being overloaded by running them from a
queue instead of simultaneously when there are many requests.

/Morgan



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to