I have a txt file with some data that i need to import to de database.
I'm using ruby to import those data but i have a major problem, when i
've
a large amount of data i run out of memory.


  File.open("#{RAILS_ROOT}/public/files/Neighborsville.TXT").each() do
|line|
      @stringArray = line.split("|")
      @i += 1
      puts @i
      @pid = @stringArray[0]
      @chain_id = @stringArray[1]
      @business = Business.find_by_pid_and_chain_id(@pid,@chain_id);
      #Check PID + CHAIN_ID
      @business.pid = @stringArray[0]
      @business.chain_id = @stringArray[1]
      @business.cityname = @stringArray[17]
      @business.state = @stringArray[18]
      @business.business =
Business.find_by_pid_and_chain_id(@pid,@chain_id);
      @business.city = City.new
      @business.business_category = get_category_id(@stringArray[40])
      @business.address = @stringArray[8] +" "+   @stringArray[9] +"
"+ @stringArray[10]+" "+ @stringArray[11] +" "[EMAIL PROTECTED]"
"[EMAIL PROTECTED]" "[EMAIL PROTECTED]
      if @chain_id == nil
        @chain_id = ""
      end
      business.save
      end
    end


I belive that ruby use in every cycle of the do new blocks of memories
por my instances
of Business. Can someone help me please?

Thanks,

Elioncho
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to