I need to use the nltk on the input from the user  (things like
stemming the word, tokenizing it) and then use the stemmedTokenized
word to search my Quran database.....Do you think this will post a
problem?

Thank you very much....you have been very very helpful to me.

On Mar 17, 3:20 pm, Tim Hoffman <[email protected]> wrote:
> I noticed you posted the same question to nltk group,
> if you are planning on using nltk I think what i said earlier won't
> work
> if you are planning on using nltk to process the text,
>
> you may need to do some rethinking on your approach
>
> T
>
> On Mar 17, 11:52 pm, Nora <[email protected]> wrote:
>
>
>
> > Thank you very much for your help....much appreciated.
>
> > One last thing, could you please point me on to where to find
> > information about model objects creation...is that the datastore
> > thing?
>
> > On Mar 17, 2:45 pm, Tim Hoffman <[email protected]> wrote:
>
> > > Hi
>
> > > Ok, that better much more info ;-)
>
> > > Really you should create model objects which representing
> > > chapters, verses and parallel translation of each verse, along
> > > with appropriate metadata or references that will allow you to find
> > > the
> > > appropriate translation.
>
> > > Whilst you example might have worked in dev (thats because dev doesn't
> > > enforce transaction/request limits)
> > > you really have no chance of getting this application to scale.
>
> > > There is another way, (but gae is really designed for what I described
> > > earlier)
> > > You could preparse all the chapters and verses in to the dictionaries
> > > as you are currently doing
> > > offline, (ie not even under the dev server) then write those
> > > dictionaries as strings to files so that the output is something like
> > > as follows
>
> > > for instance. file mybook.py contains the following
>
> > > book = {'chapter1': {'verse1':'some text'}}
> > > (I have left out lots of data ;-)
>
> > > Then in your code you could import  mybook e.g.
>
> > > import mybook
> > > mybook.book['chapter1']['verse1']
>
> > > But really you should learn all about models ;-)
>
> > > See ya
>
> > > T
>
> > > Then you could import the module
>
> > > On Mar 17, 11:34 pm, Nora <[email protected]> wrote:
>
> > > > My application is a keyword search query tool that uses the
> > > > Quran.  I am loading 2 copies of the English Quran(where every verse
> > > > has eight parallel english translations) and two Arabic copies of the
> > > > Quran.  The Quran has 114 chapters and the longest chapter has 285
> > > > verses....just to give you an idea of the size of the data that needs
> > > > to be loaded.
>
> > > > Does that help at all....I am really stuck into this as the
> > > > application worked fine on my computer before uploading it to the
> > > > server but now it is not!!  very disappointing and I have little
> > > > background about databases and datastores and how to search these
> > > > structures
>
> > > > Thank you very much for your great help,
>
> > > > Nora
>
> > > > On Mar 17, 2:14 pm, Tim Hoffman <[email protected]> wrote:
>
> > > > > for instance you could preparse these text files and generate python
> > > > > code
> > > > > which you would just import
>
> > > > > What sort of data  do you have in your text file and how do you plan
> > > > > to use it ?
> > > > > How many records (dictionary keys and values) are in these text
> > > > > file ?
>
> > > > > That would help me formulate a clearer example.
>
> > > > > However do you really need all of that data in memory for every
> > > > > request?
>
> > > > > T
>
> > > > > On Mar 17, 11:01 pm, Nora <[email protected]> wrote:
>
> > > > > > Thank you for your reply.
>
> > > > > > Well, to make myself clearer....
> > > > > > My application has to load a massive amount of information from text
> > > > > > files when it first loads...
> > > > > > Now, I am interested in your second option but could you make it
> > > > > > clearer please.
>
> > > > > > Thank you again.
>
> > > > > > On Mar 17, 1:40 pm, Tim Hoffman <[email protected]> wrote:
>
> > > > > > > Why do you need to load that much data into memory on eachrequest
> > > > > > > if you are creating a dictionary you could
>
> > > > > > > 1.  create model entities for each dictionary
> > > > > > > and load them as required by their key
>
> > > > > > > 2. preprocess the text files into dictionary code and import them
>
> > > > > > > T
>
> > > > > > > On Mar 17, 9:59 pm, Nora <[email protected]> wrote:
>
> > > > > > > > Hello,
> > > > > > > > My application needs muchtimeto load some text files into
> > > > > > > > dictionaries in memory.  I am unable to load all the data 
> > > > > > > > because of
> > > > > > > > therequesttimelimitation.  Is there another way of getting 
> > > > > > > > around
> > > > > > > > this?
>
> > > > > > > > Thank you very much.- Hide quoted text -
>
> > > > > > > - Show quoted text -- Hide quoted text -
>
> > > > > - Show quoted text -- Hide quoted text -
>
> > > - Show quoted text -- Hide quoted text -
>
> - Show quoted text -
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to