Just like this:
threads = []

for i in xrange(10):
  threads.append(threading.Thread(target=add)) # add is a function to add
some data

for thread in threads:
  thread.start()

2009/4/30 Sri <[email protected]>

>
> how did you create the threads??  did you create and destroy them each
> time or did you mantain a pool/manager ??
>
> On Apr 30, 3:58 pm, 风笑雪 <[email protected]> wrote:
> > I only tested insert and delete.
> > And they works slower( maybe because of including threads' starting time
> ).
> >
> > 2009/4/30 Sri <[email protected]>
> >
> >
> >
> > > actually ive tried using multiple threads..
> >
> > > i found uploads were faster ... deletes on (non overlapping) data was
> > > someone what similar..
> >
> > > On Apr 29, 10:37 pm, 风笑雪 <[email protected]> wrote:
> > > > In my test, using multiple threads is the same speed as 1 thread.
> >
> > > > 2009/4/29 Sri <[email protected]>
> >
> > > > > Actually Ive started doing multiple threads couple of nights ago
> and
> > > > > it was pretty fast...
> >
> > > > > Same applied with uploading new data.  Ofcourse now I am just out
> of
> > > > > quota... :D
> >
> > > > > thanks for the tips guys.
> >
> > > > > cheers
> > > > > Sri
> >
> > > > > On Apr 28, 4:37 pm, Alkis Evlogimenos ('Αλκης Ευλογημένος)
> > > > > <[email protected]> wrote:
> > > > > > Yes but you can hit them repeatedly and from multiple threads
> until
> > > > > > everything is deleted.
> >
> > > > > > 2009/4/28 Sri <[email protected]>
> >
> > > > > > > Right you mean have handlers that delete data instead of using
> > > > > > > remote_api?
> >
> > > > > > > But wouldnt that limit my requests to 30 seconds (well i guess
> I
> > > could
> > > > > > > 1000 "delete 100 items"  requests) right?
> >
> > > > > > > On Apr 27, 5:09 pm, Alkis Evlogimenos ('Αλκης Ευλογημένος)
> > > > > > > <[email protected]> wrote:
> > > > > > > > If you don't its better to do it on the server side rather
> > > > > > > > than transferring data through the net.
> > > > > > > > No you don't need to keep the keys locally.
> >
> > > > > > > > 2009/4/27 Sri <[email protected]>
> >
> > > > > > > > > But the issue is that i dont really have the keys on me...
> does
> > > > > that
> > > > > > > > > mean that each time i load the datastore il have to keep
> track
> > > of
> > > > > the
> > > > > > > > > keys as well locally.. so that when i want to clear them i
> can
> > > use
> > > > > > > > > them..
> >
> > > > > > > > > cheers
> > > > > > > > > Sri
> >
> > > > > > > > > On Apr 27, 8:50 am, Alkis Evlogimenos ('Αλκης Ευλογημένος)
> > > > > > > > > <[email protected]> wrote:
> > > > > > > > > > The sample code does:
> > > > > > > > > > MyModel.all().fetch(1000)
> >
> > > > > > > > > > This means fetch 1000 entities of MyModel. If each entity
> is
> > > 10kb
> > > > > > > this
> > > > > > > > > means
> > > > > > > > > > 10MB of data read from datastore, 10MB of data sent
> through
> > > the
> > > > > > > network
> > > > > > > > > to
> > > > > > > > > > your running instance and 10MB of data server from the
> > > running
> > > > > > > instance
> > > > > > > > > to
> > > > > > > > > > your machine running the remote script.
> >
> > > > > > > > > > If you know the keys then you can do:
> >
> > > > > > > > > > db.delete([db.Key.from_path('MyModel', key_name) for
> key_name
> > > in
> > > > > > > > > > one_thousand_key_names])
> >
> > > > > > > > > > This just sends the keys to the datastore for deletion.
> It
> > > > > doesn't
> > > > > > > need
> > > > > > > > > to
> > > > > > > > > > transfer data from the datastore to the remote script to
> read
> > > the
> > > > > > > keys in
> > > > > > > > > > the first place.
> >
> > > > > > > > > > Eventually GAE api should provide us some way of querying
> the
> > > > > > > datastore
> > > > > > > > > for
> > > > > > > > > > keys only instead of getting entities necessarily. This
> would
> > > > > make
> > > > > > > this
> > > > > > > > > > use-case quite a bit faster and a lot of others as well.
> >
> > > > > > > > > > 2009/4/26 Devel63 <[email protected]>
> >
> > > > > > > > > > > Can you explain this further?  I don't see any
> reference to
> > > > > > > key_name
> > > > > > > > > > > in the sample code.
> >
> > > > > > > > > > > More importantly, to me, what's the cost differential
> > > between
> > > > > using
> > > > > > > > > > > string representation of keys and key_names?  I've been
> > > passing
> > > > > > > around
> > > > > > > > > > > key_names to the browser because they're shorter, under
> the
> > > > > > > assumption
> > > > > > > > > > > that the cost to get the corresponding key on the
> server
> > > side
> > > > > was
> > > > > > > > > > > negligible.
> >
> > > > > > > > > > > On Apr 25, 9:02 am, Alkis Evlogimenos ('Αλκης
> Ευλογημένος)
> > > > > > > > > > > <[email protected]> wrote:
> > > > > > > > > > > > Doing it over the remote api means you are going to
> > > transfer
> > > > > all
> > > > > > > your
> > > > > > > > > > > data +
> > > > > > > > > > > > transmission overhead over the wire. You are probably
> > > better
> > > > > off
> > > > > > > > > doing
> > > > > > > > > > > > something like this on the server side through an
> admin
> > > > > protected
> > > > > > > > > > > handler.
> >
> > > > > > > > > > > > Also if you happen to know the keys of your data (you
> > > used
> > > > > > > key_name)
> > > > > > > > > your
> > > > > > > > > > > > deletes are going to be a lot more efficient if you
> give
> > > > > > > db.delete a
> > > > > > > > > list
> > > > > > > > > > > of
> > > > > > > > > > > > keys instead.
> >
> > > > > > > > > > > > On Sat, Apr 25, 2009 at 2:41 PM, Sri <
> > > [email protected]>
> > > > > > > wrote:
> >
> > > > > > > > > > > > > Hi,
> >
> > > > > > > > > > > > >    Is there a way to completely erase the
> production
> > > data
> > > > > > > store?
> >
> > > > > > > > > > > > > Currently I am using a script like this via the
> remote
> > > api:
> >
> > > > > > > > > > > > > def delete_all_objects(obj_class):
> > > > > > > > > > > > >    num_del = 300
> > > > > > > > > > > > >    while True:
> > > > > > > > > > > > >        try:
> > > > > > > > > > > > >            objs = obj_class.all().fetch(1000)
> > > > > > > > > > > > >            num_objs = len(objs)
> > > > > > > > > > > > >            if num_objs == 0:
> > > > > > > > > > > > >                return
> > > > > > > > > > > > >            print "Deleting %d/%d objects of class
> %s" %
> > > > > > > (num_del,
> > > > > > > > > > > > > num_objs, str(obj_class))
> > > > > > > > > > > > >            db.delete(objs[:num_del])
> > > > > > > > > > > > >        except Timeout:
> > > > > > > > > > > > >            print "Timeout error - continuing ..."
> >
> > > > > > > > > > > > > But with 30000 entities in the data store and
> another 3
> > > > > million
> > > > > > > > > (yep
> > > > > > > > > > > > > thats right) coming, doing a clear this way is
> > > extremely
> > > > > slow.
> >
> > > > > > > > > > > > > Any ideas?
> >
> > > > > > > > > > > > > cheers
> > > > > > > > > > > > > Sri
> >
> > > > > > > > > > > > --
> >
> > > > > > > > > > > > Alkis
> >
> > > > > > > > > > --
> >
> > > > > > > > > > Alkis
> >
> > > > > > > > --
> >
> > > > > > > > Alkis
> >
> > > > > > --
> >
> > > > > > Alkis
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to