I've read the schema migration article at
https://developers.google.com/appengine/articles/update_schema?csw=1
and the technical side of things looks easy enough.
How do you guys handle schema migration in practice? Are there any
best practices? My first thought was to have a separate app version
Even the use of sub organization is banned now. We don't need new mail
google apps accounts. We just want to launch it as a sub organization, so
the site is Up. I wonder even that is not allowed now.
And this is the same process I did for many domains of many clients
earlier wn we
it's a Java-standalone application.
I just wonder if it really works, because only Android, iOs and Java-Script
are mentioned in the doc:
https://developers.google.com/appengine/docs/java/endpoints/.
As I understand it, I just annotate my classes and then start a generator
to create a client lib
I just solved it!
I changed my download command to this
bulkloader.py --download --config_file=bulkloader.yaml --kind= + kind +
--url=http://; + url + .appspot.com/_ah/remote_api --filename=SQL/ + kind
+ .csv
the thing is not* appcfg.py* but *bulkloader.py
*
not d*ownload_data* but*
On Fri, Aug 30, 2013 at 5:25 AM, Vijay Kumbhani vnkumbh...@gmail.com
wrote:
How to calculate GMT timestamp to epoch timestamp?
First of all, it's better to discuss UTC time rather than GMT time; see
here for an explanation of UTC vs. GMT:
On Friday, June 14, 2013 9:57:57 PM UTC-4, Andrew Jessup wrote:
This change means that new App Engine customers need to create a paid
Google Apps for Business account in order associate an App Engine
application with a custom domain.
We know that many of you simply wish to associate your
Folks,
We have a long-running process that I've been trying to make into a
backend, thus far to no avail. So any help would be much appreciated.
app.yaml
==
application: app-name
version: api-storage-transition
runtime: python27
api_version: 1
threadsafe: yes
handlers:
-
just in case, thses are the working ones
*query = db.GqlQuery( 'SELECT * FROM userMail WHERE user = :1 AND mail =
:2', user_ref.key(), mail_ref.key() )*
*query = db.GqlQuery( 'SELECT * FROM userMail WHERE user = :1', user_ref )*
*query = db.GqlQuery( 'SELECT * FROM userMail WHERE user = :1',
This is really bothering me, the issue is always deep in my mind, the
reason I use appengine is the possibility of near-infinite scalability,
however the search index size limitation will prevent any decent use-case
of this beyond-awesome feature
It would be devastating to reach the 250gb
Not sure if it coincided with the major data centre change, but our API
endpoints (basic webapp2 handlers that do some datastore work, and then
emit JSON) have been running really fast lately. Probably 100ms lower then
before.
Of course, everything above is completely subjective.
Just
I too worry about this limit, and am hoping that the limit at least applies
per-index, and not per-application. We are using Search API in production
today and our per-application usage is growing fast.
re: re-sync: working in bulk against the search API is challenging today.
There is a
Wouldn't this:
https://code.google.com/p/googleappengine/issues/detail?id=8528 be great?
Take a moment and go flick on that little star!
j
On Monday, 2 September 2013 16:45:50 UTC-6, Bay of Islands wrote:
Yes it states as well:
*Note:* You must sign up for Google Apps to register this
I did the same thing for URL Fetch'es in my currently in production app,
it's one of the worst things you could do to an app, first building it to
be fast and agile, than forcefully chopping it, clogging it, rate-limiting
it to be slow, really painful
2 years ago, it was also practically
This was announced more than two months ago. It shouldn't be taking so
long, unless it has low priority. If that's the case, this needs to be
moved to top priority ASAP.
--
You received this message because you are subscribed to the Google Groups
Google App Engine group.
To unsubscribe from
Same concern here, my app will soon be going into production. That plus API
changes since it's in preview.
The documentation says 'We hope to raise this limit in the future.' so I
suppose it is not a hard limit.
What I did was limit the indexed text per documents to 500 to 1000 Kb
depending
That's not fair. I would expect it to take a lot longer than two months of
development to disassociate GAE and Google Apps given how tightly they were
(are) intertwined. Hopefully the GAE team has been working on this for a
while already, long before the public announcement.
On the other hand,
Hello
I have just launched a web application that does not attract a big trafic
so far. Lets say 30 visits a day. My website has a lot of images ( 10.000)
and connections between images
so far i have spent a huge amount of Money on Fronted Instance Hours (150
USD a month). Thanks to other
Good morning,
When I've tried taskqueue (in python) to push a task, it raises very strangelly
exception. Please help:
Traceback (most recent call last):
File
/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py,
line 196, in Handle
handler =
Hi,
It is good and really use case. I also faced this issue as well. I also
would like to see the answer from google appengine engineer how to resolve
this .
For my use case : client would like to have 10K to up requests (concerrent
access) per second to the server google appEngine. I have
Hi Jeff,
GAE log downloads have been flaky for me. Our logs (request + application)
average to about 10G a day. First I used appcfg.py to get all the logs -
but eventually discovered that it did not get everything. I was running it
every 10 minutes. I notice that there is some sort of buffer of
Dear all,
I was going through the tutorial on how to use Google App Engine in
combination with Maven Archetypes:
https://developers.google.com/appengine/docs/java/tools/maven#installing_maven
It mentions that I need Maven 3.1 or newer, but when I reach the step where
we start a test server on
21 matches
Mail list logo