btw here is the client side code to give you a full picture.
var xhr = new XMLHttpRequest();
xhr.open("POST", {url_obtained_from_the_server}, true);
xhr.onload = function(e) {
if (e.target.status < 400) {
var location = e.target.getResponseHeader('Location');
this.url = location;
this.sendFile_();
} else {
this.onUploadError_(e);
}
}.bind(this);
xhr.onerror = this.onUploadError_.bind(this);
xhr.send();
On Thursday, May 25, 2017 at 1:09:17 PM UTC-5, jayshekar harkar wrote:
>
> Thanks for the clarification. I am trying to prepare a signed URL on the
> server side so that the client can initiate the request.
> I have come up with the following Java code but when the client makes the
> request to the signed URL, it fails. Could you/anyone please take a quick
> look and tell me if I missed anything obvious?
>
> String stringToSign = "POST" + "\n" +
> "" + "\n" +
> "" + "\n" +
> expiration + "\n" +
> "x-goog-resumable:start" + "\n" +
>
> "/upload/storage/v1/b/{myBucket}/o?uploadType=resumable&name="+{myObject};
>
> byte[] rawSignature = SHA256RSA.signSHA256RSA(stringToSign, myPrivateKey);
> String encodedSignedString = new String(Base64.encodeBase64(rawSignature,
> false), "UTF-8");
>
> //myObject is sent from the client that he/she wants to upload
> String baseUrl = "
> https://www.googleapis.com/upload/storage/v1/b/{myBucket}/o?uploadType=resumable&name=
> "+{myObject};
> String googleAccessStorageId = {myGoogleServiceAccount};
>
> String queryParams = "&GoogleAccessId=" + googleAccessStorageId +
> "&Expires=" + expiration +
> "&Signature=" + URLEncoder.encode(encodedSignedString, "UTF-8");
>
> String fullUrl = baseUrl + queryParams;
>
> return fullUrl;
>
> On Tuesday, May 23, 2017 at 1:32:16 PM UTC-5, Simon Green wrote:
>>
>> The signed URL that you pass back to the client provides a way for an
>> *unauthenticated* user (to GCS) to upload files *directly* to a GCS
>> bucket. Without this, every user would need an account and permissions
>> setup or all the files would have to go through your own service instead.
>>
>>
>> On Tuesday, 23 May 2017 12:14:14 UTC-6, jayshekar harkar wrote:
>>>
>>> Re: Resumable Uploads,
>>>
>>> 1. Created a signed URL on
>>> https://www.googleapis.com/upload/storage/v1/b/myBucket/o?uploadType=resumable&name=myObject
>>>
>>> 2.To get a resumable session URI, sent a post request to the signed URL
>>> (without the Authorization header) and it responds with an HTTP 401 error
>>> but when I include the authorization header with a valid auth token,
>>> it works.
>>>
>>> So I wonder what's the point of the signed URL in this particular case?
>>> Signing the URL and also providing the authorization token seems redundant
>>> to me unless my workflow is incorrect. Please shed some light on this.
>>>
>>>
>>> On Monday, February 20, 2017 at 5:41:32 PM UTC-6, Adam (Cloud Platform
>>> Support) wrote:
>>>>
>>>> The simplest upload option is the POST object call to the XML API
>>>> <https://cloud.google.com/storage/docs/xml-api/post-object>, which
>>>> accepts form data and allows you to upload a file without any JavaScript
>>>> using just a <form> tag with the 'action=' attribute set to the Cloud
>>>> Storage API Endpoint, eg.
>>>>
>>>> <form action="http://<my-bucket>.storage.googleapis.com">
>>>> ...
>>>>
>>>> You can of course use an XHR in JavaScript to post the form as well.
>>>> Authentication for this type of upload is done using a policy document,
>>>> which is described in the Usage and Examples
>>>> <https://cloud.google.com/storage/docs/xml-api/post-object#usage_and_examples>
>>>>
>>>> section of the documentation along with Python code examples and an
>>>> example
>>>> HTML form.
>>>>
>>>> A cleaner approach is to use the Google API JavaScript Client Library
>>>> <https://developers.google.com/api-client-library/javascript/start/start-js>
>>>>
>>>> to upload the file using the JSON API
>>>> <https://cloud.google.com/storage/docs/json_api/>. There's a
>>>> JavaScript example on the googlearchive GitHub page
>>>> <https://github.com/googlearchive/> called
>>>> storage-getting-started-javascript
>>>> <https://github.com/googlearchive/storage-getting-started-javascript/blob/master/index.html>
>>>>
>>>> which I still use as a reference, and is still updated. This will show you
>>>> how to do a multipart upload
>>>> <https://cloud.google.com/storage/docs/json_api/v1/how-tos/multipart-upload>,
>>>>
>>>> which is sufficient for most files under 5MB. Authentication for this is
>>>> done using OAuth and is handled by the GAPI client.
>>>>
>>>> If you need resumable uploads, it's not too much work to adapt the
>>>> concepts in Performing a Resumable Upload
>>>> <https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload>,
>>>>
>>>> which give raw HTTP examples, to the above code example. If you need to do
>>>> some kind of authentication outside of Google OAuth, you can look into
>>>> Signed
>>>> URLs <https://cloud.google.com/storage/docs/access-control/signed-urls>.
>>>> The documentation also provides Python code samples for generating them
>>>> <https://cloud.google.com/storage/docs/access-control/create-signed-urls-program>.
>>>>
>>>> Uploading to a signed URL works mostly the same way as for regular GCS
>>>> URLs, with some differences that are covered in the docs.
>>>>
>>>> On Monday, February 20, 2017 at 10:47:26 AM UTC-5, Richard Cheesmar
>>>> wrote:
>>>>>
>>>>> I am using the standard python app engine environment and currently
>>>>> looking at how one goes about uploading multiple large media files to
>>>>> Google Cloud Storage (Public Readable) using App Engine or the Client
>>>>> directly (preferred).
>>>>>
>>>>> I currently send a bunch of smaller images (max 20 - between 30 and
>>>>> 100k on average), at the same time directly via a POST to the server.
>>>>> These
>>>>> images are provided by the client and put in my projects default bucket.
>>>>> I
>>>>> handle the requests images using a separate thread and write them one at
>>>>> a
>>>>> time to the cloud and then associate them with an ndb object. This is all
>>>>> fine and dandy when the images are small and do not cause the request to
>>>>> run out of memory or invoke a DeadlineExceededError.
>>>>>
>>>>> But what is the best approach for large image files of 20mb+ a piece
>>>>> or video files of up to 1GB in size? Are there efficient ways to do this
>>>>> from the client directly, would this be possible via the Json api ,a
>>>>> resumable upload, for example? If so, are there any clear examples of how
>>>>> to do this purely in javascript on the client? I have looked at the docs
>>>>> but it's not intuitively obvious at least to me.
>>>>>
>>>>> I have been looking at the possibilities for a day or two but nothing
>>>>> hits you with a clear linear description or approach. I notice in the
>>>>> Google Docs there is a way using PHP to upload via a POST direct from the
>>>>> client...
>>>>> https://cloud.google.com/appengine/docs/php/googlestorage/user_upload...Is
>>>>>
>>>>> this just relevant to using PHP on app engine or is there an equivalent
>>>>> to createUploadUrl
>>>>> for python or javascript?
>>>>>
>>>>>
>>>>> Anyway, I'll keep exploring but any pointers would be greatly
>>>>> appreciated.
>>>>>
>>>>>
>>>>> Cheers
>>>>>
>>>>>
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit
https://groups.google.com/d/msgid/google-appengine/37898774-4dde-4ed9-941a-784bbe72a3d7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.