actually Dave S,
your indirect answer led me to a place of where not to go at all, which was
the server-side node.js. believe it or not. because it led me to a place
where i actually needed to be. and i conjured the final code after about
44 hours of toil. it works freaking great, pure client-side and progress
also. so, i'll post my code snippets for anyone else in the future.
so here is most of the snippet from the view:
{{block head}}
<link rel="stylesheet" href=
"{{=URL('static','css/jquery-confirm.min.css')}}"/>
<script src="{{=URL('static','js/jquery-confirm.min.js')}}"></script>
<script src="{{=URL('static','js/aws-sdk-2.456.0.min.js')}}"></script> <!--
aws-sdk-js-->
<script>
var fileChoose, fileChooseUploadButton, fileChooseSpan, fileETag;
function local_file_upload(event) {
event.preventDefault();
var submit = jQuery('form#item_edit > div#submit_record__row >
div.col-sm-9 > input[type="submit"]');
var file = fileChoose.prop('files')[0];
console.log('uploading file: '+file.name);
fileETag.prop('readonly', false).val('').prop('readonly', true);
fileChooseUploadButton.val('Uploading...').prop('disabled', true);
submit.prop('disabled', true);
jQuery.get({url:"{{=URL(c='default', f='ajaxAws')}}", async:false}).done
(function(t) {
if (t.response) {
console.log('rtn t.response: '+t.response);
//console.log('rtn t: '+JSON.stringify(t));
AWS.config.update({
"accessKeyId": t.accessKeyId,
"secretAccessKey": t.secretAccessKey, //.replace('K', '9'),
"region": "us-west-1"
});
}
});
var s3 = new AWS.S3();
var params = {
Bucket: 'happyBucketDays',
Key: '{{=professor_folder}}/'+jQuery('form#item_edit >
div#lecture_items_filename__row > div.col-sm-9 >
input#lecture_items_filename').val(),
ContentType: file.type,
Body: file,
ACL: 'public-read'
};
s3.upload(params, function (error, results) {
if (error) {
fileETag.prop('readonly', false).val('').prop('readonly', true);
fileChooseSpan.html(error.message);
return jQuery.alert({title:"Error Uploading Data: ", content:
error});
} else {
var etg = results.ETag.replace('"', '').replace('"', '');
fileChooseSpan.html('Sucessfull Upload Confirmed. Be sure to
click Submit below.');
fileETag.prop('readonly', false).val(etg).prop('readonly', true
);
console.log('file uploaded with result: '+JSON.stringify(results
));
submit.prop('disabled', false);
return jQuery.alert({title:"Successfully Uploaded File", content
:""});
}
}).on('httpUploadProgress', function(progress) {
fileChooseSpan.html('Uploaded: '+parseInt((progress.loaded * 100)/
progress.total)+'%');
});
}
function local_filename_change() {
var e = jQuery('input[type="file"]');
console.log('local file name: '+e.prop('files')[0].name);
jQuery('input#lecture_items_filename').val(e.prop('files')[0].name);
fileChooseUploadButton.css('display', "inline").val('Upload File').prop(
'disabled', false);
console.log(fileChooseUploadButton.prop('type')+" ___ "+
fileChooseUploadButton.prop('id'));
fileChooseSpan.html('Make sure the "Server Filename" is correct.');
jQuery('form#item_edit > div#lecture_items_filename__row > div.col-sm-9
> input#lecture_items_filename').focus();
}
jQuery(document).ready(function() {
var sx = "jQuery.version: "+jQuery.fn.jquery;
console.log("jQuery.document.ready begin..."+sx);
jQuery('form#item_edit > div#lecture_items_text_before__row').after('<div
class="form-group row" id="lecture_items_local_filename__row"><label
class="form-control-label col-sm-3" for="lecture_items_local_filename"
id="lecture_items_local_filename__label">Local Filename</label><div
class="col-sm-9" style="text-align:center;"><input type="file" class="btn
btn-primary" id="lecture_items_file" onchange="local_filename_change();"
/><input class="btn btn-primary" type="submit"
id="lecture_items_file_upload" value="Upload File"
onclick="local_file_upload(event);" style="display:none;" /><span
class="help-block" style="display:block;"></span></div></div>');
fileChoose = jQuery('form#item_edit >
div#lecture_items_local_filename__row > div.col-sm-9 >
input#lecture_items_file');
fileChooseUploadButton = jQuery('form#item_edit >
div#lecture_items_local_filename__row > div.col-sm-9 >
input#lecture_items_file_upload');
fileChooseSpan = jQuery('form#item_edit >
div#lecture_items_local_filename__row > div.col-sm-9 > span.help-block');
fileETag = jQuery('form#item_edit > div#lecture_items_fileetag__row >
div.col-sm-9 > input#lecture_items_fileetag');
fileETag.prop('readonly', true);
console.log("jQuery.document.ready end...");
});
</script>
{{end}}
{{if ('body' in globals()) and (body is not None):
=body
pass}}
where the "body" is passed as an SQLFORM from the controller, where the
referenced fields in the <script> region will have to be implied.
the jQuery.get is a quick get from the server to ensure the accessKeyId
and secretAccessKey are not accessible or visible by the user, or
hopefully, the console or hack. i welcome anyone correcting me on that
latter point.
the real trick was getting aws s3 permissions in line because there are
like a thousand protocols and formats and every other damn thing. i
believe the IAM protocol is the most straightforward and its setup when you
setup your s3 account. under python's boto library, you'd setup a policy
with boto and nothing was really special on the s3 side except your
accessKeyId and secretAccessKey. however, when trying to upload or access
the s3 bucket via the client-side javascript (aws-sdk-js), you have to
setup the CORS configuration under s3, Permissions, CORS configuration
under and for that s3 bucket. mine is fairly open and looks like:
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>DELETE</AllowedMethod>
<ExposeHeader>ETag</ExposeHeader>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
where its probably wise to delete the
"<AllowedMethod>DELETE</AllowedMethod>" line for added security.
you don't need to change or add any of the other s3 permissions, i.e.,
"Block public access", "Access control list" (ACL), or "Bucket Policy", for
the above script to work and access your bucket. they are all bears anyway.
the jQuery.get call points to:
@auth.requires(lambda: any([auth.has_membership(r) for r in ['Administrator'
, 'Professor']]))
def ajaxAws():
if (session is not None) and ('ajaxAws_allowed' in session) and (session
.ajaxAws_allowed is not None) and session.ajaxAws_allowed:
from xyz1 import s3AccessKey, s3SecretKey #xyz1 is a module not
controller
session.ajaxAws_allowed = None
return response.json({'response':True, 'accessKeyId':s3AccessKey,
'secretAccessKey':s3SecretKey})
return response.json({'response':False})
@auth.requires(lambda: any([auth.has_membership(r) for r in ['Administrator'
, 'Professor']]))
def lecture_item_upload():
#SQLFORM creation stored into body
#...
session.ajaxAws_allowed = True
return dict(body=body, professor_folder=sProfessor.
create_lecture_foldername)
where you can see i've written a website for professors to create their own
courses and upload their podcasts. so the uploading of the podcasts, which
can be huge, > 100s MB, is really best done directly from their computers
right into s3 as opposed to running it through the server.
so, that is it. i hope this helps others. lucas
--
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
---
You received this message because you are subscribed to the Google Groups
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/web2py/6fc2f606-266f-4d7b-b3c3-4ac4cad84f94%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.