Hi,
I have an issue with the upload form and postbacks. In my application I
need to validate file contents before accepting it from user upload form,
so naturally I try to use "onvalidation" method to it. The table used in
the forma is defined like this:
db.define_table('input_data',
Field('input_file', 'upload', autodelete=True,
requires=IS_NOT_EMPTY(), label=T('Input file')),
Field('output_type', default="xxx",
requires=IS_IN_SET(['xxx', 'yyy']), label=T('Output format')),
Field('original_name', writable=False, readable=False),
Field('created', 'datetime', writable=False,
readable=False),
)
My simplified action looks like this:
def _validateFileForm(form):
try:
validator.validate(form.vars.input_file.file)
# on success set hidden fields
form.vars.original_name = request.vars.input_file.filename
form.vars.created = datetime.datetime.now()
except Exception, ex:
# if validation fails then display an error
form.errors.input_file = T('file_validation_failed') + ': ' +
str(ex)
def index():
form = SQLFORM(db.input_data, submit_button=T("Upload"))
if form.process(onvalidation=_validateFileForm).accepted:
# save file id in a session and go to options page
session.input_data_id = form.vars.id
redirect(URL(options_odt))
return dict(form=form)
Validation seems to work OK, if file is OK then everything is redirected to
next page, if file is bad then message is displayed in red under file
upload field.
What is interesting about it is that when validation fails and form is
re-displayed it also looks like that the whole file contents is send back
to the client together with the form. For large uploaded file this could be
a lot of data. I see long delays and when I check the HTTP response headers
I see many-megabytes number there. Is there a way to avoid sending the file
contents back with the form during a postback?
Thanks,
Andy
--