On Jun 21, 2010, at 1:06 PM, Eugueny Kontsevoy wrote:
> I can think of 3 possible answers here: JS validators (jQuery has lots of
> plugins), smaller AJAX-y forms and, for more complex monolithic forms - fake
> models AKA form classes. I guess the reason I'm not a big fan of FormEncode
> is because it defaults to the 3rd approach. And, again, I myself don't have a
> better design/approach to suggest, without tying together html helpers and
> models. Perhaps it wasn't the best idea coming from me today to even start
> this discussion: I really have nothing constructive to contribute! ;-)
The first doesn't really work, because you can't trust the client, ever. The
second requires substantially more work for the AJAX round-trips as the form is
filled out, but still means that on the final form submit, they could've mucked
with the values, leaving only the last approach as a viable one for actually
knowing the data you received is appropriate for use in your app.
The reason tying the form validation to a domain model really terrifies me, is
because it encourages developers to design forms to match their database since
that is the simplest for them to implement quickly. This means that regardless
of what is the *best experience for the end-user*, the developer will try to
shoe-horn the form into the easiest thing for them to implement. I've used many
of these websites, and its very obvious usually when this has happened, and
generally leads to a crappy user experience.
I do think that there could be more elegant approaches than FormEncode though,
and I think Felix's pycerberus and Chris McDonough's Colander/Deform fill those
notches quite well. They all happen to look similar in many respects to the
FormEncode approach too, which to me indicates they're all onto a great pattern
to utilize. This pattern is also rather transferable, for example, in one of my
apps I can do this:
@api(schema=NewPasteAPI())
def api_create(self, req, version):
if self.json['language'] == 'guess':
try:
lex = guess_lexer(self.json['code']).name.lower()
except ClassNotFound, exc:
lex = 'text'
self.json['language'] = lex
paste = Paste(**self.json)
visible_id = paste.save()
return {'status': 'success', 'url': req.link('view_paste',
id=visible_id, qualified=True)}
That is for an exposed JSON API method that takes a JSON POST, and returns
JSON. Since JSON is just a dict.... it can be validated by a FormEncode schema,
which I supplied. It makes writing API methods that take API input like that
much much easier to write.
Also, as you can see, I do some extra operations based on the input (which I
need to be *valid before I can even use the model!*). Once you start creating
fake domain models for the purpose of form validation in Rails, you've back to
this approach. The main problem I've had with FormEncode, which I think Felix
and Chris McDonough both do a great job of solving, is a cleaner API and
substantially better documentation.
I guess the issue is, what exactly is the issue with making a form schema to
represent a form's valid values and how to coerce them into the appropriate
Python datatype? I know the @validate decorator is a nasty mess, that's
definitely on the roadmap to clean up for Pylons 1.1, or to remove entirely in
favor of providing some helpers for a more elegant custom solution. Is it an
organization issue? Are people unsure where to stick their form models?
I think maybe streamlining this approach would be best, both for the
flexibility, and because making it simple to do forms that provide the best
user experience sounds like a dang good goal.
P.S. - Yes, I've been reading too many user experience / interface books
lately. :)
Cheers,
Ben
--
You received this message because you are subscribed to the Google Groups
"pylons-discuss" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/pylons-discuss?hl=en.