I'll take a chance that the following thought hasn't already been expressed on these threads.
A lot of work has been done in somewhat-numerical realms to avoid getting stuck in local maxima. For example, you could throw a whole bunch of trial balloons into a room with a bumpy ceiling, see which ones end up highest, and artificially smooth out the cavities that the other balloons get stuck in. That way, gradually the ceiling gets smoother, and the highest trial balloons end up higher on later trials. Simulated annealing runs in a slightly different way. ( http://en.wikipedia.org/wiki/Simulated_annealing ). You throw a bunch of trial balloons in, and let them bounce around on the ceiling, so that they bobble out of the local maxima and have a tendency to bounce up and into higher and higher local maxima, hopefully including the global maximum. To make sure that they don't bounce out of the global maximum, you gradually decrease the bounciness of the balloons. There have been attempts to find good designs through genetic algorithms based on simulated annealing. They all depend on being able to reward better designs, and you also have to have some idea of how to write down a specification for a design. What we're missing in the kind of design we want to do is the specification language. What would you say in order to ensure that the right design elements are being included? In the initial discovery phase, that's what we need to find out from the users that we interview. What is the problem space in which we will be doing design? What concepts matter? What concepts are implied? What are the most significant relationships among the concepts, namely the ones that the users will want to follow? (I've got this, I want that, so I'm going to use this relationship to get there.) This is why I feel strongly that a good conceptual framework is essential to defining the space in which to do design. The complementary method works also, but I'm just finding out about it. You put up a candidate design, find out what concepts and relationships are missing, and revise the design. Because you're changing the framework as you go along, I don't think that traditional optimization ideas transfer very well to this sort of design. A second point about this complementary method. Revising the design requires two steps: adding the concepts and relationships, and figuring out how to build a new design that incorporates them. The new concepts and relationships may be disruptive to the existing design, so you hope in the ideal case that you added the most fundamental concepts first. If you didn't, you may have a radical re-design on your hands. Even if you did, some of the things you inferred in your first design may need to be un-done so that the new concepts and relationships can be incorporated as thoroughly as if they were considered from the start. Best wishes, Bruce Esrig At 06:49 PM 1/2/2008, Nick Iozzo wrote: >A great discussion appears to be happening under the original topic. I >wanted to further focus it into something I have been thinking a lot about >recently. So I have hijacked it and created this new thread. My apologies >to Oleh. > >I find all of the applications coming out to do quick real time testing of >alternate design solutions a great tool to have in our toolbox. But it is >just another tool, not a replacement. I see this as a complement (not a >supplement) to traditional user research and testing. > >If you where to just use these tools to iterate and improve your design, I >am sure you will be able to get the best design possible considering the >point you started from. Maybe, however, if you started with a very >different design your end iteration would perform even better. So, if the >only tool you used was a micro-design evolution approach, then you may end >up not with the best design possible, but with the best design considering >the point you started from. (In mathematical terms, a local maximum >http://en.wikipedia.org/wiki/Local_maximum). > >So how do you get the very best design possible? >If you had all the time any money you wanted, you would build the best >designs you could dream of that would cover as much of the design space as >possible. You would send small, but significant traffic to each and keep >tweaking the design until it was the best it could be. Eventually designs >would begin to drop-out because other designs would be performing better. >Finally, when you are left with a single design performing better then the >rest, then you know you have the best design for that design space. > >Of course we have real world issue of making bad first impressions with >users (not to mention the unlimited money part). So we can not follow a >model like this. > >Instead, what we can do, is to develop prototypes of the best design >possibilities we can dream up. We run small samples of users through these >prototypes and we see which performs the best and which appears to have >the best potential upside and most design flexibility. It may not be the >design all the users said they liked the best. (What a user says and what >they actually think are very different things). > >My opinion (In summary) >These analytical tools that are coming out to do micro evaluations of >design options are great. They will help you climb that hill to get your >design to the best it can be! But, they will not help you pick the tallest >hill to climb. It is too expensive to try and climb them all so you need a >low-cost method to explore and evaluate the potential of each... a method >like low-fidelity prototypes with usability test evaluations. > >We worked with one our client's, Orbitz, to do a presentation on this >subject at Gartner's web innovation summit. >http://agendabuilder.gartner.com/WEB1/WebPages/SessionDetail.aspx?EventSessionId=833. > >Its spin was more focused on the business topics of the issue and less on >the design and design process side of things. > >Nick Iozzo >Principal User Experience Architect > >tandemseven > >847.452.7442 mobile > >[EMAIL PROTECTED] >http://www.tandemseven.com/ >________________________________________________________________ >*Come to IxDA Interaction08 | Savannah* >February 8-10, 2008 in Savannah, GA, USA >Register today: http://interaction08.ixda.org/ > >________________________________________________________________ >Welcome to the Interaction Design Association (IxDA)! >To post to this list ....... [EMAIL PROTECTED] >Unsubscribe ................ http://www.ixda.org/unsubscribe >List Guidelines ............ http://www.ixda.org/guidelines >List Help .................. http://www.ixda.org/help ________________________________________________________________ *Come to IxDA Interaction08 | Savannah* February 8-10, 2008 in Savannah, GA, USA Register today: http://interaction08.ixda.org/ ________________________________________________________________ Welcome to the Interaction Design Association (IxDA)! To post to this list ....... [EMAIL PROTECTED] Unsubscribe ................ http://www.ixda.org/unsubscribe List Guidelines ............ http://www.ixda.org/guidelines List Help .................. http://www.ixda.org/help
