Mykola,
 
There is a very thin line in the process between methodology and outcome. More often then not "how to think" becomes "what to think". I am merely making a point that creative people should choose the tools to automate labor then the thinking itself - even if the later is easier.
 
    Let us distance it in unrelated area - the one I actually witnessed recently
 
    You watch high-tech medical doctor lately doing diagnosis. He asks series of questions, fills the answers into computer and prescribes drugs / reads diagnosis that were generated by "expert system". From the practice owner prospective he does what he suppose to do minimizing insurance and other risks with possible ineffectiveness - but in very predictable, easy to schedule process. Best of all, he is easily replaceable. And yes, the process is very clean, repeatable (assuming that the patient's spec, ie his ability to diagnose himself, does not improve/change), guaranteed to cover everything (patient told you that he can not think of anything else, and neither can or should the doctor), supported by unit testing of few key parameters like blood test, etc. Furthermore, you can scale it, separate question asking process from data entry from diagnosis sign-off, etc making process as as formal as it could and arguably should be.
 
    And yet some people who have some unexplained trust to "service networks"  or have historically been in the "process" too long option for  "old way" - with custom tailored diagnostics, fewer drugs, non-traditional treatments - all based on experience / consensus of contradictory methodologies? There are huge disadvantages in managing these people and  the process!
 
    It is a matter of preference. Everything else follows. I have seen extremely capable programmers falling in love with patterns/frameworks. That coincided with Internet revolution when instead of "surgical teams" of 5-6 people they had to lead 50-60 "bodies" in rapidly growing bodyshops. Most of them reverted to the previous model after the bust, however, the created structures are on the pattern/framework path as the only way they know and that undeniably works. It's a big world with enough places for everyone, as long as "methodology" is not enforced as a religion or such.
 
        Next, testing methodology. It bases actually has been explored in 70s before "structural programming" came around (and that was before "object-oriented" programming). At that time 2 most populare methodologies of coding where "full permutation" and "what if not" approaches. All possible cases were suppose to be enumerated and coded with "if" (framework 1) or "if not" (framework 2) statements. Research on which structural programming was done showed that grood programmers whould write those "if" statements correctly in 51% of the cases and bad programmers would get 49% right. And of course there is "Murphy law" that "the only one case happening in the reality is one not covered". One of the outcomes of structural programming that is that reliable programs have to contain the least amount of "if" statements (preferably 0) (hence structured) - the whole frameworks basis was essentially denounced. (Of course, structural programming was denounced by OO, and so on). The point being that there are limitation to the unit testing and they have been exceeded in 70s.
 
    Unit testing is more of assurance/progress management tool for the management then anything. It does catch crude bugs (like refactoring ones within relativel small unit) - the ones that are based on negligence and human factors listed in "if" methodology. It does not solve bigger problem that manifests itself in large scale apps. You need system test on the largest user base possible as a real measurement - and to make sure that "all permutation" approach on event driven really RICH application will work you need disproportional amount of testers/time. While I would make strong case to have at least 2 testers for large systems - preferably with no programming or subject matter experience - per developer, in reality testing of large systems has not been guaranteed by any means. It takes  automated bug reporting, maintenance teams, time, effort, commitment, money - and I would strongly recommend automating bug tracking from real users in software over any other method of collecting information.
 
Finally, refactoring in AS. Let us see all the places in recent Java programs that use hash maps. Let us make sure that names of objects are not known in advance and are not in constants. Wrap each use of returned values from hashmap in Try{} finally{} statement. How comfortable is that now? Will unit catch all these errors? Does the need for better "hashmap" seem more plausable?
 
 
Sincerely,
Anatole
 
----- Original Message -----
Sent: Friday, November 18, 2005 3:33 AM
Subject: Re: [flexcoders] Re: Cairngorm is bad?

Anatole,

I do not agree that it is a big problem in refactoring serverside code. Sure if you have no unit tests for RIA client you will not be able to detect that something broken when you change server interface. But in agile methodology that is most popular now you always have to have tests. And it is not impossible to do them for RIA client.

Another thing that can help you is code generation. In our application we completely generate all Flex2Java interaction code (Business delegates for all java services and value objects), also we are trying to use strong typing as much as it is possible. So if you change server interface your RIA code will not compile in most cases.

I agreed with you that bright people can work without frameworks but the code they will produce will not be quite maintainable. One of the main benefits of the frameworks is that makes code to be more clean for other people who knows that framework, however it raise some bounds for developers. Also I think that any serious project should be built using single approach. By approach I mean coding conventions, framework, and style of framework usage. Each framework can be used in multiple ways but to succeed you have to choose your own. I did so many times and in most cases we deliver stuff in time.

--
Best Regards,
Mykola

On 11/17/05, Anatole Tartakovsky <[EMAIL PROTECTED]> wrote:
Mykola,
    I just want to clarify the point I was making about the refactoring and non-heterogenic environment. There is a huge difference between RIA and desktop application. It is called "distributed non-heterogenic" application. Refactoring the back-end Java code will not cause compilation notifications of missing/changed fields/methods in UI - they have to be found via global search and (with larger apps) through testing  - and can easily slip into production, especially if done in maintenance phase. The same applies to changes in UI - very easy to get out of synchronization with back-end.
 
    Here is simplier example for desktop application developers. Let us say you do changes to non-heterogenic piece of application - SQL table used by application - changing column names and datatypes. Compiler (unless it is integrated with pre-compiler/code generator) is not going to catch those and prompt you that your queries are no longer correct. You can start getting random errors based on overruns of the buffers due to field length, etc.
 
    While programming is definetly fun, retesting large application after each change is definetly not. Writing test cases to anticipate future enhancements is not possible, adding ones along the change is not a reliable solution but rather desparate measure to plug the obvious hole.
 
    I am aware of 2 approaches to reduce the impact of refactoring (while hopefully reducing coding as well):
1. usage of higher-level "model" objects encapsulating communications as well - i.e. "super" dataProviders that are aware of retrieval/update methods, maintain state, provide diagnostics and can be submitted as a part of bigger transaction - just few features to be delt within any application. Such "models" can also greatly simplify coding of datagrids, forms and other databound controls
2. use code generators to be included as a part of "build" process. You can produce "proxy" objects that can facilitate strong type checking in either java of flex. They can improve performance of the software execution as "guesswork" is taken out of both client and flashgateway software. You can also generate UI resources so initial form painting is less time consuming.
 
I use both approaches - actually one on the top of the other - to reduce amount of "manual" code. That drives application into using high-level objects instead of framework - and I personally like it.
 
However I would say for one person who uses these approaches there are 10 people using frameworks. It come down to the project management - small teams of highly trained specialists vs uniformly trained in framework large development teams with predicatble (while long) development schedule/replaceable resources. If you foresee the lifecycle of the project to be completed via reducing the team toward the retaining the better technology people of the team and insuring the quality across the modules (and company budget allows for that) then the framework-less approach is the way to go. If you anticipate to retain maintenance crew of the least creative people that would prefer to do "spot" fixing then low-level code and framework might be better choice.
 
Here is very simple article on abstractions that is also applicable to refactoring:
Toward the end, there is Joel's estimate what using code generators or frameworks amounts to:

"The law of leaky abstractions means that whenever somebody comes up with a wizzy new code-generation tool that is supposed to make us all ever-so-efficient, you hear a lot of people saying "learn how to do it manually first, then use the wizzy tool to save time." Code generation tools which pretend to abstract out something, like all abstractions, leak, and the only way to deal with the leaks competently is to learn about how the abstractions work and what they are abstracting. So the abstractions save us time working, but they don't save us time learning.

And all this means that paradoxically, even as we have higher and higher level programming tools with better and better abstractions, becoming a proficient programmer is getting harder and harder."

 
Sincerely,
Anatole Tartakovsky
917-304-3381
  
   





--
Flexcoders Mailing List
FAQ: http://groups.yahoo.com/group/flexcoders/files/flexcodersFAQ.txt
Search Archives: http://www.mail-archive.com/flexcoders%40yahoogroups.com




YAHOO! GROUPS LINKS




Reply via email to