Re: studies of naming?

2012-03-29 Thread John Daughtry
Steven,

Given that many programmers have been trained to believe that opportunism
is bad, it is also likely that observation of programming evokes
non-opportunism. For example, in Rosson and Carroll's Reuse of Uses
paper, the subjects exhibited opportunism with respect to using an API, but
reflected on the behavior as something that they shouldn't have done
because it wasn't the way code should be written. They are successful and
efficient when they adopt an opportunistic strategy, yet they are taught
that it is bad behavior. Have any thoughts?

I often here people talking about opportunistic programmers as if this
strategy is a persistent individual difference. Is there any proof for the
assumption of persistence?

I have always found it odd the way we talk about opportunism as being an
alternative to, for example, strategic programming. I find it more
commensurable with my observations to consider opportunism the default
state, and that other approaches, at times and to various degrees,
overload our default opportunistic behaviors. The questions and
descriptions shift slightly under such a view, but it seems more
appropriate given every programmer I have ever observed.

I realize that in this entire email I may just be hitting on the conflict
between personas and a descriptive theory of programming styles; personas
are distinct and immutable for a reason.

John




On Thu, Mar 29, 2012 at 9:28 AM, Steven Clarke
steven.cla...@microsoft.comwrote:

   I'm very aware that there is a lack of respect and understanding of
 these different programming styles. That doesn't mean we should cast out
 these programmers. My colleagues and I spend a lot of time observing
 professional programmers.  And many of those programmers exhibit the
 opportunistic workstyle. A large number of them earn a good living writing
 code this way.

 One of the reasons discussion lists such as this one exist is to discuss
 how to accommodate different approaches to programming. I'd rather figure
 out ways we can improve their development experience.

 Note that our study was focused on API design, not language design. Note
 also that optional and named parameters were not available at the time we
 did the study.

 It certainly wasn't our intent to claim that the create set call pattern
 is preferred by all opportunistic programmers compared to every design
 pattern that ever has and ever will exist.

 Steven
  --
 From: Richard O'Keefe
 Sent: 29/03/2012 07:55
 To: Steven Clarke
 Cc: John Daughtry; Brad Myers; Raoul Duke; Ppig-Discuss-List

 Subject: Re: studies of naming?


 On 29/03/2012, at 3:39 AM, Steven Clarke wrote:
 
  We don’t have the luxury of dismissing these types of programmers. While
 it might strike you with terror that these programmers exist, they are
 successfully building applications in many different domains. They may work
 differently to you and many other programmers but that doesn’t necessarily
 mean that the code they create is worthless.

 Poke around on the web and you will stumble across other people saying
 things like the one thing that frightens me is cut-and-paste programmers
 and As a programmer, boilerplate scares me, because it is prone to errors
 that don't get noticed. Repeated code is an occasion for errors, because
 the code that the boilerplate calls may change. It's easy to miss
 correcting one of the many repetitions.

 Boilerplate prevention happens when designing a programming interface or
 language. It's hard, though. Certain kinds of boilerplate become idiomatic.
 (Analogy from English: for all intents and purposes.) i catch myself not
 even noticing them. It's hard to revise the code constantly, looking for
 opportunities to strip away unnecessary repetition -- especially when
 working in a lower-level programming language.

 There are sound good reasons why a *platform* like .NET needs to support
 everybody and his dog (and their lemons).  It's not so clear that it is
 necessary for a *language* to have to cater to all personalities and
 workstyles.
 Necessary?  It's not even clear to me that a language *can* cater equally
 well
 to all personalities and workstyles.

 One issue, of course, is that if a constructor has so many arguments that a
 cut-and-paste programmer is worried about all the dots, it has _too many_
 arguments.  My Smalltalk compiler, for example, only accepts up to 15
 arguments, which is all the ANSI Smalltalk standard requires.  No code I've
 written has even got past eight parameters.

 I wonder just how 'sucessful' these applications you speak of really are.
 Or more precisely, I wonder just which of the quality attributes they
 actually possess.  That is, of course, another study.



  Within the Visual Studio team at Microsoft we’ve devoted efforts to
 attempting to make them successful by adapting to their workstyles when
 appropriate.
 
  There are a few blog posts and papers that describe these personas in
 more detail that might

Re: studies of source code line purposes?

2011-10-06 Thread John Daughtry
With respect to the UI portion...
Here is a paper by Myers and Rosson from 1992:
http://dl.acm.org/citation.cfm?id=142789




On Thu, Oct 6, 2011 at 6:58 PM, Raoul Duke rao...@gmail.com wrote:

 hi,

 along the lines of a thought i've had of late why is there so much
 *code*!?, i wonder if anybody has tired to study programs and
 categorize the source code into purposes, so we could get a feel for
 what kinds of complexities are all mixed up in our systems? i have
 done cursory google searching, but it didn't immediately turn up
 anything obviously appropriate based on titles. i.e. i'm wondering how
 much code can be attributed to:
 * wiring data through, components together.
 * control flow.
 * managing namespaces.
 * dealing with errors of all ilk, those can be categories in and of
 themselves.
 * user interface.
 * concurrency.
 * typing.
 * yadda yadda yadda
 there's probably a fair bit of overlap in the semantics/purposes/goals
 of those, making categorization more nuanced?

 thanks and apologies if i missed an obvious paper!
 sincerely.

 --
 The Open University is incorporated by Royal Charter (RC 000391), an exempt
 charity in England  Wales and a charity registered in Scotland (SC 038302).




Re: Call for advice, and possible case study?

2011-06-10 Thread John Daughtry
My two cents...

Use whatever route to Java proficiency the instructor happens to feel most
passionately about. If they really believe the argument that
INSERT_LANGUAGE_HERE first works well, then let them do it that way. If
they feel strongly that they can teach Java first, let them do it that way.

The primary component of a curriculum is the instructor and their belief and
excitement for what they are doing. Someone who believes in teaching Java
first and is excited about doing it that way (and slightly skeptical of
Python) will have much greater success teaching Java first as opposed to
Python then Java.

If the instructor is apathetic about the approach, then fire the instructor
and hire another who cares enough about the topic to have opinions (even if
those opinions are tenuous).

No matter the route to learning Java, the final learning objectives are the
same for this company. They know what they think they want employees to
know. The various routes to getting there each have positive and negative
trade-offs. But, they are a moot point without a decent instructor who
believes in the students, the learning objectives, and the route to which
the learning objectives are accomplished within the context of the
curriculum.

There are countless papers over decades on why one approach is better than
another. Their utility isn't in actually finding the right method, but
instead as a mechanism for instructors to find the method that works for
them given their interests, strengths, abilities, and experience.

As a simple example, one instructor may be fantastic at the Socratic method
while another cannot do it at all. Likewise, one may be really good at
explaining the relevance and utility and history of the Java syntax as a
teaching instrument in itself, while another hates it so much that they
cannot make themselves be excited about it.

John Daughtry




On Fri, Jun 10, 2011 at 3:07 PM, Russel Winder rus...@russel.org.uk wrote:

 On Fri, 2011-06-10 at 19:47 +0100, Stasha Lauria wrote:
  I fully agree on both:
 
  1-  Don't teach Java.
 
  2-  before learning _Java_, it pays to learn something about
 _programming_, and that's definitely easier using Python than using Java.
 
  This is based on my personal experience of teaching programming to First
 year undergraduate students.

 Graham Roberts at UCL is using Groovy and Sarah Mount at Wolverhampton
 and James Shuttleworth at Coventry are using Python to great effect.
 The folk at Leeds are using Python also I believe.

 The big problem though is the the issue of type.   Believers in static
 strong typing will object to the use of languages that work with dynamic
 typing even though learners seem to find it easier to do things without
 having to worry about types in the first instance.  I guess someone
 somewhere needs to do some experimentation rather than there just be
 anecdotal evidence and advocacy research?

 --
 Russel.

 =
 Dr Russel Winder  t: +44 20 7585 2200   voip:
 sip:russel.win...@ekiga.net
 41 Buckmaster Roadm: +44 7770 465 077   xmpp: rus...@russel.org.uk
 London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder


 --
 The Open University is incorporated by Royal Charter (RC 000391), an exempt
 charity in England  Wales and a charity registered in Scotland (SC 038302).




Re: evalutation of new tools to teach computer programming

2011-03-01 Thread John Daughtry
I would suggest taking a more holistic view of the design space. Rather than
asking which tool is best, you may be better served by seeking to
empirically describe and explain the underlying trade-offs. In what ways do
option1 help, hinder, and undermine learning? In what ways do option2 help,
hinder, and undermine learning? In all likelihood there are answers to all
six questions.

John
--
Associate Research Engineer
The Applied Research Laboratory
Penn State University
daugh...@psu.edu



On Tue, Mar 1, 2011 at 7:08 AM, Thomas Green green...@ntlworld.com wrote:

 Depending on your aims, you might want to measure transfer to other
 problems: that is,  do participants who used tool A for the sorting task,
 then do better when tackling  a new problem, possibly with a different tool,
 than participants who used tool B?

 You might also want to look at memory and savings: how do the participants
 manage two months later? Occasionally cognitive tasks like yours show no
 effect at the time but produce measurable differences when the same people
 do the same tasks later.

 Pretty hard to create a truly fair test, but things to think about are
 controlling for practice and order effects, which should be easy, and
 controlling for experimenter expectation effects. The hardest thing to
 balance for is sometimes the training period: people using a new tool have
 to learn about it, and that gives them practice effects that the controls
 might not get. Sometimes people create a dummy task for the control
 condition to avoid that problem; or you can compare different versions of
 the tools, with differing features.

 I suggest you try to avoid the simple A vs B design and instead look for a
 design when you can predict a trend: find A, B, C such that your theory says
 A  B  C. The statistical power is much better.

 Don't forget to talk to the people afterwards and get their opinions.
 Sometimes you can find they weren't playing the same game that you were.

 Good luck

 Thomas Green




 On 1 Mar 2011, at 11:20, Stefano Federici wrote:

  Dear Collegues,
 I need to plan an evaluation of the improvements brought by the usage of
 specific software tools when learning the basic concepts of computer
 programming (sequence, loop, variables, arrays, etc) and the specific topic
 of sorting algorithms.

 Which are the best practises for the necessary steps? I guess the steps
 should be: selection of test group, test of initial skills, partition of the
 test group in smaller homogenous groups, delivery of learning materials by
 or by not making use of the tools, test of final skills, comparative
 analysis.

 What am I supposed to do to perform a fair test?

 Any help or reference is welcome.

 Best Regards

 Stefano Federici
 -
 Professor of Computer Science
 University of Cagliari
 Dept. of Education and Philosophy
 Via Is Mirrionis 1, 09123 Cagliari, Italy
 -
 Tel: +39 349 818 1955 Fax: +39 070 675 7113


 --
 The Open University is incorporated by Royal Charter (RC 000391), an
 exempt charity in England  Wales and a charity registered in Scotland (SC
 038302).


 73 Huntington Rd, York YO31 8RL
 01904-673675
 http://homepage.ntlworld.com/greenery/






Re: Intuitiveness of programming languages/paradigms

2009-11-24 Thread John Daughtry
Maybe it is time to rewrite K-12 math books to be in line with computational
processing.
It would be great. Imagine dad at the dining room table trying to explain to
little Johnny... 100 Cheerios are here, and if we add another, we have -100
Cheerios.
Sorry, I couldn't help myself.

With respect to such problems, I spent the usual amount of time in college
studying various complexities in arithmetic on computers. Yet, I have only
seen problems crop up three times over 10 years of full-time programming
experience. For 'most' programmers, the complexities of computational
arithmetic don't impact us on a daily basis. And, when these problems occur,
they often result in obviously wrong results (as opposed to believable
results).

Thus, if one were to study an aspect of this problem, I would think it would
be better to focus on a higher-impact issue (not that arithmetic doesn't
have worth).




On Tue, Nov 24, 2009 at 4:35 PM, Richard O'Keefe o...@cs.otago.ac.nz wrote:


 On Nov 24, 2009, at 9:35 PM, Derek M Jones wrote:

  Brad,

  like i said, i'm not sure intuition exists

 What's quite certain is that *claims* of intuitiveness exist.


 But do they only exist as a reason for justifying the use of
 one particular language?


 I don't think so.  For one thing, in the recent thread that got me
 started on this, other people were recommending a whole range of
 programming languages (Java, C#, Python, AWK, even PERL).  For
 another, when people try to justify one particular language, there
 are lots of other reasons they can and usually do oofer.

 I believe that when people say things like imperative programming
 is more intuitive than [whatever] they mean _at least_ the
 following things:
  1 I learned imperative programming with only a modest amount
   of trouble or no trouble at all.
  2 I was able to transfer what I learned to other imperative
   languages with little or no trouble.
  3 I find [whatever] much harder to understand.
  4 I know a lot of other people who feel the same.
  5 I do NOT know many people (or even any at all) who came from
   [whatever] to imperative programming and found it hard to
   understand.
  6 The experienced difficulty of [whatever] is not a defect in
   us or our education but a defect in [whatever].

 For the speakers, 1-5 are facts and 6 is felt to be justified by
 those facts.  The possibility of selection bias (people who would
 have been more comfortable learning Haskell or Miranda first
 very seldom get the chance, and leave the field, so we never get
 to hear their opinions about intuition and programming languages)
 is rarely considered.


  How could students tell the difference between having problems
 programming and having problems using a particular kind of
 language?  Perhaps this distinction is not important, they
 could simply try another approach and see if it makes any
 difference.


 That's indeed an operational way of telling the difference.

 That suggestion of mine was not just a half-baked idea, it was
 just set out in the sun for a minute or two.  It would not be
 easy to set up or administer.

 By the way, there's a service paper here for people who want to
 be surveyors.  It covers trigonometry, statistics, a couple of
 other topics, and some programming.  The surveying department
 insisted that the language taught be Visual Basic (more precisely,
 Visual Basic for Applications, inside the Excel spreadsheet).
 I was the only computer science lecturer willing to be involved
 with it.  I only have five one-hour lectures to teach the
 elements of programming.

 I *KNOW* the thing is impossible.

 I spend one lecture explaining that and why computer arithmetic
 does not behave the way they expect arithmetic to behave, for
 example that you can find a number X such that X + 1  X
 and numbers X, Y, Z such that X+(Y+Z) differs from (X+Y)+Z
 and numbers X Y both different from zero such that X*Y = 0.
 I spend half of another lecture telling them that they need to
 write down what their functions are supposed to do and to TEST
 their functions to make sure that they do.

 When you stop and think about it, computer arithmetic is
 *stunningly* unintuitive, IF your intution is based on the
 laws of whole numbers and fractions learned at school and the
 laws of the real and complex numbers learned in first year
 mathematics at university.

 I wonder if the question of intuitiveness could be studied
 at the level of arithmetic rather than programming as a whole.
 For example, Smalltalk counts as OO-imperative, but has
 bignum and ratio arithmetic built in and standard:  6/4 gives
 the answer 3/2, not 1.5.  Java _has_ bignum arithmetic, but
 doesn't let you use ordinary notation with it.  And so on.




Re: static and topicality

2009-08-07 Thread John Daughtry
Chris brings up an interesting point... the underlying strategy. I think the
strategic intension of the usage is more interesting than the pattern by
itself.

Take these strategies for example (off the top of my head):
  1. Use static whenever a method doesn't access attributes
  2. Use static only in utility classes that don't fit the OO metaphor (e.g.
- java.util.Math)
  3. Use static (naively) to satisfy the compiler (Chris Bogart's example)
  4. Use static only when required (e.g. - singletons)
  5. Always use static until the need for an OO-approach in a module is
apparent
  6. Never use static because it is inherently evil (along the lines of
global variables)

What are the strategies being used (many of which may end up surfacing as
the same patterns in source code)? And, what value-systems and training led
to those patterns being formed over the career of the programmers?



On Fri, Aug 7, 2009 at 12:02 PM, Chris Bogart
bog...@eecs.oregonstate.eduwrote:



 In TA'ing a beginning Java class last year, the most typical use of static
 I saw in student assignments was actually a misuse.  Students tended to blur
 the distinction between a class and an object, and they'd try to call a
 method, when they had not instantiated an object yet, and of course get an
 error message.  They'd try making the method static, which in turn caused
 other errors, requiring them to make other methods and member variables
 static as well.  They'd basically end up with a totally static class that
 they used as a singleton.  In some cases that ended up being OK; but you can
 see how it can lead to hard-to-diagnose bugs if you don't really understand
 the issue.

 So, throwing the word static in to make the compiler happy is not a
 normative usage of static, but I'd have to call it typical, at least in one
 population of fledgling programmers.

 Chris Bogart
 bog...@eecs.oregonstate.edu