[I couldn't figure out how to quote properly underneath Trevor's message, 
sorry!]

Though it offers a good view on the implications of Cognitive Load Theory and 
the associated memory research for teaching and has been widely read, the 
Kirschner, Sweller, and Clark article [3 in Trevor's comment] can be a bit of a 
tricky entry point into the literature as their notion of "direct instruction" 
includes guidance, student engagement, and interaction, and is pitted against a 
bunch of less successful (according to their measures) "minimal guidance" 
methods. The "minimal guidance" collection is quite a large batch of 
methods/approaches, and includes Problem-Based and Inquiry-Based methods, for 
example, which generally also feature significant guidance, student engagement, 
and interaction as part of their design, so it becomes difficult to distinguish 
what is really intended by "direct instruction" (as noted in a response article 
[4], among others).

For the SWC lesson provided as an example here, there was certainly substantial 
guidance: presentation of ideas and examples went on for quite a while before 
particpants were given their tasks. The tasks were a sequence of exercises 
(designed/provided by the instructor; this is yet more guidance), presumably of 
increasing difficulty, where they were asked to practice the kinds of things 
that people encounter in scientific computing (dealing with unknown data files, 
grappling with unexpected bugs, etc.) and had relatively on-demand support from 
the helpers and other participants. This is aligned with the kind of 
instruction advocated by Kischner et al., and probably gave everyone a good 
opportunity to progress from where they started that day.

If the same exercises were just solved live by the instructor, for example, 
maybe all of the exercises would have been solved ("covered"), but I am willing 
to bet that the people who were stuck on the early exercises when on their own 
would have been lost for most of such a presentation, and gotten much less out 
of the experience in terms of building skills (not to mention effects on 
motivation). Or to take another extreme, you could distribute data that you 
know has some interesting patterns and just say "go!" on the first day and let 
people figure it out; participants would undoubtedly learn some meaningful 
things about data exploration and analysis but there would be a lot of 
meandering and they wouldn't be benefitting nearly as much from the 
instructional team's expertise (modeling of problem solving and use of 
resources, providing sensible/well-designed practice tasks, providing feedback 
on practice). 

As per Giuseppe's original question, would cutting up the presentation a bit 
more and weaving the exercises throughout be more effective, I would agree with 
others in this thread that more frequent chances for practice and feedback are 
likely to help more, but there may be reasons to give a long stretch for work 
alone. Complex tasks may need long periods for struggle/practice, so a "putting 
it all together" activity may be better suited by a longer time for working on 
tasks. If there is a widespread difficulty that isn't terribly interesting to 
the main ideas, like the GUI bugs, maybe having a shorter cycle of "try this - 
did everyone get this error - this is why, there's this bug" would be warranted 
before the longer period working alone. In my experience, the only practical 
way to discover these sort of difficulties is to run the lesson, collect that 
sort of information, and do it differently the next time. :)

regards,


Warren
------------------
Warren Code
Associate Director, Science Centre for Learning and Teaching
University of British Columbia


[4]: One of the response articles in the follow-up 42:2 issue of Ed 
Psychologist (paywalled, but abstract below gives the idea):

Scaffolding and Achievement in Problem-Based and Inquiry Learning: A Response 
to Kirschner, Sweller, and Clark (2006). Cindy E. Hmelo-Silver, Ravit Golan 
Duncan, and Clark A. Chinn. Educational Psychologist, 42:2, 99-107.  
http://dx.doi.org/10.1080/00461520701263368

Abstract: "Many innovative approaches to education such as problem-based 
learning (PBL) and inquiry learning (IL) situate learning in problem-solving or 
investigations of complex phenomena. Kirschner, Sweller, and Clark (2006) 
grouped these approaches together with unguided discovery learning. However, 
the problem with their line of argument is that IL and PBL approaches are 
highly scaffolded. In this article, we first demonstrate that Kirschner et al. 
have mistakenly conflated PBL and IL with discovery learning. We then present 
evidence demonstrating that PBL and IL are powerful and effective models of 
learning. Far from being contrary to many of the principles of guided learning 
that Kirschner et al. discussed, both PBL and IL employ scaffolding extensively 
thereby reducing the cognitive load and allowing students to learn in complex 
domains. Moreover, these approaches to learning address important goals of 
education that include content knowledge, epistemic practices, and soft ski
 lls such as collaboration and self-directed learning."

________________________________________
From: Discuss [[email protected]] on behalf of W. 
Trevor King [[email protected]]
Sent: March-05-16 14:15
To: Giuseppe Profiti
Cc: Software Carpentry Discussion
Subject: Re: [Discuss] Leave students by themselves during hands-on

On Thu, Jan 28, 2016 at 01:33:57AM +0100, Giuseppe Profiti wrote:
> The lesson was something like that: less than 1 hour of explanation,
> then exercises were presented and the students had 1 hour or a bit
> more to complete them. Using colored stickers, students could
> attract our attention when they needed help.
>
> I was a bit skeptic about this approach: the students had no clue
> about the content and the format of the input file provided, some of
> the steps required knowledge about a couple of bugs in the software
> UI, and there was no explanation on the expected output (or on the
> meaning of the output, when there was more than one result). Also,
> given the length of the session there was no way to adjust their
> pace: few finished almost all the exercises, some were stuck at the
> first one and so on.
>
> However, another trainer pointed out that in this way the students
> were forced to think about the problems they were facing and to ask
> for help.

I was catching up on the Computing Education Blog [1] spearheaded by
Mark Guzdial, and came across this comment [2] linking [3] which seems
like a nice review article pointing out lots of research in favor of a
more hands-on approach to teaching.  I haven't had time to go through
and digest it yet, but thought I'd post a reference here in case it
provides a foothold in the existing body of research on this sort of
issue.

Cheers,
Trevor

[1]: https://computinged.wordpress.com/
[2]: 
https://computinged.wordpress.com/2016/03/04/friction-between-programming-professionals-and-beginners/#comment-52575
[3]: https://doi.org/10.1207/s15326985ep4102_1  (open access)
     Paul A. Kirschner, John Sweller, and Richard E. Clark (2006) Why
     Minimal Guidance During Instruction Does Not Work: An Analysis of
     the Failure of Constructivist, Discovery, Problem-Based,
     Experiential, and Inquiry-Based Teaching, Educational
     Psychologist, 41:2, 75-86.


_______________________________________________
Discuss mailing list
[email protected]
http://lists.software-carpentry.org/mailman/listinfo/discuss_lists.software-carpentry.org

Reply via email to