Re: [fonc] Debugging PEGs and Packrats

2011-12-14 Thread Lukas Renggli
 I've experimented in what little time I can devote with OMeta, PetitParser, 
 and Treetop. The debugging experience has been roughly consistent across all 
 three.

Casey, did you try the PetitParser IDE? If so, what did you miss?

If not, please check it out
(http://jenkins.lukas-renggli.ch/job/PetitParser/lastSuccessfulBuild/artifact/PetitParser-OneClick.zip).
It comes with dedicated tools for grammars (editor, visualizations,
profiler, debugger, ...). An earlier version of the tool is described
in Section 3.5 of this paper
(http://scg.unibe.ch/archive/papers/Reng10cDynamicGrammars.pdf). We
are currently working on an improved IDE with grammar refactorings.

Lukas


 One particular issue which has bugged me: memoization seems to carry a lot of 
 instance-state that's really hard to comprehend when the grammar isn't 
 working as I expect. It's just really hard to use that ocean of information 
 to figure out what I've done wrong.

 Given that with these new parsing technologies, we're pretty lucky to see 
 parse error as an error message, I can't help but think that it's worth 
 studying debugging strategies. Heh. :D I'm really not complaining, I'm just 
 pointing it out.

 Has anyone here found any technique(s) which makes debugging a grammar 
 written for a PEG/packrat less of a pain in the butt?

 I'd be really interested in hearing about it.



 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc



-- 
Lukas Renggli
www.lukas-renggli.ch

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Debugging PEGs and Packrats

2011-12-14 Thread Yoshiki Ohshima
At Wed, 14 Dec 2011 09:35:06 +0100,
Lukas Renggli wrote:
 
  I've experimented in what little time I can devote with OMeta, PetitParser, 
  and Treetop. The debugging experience has been roughly consistent across 
  all three.
 
 Casey, did you try the PetitParser IDE? If so, what did you miss?
 
 If not, please check it out
 (http://jenkins.lukas-renggli.ch/job/PetitParser/lastSuccessfulBuild/artifact/PetitParser-OneClick.zip).
 It comes with dedicated tools for grammars (editor, visualizations,
 profiler, debugger, ...). An earlier version of the tool is described
 in Section 3.5 of this paper
 (http://scg.unibe.ch/archive/papers/Reng10cDynamicGrammars.pdf). We
 are currently working on an improved IDE with grammar refactorings.

  Wow.  Pretty cool.

I'm playing with it a bit, and trying to figure out how I'd detect
an error in my grammar.  I introduced a bug in PPJsonGrammer by
changing string to read (omit the last $ asParser):

string
^ $ asParser , char star

and chose Dynamic and put {a: 1, b: 2} and said parse.  I see
the tree and the parse goes as far as position 3.  But of course, it
is not easy to tell that #string has the problem from this.  (I'd
think that this is the nature of grammar writing, where the parser
basically does not know what makes sense, especially if there are
other choices.

  OMeta2/Squeak has a feature to pop up a Squeak debugger at the
position where the parse went as far as it could, and then you can
step execute it.  But of course, it goes into the underlying
implementation and generated code, so it is quite tedious.  What we
would like to see in that debugger is the original code and step
execution that makes sense.

Of course, Squeak debugger lets you restart a context, but this
does not help much as the parser is stateful.  We contemplated Worlds
would help to really rewind a rule and try again...

-- Yoshiki

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] History of computing talks at SJSU

2011-12-14 Thread karl ramberg
One of Alans points in his talk is that students should be using bleeding
edge hardware, not just regular laptops. I think he is right for some part
but he also recollected the Joss environment which was done on a machine
about to be scraped. Some research and development does not need the
bleeding edge hardware. It can get a long way by using what you have till
it's fullest.

Karl

On Tue, Dec 13, 2011 at 9:02 PM, Kim Rose kim.r...@vpri.org wrote:

 For those of you looking to hear more from Alan Kay -- you'll find a talk
 from him and several other big names in computer science here -- thanks
 to San Jose State University.

  http://www.sjsu.edu/atn/**services/webcasting/archives/**
 fall_2011/hist/computing.htmlhttp://www.sjsu.edu/atn/services/webcasting/archives/fall_2011/hist/computing.html

  -- Kim


 __**_
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/**listinfo/fonchttp://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Debugging PEGs and Packrats

2011-12-14 Thread Matthew Blount
http://languagejs.com/ is a JavaScript PEG library written for the
Cappuccino project that claims to have a good approach to error handling.
From the page:

The most unique addition Language.js makes to PEG is how it handles errors.
 No parse ever fails in Language.js, instead SyntaxErrorNodes are placed
 into the resultant tree. This makes it trivial to do things like write
 syntax highlighters that have live error reporting. This also means that
 Language.js is very competent at handling multiple errors (as opposed to
 aborting on the first one that is reached).

 A new operator designed specifically to handle errors trivially and
 declaratively is added on top of the normal PEG operators. The naughty OR
 operator (%) behaves just like the choice operator (/), but only gets used
 if the parse first completely fails. Because of this, performance is
 guaranteed to never be affected, regardless of how many error rules you add
 to the grammar. Thus, you are allowed to offer alternative incorrect but
 valid grammars to provide increasingly useful errors to your users.

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] History of computing talks at SJSU

2011-12-14 Thread Jecel Assumpcao Jr.
Karl Ramberg wrote:

 One of Alans points in his talk is that students should be using bleeding edge
 hardware, not just regular laptops. I think he is right for some part but he 
 also
 recollected the Joss environment which was done on a machine about to be
 scraped. Some research and development does not need the bleeding edge
 hardware. It can get a long way by using what you have till it's fullest.

You mixed research and development, and they are rather different. One
is building stuff for the computers of 2020, the other for those of
2012.

I was at a talk where Intel was showing their new multicore direction
and the guy kept repeating how the academic people really should be
changing their courses to teach their students to deal with, for
example, four cores. At the very end he showed an experimental 80 core
chip and as he ended the talk and took questions he left that slide up.
When it was my turn to ask, I pointed to the 80 core chip on the screen
and asked if programming it was exactly the same as on a quad core. He
said it was different, so I asked if it wouldn't be better investment to
teach the students to program the 80 core one instead? He said he didn't
have an answer to that.

About Joss, we normally like to plot computer improvement on a log
scale. But if you look at it on a linear scale, you see that many years
go by initially where we don't see any change. So the relative
improvement in five years is more or less the same no matter what five
years you pick, but the absolute improvement is very different. When I
needed a serious computer for software development back in 1985 I
built an Apple II clone for myself, even though that machine was already
8 years old at the time (about five Moore cycles). The state of the art
in personal computers at the time was the IBM PC AT (6MHz iAPX286) which
was indeed a few times faster than the Apple II, but not enough to make
a qualitative difference for me. If I compare a 1992 PC with one from
2000, the difference is far more important to me.

 On Tue, Dec 13, 2011 at 9:02 PM, Kim Rose wrote:
 
 For those of you looking to hear more from Alan Kay -- you'll find a talk from
 him and several other big names in computer science here -- thanks to San
 Jose State University.
 
  http://www.sjsu.edu/atn/services/webcasting/archives/fall_2011/hist/computing.html

Thanks, Kim, for the link!

I have added this and four other talks from 2011 to

http://www.smalltalk.org.br/movies/

I also added a link to the Esug channel on Youtube, which has lots of
stuff from their recent conferences.

Cheers,
-- Jecel


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] History of computing talks at SJSU

2011-12-14 Thread Casey Ransberger
Inline and greatly abridged. 

On Dec 14, 2011, at 5:09 PM, Jecel Assumpcao Jr. je...@merlintec.com wrote:

 About Joss, we normally like to plot computer improvement on a log
 scale. But if you look at it on a linear scale, you see that many years
 go by initially where we don't see any change. So the relative
 improvement in five years is more or less the same no matter what five
 years you pick, but the absolute improvement is very different. When I
 needed a serious computer for software development back in 1985 I
 built an Apple II clone for myself, even though that machine was already
 8 years old at the time (about five Moore cycles).

That's just so cool. Someday I want to make an Apple IIgs clone because that 
thing rocked and the emulator I have is dog-slow:/ but we've talked about that 
before!

 The state of the art
 in personal computers at the time was the IBM PC AT (6MHz iAPX286) which
 was indeed a few times faster than the Apple II, but not enough to make
 a qualitative difference for me. If I compare a 1992 PC with one from
 2000, the difference is far more important to me.

Okay so this is where stuff gets funny to me. My computer, if you look at the 
clock and the cores, is blazing fast. You can see it once in awhile: when doing 
something graphically intensive (the GPU is also really fast) or something 
straightforwardly computationally expensive, like compiling C code with all of 
the optimizations on. 

But in general... my computer is only a tiny bit faster than the one I had in 
the early nineties. In terms of day to day stuff, it's only gotten a tinsy bit 
faster. Sometimes I sit there looking at an hourglass or a beach ball and think 
to myself this only used to happen when I was waiting on a disk to spin about. 
There isn't even a disk in this thing. What the hell?

Hypothesis: Mainstream software slows down at a rate slightly less than 
mainstream hardware speeds up. It's an almost-but-not-quite-inverse Moore's 
Law. 

Unless someone else has called this out directly, I'm calling it Joe's Law, 
because I don't want to deal with the backlash!

 Cheers,
 -- Jecel
 
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] History of computing talks at SJSU

2011-12-14 Thread David Barbour
On Wed, Dec 14, 2011 at 11:02 PM, Casey Ransberger casey.obrie...@gmail.com
 wrote:

 But in general... my computer is only a tiny bit faster than the one I had
 in the early nineties. In terms of day to day stuff, it's only gotten a
 tinsy bit faster. Sometimes I sit there looking at an hourglass or a beach
 ball and think to myself this only used to happen when I was waiting on a
 disk to spin about. There isn't even a disk in this thing. What the hell?

 Hypothesis: Mainstream software slows down at a rate slightly less than
 mainstream hardware speeds up. It's an almost-but-not-quite-inverse Moore's
 Law.


 Unless someone else has called this out directly, I'm calling it Joe's
 Law, because I don't want to deal with the backlash!


It's a variation on Parkinson's Law.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc