On Thu, Apr 12, 2018 at 2:56 PM, Gil Barmwater <gbarmwa...@alum.rpi.edu>
wrote:

> Found and fixed the problem and have been through all my tests that I
> developed for my proof of concept package.  In the process, I found several
> things I wasn't expecting.
>
> 1) The parser seems to have reversed the expected positions of the target
> type - STEM, STREAM, or USING - and the "mode" - APPEND or REPLACE.  I
> believe the syntax per the ANSI standard should be e.g. WITH OUTPUT STEM
> APPEND someStem. and not WITH OUTPUT APPEND STEM someStem.
>
> 2) There seems to be a requirement for the stream/stem name to be either a
> literal - "abc.txt" - or if it is a variable, enclosed in parenthesis -
> (someFile).  I expected the () for USING but not for STEM or STREAM.
>
> 3) My package allowed for stem "names" to be specified with or without the
> trailing . but my testing shows the trailing . is required or an error
> message is generated.  I'm OK with that as long as it is documented.
>
> 4) I get an error (unable to open ... for output) when I specify
>
> output stream "abc2.txt" error replace stream "abc2.txt"
> which was testing the redirection of both output and error to the same
> stream.
>
>
Problem now fixed.

Rick



> 5) Using a non-existent file for redirected input generates an error
> (unable to open ... for reading) while my package treated this as a null
> file.  Probably not a problem as the error message would flag a file name
> typo :-)
>
> As for checking in the code, I am not a committer so I plan to generate a
> patch file and attach it to RFE 4.
>
> One other item that I discovered is that I can get a deadlock in Windows
> as well.  I had not tested sending a lot of data to redirected output but
> when I did, my package locked up, presumably due to the output pipe being
> full.  I plan to experiment with ways around this but will generate the
> patch file with the code as it stands now.
>
> Gil
>
> On 4/11/2018 2:32 PM, Rick McGuire wrote:
>
>
> On Wed, Apr 11, 2018 at 2:26 PM Gil Barmwater <gbarmwa...@alum.rpi.edu>
> wrote:
>
>> Just a quick update on my progress so far.  I've gotten all the code in
>> (hopefully) the right places and I have a clean build.  My initial tests
>> are working but I've just hit my first problem which I need to debug.
>> Hopefully, it will not be too much longer before I'm done.
>
> checking the code in as it is is also an option...it gets more eyeballs on
> the problem
>
> Rick
>
>>
>>
>> --
>>
>> --
>> Gil Barmwater
>>
>>
>> ------------------------------------------------------------
>> ------------------
>> Check out the vibrant tech community on one of the world's most
>> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
>> _______________________________________________
>> Oorexx-devel mailing list
>> Oorexx-devel@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/oorexx-devel
>>
>
>
> ------------------------------------------------------------------------------
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
>
>
>
> _______________________________________________
> Oorexx-devel mailing 
> listOorexx-devel@lists.sourceforge.nethttps://lists.sourceforge.net/lists/listinfo/oorexx-devel
>
> --
> Gil Barmwater
>
>
> ------------------------------------------------------------
> ------------------
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
> _______________________________________________
> Oorexx-devel mailing list
> Oorexx-devel@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/oorexx-devel
>
>
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Oorexx-devel mailing list
Oorexx-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/oorexx-devel

Reply via email to