Hello,
I would like to parse a text file containing the following data, with MARPA:
EUR=089980
GBP=063886
AUD=135358
...
When I run my program (see below), it only displays the first exchange
rate, namely "089980", although my DSL file says that Catx (i.e., the
exchange rate file) should contain one or more Expressions ("Expression+").
Furthermore, why can't you specify tokens with specific lengths in MARPA
DSLs, e.g.,
Label ~ \w{3}
In the Catx data file's case, [currency] labels are always 3 characters
long.
Many thanks.
Best regards,
Philippe
PERL script:
-------------------
#!/usr/bin/perl
use strict;
use warnings;
use Marpa::R2;
use Data::Dumper;
my $data_file = '/Users/philippe/Desktop/MARPA/data.txt';
my $dsl_file = '/Users/philippe/Desktop/MARPA/catx.dsl';
my $input = slurp_file($data_file);
my $dsl = slurp_file($dsl_file);
my $grammar = Marpa::R2::Scanless::G->new( { source => \$dsl } );
my $recce = Marpa::R2::Scanless::R->new(
{ grammar => $grammar, semantics_package => 'My_Actions' } );
my $length_read = $recce->read ( \$input );
die "Read ended after $length_read of ", length $input, " characters"
if $length_read != length $input;
if ( my $ambiguous_status = $recce->ambiguous() ) {
chomp $ambiguous_status;
die "Parse is ambiguous\n", $ambiguous_status;
}
my $value_ref = $recce->value;
print "$$value_ref\n";
sub slurp_file {
my $file = shift;
local $/ = undef;
open my $fh, '<', $file or die "$!";
my $data = <$fh>;
close $fh;
return $data;
}
sub My_Actions::do_extract_rate {
my ( undef, $t1, undef, $t2 ) = @_;
return $t2;
}
--------------------
DSL file:
-------------------
:default ::= action => [name,values]
lexeme default = latm => 1
Catx ::= Expression+ action => ::first
Expression ::= Label '=' Rate action => do_extract_rate
Label ~ [\w]+
Rate ~ [\d]+
:discard ~ whitespace
:discard ~ cr
whitespace ~ [\s]+
cr ~ [\n]+
--------------------
--
You received this message because you are subscribed to the Google Groups
"marpa parser" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.