I agree with Marten that these problems have probably been solved.
Basically, the type system now infers types based on how the data
is used downstream, resolving to the least constrained type that
is consistent with the usage.

Thus, if downstream you extract a field from a record, then the
upstream type will be resolved to be a record with at least that
field...

Edward


On 7/23/12 5:41 PM, Marten Lohstroh wrote:
I still had conversation marked for answering, but I think all the
problems mentioned below are solved by the new JSONToToken and
TokenToJSON actors, along with the changes to the type system that
allow connecting these actors without declaring a type on the output
port. Instead, now the type of the output port is inferred from the
declared or resolved type of the connected receiver's input ports.

Runtime errors can still occur if the JSON-data is insufficient to
meet the requirements of the type constraints. However, when this
happens, the error is produced on the sender's side, whereas before
this would be on the receiver's side. This allows for more
sophisticated error handling policies than just crash. The error
handling however is still in development..,

Marten

On Thu, Apr 12, 2012 at 8:24 AM, Hogan, D. (GE Energy)<[email protected]>  wrote:
Beth,



For your first example, I have a similar use case with some Fortran programs
that use namelists for I/O.



For other web data streams, the type signature changes.  Here we currently
set the output port type to an empty record so at least the type solver
knows it’s a record, but then we can’t connect this output to a
RecordDisassembler – it would throw an exception when the model is executed



Are you saying there’s no way to connect to a RecordDisassembler, or there’s
no way to connect to it without manually forcing the output port types
(which can lead to runtime errors)?  I have found the latter to be true, but
maybe we are talking about different things.

Also like you mentioned for objects, JSON is fairly close to a Ptolemy
record, but like Java types, the JSON types are not exactly the same as
Ptolemy types.  (For example, it looks possible that JSON arrays could
contain mixed types in the same array).  We don’t have a good solution for
that yet…  right now we are just extracting Ptolemy types and hoping for the
best.

I was thinking along the lines of a general tool to handle standard Java
data type conversions.  Take that tool and extend it to speak Ptolemy and
perhaps JSON types too.  For example, maybe we could teach
http://transmorph.sourceforge.net how to deal with Ptolemy tokens/types.  If
you want JSON support, you can initialize it with a different set of
converters than the default.  There would one place for all of the standard
Java data type conversions including generics (via TypeReferences to work
around type erasure) and convert to/from Ptolemy types.  For example, it
would be able to handle converting back and forth between a
Map<String,Map<String,Double>>  and a RecordToken or a List<List<Integer>>
and ArrayToken or IntMatrixToken.  It could be useful on simple cases too
like int[] to/from ArrayToken.



That type of code is ugly, but it would help with quickly creating actors
that interface with existing code.



From: Christopher Brooks [mailto:[email protected]]
Sent: Wednesday, April 04, 2012 12:05 PM
To: Hogan, D. (GE Energy)
Cc: [email protected]; [email protected]
Subject: Re: [Ptango] Re: [kepler-dev] ObjectToRecord related questions




Beth Latronico asked me to forward on the following about this issue.

Beth writes:

We ran into a similar problem trying to build an actor which reads
information in JSON from a web page and outputs a RecordToken.  Here’s a
summary of the discussion and a sample model in case this is helpful for
your situation!

Actor:  ptolemy.actor.lib.conversions.JSONToRecord

Test model:  ptolemy.actor.lib.conversions.test.auto.JSONToRecord1.xml



Ideally we wanted to be able to set the type of the output to a RecordType
including all of the fields and types of those fields (vs. an empty Record),
so that we could use the RecordDisassembler.  However, since the type solver
only runs once and runs during preinitialize(), this means we need to get
the type information somehow ahead of time, and that the type signature
cannot change during execution.  (It’s OK if the data changes).



For some web data streams, this seems OK – the type signature is always the
same – so we pull some sample data in preinitialize() to determine the type,
and then check the type signature in fire() to make sure it has not changed.
This assumes that we know the URL of the data source, the URL remains fixed
and that sample data is available before the model executes.



For other web data streams, the type signature changes.  Here we currently
set the output port type to an empty record so at least the type solver
knows it’s a record, but then we can’t connect this output to a
RecordDisassembler – it would throw an exception when the model is executed.

Token value = new RecordToken();

output.setTypeEquals(value.getType());



So, we’re debating on whether 1) these two cases should be split into
different actors, and 2) if it’s appropriate to use RecordDisassembler or
should we create some new actors that can handle loosely-typed records.



Also like you mentioned for objects, JSON is fairly close to a Ptolemy
record, but like Java types, the JSON types are not exactly the same as
Ptolemy types.  (For example, it looks possible that JSON arrays could
contain mixed types in the same array).  We don’t have a good solution for
that yet…  right now we are just extracting Ptolemy types and hoping for the
best.



Unfortunately there are more questions than answers at the moment, but hope
the discussion highlights some of the issues!



Best,

Beth

Marten Lohstroh suggested that we could have a flag that when set required
strict typing.  Certain actors could record the type information in a manner
similar to how the Test actor works.  The Test actor has a shared parameter
called "trainingMode".  When trainingMode is true, the known good results of
the TestActor is recorded.  In a similar manner, certain actors could record
the type information when trainingMode was set to true.


_Christopher

On 4/2/12 6:05 PM, Edward A. Lee wrote:


Ptolemy II is statically typed, so after a model is preinitialized,
the types of all ports must be known. The type system is described in
chapter 5 of this document:

http://www.eecs.berkeley.edu/Pubs/TechRpts/2008/EECS-2008-29.html

and also in this paper:

http://chess.eecs.berkeley.edu/pubs/665.html

For Record types, the most general record type is the empty record type.
That is, a record type {fieldName = double}, for example, is also
an instance of {}, the empty record type. Moreover, a record of
type {a = int, b = double} is also an instance of {a = int}.

So, for example, if you know that your output record will contain
a field named "a" of type "int", then you can force the output type
to be {a = int} (right click on the actor, select Configure Ports,
and enter "{a = int}" in the type field). As long as there is actually
such a field, in the record, this will work. But if you run the model
and there is no such field, I believe you will get a run-time type
error (which is probably what you want).

Edward


On 4/2/12 1:51 PM, Hogan, D. (GE Energy) wrote:

I have an actor that uses the structure of a file generated at runtime
to create an output record.  The actor is working, but I have a few
questions on ObjectToRecord since it is similar.

In ObjectToRecord, it does not change the type and lists it as fixme.
The documentation for RecordType mentions using setTypeAtMost(new
RecordType(new String[0], new Type[0])) when you want to specify a
record without specifying the fields.  Should I use that?  Is there a
better way to specify it is a record and the field names/types are
discovered at runtime?  Since unknown matches that type constraint, what
is the best way to handle this?

ObjectToRecord doesn't try to convert between standard Java types and
Ptolemy types.  I noticed there is a
ptolemy.data.expr.ConversionUtilities, but it looks limited to the
expression language needs.  Is there another conversion in Kepler?

_______________________________________________
Kepler-dev mailing list
[email protected]
http://lists.nceas.ucsb.edu/kepler/mailman/listinfo/kepler-dev



--

Christopher Brooks, PMP                       University of California

CHESS Executive Director                      US Mail: 337 Cory Hall

Programmer/Analyst CHESS/Ptolemy/Trust        Berkeley, CA 94720-1774

ph: 510.643.9841                                (Office: 545Q Cory)

home: (F-Tu) 707.665.0131 cell: 707.332.0670

_______________________________________________
Ptango mailing list
[email protected]
http://chess.eecs.berkeley.edu/ptango/listinfo/ptango

<<attachment: eal.vcf>>

_______________________________________________
Kepler-dev mailing list
[email protected]
http://lists.nceas.ucsb.edu/kepler/mailman/listinfo/kepler-dev

Reply via email to