I use an interface box Advantech ADAM 5000/TCP ethernet ( +/- same
Compact Fieldpoint ) and a 16 analog inputs channels bloc. I generate
a signal from a device, value for this signal is 1.35 Volts by example
or all real voltage ( between 1.xxx and 5.xxxx Volts ). I use an
Advantech OPC server. I read this value with Labview 7.0 and
datasocket connection. I want display this value in a waveform graph
and a numeric display. When I start application I see the correct
value during 1 second, after this time, value is rounded to 1 Volt (
by example ) or truncated.

Example : If real value generated is 1.350 Volts, when program start,
during one second, this value is displayed correct 1.350 inside the
graph and the numeric display. After this time, the graph display a
continues line with a value 1 Volt. But inside OPC Server this value
is correct all time 1.350 Volt.

I use a While Loop for including a "Read DS", a "Variant to Data"
transformer and a Graph + a display numeric value. Setting for all is
DBL ( Dubble ) No timeout is using and the conversion data type for
variant is a constant 0,000000.

What could be the problem ? Thanks.

Reply via email to