Hi Andy,

It sounds like there may be some interesting things going on with the data 
values for those records that fail.
You could try something like creating a "Modified Java Script Value" step 
before your AROutput step to see what the actual length of the value is (in 
Bytes), then perform some other action(s) as needed.
The following java code worked well for me. (Note: Replace "streamField" with 
the name of the field you want to check the value on)

//Script here

var strRealLEN

if (streamField != null) {

function getByteCount( s )
{
  var count = 0, stringLength = s.length, i;
  s = String( s || "" );
  for( i = 0 ; i < stringLength ; i++ )
  {
    var partCount = encodeURI( s[i] ).split("%").length;
    count += partCount==1?1:partCount-1;
  }

return count;

}
strRealLEN = getByteCount(streamField);
} else
strRealVAL = 0;


HTH
Leonard Neely



From: Action Request System discussion list(ARSList) 
[mailto:[email protected]] On Behalf Of Andrew Hicox
Sent: Thursday, April 30, 2015 12:56 PM
To: [email protected]
Subject: Re: Atrium integrator AROutput detects bogus field length violation?

**

The job is writing to a custom form.

As far as I can tell, "Strings cut" is doing the job I asked it to do, as 
evidenced by the fact that the value is truncated both in the log, and (if you 
set the field length to 0) in the resulting field value.

The problem seems really to be with AROutput. The server does not log the field 
length exception,  as if the server never actually receives it. It seems like 
"AROutput" must have an internal field length validation,  that is generating 
bogus exceptions.

Anyone know if this is a known bug, perhaps fixed in 8.1.02?

Been looking, but don't see an SW# for it on the bmc site (yet)

Andy
On Apr 30, 2015 1:59 PM, "Jayesh" 
<[email protected]<mailto:[email protected]>> wrote:
**
Hi,

The field is OOTB or custom?

Actually I m not at my desk so couldn't check the exact option but we do have 
Many options in String Operations transform to modify the string values. Have 
you checked there?

You can use java in spoon  to modify the character set or set the default 
character set of input stream and write it to output stream  I guess if you are 
really thinking its the issue.
________________________________
From: Andrew Hicox<mailto:[email protected]>
Sent: ‎30-‎04-‎2015 07:05 PM
To: [email protected]<mailto:[email protected]>
Subject: Re: Atrium integrator AROutput detects bogus field length violation?
**

Yeah double byte chars was my first guess too. The problem, is that I don't see 
anything like that in the log. Of course,  like I said, could just be some 
non-printable trash, then I'd never see it.

In any other programming environment,  I'd put a filter on the input to catch 
trash like that and discard it, but I don't see anything at all in spoon that'd 
let me do that.

Anyone out there know how to do that sort of thing with spoon?

I guess my work-around here, is just have AI write to clobs all the time on 
bogus landing forms, use workflow to truncate the values, then send it off to 
where it really needs to go?

Is that really how everyone else is dealing with this? I have to be missing 
something ... this seems too fishy ...

Thanks everyone,

Andy
On Apr 30, 2015 12:39 AM, "laurent matheo" <[email protected]<mailto:[email protected]>> 
wrote:
**
Happened to me as well importing files (especially with DMT in 7.6.04), but 
most of the time is was due to database server being in Unicode and importing 
special characters (like accents) which take 2 bytes to be stored in database 
for example rather than 1 byte:
« Tête » needs 5 bytes in Unicode for example. So if you have a 4 char 
character field it wouldn’t fit (well if not « byte » type in dev studio in 
which case it « just » double the char length in database if memory serves).



Le 30 avr. 2015 à 07:10, Jarl Grøneng 
<[email protected]<mailto:[email protected]>> a écrit :

**

I have seen similar, but I have not have time to invesitgate it.

Importing approx 200 records from a csv file, and 4 of them failes with ARERR 
306.

--
J

2015-04-30 0:02 GMT+02:00 Andrew Hicox 
<[email protected]<mailto:[email protected]>>:
**

Hi everyone,

I know this isn't strictly ARS, but I thought I'd ask here.

AI has me pulling my hair out, and at this rate, I'll be bald by friday.

I have a pretty basic job. It queries an ldap server and writes what it finds 
into a landing form.

Because the data on the ldap server is pretty dirty, I'm using the "Strings 
cut" node to truncate all the data so it'll fit in my fields.

And indeed this seems to work. Until it doesn't work.

The log shows that ARERR 306 is encountered,  because I've tried to set a 
too-long value on a field and it gives me the field id.

Sure enough,  I am truncating the data mapped to that field to the length of 
the field.

I think, "well maybe the indexing isn't really as advertised", so I truncate 
the field to 1 less char than the max length of the field. No good. What the 
hell, I go for 2 less. Still no good.

I insert a "write to log" right before the AROutput step just to verify, and 
yes indeed,  there is not one value too long going into the AROutput.

Ok, so maybe some workflow on the server is doing it? Nope! Disabled all the 
workflow, and just to be damn sure, I checked the "skip workflow processing" 
check box on the AROutput node.

still throws the error.

Ok, just to be sure, turned on filter logging. Not a darn thing firing. The 
arserver does not seem to be throwing the error!

Ok. In desperation, I set 0 length on the field in question. Boom, it works.

So after the job is done, I check to see what the longest value that got wrote 
to that field was.

It is, exactly as it should be. Nothing longer than the max length set on 
"Strings cut" ... which is to say, two characters less than the previous length 
of the field.

What the hell?

Only thing I can think of is maybe there's some kinda garbage non-printable 
ascii on the input that throws kettle's length detection for a loop? If so, I 
don't really see any kind of charset conversion or anything I could use to 
filter it.

I'm on 8.1.01 ... anyone ever run into anything like this before?

-Andy
_ARSlist: "Where the Answers Are" and have been for 20 years_

_ARSlist: "Where the Answers Are" and have been for 20 years_

_ARSlist: "Where the Answers Are" and have been for 20 years_
_ARSlist: "Where the Answers Are" and have been for 20 years_
_ARSlist: "Where the Answers Are" and have been for 20 years_
_ARSlist: "Where the Answers Are" and have been for 20 years_

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
"Where the Answers Are, and have been for 20 years"

Reply via email to