There should be a strong caveat here.
Where all applications are in house created this is an excellent option.
However, where there are a disparate number of vendors software applications
running in an environment application integration does not serve well.
Rather, a data integration model works best. The more "canned" the software
the more problems application integration will present.
Peter Olivola ([EMAIL PROTECTED])
> -----Original Message-----
> From: Electronic Data Interchange Issues
> [mailto:[EMAIL PROTECTED]]On Behalf Of Eliot Muir
> Sent: Thursday, March 02, 2000 2:47 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Mapping to/from Access and/or Excel
>
>
> Don't forget the third option where your data stream
> goes through:
>
> [Parser (third party or in house)]
> |
> \/
> [API - which can then verify business logic
> and reject data which appears to be rubbish]
> |
> \/
> Database
>
> So you're not dealing with batch loading and you're not
> dealing with a package which maps straight into your
> database.
>
> Cheers,
> Eliot
>
> --
> Eliot Muir, Technical Director iNTERFACEWARE
> mailto:[EMAIL PROTECTED]
> Voice 64-21-333068 http://www.interfaceware.com
>
> Makers of iNTERFACEWARE Chameleon
> "Program to the iNTERFACE not the implementation"
>
> -----Original Message-----
> From: Andrew Clifford [SMTP:[EMAIL PROTECTED]]
> Sent: Thursday, March 02, 2000 6:27 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Mapping to/from Access and/or Excel
>
> I'm interested to hear how people feel about these two approaches to EDI
> integration: flat-file interface vs. direct database calls.
>
> I would think that direct DB calls would be more efficient on
> outbound since
> the data is presumed clean and is what you want to send to your trading
> partner. Even if the data goes through a flat-file intermediate
> step first
> before hitting the mapper, the source is the same.
>
> On inbound, however, I think it's more fraught with peril since you risk
> garbage coming into your database without an adequate opportunity
> to review
> the data. If you add edits into the mapping to- for example- kick out
> questionable data or outright errors, you'll have to constantly tweak the
> maps to keep up with the inventive ways trading partners can make life
> interesting. After you fix the offending data or ask for re-sends, you'll
> have to reprocess the stream and perhaps keep track of duplicates.
>
> Should you decide to change your EDI software - no one would actually do
> that, would they? ;) - you may have an entirely new API to deal with.
>
> For flat-files, I think that overhead is higher and of course batch
> processing is not in "real" time.
>
> However, if you use a flat-file approach, you will still have to deal with
> edits on inbound but you'd be changing load code that you will be able to
> manage through version control and code libraries.
>
> If you then change your EDI software, you'd need only to insure that the
> mapper writes out identically formatted flat-files, preserving your load
> programs.
>
> I haven't touched on all pros and cons and I'd like to hear what others
> think on this. It's a fundamental issue and I wonder how you arrived at
> your choice.
>
>
>
>
> << File: clifford.vcf >>
>
> =======================================================================
> To signoff the EDI-L list, mailto:[EMAIL PROTECTED]
> To subscribe, mailto:[EMAIL PROTECTED]
> To contact the list owner: mailto:[EMAIL PROTECTED]
> Archives at http://www.mail-archive.com/edi-l%40listserv.ucop.edu/
=======================================================================
To signoff the EDI-L list, mailto:[EMAIL PROTECTED]
To subscribe, mailto:[EMAIL PROTECTED]
To contact the list owner: mailto:[EMAIL PROTECTED]
Archives at http://www.mail-archive.com/edi-l%40listserv.ucop.edu/