Re: Is there currently a way to access MongoDB from z/OS LE languages?
In of6c884318.84e6f64a-on48257c16.00237d2b-48257c16.00257...@sg.ibm.com, on 11/01/2013 at 02:47 PM, Timothy Sipples sipp...@sg.ibm.com said: Now, I stipulate that there are many desirable capabilities. Operating on/with EBCDIC data is often useful. There are two ways to try to accomplish that goal: FSVO two larger than the standard value. False dichotomies are not helpful. 3. Select only those open source programs that meet your needs. 4. If there are open source programs that would meet you needs if specific facilities were available, either a. Submit an enhancement request b. Write and contribute code -- Shmuel (Seymour J.) Metz, SysProg and JOAT ISO position; see http://patriot.net/~shmuel/resume/brief.html We don't care. We don't have to care, we're Congress. (S877: The Shut up and Eat Your spam act of 2003) -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Hi all While I may change my mind in the future, I've pretty much decided to abandon the project for now, for these reasons: 1. Mongo DB data is UTF-8 and not even ASCII. An EBCDIC version is thus irrelevant and not needed. This is different then the situation with the PCRE library where EBCDIC version is possible and useful in EBCDIC environment, and thus relevant. 2. Once the need for EBCDIC is ruled out, there is no real need for classic z/OS version. The C modules of the C driver could easily be compiled under USS and the only piece that might be needed is some COBOL copybooks and code to interface with the dll, resolve big and little endian conflicts and so on. This is a fairly easy task to do, but I sense that there is no specific need for it. As I've said, I may go back and do it some time in the future. 3. The most important obstacle is the fact that there is no descent development environment that could be reasonably and legally available to open source developers. This, combined with the hostility towards C in classic z/OS shops, curtails the ability of many potential users to actually build from source code and the ability of the project to distribute binaries. I will concentrate my open source efforts elsewhere ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
David Crayford writes: Do you have any real world experience with open source and porting to z/OS? Yes, some. Tony Harminc opines: Sure, if all your application does is crunch numbers or manipulate bytes. But if it has any interaction with the operating system such as calling its services... Which (other) services? referencing external names (dsnames and file names being only two of many), There is no requirement on z/OS to support dataset names unless operating on datasets (VSAM files, PDSes, PDSEs, etc.) Which I've already covered in my statement. dealing with operator services, etc. etc. etc., you will find it There is no requirement on z/OS for open source ported applications to deal with operator services. For example, there is no requirement on z/OS that applications produce SMF records. *Requirement*. Everything you're describing may be desirable, even highly desirable, but not REQUIRED. Now, I stipulate that there are many desirable capabilities. Operating on/with EBCDIC data is often useful. There are two ways to try to accomplish that goal: 1. Complain to and criticize each and every individual open source community for every open source product, that they must accept adding a laundry list of features to their mainline source code in order to exploit z/OS-unique and z/OS-desirable features. That approach doesn't seem to be working, does it? 2. As the Java community has, and IBM has (and continues to do), bolster the generalized runtime environments on z/OS so that more open source products can come to z/OS more easily and (optionally!) exploit z/OS without changes to the mainline source code (or with at least fewer changes). For example, if you want open source applications to be able to operate on/with EBCDIC data, how about a generic approach that works for all (or at least most) open source products? My /ebcdic path idea isn't necessarily best or even viable, but at least I'm trying. Might I suggest door #2? What would a Common Transparent z/OS Services Environment look like? I'm not excluding the possibility that IBM might implement *some* of that sort of stuff. IBM already is, e.g. z/OS 2.1's GNUish file I/O. For example, one radical (but probably viable) idea would be a userland GNU/Linux atop z/OS. Anybody interested in doing that? Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 1 November 2013 02:47, Timothy Sipples sipp...@sg.ibm.com wrote: Tony Harminc opines: [with respect to the need to use EBCDIC] Sure, if all your application does is crunch numbers or manipulate bytes. But if it has any interaction with the operating system such as calling its services... Which (other) services? ENQ/DEQ. WTO/WTOR/QEDIT and friends. OPEN/CLOSE. TPUT/TGET. Many of the UNIX callable services. There are many more. Some of the services merely use EBCDIC constants or values in their invocations; others actually deal with EBCDIC data. There is no requirement on z/OS to support dataset names unless operating on datasets (VSAM files, PDSes, PDSEs, etc.) Which I've already covered in my statement. dealing with operator services, etc. etc. etc., you will find it There is no requirement on z/OS for open source ported applications to deal with operator services. For example, there is no requirement on z/OS that applications produce SMF records. So my statement you quoted above pretty much covers it. If you are happy for your app to run isolated, you are fine. If you want to talk to the outside world, you need to support EBCDIC to at least some degree. If what you have is a callable subroutine that manipulates data and returns a result, yes of course the caller can do the necessary translations on input and output. *Requirement*. Everything you're describing may be desirable, even highly desirable, but not REQUIRED. Oh come on. A program is not REQUIRED to do anything beyond its specifications, and those can be as minimal as you like. I will stipulate that IEFBR14 will run fine in ASCII, EBCDIC, or even UTF-8. What do you accomplish by making this point? Now, I stipulate that there are many desirable capabilities. Operating on/with EBCDIC data is often useful. There are two ways to try to accomplish that goal: 1. Complain to and criticize each and every individual open source community for every open source product, that they must accept adding a laundry list of features to their mainline source code in order to exploit z/OS-unique and z/OS-desirable features. That approach doesn't seem to be working, does it? A strange straw man to set up. I'm not aware of this complaining and criticizing you speak of. Can you give an example of such a laundry list? What I hear, and have contributed to, is the notion that people should design and write code in a portable manner. This was once considered evident goodness among almost all programmers, but the notion has faded with respect to non-ASCII characters sets, whereas e.g. portability wrt endianness hasn't. If you write non portable stuff like If A = char = Z then char_is_uc_alpha = TRUE then you will have trouble running on any non-ASCII system. But this, and worse, continues to this day. Quite evidently this has happened because while there are many prominent platforms of both big and little endian pursuasion, there are approximately five current EBCDIC OSs in existence running on two hardware platforms, and most people know nothing about any of them. 2. As the Java community has, and IBM has (and continues to do), bolster the generalized runtime environments on z/OS so that more open source products can come to z/OS more easily and (optionally!) exploit z/OS without changes to the mainline source code (or with at least fewer changes). There's nothing wrong with this, certainly. But I see little to no connection with your earlier point. For example, if you want open source applications to be able to operate on/with EBCDIC data, how about a generic approach that works for all (or at least most) open source products? My /ebcdic path idea isn't necessarily best or even viable, but at least I'm trying. This addresses an already-addressed problem, arguably in a worse way, and it's the narrow problem of UNIXy file I/O. For example, one radical (but probably viable) idea would be a userland GNU/Linux atop z/OS. Anybody interested in doing that? Perhaps, but it's hard to see the business case when zLinux and z/OS UNIX already exist. And it certainly won't be IBM distributing it if it's GPL licensed, which any GNU/Linux-based product would be. Tony H. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
As long as the program accepts the data as valid and doesn't check it for valid ASCII characters, it should work for any character set. Let the operating system determine if it is a valid data set name, path, etc. On Fri, Nov 1, 2013 at 10:51 AM, Tony Harminc t...@harminc.net wrote: On 1 November 2013 02:47, Timothy Sipples sipp...@sg.ibm.com wrote: Tony Harminc opines: [with respect to the need to use EBCDIC] Sure, if all your application does is crunch numbers or manipulate bytes. But if it has any interaction with the operating system such as calling its services... Which (other) services? ENQ/DEQ. WTO/WTOR/QEDIT and friends. OPEN/CLOSE. TPUT/TGET. Many of the UNIX callable services. There are many more. Some of the services merely use EBCDIC constants or values in their invocations; others actually deal with EBCDIC data. There is no requirement on z/OS to support dataset names unless operating on datasets (VSAM files, PDSes, PDSEs, etc.) Which I've already covered in my statement. dealing with operator services, etc. etc. etc., you will find it There is no requirement on z/OS for open source ported applications to deal with operator services. For example, there is no requirement on z/OS that applications produce SMF records. So my statement you quoted above pretty much covers it. If you are happy for your app to run isolated, you are fine. If you want to talk to the outside world, you need to support EBCDIC to at least some degree. If what you have is a callable subroutine that manipulates data and returns a result, yes of course the caller can do the necessary translations on input and output. *Requirement*. Everything you're describing may be desirable, even highly desirable, but not REQUIRED. Oh come on. A program is not REQUIRED to do anything beyond its specifications, and those can be as minimal as you like. I will stipulate that IEFBR14 will run fine in ASCII, EBCDIC, or even UTF-8. What do you accomplish by making this point? Now, I stipulate that there are many desirable capabilities. Operating on/with EBCDIC data is often useful. There are two ways to try to accomplish that goal: 1. Complain to and criticize each and every individual open source community for every open source product, that they must accept adding a laundry list of features to their mainline source code in order to exploit z/OS-unique and z/OS-desirable features. That approach doesn't seem to be working, does it? A strange straw man to set up. I'm not aware of this complaining and criticizing you speak of. Can you give an example of such a laundry list? What I hear, and have contributed to, is the notion that people should design and write code in a portable manner. This was once considered evident goodness among almost all programmers, but the notion has faded with respect to non-ASCII characters sets, whereas e.g. portability wrt endianness hasn't. If you write non portable stuff like If A = char = Z then char_is_uc_alpha = TRUE then you will have trouble running on any non-ASCII system. But this, and worse, continues to this day. Quite evidently this has happened because while there are many prominent platforms of both big and little endian pursuasion, there are approximately five current EBCDIC OSs in existence running on two hardware platforms, and most people know nothing about any of them. 2. As the Java community has, and IBM has (and continues to do), bolster the generalized runtime environments on z/OS so that more open source products can come to z/OS more easily and (optionally!) exploit z/OS without changes to the mainline source code (or with at least fewer changes). There's nothing wrong with this, certainly. But I see little to no connection with your earlier point. For example, if you want open source applications to be able to operate on/with EBCDIC data, how about a generic approach that works for all (or at least most) open source products? My /ebcdic path idea isn't necessarily best or even viable, but at least I'm trying. This addresses an already-addressed problem, arguably in a worse way, and it's the narrow problem of UNIXy file I/O. For example, one radical (but probably viable) idea would be a userland GNU/Linux atop z/OS. Anybody interested in doing that? Perhaps, but it's hard to see the business case when zLinux and z/OS UNIX already exist. And it certainly won't be IBM distributing it if it's GPL licensed, which any GNU/Linux-based product would be. Tony H. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- Mike A Schwab, Springfield IL USA Where do Forest Rangers go to get away from it all? -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Timothy is right - as long as your program doesn't call services that require EBCDIC names, you don't need EBCDIC. What's the open source equivalent of IEFBR14, the Unix true command? Kirk Wolf Dovetailed Technologies http://dovetail.com On Fri, Nov 1, 2013 at 11:18 AM, Mike Schwab mike.a.sch...@gmail.comwrote: As long as the program accepts the data as valid and doesn't check it for valid ASCII characters, it should work for any character set. Let the operating system determine if it is a valid data set name, path, etc. On Fri, Nov 1, 2013 at 10:51 AM, Tony Harminc t...@harminc.net wrote: On 1 November 2013 02:47, Timothy Sipples sipp...@sg.ibm.com wrote: Tony Harminc opines: [with respect to the need to use EBCDIC] Sure, if all your application does is crunch numbers or manipulate bytes. But if it has any interaction with the operating system such as calling its services... Which (other) services? ENQ/DEQ. WTO/WTOR/QEDIT and friends. OPEN/CLOSE. TPUT/TGET. Many of the UNIX callable services. There are many more. Some of the services merely use EBCDIC constants or values in their invocations; others actually deal with EBCDIC data. There is no requirement on z/OS to support dataset names unless operating on datasets (VSAM files, PDSes, PDSEs, etc.) Which I've already covered in my statement. dealing with operator services, etc. etc. etc., you will find it There is no requirement on z/OS for open source ported applications to deal with operator services. For example, there is no requirement on z/OS that applications produce SMF records. So my statement you quoted above pretty much covers it. If you are happy for your app to run isolated, you are fine. If you want to talk to the outside world, you need to support EBCDIC to at least some degree. If what you have is a callable subroutine that manipulates data and returns a result, yes of course the caller can do the necessary translations on input and output. *Requirement*. Everything you're describing may be desirable, even highly desirable, but not REQUIRED. Oh come on. A program is not REQUIRED to do anything beyond its specifications, and those can be as minimal as you like. I will stipulate that IEFBR14 will run fine in ASCII, EBCDIC, or even UTF-8. What do you accomplish by making this point? Now, I stipulate that there are many desirable capabilities. Operating on/with EBCDIC data is often useful. There are two ways to try to accomplish that goal: 1. Complain to and criticize each and every individual open source community for every open source product, that they must accept adding a laundry list of features to their mainline source code in order to exploit z/OS-unique and z/OS-desirable features. That approach doesn't seem to be working, does it? A strange straw man to set up. I'm not aware of this complaining and criticizing you speak of. Can you give an example of such a laundry list? What I hear, and have contributed to, is the notion that people should design and write code in a portable manner. This was once considered evident goodness among almost all programmers, but the notion has faded with respect to non-ASCII characters sets, whereas e.g. portability wrt endianness hasn't. If you write non portable stuff like If A = char = Z then char_is_uc_alpha = TRUE then you will have trouble running on any non-ASCII system. But this, and worse, continues to this day. Quite evidently this has happened because while there are many prominent platforms of both big and little endian pursuasion, there are approximately five current EBCDIC OSs in existence running on two hardware platforms, and most people know nothing about any of them. 2. As the Java community has, and IBM has (and continues to do), bolster the generalized runtime environments on z/OS so that more open source products can come to z/OS more easily and (optionally!) exploit z/OS without changes to the mainline source code (or with at least fewer changes). There's nothing wrong with this, certainly. But I see little to no connection with your earlier point. For example, if you want open source applications to be able to operate on/with EBCDIC data, how about a generic approach that works for all (or at least most) open source products? My /ebcdic path idea isn't necessarily best or even viable, but at least I'm trying. This addresses an already-addressed problem, arguably in a worse way, and it's the narrow problem of UNIXy file I/O. For example, one radical (but probably viable) idea would be a userland GNU/Linux atop z/OS. Anybody interested in doing that? Perhaps, but it's hard to see the business case when zLinux and z/OS UNIX already exist. And it certainly won't be IBM distributing it if it's GPL licensed, which any GNU/Linux-based product would be. Tony H.
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Dave, I actually looked up jdbc after your post. .. and was somewhat surprised. I had assumed that jdbc was just java database connector ... Instead it is just for relational using sql. I personally think that this was a huge assumption by the original authors. It should have been jrdbc. Leaving the 2nd char as an indicator of what type if db was being connected. VBG Rob On Oct 26, 2013 12:55 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 11:13 PM, Rob Schramm wrote: Not sure how to respond.. on the one hand you have an excellent point. One the other hand.. Google jdbc and mongodb.. as well as there being a jdbc link on the mongodb page in addition to the mongodb java connectors. Doesn't really change my intent ... Grab the mongodb java database driver.. (how does jmdbc driver sound???) and couple it with the cobol application code. I understood your original intent Rob. I was just sounding off about JDBC drivers for non-relational data bases. I've never quite grasped why there are so many SQL adapters for non-relational data bases. Even IMS has a Java SQL interface with ODBC and I just don't get it. Is SQL really that much better then native APIs? In the case of your typical key/value data store surely get/set is easier than SELECT FROM WHERE/UPDATE SET IN etc. Rob On Oct 25, 2013 3:03 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 1:51 PM, Rob Schramm wrote: With a JDBC driver and a bit of JAVA code..you could use the COBOL/JAVA procedure BCDBATCH to help tie the two together. Did a quick scan and there appear to be at least few JDBC drivers. I'm scratching my head as to why a JDBC driver is useful with a NoSQL data base which has a very specific API. Why not just use the MongoDB Java API? Does JDBC provide some kind of value add? Rob Rob Schramm Senior Systems Consultant Imperium Group On Fri, Oct 25, 2013 at 1:18 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/**specificationhttp://bsonspec.org/#/specification http://bsonspec.**org/#/**specificationhttp://bsonspec.org/#/**specification http://bsonspec.**org/#/**specificationhttp://bsonspec.** org/#/specification http://bsonspec.org/#/specification . I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/**reports/tr16/http://www.unicode.org/reports/tr16/ http://www.**unicode.org/**reports/tr16/http://www.unicode.org/**reports/tr16/ http://www.**unicode.org/**reports/tr16/http://unicode.org/reports/tr16/ http://www.**unicode.org/reports/tr16/http://www.unicode.org/reports/tr16/ Tony H. --**--** --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--** --** --**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --** --**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
As far as Perl and EBCDIC goes the biggest objection was the lack of any system that the Perl guys can use to validate the code with. I had put out a question to the list a while back... Looking to run Tomcat and Jspwiki on a z/os system ... And offer up some system time to the Perl guys for testing... Doesn't seem to be a way to do it. Although.. I have stumbled across a couple systems that folks have made freely available. Just not sure how they are doing it legally. Rob On Nov 1, 2013 7:04 PM, Rob Schramm rob.schr...@gmail.com wrote: Dave, I actually looked up jdbc after your post. .. and was somewhat surprised. I had assumed that jdbc was just java database connector ... Instead it is just for relational using sql. I personally think that this was a huge assumption by the original authors. It should have been jrdbc. Leaving the 2nd char as an indicator of what type if db was being connected. VBG Rob On Oct 26, 2013 12:55 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 11:13 PM, Rob Schramm wrote: Not sure how to respond.. on the one hand you have an excellent point. One the other hand.. Google jdbc and mongodb.. as well as there being a jdbc link on the mongodb page in addition to the mongodb java connectors. Doesn't really change my intent ... Grab the mongodb java database driver.. (how does jmdbc driver sound???) and couple it with the cobol application code. I understood your original intent Rob. I was just sounding off about JDBC drivers for non-relational data bases. I've never quite grasped why there are so many SQL adapters for non-relational data bases. Even IMS has a Java SQL interface with ODBC and I just don't get it. Is SQL really that much better then native APIs? In the case of your typical key/value data store surely get/set is easier than SELECT FROM WHERE/UPDATE SET IN etc. Rob On Oct 25, 2013 3:03 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 1:51 PM, Rob Schramm wrote: With a JDBC driver and a bit of JAVA code..you could use the COBOL/JAVA procedure BCDBATCH to help tie the two together. Did a quick scan and there appear to be at least few JDBC drivers. I'm scratching my head as to why a JDBC driver is useful with a NoSQL data base which has a very specific API. Why not just use the MongoDB Java API? Does JDBC provide some kind of value add? Rob Rob Schramm Senior Systems Consultant Imperium Group On Fri, Oct 25, 2013 at 1:18 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/**specificationhttp://bsonspec.org/#/specification http://bsonspec.**org/#/**specificationhttp://bsonspec.org/#/**specification http://bsonspec.**org/#/**specificationhttp://bsonspec.** org/#/specification http://bsonspec.org/#/specification . I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/**reports/tr16/http://www.unicode.org/reports/tr16/ http://www.**unicode.org/**reports/tr16/http://www.unicode.org/**reports/tr16/ http://www.**unicode.org/**reports/tr16/http://unicode.org/reports/tr16/ http://www.**unicode.org/reports/tr16/http://www.unicode.org/reports/tr16/ Tony H. --**--** --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--** --** --**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --** --** -- For IBM-MAIN subscribe / signoff / archive
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 31 October 2013 01:35, Timothy Sipples sipp...@sg.ibm.com wrote: Shmuel Metz writes: ...z/OS does require EBCDIC. It does not (if referring to ported applications), and repeating a falsehood does not make it any more true. EBCDIC support is required if and only if there is a requirement to operate on/with EBCDIC-encoded data. z/OS does not require an application to support EBCDIC in order to run and run well. There is no requirement to store user data in EBCDIC, though many/most z/OS customers store at least some data in EBCDIC. Sure, if all your application does is crunch numbers or manipulate bytes. But if it has any interaction with the operating system such as calling its services, referencing external names (dsnames and file names being only two of many), dealing with operator services, etc. etc. etc., you will find it impossible to not deal with EBCDIC. Precision here is particularly important. I would humbly suggest that those who are misleading members of the open source community about z/OS are not doing anyone any favors. Indeed! Tony H. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
I've recently been writing web servers using Lua/Orbit. From the reports I've read there doesn't seem to be any particular problem compiling Lua on z/OS (via make posix). As mentioned previously, if you want to operate on EBCDIC data there might be additional steps you have to take (might), but if that's true it would be true of Lua in general and not only when running on z/OS specifically. Though I did see a report that a commercial firm decided to use Lua on IBM i, and they may have done so with EBCDIC support. If I could coach a little bit on the Perl conversation within the open source community, Perl's maintainers seem to be particularly hung up on EBCDIC support and not particularly interested in it (to be charitable). Well, OK -- but why impose that requirement when it doesn't exist in reality? If somebody wants to have an EBCDIC extension to Perl -- Perl allows various non-core extensions, which is partly what makes Perl popular in certain domains -- then OK, and that extension can be separately maintained (and would be useful both on and off z/OS since EBCDIC is not unique to z/OS nor useful only when running on z/OS). But why try to force EBCDIC into Perl mainline itself when z/OS doesn't require EBCDIC? In my view, those that are interested in encouraging and supporting more open source software on z/OS -- as I am -- need to be a bit more, er, diplomatic. Yes, I understand why many z/OS-centric and z/OS-specific features would be nice to have, including support for operating on EBCDIC data. However, various open source communities have different priorities and interests -- and that's not a z/OS-specific observation. Moreover, something like EBCDIC support should be quite easy to keep well segregated and generalized, via z/OS Unicode Services in particular. So keep it simple when porting and rely on z/OS's support for UNIX, POSIX, Unicode, and other common standards. Then, later, optionally, plug into a generalized z/OS Unicode Services layer (for example), which *might* involve negotiating over only one or a couple lines added to the mainline source code for the particular open source project. Thinking aloud (metaphorically speaking), maybe it would make a lot of sense to have a generalized z/OS-exploiting services layer, with common services that typical open source software users can optionally use but which aren't required to be incorporated into the open source mainline itself. It seems like that sort of approach ought to be very possible. What I'd suggest is agreeing on a ranked list of technically optional but nice to have z/OS-exploiting features, then figuring out how to have a common services runtime/library that typical open source software would either automatically tap into (via existing interfaces that open source software already use, such as file I/O) or that would involve at most adding at most a line or two to the mainline source. For example, if we're talking about EBCDIC support, then maybe one way to do that -- thinking aloud again -- is via bog standard open source-friendly file I/O. For example, if the path begins with: /ebcdic/... Then that would get *externally* trapped and handled at runtime without affecting the application's source code. (It'd be a bit more elegant than this, but you get the idea.) IBM did something vaguely similar to this with early Web serving on z/OS though in reverse. If the HTML file had a .ascii appended to the file name then the Web server would deliver it as ASCII (well, actually without any character set translation, i.e. binary), while otherwise it would pass through EBCDIC-to-ASCII/Unicode conversion. There are actually already some MVS aware file I/O services in z/OS UNIX which might already do the trick, or at least which are very close to what's required. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 30/10/2013 3:01 PM, Timothy Sipples wrote: I've recently been writing web servers using Lua/Orbit. From the reports I've read there doesn't seem to be any particular problem compiling Lua on z/OS (via make posix). As mentioned previously, if you want to operate on EBCDIC data there might be additional steps you have to take (might), but if that's true it would be true of Lua in general and not only when running on z/OS specifically. Though I did see a report that a commercial firm decided to use Lua on IBM i, and they may have done so with EBCDIC support. Any z/OS tool that doesn't support EBCDIC is a dud from the start. Simple as that. There is no problem building Lua, but you may have issues executing it. make posix will build the basic interpreter and runtime, that's all. Then optimize the code and run a coroutine - ABEND. z/OS C has a different stack layout to other *nix systems. There was a similar problem porting Ruby to z/OS. https://www.ruby-forum.com/topic/114046. Then you have to tackle the regex like string handling routines. And then of course to do anything useful you need packages, and that's where the hard work starts. It's certainly not as simple as running make. Porting packages for web programming, like HTTP protocols, will require EBCDIC work at the socket level to handle those pesky newlines. And then there's a lot of patching the C code for runtime inconsistencies (almost all packages were coded for Linux). It took me a good few days just to port the socket package. There is a considerable amount of work getting it to run smoothly on z/OS. It's taken me a year to build up the runtime and I've got almost 20 years of C/C++ programming experience and have ported a lot of software to z/OS. I've had a lot of interest in Lua, including vendors and several IBM labs. I should be ready to release it soon. If I could coach a little bit on the Perl conversation within the open source community, Perl's maintainers seem to be particularly hung up on EBCDIC support and not particularly interested in it (to be charitable). Well, OK -- but why impose that requirement when it doesn't exist in reality? If somebody wants to have an EBCDIC extension to Perl -- Perl allows various non-core extensions, which is partly what makes Perl popular in certain domains -- then OK, and that extension can be separately maintained (and would be useful both on and off z/OS since EBCDIC is not unique to z/OS nor useful only when running on z/OS). But why try to force EBCDIC into Perl mainline itself when z/OS doesn't require EBCDIC? I don't see what the big issue is. Just maintain a separate patch file. In my view, those that are interested in encouraging and supporting more open source software on z/OS -- as I am -- need to be a bit more, er, diplomatic. Yes, I understand why many z/OS-centric and z/OS-specific features would be nice to have, including support for operating on EBCDIC data. However, various open source communities have different priorities and interests -- and that's not a z/OS-specific observation. Moreover, something like EBCDIC support should be quite easy to keep well segregated and generalized, via z/OS Unicode Services in particular. So keep it simple when porting and rely on z/OS's support for UNIX, POSIX, Unicode, and other common standards. Then, later, optionally, plug into a generalized z/OS Unicode Services layer (for example), which *might* involve negotiating over only one or a couple lines added to the mainline source code for the particular open source project. Wishful thinking! And I don't know any z/OS Unix programmer who has anything good to say about the z/OS Unicode Systems Services API. Thinking aloud (metaphorically speaking), maybe it would make a lot of sense to have a generalized z/OS-exploiting services layer, with common services that typical open source software users can optionally use but which aren't required to be incorporated into the open source mainline itself. It seems like that sort of approach ought to be very possible. What I'd suggest is agreeing on a ranked list of technically optional but nice to have z/OS-exploiting features, then figuring out how to have a common services runtime/library that typical open source software would either automatically tap into (via existing interfaces that open source software already use, such as file I/O) or that would involve at most adding at most a line or two to the mainline source. For example, if we're talking about EBCDIC support, then maybe one way to do that -- thinking aloud again -- is via bog standard open source-friendly file I/O. For example, if the path begins with: /ebcdic/... Then that would get *externally* trapped and handled at runtime without affecting the application's source code. (It'd be a bit more elegant than this, but you get the idea.) IBM did something vaguely similar to this with early Web serving on z/OS though in reverse. If the HTML file had a
Re: Is there currently a way to access MongoDB from z/OS LE languages?
In of5f4879d8.ef7c6ebe-on48257c14.0023d7b0-48257c14.00269...@sg.ibm.com, on 10/30/2013 at 03:01 PM, Timothy Sipples sipp...@sg.ibm.com said: If I could coach a little bit on the Perl conversation within the open source community, Perl's maintainers seem to be particularly hung up on EBCDIC support and not particularly interested in it (to be charitable). The impression that I have is they're perfectly willing to accept the code for EBCDIC support as long as it doesn't break anything; they're not willing or able to write and test that code themselves. Do you have any reason to believe that they would be unwilling to accept code from, e.g., IBM? Well, OK -- but why impose that requirement when it doesn't exist in reality? Why is Saturn the 3rd planet from the Sun? For many, the requirement for EBCDIC support does exist in reality, and a language that does not support it is out of the running. But why try to force EBCDIC into Perl mainline itself when z/OS doesn't require EBCDIC? Again your question presupposes something contrary to fact; z/OS does require EBCDIC. The fact that in some contexts it also supports other character sets does not alter that. something like EBCDIC support should be quite easy to keep well segregated and generalized, via z/OS Unicode Services in particular. No; the Devil is in the details. For example, if the path begins with: /ebcdic/... Bletch! Tagging is a lot cleaner. -- Shmuel (Seymour J.) Metz, SysProg and JOAT ISO position; see http://patriot.net/~shmuel/resume/brief.html We don't care. We don't have to care, we're Congress. (S877: The Shut up and Eat Your spam act of 2003) -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Shmuel Metz writes: ...z/OS does require EBCDIC. It does not (if referring to ported applications), and repeating a falsehood does not make it any more true. EBCDIC support is required if and only if there is a requirement to operate on/with EBCDIC-encoded data. z/OS does not require an application to support EBCDIC in order to run and run well. There is no requirement to store user data in EBCDIC, though many/most z/OS customers store at least some data in EBCDIC. Precision here is particularly important. I would humbly suggest that those who are misleading members of the open source community about z/OS are not doing anyone any favors. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
You speak with great authority about this Timothy. Do you have any real world experience with open source and porting to z/OS? On 31/10/2013 1:35 PM, Timothy Sipples wrote: Shmuel Metz writes: ...z/OS does require EBCDIC. It does not (if referring to ported applications), and repeating a falsehood does not make it any more true. EBCDIC support is required if and only if there is a requirement to operate on/with EBCDIC-encoded data. z/OS does not require an application to support EBCDIC in order to run and run well. There is no requirement to store user data in EBCDIC, though many/most z/OS customers store at least some data in EBCDIC. Precision here is particularly important. I would humbly suggest that those who are misleading members of the open source community about z/OS are not doing anyone any favors. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
David Crayford writes: I wonder if there is a market for mainframe legacy applications to access NoSQL data stores? Of course. Case in point: the IBM DB2 Analytics Accelerator. The PureData System for Analytics which powers the IDAA is, as it happens, a NoSQL data store. However, applications keep using SQL to access something NoSQL, including now static SQL. We've perhaps come full circle. :-) I'm not big on the political/religious wars, though, about relational v. non-relational. They're both wonderful in my view, just as airplanes and railway locomotives are both wonderful. You asked about JSON. Sure, no problem. There are many ways. To pick an example, for CICS Transaction Server (as far back as Version 4.2), there's the CICS Transaction Server Feature Pack for Mobile Extensions, available at no additional charge here: http://www.ibm.com/software/htp/cics/mobile/ You weren't specific about IMS Transaction Manager or IMS Database, so I'll try to answer both ways. Of course routes via CICS Transaction Server work if you have CICS TS. If you don't have CICS TS then you could use WebSphere Message Broker for z/OS. Or you could use IBM Worklight in conjunction with the standard IMS-included connectors. IBM Worklight is available for Linux on zEnterprise today, and there's sn IBM Statement of Direction to make Worklight available for z/OS as well. Worklight does much more than the use case you describe, but it covers that use case, too. Yet another example is via WebSphere Application Server for z/OS, either as a full WAS instance or via a WebSphere Liberty Profile deployment on z/OS. There are various JSON approaches that work within that context, too, if you would prefer a Java Enterprise Edition (JEE) or JEE subset approach. Still yet another example (depending on what you're doing) is to come in via the z/OS Management Facility interfaces. Anyway, those are just a few examples. There are others. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 29/10/2013 4:47 PM, Timothy Sipples wrote: David Crayford writes: I wonder if there is a market for mainframe legacy applications to access NoSQL data stores? Of course. Case in point: the IBM DB2 Analytics Accelerator. The PureData System for Analytics which powers the IDAA is, as it happens, a NoSQL data store. However, applications keep using SQL to access something NoSQL, including now static SQL. Most MVC web frameworks use an ORM framework. Relational data bases and OO languages don't work well together due to the object-relational impedance mismatch. What's the point of using SQL with a NoSQL data base when you can just serialize data into something like BSON? Adding another layer of complexity is pointless. We've perhaps come full circle. :-) I'm not big on the political/religious wars, though, about relational v. non-relational. They're both wonderful in my view, just as airplanes and railway locomotives are both wonderful. Everything always comes full circle. I was watching music videos with the kids last week and a hip new female rock band came on sporting headbands, kaftans and were laying down some seriously fat 70s rock riffs. Cool! Interestingly, when you look under the covers of the new breed of web servers/frameworks there is something very familiar. node.js, nginx and even data stores like Redis are all driven by asynchronous, non-blocking event loops. They all use a single thread to process thousands of concurrent connections. It reminds me of CICS and the QR-TCB. Even the programming models, node.js callbacks etc are similar to CICS quasi-reentrant programming. Get in/out as quickly as possible and wait for an event. The Reactor pattern is certainly not new but it's been simplified to work well with simple but powerful dynamic programming languages. You asked about JSON. Sure, no problem. There are many ways. To pick an example, for CICS Transaction Server (as far back as Version 4.2), there's the CICS Transaction Server Feature Pack for Mobile Extensions, available at no additional charge here: http://www.ibm.com/software/htp/cics/mobile/ I've already read that. It's interesting to see the solution for mapping JSON from static languages like COBOL. I'm sure it works well. Wouldn't it be great if CICS web services could be as simple as Sinatra using JRuby. You weren't specific about IMS Transaction Manager or IMS Database, so I'll try to answer both ways. Of course routes via CICS Transaction Server work if you have CICS TS. If you don't have CICS TS then you could use WebSphere Message Broker for z/OS. Or you could use IBM Worklight in conjunction with the standard IMS-included connectors. IBM Worklight is available for Linux on zEnterprise today, and there's sn IBM Statement of Direction to make Worklight available for z/OS as well. Worklight does much more than the use case you describe, but it covers that use case, too. Yet another example is via WebSphere Application Server for z/OS, either as a full WAS instance or via a WebSphere Liberty Profile deployment on z/OS. There are various JSON approaches that work within that context, too, if you would prefer a Java Enterprise Edition (JEE) or JEE subset approach. I've recently been writing web servers using Lua/Orbit. It may not offload to a zIIP but it's just so much easier than WAS/Java. Still yet another example (depending on what you're doing) is to come in via the z/OS Management Facility interfaces. That's a very nice interface for scheduling and monitoring jobs from distributed platforms. Hopefully the first of many. Anyway, those are just a few examples. There are others. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Who's using F1? MongoDB is currently valued at $1B and has venture capatalists throwing money at it. Last time I looked Mongo could handle joins and complex data and had a very rich query language. F1 is obviously new and it is not clear how (if at all) Google would release it for non-Google users, but it is a fact that one of the largest big data users in the world felt the need for such a thing. Mongo could handle joins in a forced non-natural way which they call embedded documents and linking. This is akin to what is done in IMS. Don't misunderstand me, Mongo has its uses, but OLTP is decidedly not one of them. The flatter the data model in Mongo, the better you realize its benefits. Again, for complex data models you would want something better. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
This is a very old argument. Hierarchical data bases (HDBs) long antedate relational ones (RDBs), and the deficiencies of HDBs were once well understood. The chief problem with them is that an HDB and applications that use it are not independent. They are unhappy bedfellows. If one is changed corrresponding changes must be made in the other too. A major consequence of this interdependence is that HDBs tend to be application-specific. Each of N applications has its own HDB, one of N. This point made, it must be added that RDBs have not usually realized their promise. Many, perhaps most of them are badly designed. Their promise remains. They can be, sometimes are, used well; and, in particular, a single RDB can serve multiple applications well. Mr Crayford's point that RDBs do not do list processing well is of a different kind. In my own use of DB2, which is heavy, I have not experienced any difficulty manipulating pointers to data returned by queries to it in stacks, lists, BSTs, and the like. The only kinds of processing I ever do are list- and pointer-orient[at]ed. (My chief objection to COBOL has been that until recently it required me to move data instead of pointers to them.) Until I read Mr. Crayford's post it had thus never occurred to me that I should delegate list manipulations to either an HDB or an RDB; and I still do not find this idea attractive. Others may, however, find it attractive or even necessary if their programming skills have atrophied or were never more than rudimentary. Here, as elsewhere: À chacun son gout! John Gilmore, Ashland, MA 01721 - USA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
David Crayford writes: begin extract IMO, programming skills to develop applications should be kepth to the minimal. I would rather get the job done as quickly as possible then show off rubbing two sticks together when I could just use a match. /end extract and here we have an example of rhetoric rather than substance. I have long since developed my own reusable list processing routines, and I do not really believe that applications and systems programming are different in the sense that they require qualitatively different kinds of skills. One uses the skills one has, and if they are deficient the resulting application is deficient too. The chief reason why so little application code is reusable is that it is conceived in haste or by people who know too little. Most flagrantly, the situation we confront online reflects these deficiencies: AP after AP turns out to be insecure because written in radical ignorance of how to make it secure. (The most recent US-CERT vulnerability summary, that for the week of 14 October 2013, listed 58 new high vulnerabilities; and this is a typical weekly count.) Mr. Crayford is of course entitled to his views and practices. I often---but certainly not always---disagree with them, as I too am entitled to do. I do regret that he seldom takes the time to argue for his positions. Something sucks or is going the way of the dodo, and those who disagree with him do so because they wish to show off by rubbing two sticks together. Too often, ad hominem jibes replace substantive argument; and this is a pity because when he does trouble to present his views in detail I find them interesting. John Gilmore, Ashland, MA 01721 - USA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
David Crayford writes: I'm not aware of any previous requirement for a mainframe COBOL program to access a data base running on a different platform. That's fairly common. To pick an example, Oracle offers a product called Oracle Access Manager for CICS that allows COBOL (and other programming language) programs running in CICS to access Oracle databases. I think OAM4CICS also supports applications running outside CICS as it happens. To pick another example, CICS Transaction Server and IMS Transaction Manager support Web Services. COBOL applications are frequently Web Services consumers (clients), accessing remote applications and databases via Web Services protocols. To pick yet another example, it's a very common pattern for COBOL (and other programming language) applications running on z/VSE (inside or outside CICS Transaction Server) to access DB2 (or other databases) running on Linux on zEnterprise (or perhaps also on other platforms). z/VSE started providing standard, no additional charge support for remote database access several years ago. It's called the z/VSE Database Connector (DBCLI), and more information is available here: http://www.ibm.com/systems/z/os/zvse/products/connectors.html#dbcli Ze'ev Atlas asks: Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. Yes, UTF-8 is supported in Enterprise COBOL. To get started, take a look at the following Enterprise COBOL 5.1 functions: ULENGTH, USUBSTR, USUPPLEMENTARY, UVALID, and UWIDTH. Those are all new in Version 5.1. More information here: http://pic.dhe.ibm.com/infocenter/pdthelp/v1r1/topic/com.ibm.entcobol.doc_5.1/PGandLR/tasks/tputf8in.html The following page applies to Enterprise COBOL at least as far back as Version 4.1: http://pic.dhe.ibm.com/infocenter/pdthelp/v1r1/topic/com.ibm.entcobol.doc_5.1/PGandLR/tasks/tpstr31.html And here's another code sample (applicable to Enterprise COBOL 5.1) which takes a table of Czech composer names (fun!) represented in UTF-8, determines their initials, and outputs the results (in EBCDIC): http://pic.dhe.ibm.com/infocenter/pdthelp/v1r1/topic/com.ibm.entcobol.doc_5.1/PGandLR/ref/rputf8e.html Thus interacting with MongoDB in UTF-8 from your COBOL applications should be no problem whatsoever. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 28/10/2013 11:39 AM, Timothy Sipples wrote: David Crayford writes: I'm not aware of any previous requirement for a mainframe COBOL program to access a data base running on a different platform. That's fairly common. To pick an example, Oracle offers a product called Oracle Access Manager for CICS that allows COBOL (and other programming language) programs running in CICS to access Oracle databases. I think OAM4CICS also supports applications running outside CICS as it happens. I wonder if there is a market for mainframe legacy applications to access NoSQL data stores? To pick another example, CICS Transaction Server and IMS Transaction Manager support Web Services. COBOL applications are frequently Web Services consumers (clients), accessing remote applications and databases via Web Services protocols. Do you know of many customers using COBOL web service consumers in production? After all the hullabaloo about SOA a few years ago it seems to have gone a bit quiet. And the protocol that was being suggested, SOAP, is already obsolete. If we take IMS as an example, it has a SOAP gateway and XML adapters for IMS Connect. That's pretty old school in the world of web services these days where REST/JSON is the architecture of choice. Does anybody know how I can call a RESTful web service from an IMS COBOL program? To pick yet another example, it's a very common pattern for COBOL (and other programming language) applications running on z/VSE (inside or outside CICS Transaction Server) to access DB2 (or other databases) running on Linux on zEnterprise (or perhaps also on other platforms). z/VSE started providing standard, no additional charge support for remote database access several years ago. It's called the z/VSE Database Connector (DBCLI), and more information is available here: http://www.ibm.com/systems/z/os/zvse/products/connectors.html#dbcli Ze'ev Atlas asks: Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. Yes, UTF-8 is supported in Enterprise COBOL. To get started, take a look at the following Enterprise COBOL 5.1 functions: ULENGTH, USUBSTR, USUPPLEMENTARY, UVALID, and UWIDTH. Those are all new in Version 5.1. More information here: http://pic.dhe.ibm.com/infocenter/pdthelp/v1r1/topic/com.ibm.entcobol.doc_5.1/PGandLR/tasks/tputf8in.html The following page applies to Enterprise COBOL at least as far back as Version 4.1: http://pic.dhe.ibm.com/infocenter/pdthelp/v1r1/topic/com.ibm.entcobol.doc_5.1/PGandLR/tasks/tpstr31.html And here's another code sample (applicable to Enterprise COBOL 5.1) which takes a table of Czech composer names (fun!) represented in UTF-8, determines their initials, and outputs the results (in EBCDIC): http://pic.dhe.ibm.com/infocenter/pdthelp/v1r1/topic/com.ibm.entcobol.doc_5.1/PGandLR/ref/rputf8e.html Thus interacting with MongoDB in UTF-8 from your COBOL applications should be no problem whatsoever. Timothy Sipples GMU VCT Architect Executive (Based in Singapore) E-Mail: sipp...@sg.ibm.com -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Is SQL really that much better then native APIs? In the case of your typical key/value data store surely get/set is easier than SELECT FROM WHERE/UPDATE SET IN etc. My short answer would be YES! The Typical key/value data store may be better handled with get/set. But presenting complex data (think OLTP) in those data stores is pretty much impossible. The hierarchical databases like IMS and IDMS (multi-hierarchies) were an interesting solution that used API to navigate the hierarchies [and I did a lot of work in both.] However, ultimately, in any real world application that is beyond what you could handle in flat Excel like store or Typical key/value data store, you find the need for relations... and the relational model. Typical key/value data store with some forced relations may be good for warehouse type of application. Anything else need some relational model, and SQL engines are pretty refined and do the navigation for you. My reason of thinking about interfacing MongoDB to COBOL is the fact that COBOL is very well suited to deal with API's and the hierarchical model. And I believe that MongoDB has its place as Warehouse engine. Again, even in the Big Data movement there is now a tendency to go back to SQL, hence Google's F1 Database engine. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 27/10/2013 7:44 AM, Ze'ev Atlas wrote: Is SQL really that much better then native APIs? In the case of your typical key/value data store surely get/set is easier than SELECT FROM WHERE/UPDATE SET IN etc. My short answer would be YES! I disagree. One of the reasons for choosing a NoSQL data base is because SQL isn't the right tool for the job. Check out the API reference for Redis, a very popular NoSQL data base http://redis.io/commands. If I want to process a list of items in Redis I can push/pull from the top/bottom of a list with RPUSH/RPULL LPUSH/LPULL. If I want to retrieve the number of elements I call LLEN. It's trivial. Doing that kind of stuff using SQL is non-trivial. SQL is poor for processing simple lists, queues, stacks etc. That kind of use case is meat and potatoes for web based applications and relational data bases suck at it. The Typical key/value data store may be better handled with get/set. But presenting complex data (think OLTP) in those data stores is pretty much impossible. The hierarchical databases like IMS and IDMS (multi-hierarchies) were an interesting solution that used API to navigate the hierarchies [and I did a lot of work in both.] However, ultimately, in any real world application that is beyond what you could handle in flat Excel like store or Typical key/value data store, you find the need for relations... and the relational model. There are a lot of big companies out there processing complex data in NoSQL data bases. They don't need (or want) a relational model. Typical key/value data store with some forced relations may be good for warehouse type of application. Anything else need some relational model, and SQL engines are pretty refined and do the navigation for you. My reason of thinking about interfacing MongoDB to COBOL is the fact that COBOL is very well suited to deal with API's and the hierarchical model. And I believe that MongoDB has its place as Warehouse engine. Again, even in the Big Data movement there is now a tendency to go back to SQL, hence Google's F1 Database engine. Who's using F1? MongoDB is currently valued at $1B and has venture capatalists throwing money at it. Last time I looked Mongo could handle joins and complex data and had a very rich query language. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 25/10/2013 1:51 PM, Rob Schramm wrote: With a JDBC driver and a bit of JAVA code..you could use the COBOL/JAVA procedure BCDBATCH to help tie the two together. Did a quick scan and there appear to be at least few JDBC drivers. I'm scratching my head as to why a JDBC driver is useful with a NoSQL data base which has a very specific API. Why not just use the MongoDB Java API? Does JDBC provide some kind of value add? Rob Rob Schramm Senior Systems Consultant Imperium Group On Fri, Oct 25, 2013 at 1:18 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/**specificationhttp://bsonspec.org/#/specification . I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/**reports/tr16/http://www.unicode.org/reports/tr16/ Tony H. --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Not sure how to respond.. on the one hand you have an excellent point. One the other hand.. Google jdbc and mongodb.. as well as there being a jdbc link on the mongodb page in addition to the mongodb java connectors. Doesn't really change my intent ... Grab the mongodb java database driver.. (how does jmdbc driver sound???) and couple it with the cobol application code. Rob On Oct 25, 2013 3:03 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 1:51 PM, Rob Schramm wrote: With a JDBC driver and a bit of JAVA code..you could use the COBOL/JAVA procedure BCDBATCH to help tie the two together. Did a quick scan and there appear to be at least few JDBC drivers. I'm scratching my head as to why a JDBC driver is useful with a NoSQL data base which has a very specific API. Why not just use the MongoDB Java API? Does JDBC provide some kind of value add? Rob Rob Schramm Senior Systems Consultant Imperium Group On Fri, Oct 25, 2013 at 1:18 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/specificationhttp://bsonspec.org/#/**specification http://bsonspec.**org/#/specificationhttp://bsonspec.org/#/specification . I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/reports/tr16/http://www.unicode.org/**reports/tr16/ http://www.**unicode.org/reports/tr16/http://www.unicode.org/reports/tr16/ Tony H. --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --** --**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
National literals and the NATIONAL-OF and DISPLAY-OF intrinsic functions are available at least back to version 4.1. The limitation of national literals to 80 characters (however that character is defined in the code page selected by the CODEPAGE compiler option) seems like another poor choice to me. Why is the length of literals so limited? 160 characters for ordinary alphanumeric literals and 80 characters for national literals looks like a compiler lexical scan limitation to me, but I could be wrong about that. Peter -Original Message- From: IBM Mainframe Discussion List [mailto:IBM-MAIN@LISTSERV.UA.EDU] On Behalf Of Ze'ev Atlas Sent: Friday, October 25, 2013 12:00 AM To: IBM-MAIN@LISTSERV.UA.EDU Subject: Re: Is there currently a way to access MongoDB from z/OS LE languages? Actually, it looks like there is support to UTF-8: ___ You need to do two steps to convert ASCII or EBCDIC data to UTF-8: Use the function NATIONAL-OF to convert the ASCII or EBCDIC string to a national (UTF-16) string. Use the function DISPLAY-OF to convert the national string to UTF-8. ___ This is from Enterprise COBOL for z/OS Version 5.1 documentation and there is N type. ZA -- This message and any attachments are intended only for the use of the addressee and may contain information that is privileged and confidential. If the reader of the message is not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any dissemination of this communication is strictly prohibited. If you have received this communication in error, please notify us immediately by e-mail and delete the message and any attachments from your system. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Why do you say there is a need for a C layer here? Even without using Object COBOL you can use JNI directly in COBOL. (It's not great fun, but it is doable.) From: Jantje. jan.moeyers...@gfi.be To: IBM-MAIN@LISTSERV.UA.EDU Sent: Friday, October 25, 2013 4:33 AM Subject: Re: Is there currently a way to access MongoDB from z/OS LE languages? On Thu, 24 Oct 2013 22:58:05 -0500, John McKown john.archie.mck...@gmail.com wrote: I'm not sure about the following. I'm up late due to ... well, it doesn't matter. But I am wondering if it would be easier to interface MongoDB (on a remote system such as z/Linux) with a z/OS Java routine. And then interface the Java routine with COBOL. I need to read up on the Java - COBOL communication. It may only be for Object COBOL. Java to COBOL and COBOL to Java can be done through JNI. You will need a C layer to glue the two together but you do not neec Object COBOL if you dont want it. Cheers, Jantje. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
I will look carefully at the Java option and JNI, but my inclination (as an old timer) is to adapt the C driver rather. Working directly with C subroutines from COBOL, without a Java layer seems to me to be more natural, but again, I am an old timer and I do not really know Java. If I need extensive additional functionality, not available in the driver, than that could be a reason to do Java (and learn that stuff at long last; I love working with languages that I don't know.) Can somebody please point me to the documentation of JNI and interfacing Java and COBOL. I did not yet look at the N type and the limitation that have been mentioned here (only 80 characters) would be a make or break. If indeed I cannot reasonably deal with (virtually) unlimited UTF-8 strings then I will not even start the porting project. My time and resources are limited! ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
The compiler limitations are for LITERALS, not for variables. Think VALUE clause, or constant strings MOVEd to a variable. Variable sizes are not any more limited for NATIONAL than for regular DISPLAY alphanumeric, AFAICT. HTH Peter -Original Message- From: IBM Mainframe Discussion List [mailto:IBM-MAIN@LISTSERV.UA.EDU] On Behalf Of Ze'ev Atlas Sent: Friday, October 25, 2013 12:50 PM To: IBM-MAIN@LISTSERV.UA.EDU Subject: Re: Is there currently a way to access MongoDB from z/OS LE languages? I will look carefully at the Java option and JNI, but my inclination (as an old timer) is to adapt the C driver rather. Working directly with C subroutines from COBOL, without a Java layer seems to me to be more natural, but again, I am an old timer and I do not really know Java. If I need extensive additional functionality, not available in the driver, than that could be a reason to do Java (and learn that stuff at long last; I love working with languages that I don't know.) Can somebody please point me to the documentation of JNI and interfacing Java and COBOL. I did not yet look at the N type and the limitation that have been mentioned here (only 80 characters) would be a make or break. If indeed I cannot reasonably deal with (virtually) unlimited UTF-8 strings then I will not even start the porting project. My time and resources are limited! ZA -- This message and any attachments are intended only for the use of the addressee and may contain information that is privileged and confidential. If the reader of the message is not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any dissemination of this communication is strictly prohibited. If you have received this communication in error, please notify us immediately by e-mail and delete the message and any attachments from your system. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
The compiler limitations are for LITERALS, not for variables. Think VALUE clause, or constant strings MOVEd to a variable. That's good news ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 25/10/2013 11:13 PM, Rob Schramm wrote: Not sure how to respond.. on the one hand you have an excellent point. One the other hand.. Google jdbc and mongodb.. as well as there being a jdbc link on the mongodb page in addition to the mongodb java connectors. Doesn't really change my intent ... Grab the mongodb java database driver.. (how does jmdbc driver sound???) and couple it with the cobol application code. I understood your original intent Rob. I was just sounding off about JDBC drivers for non-relational data bases. I've never quite grasped why there are so many SQL adapters for non-relational data bases. Even IMS has a Java SQL interface with ODBC and I just don't get it. Is SQL really that much better then native APIs? In the case of your typical key/value data store surely get/set is easier than SELECT FROM WHERE/UPDATE SET IN etc. Rob On Oct 25, 2013 3:03 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 1:51 PM, Rob Schramm wrote: With a JDBC driver and a bit of JAVA code..you could use the COBOL/JAVA procedure BCDBATCH to help tie the two together. Did a quick scan and there appear to be at least few JDBC drivers. I'm scratching my head as to why a JDBC driver is useful with a NoSQL data base which has a very specific API. Why not just use the MongoDB Java API? Does JDBC provide some kind of value add? Rob Rob Schramm Senior Systems Consultant Imperium Group On Fri, Oct 25, 2013 at 1:18 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/specificationhttp://bsonspec.org/#/**specification http://bsonspec.**org/#/specificationhttp://bsonspec.org/#/specification . I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/reports/tr16/http://www.unicode.org/**reports/tr16/ http://www.**unicode.org/reports/tr16/http://www.unicode.org/reports/tr16/ Tony H. --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --** --**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Is there currently a way to access MongoDB from z/OS LE languages?
Hi all Is there currently a way to access MongoDB from z/OS LE languages? ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 25/10/2013 10:04 AM, Ze'ev Atlas wrote: Hi all Is there currently a way to access MongoDB from z/OS LE languages? It should be simple enough to build a client. There are many http://docs.mongodb.org/ecosystem/drivers/. Of course, that's for accessing MongoDB running off the mainframe. I'm interested to know why you would want to do that? Porting MongoDB to z/OS is non-trivial as it requires a JavaScript engine, either V8 or SpiderMonkey. Porting either of those is a huge amount of work, although it would be fantastic if somebody could. It requires writing a JIT compiler, so I'm sure IBM have to tech from Java to do that. The current crop of next gen dynamic scripting language JITs like V8 JavaScript or LuaJIT are as fast as Java and not too far off C, and dirt easy to write code in. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 25/10/2013 10:29 AM, Ze'ev Atlas wrote: Assuming I use my experience in porting Open Source C libraries to the mainframe and import the MongoDB C driver and compile it successfully, my main issue would then be, as usual, the pesky EBCDIC. When working on the PCRE library, there was already a full EBCDIC implementation, but there is obviously no joy like that in MongoDB. EBCDIC may not be your only problem! What about endianess? I suggest you study the wire protocol if you are serious *http://docs.mongodb.org/meta-driver/latest/legacy/mongodb-wire-protocol/.* My question would be, I guess (I am not even sure I express it in the correct terminology), is there a way to tell an LE application (mainly COBOL) to use ASCII or UTF-8 in a native way, or alternatively, to present the ASCII code in EBCDIC. http://pic.dhe.ibm.com/infocenter/zos/v1r13/index.jsp?topic=%2Fcom.ibm.zos.r13.cbcux01%2Fascii.htm ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
It should be simple enough to build a client. There are many http://docs.mongodb.org/ecosystem/drivers/. Of course, that's for accessing MongoDB running off the mainframe. I'm interested to know why you would want to do that? I have no intention on porting the whole engine because I understand the difficulties, but allowing COBOL to access MongoDB on some MongoDB cluster is not different (conceptually) from accessing any other remote database. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
EBCDIC may not be your only problem! What about endianess? I suggest you study the wire protocol if you are serious thank you for pointing me to the right direction. I will look at the documents you've mentioned about both, EBCDIC and endianess and see if it it worth it. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 25/10/2013 10:58 AM, Ze'ev Atlas wrote: It should be simple enough to build a client. There are many http://docs.mongodb.org/ecosystem/drivers/. Of course, that's for accessing MongoDB running off the mainframe. I'm interested to know why you would want to do that? I have no intention on porting the whole engine because I understand the difficulties, but allowing COBOL to access MongoDB on some MongoDB cluster is not different (conceptually) from accessing any other remote database. Interesting concept. I'm not aware of any previous requirement for a mainframe COBOL program to access a data base running on a different platform. Just for fun maybe but certainly not in production. It would be very interesting to see how a intrinsically constrained language like COBOL would deal with the dynamic nature of Mongos JSON like document objects. It's not much fun accessing Mongo in C let alone COBOL. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
It's not much fun accessing Mongo in C let alone COBOL. Agree My language of choice is Perl which flaws with that stuff and I am working on my JavaScript skills. However, what drives me is frustration. Whenever I do a mainframe project, I see how much I miss the other side's features and then I look how to bring those features in. I may need to invent a way to represent complex key value structures in COBOL which is a challenge on its own (unless somebody has already done it.) The simplest solution would be a two dimensional table of Z type strings, but that would not allow in a simple way for hierarchies. I guess I'll have to develop a type and the functionality to handle it. About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
I'm not sure about the following. I'm up late due to ... well, it doesn't matter. But I am wondering if it would be easier to interface MongoDB (on a remote system such as z/Linux) with a z/OS Java routine. And then interface the Java routine with COBOL. I need to read up on the Java - COBOL communication. It may only be for Object COBOL. Anyway, it's just a weird thought from a person who is sleep deprived. On Thu, Oct 24, 2013 at 10:49 PM, Ze'ev Atlas zatl...@yahoo.com wrote: It's not much fun accessing Mongo in C let alone COBOL. Agree My language of choice is Perl which flaws with that stuff and I am working on my JavaScript skills. However, what drives me is frustration. Whenever I do a mainframe project, I see how much I miss the other side's features and then I look how to bring those features in. I may need to invent a way to represent complex key value structures in COBOL which is a challenge on its own (unless somebody has already done it.) The simplest solution would be a two dimensional table of Z type strings, but that would not allow in a simple way for hierarchies. I guess I'll have to develop a type and the functionality to handle it. About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- This is clearly another case of too many mad scientists, and not enough hunchbacks. Maranatha! John McKown -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
Actually, it looks like there is support to UTF-8: ___ You need to do two steps to convert ASCII or EBCDIC data to UTF-8: Use the function NATIONAL-OF to convert the ASCII or EBCDIC string to a national (UTF-16) string. Use the function DISPLAY-OF to convert the national string to UTF-8. ___ This is from Enterprise COBOL for z/OS Version 5.1 documentation and there is N type. ZA -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. http://www.unicode.org/reports/tr16/ Tony H. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/specification. I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/reports/tr16/ Tony H. -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
Re: Is there currently a way to access MongoDB from z/OS LE languages?
With a JDBC driver and a bit of JAVA code..you could use the COBOL/JAVA procedure BCDBATCH to help tie the two together. Did a quick scan and there appear to be at least few JDBC drivers. Rob Rob Schramm Senior Systems Consultant Imperium Group On Fri, Oct 25, 2013 at 1:18 AM, David Crayford dcrayf...@gmail.com wrote: On 25/10/2013 12:28 PM, Tony Harminc wrote: On 24 October 2013 23:49, Ze'ev Atlas zatl...@yahoo.com wrote: About a previous post, the endianess should not be a big issue to deal with once the two sides of the protocol are well defined. The EBCDIC issue is a make or break issue. MongoDB works decidedly with UTDF-8 and I need COBOL to natively view a string as UTF-8. Does the current incarnation of COBOL (and perhaps PL/I) have a native UTF-8 string type. If not, then I will abandon the whole project. I'm doubtless blowing (or something) into the wind again, but this sounds like a place for UTF-EBCDIC. Which is easily translated to and from UTF-8 if that's what goes on the wire. (I'm assuming your UTDF-8 was just a typo.) Presumably it would be a good start if COBOL could see and manipulate the subset of UTF-EBCDIC that is EBCDIC strings that would live as UTF-8 in the database. Then when COBOL learns to handle UTF-EBCDIC, it could handle the complete UNICODE set. The wire protocol is binary. The UTF-8 requirement for strings in the BSON spec http://bsonspec.org/#/**specificationhttp://bsonspec.org/#/specification . I really like the look of BSON. It's like google protocol buffers but more flexible. XML is the pleated khakis of the document markup world. http://www.unicode.org/**reports/tr16/http://www.unicode.org/reports/tr16/ Tony H. --**--** -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN --**--**-- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN -- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN