[U2] [AD] Webinar: Super Easy RESTful Services Generation for MultiValue Databases

2012-10-01 Thread David Peters Bluefinity
Webinar:  Super Easy RESTful Services Generation for MultiValue Databases
Tuesday, October 2, 2012

The latest release of mv.NET provides what is arguably the simplest yet most 
powerful and flexible way of creating RESTful web services against a MultiValue 
database.

Developers are now able to generate industry standard RESTful services without 
having to be a network stack or HTTP guru.  Easy, automated service generation 
combined with fully customizable service content provides the developer with 
the best of both worlds. Moreover, developers can host and deploy these 
services using non-proprietary, industry standard technologies that are able to 
integrate seamlessly with existing enterprise computing infrastructures.

The integration with RESTful service support demonstrates the power and 
flexibility of BlueFinity's entity modelling framework - a framework that sets 
BlueFinity apart from all of the other vendors in the MultiValue market space.

Discover how RESTful services can be created with literally a click of a button 
on any of the major MultiValue databases!  RESTful services generation will be 
demonstrated on a variety of sample MultiValue applications running on 
different platforms including Windows 8 in this free webinar.


Reserve your space now at:
https://www2.gotomeeting.com/register/445632618

For more information, visit
http://www.bluefinity.com/site/web_services.html

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


[U2] STARTUP file issue with UV11.1 PE version (Linux)

2012-10-01 Thread doug chanco
I recently downloaded uv 11 and when I went to run STARTUP I got a weird
error, upon looking at the STARTUP script I noticed it had a bunch of binary
and other junk at the beginning of the file, I removed all the extra
stuff, saved the file and it ran just fine.

 

Has anyone else seen this?  I re downloaded the zip and still had this
issue.  It was easy enough to resolve but I thought I would mention it.

 

Dougc

 

 

 

 

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] STARTUP file issue with UV11.1 PE version (Linux)

2012-10-01 Thread Brian Leach
Doug

Have you remembered that STARTUP is a cpio archive?

# cpio -uvcdumB uv.load  STARTUP
./uv.load


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of doug chanco
Sent: 01 October 2012 17:37
To: U2 Users List
Subject: [U2] STARTUP file issue with UV11.1 PE version (Linux)

I recently downloaded uv 11 and when I went to run STARTUP I got a weird
error, upon looking at the STARTUP script I noticed it had a bunch of binary
and other junk at the beginning of the file, I removed all the extra
stuff, saved the file and it ran just fine.

 

Has anyone else seen this?  I re downloaded the zip and still had this
issue.  It was easy enough to resolve but I thought I would mention it.

 

Dougc

 

 

 

 

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] STARTUP file issue with UV11.1 PE version (Linux)

2012-10-01 Thread doug chanco
No sir, I did not know that,  why would they cpio it anyway?  Not that it
matters I was just curious, anyway thanks for the info.

Dougc

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Brian Leach
Sent: Monday, October 01, 2012 1:19 PM
To: 'U2 Users List'
Subject: Re: [U2] STARTUP file issue with UV11.1 PE version (Linux)

Doug

Have you remembered that STARTUP is a cpio archive?

# cpio -uvcdumB uv.load  STARTUP
./uv.load


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of doug chanco
Sent: 01 October 2012 17:37
To: U2 Users List
Subject: [U2] STARTUP file issue with UV11.1 PE version (Linux)

I recently downloaded uv 11 and when I went to run STARTUP I got a weird
error, upon looking at the STARTUP script I noticed it had a bunch of binary
and other junk at the beginning of the file, I removed all the extra
stuff, saved the file and it ran just fine.

 

Has anyone else seen this?  I re downloaded the zip and still had this
issue.  It was easy enough to resolve but I thought I would mention it.

 

Dougc

 

 

 

 

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


[U2] [u2] Parallel processing in Universe

2012-10-01 Thread Wjhonson

What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to the 
master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread u2ug
pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to
the master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread George Gallen
The only thing about a pipe is that once it's closed, I believe it has to be 
re-opened by both
Ends again. So if point a opens one end, and point b opens the other end, once 
either end closes,
It closes for both sides, and both sides would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other sides 
do a  append
To that file - just make sure you include some kind of dataheader so the 
reading side knows which
Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to
the master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread George Gallen
0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the READSEQ's 
ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

Now your phantom just needs to print to that printer.

George

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has to be 
re-opened by both
Ends again. So if point a opens one end, and point b opens the other end, once 
either end closes,
It closes for both sides, and both sides would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other sides 
do a  append
To that file - just make sure you include some kind of dataheader so the 
reading side knows which
Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to
the master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread David Wolverton
So how would a user 'chop up' a file for parallel processing?  Ideally, if
here was a Mod 10001 file (or whatever) it would seem like it would be
'ideal' to assign 2000 groups to 5 phantoms -- but I don't know how 'start a
BASIC select at Group 2001 or 4001' ...

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 3:29 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the READSEQ's
ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

Now your phantom just needs to print to that printer.

George

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has to be
re-opened by both Ends again. So if point a opens one end, and point b opens
the other end, once either end closes, It closes for both sides, and both
sides would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other
sides do a  append To that file - just make sure you include some kind
of dataheader so the reading side knows which Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to the
master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread Robert Houben
Create an index on a dict pointing at the first character of the key, and have 
each phantom take two digits. (0-1, 2-3, 4-5, 6-7, 8-9)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton
Sent: October-01-12 2:43 PM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

So how would a user 'chop up' a file for parallel processing?  Ideally, if here 
was a Mod 10001 file (or whatever) it would seem like it would be 'ideal' to 
assign 2000 groups to 5 phantoms -- but I don't know how 'start a BASIC select 
at Group 2001 or 4001' ...

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 3:29 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the READSEQ's 
ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

Now your phantom just needs to print to that printer.

George

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has to be 
re-opened by both Ends again. So if point a opens one end, and point b opens 
the other end, once either end closes, It closes for both sides, and both sides 
would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other sides 
do a  append To that file - just make sure you include some kind of 
dataheader so the reading side knows which Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to the 
master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread David Wolverton
OK - I was trying to create a 'smoother use' of the disk and 'read ahead' --
this example the disk would be chattering from the heads moving all over the
place. I was trying to find a way to make this process more 'orderly' -- is
there one?

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Robert Houben
Sent: Monday, October 01, 2012 4:48 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

Create an index on a dict pointing at the first character of the key, and
have each phantom take two digits. (0-1, 2-3, 4-5, 6-7, 8-9)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton
Sent: October-01-12 2:43 PM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

So how would a user 'chop up' a file for parallel processing?  Ideally, if
here was a Mod 10001 file (or whatever) it would seem like it would be
'ideal' to assign 2000 groups to 5 phantoms -- but I don't know how 'start a
BASIC select at Group 2001 or 4001' ...

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 3:29 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the READSEQ's
ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

Now your phantom just needs to print to that printer.

George

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has to be
re-opened by both Ends again. So if point a opens one end, and point b opens
the other end, once either end closes, It closes for both sides, and both
sides would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other
sides do a  append To that file - just make sure you include some kind
of dataheader so the reading side knows which Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to the
master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread Wjhonson

The GROUP.STAT.DETAIL command will tell you the keys, in stored order, in each 
group of a hashed file.



-Original Message-
From: David Wolverton dwolv...@flash.net
To: 'U2 Users List' u2-users@listserver.u2ug.org
Sent: Mon, Oct 1, 2012 3:47 pm
Subject: Re: [U2] [u2] Parallel processing in Universe


OK - I was trying to create a 'smoother use' of the disk and 'read ahead' --
this example the disk would be chattering from the heads moving all over the
place. I was trying to find a way to make this process more 'orderly' -- is
there one?

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Robert Houben
Sent: Monday, October 01, 2012 4:48 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

Create an index on a dict pointing at the first character of the key, and
have each phantom take two digits. (0-1, 2-3, 4-5, 6-7, 8-9)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton
Sent: October-01-12 2:43 PM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

So how would a user 'chop up' a file for parallel processing?  Ideally, if
here was a Mod 10001 file (or whatever) it would seem like it would be
'ideal' to assign 2000 groups to 5 phantoms -- but I don't know how 'start a
BASIC select at Group 2001 or 4001' ...

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 3:29 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the READSEQ's
ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

Now your phantom just needs to print to that printer.

George

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has to be
re-opened by both Ends again. So if point a opens one end, and point b opens
the other end, once either end closes, It closes for both sides, and both
sides would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other
sides do a  append To that file - just make sure you include some kind
of dataheader so the reading side knows which Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to the
master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

 
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread u2ug
True - but why would you want it any other way ?
Once one end closes it - process is complete 

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has
to be re-opened by both Ends again. So if point a opens one end, and
point b opens the other end, once either end closes, It closes for both
sides, and both sides would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the
other sides do a  append To that file - just make sure you include
some kind of dataheader so the reading side knows which Process just
wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to
the master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread David Taylor
Or, let's suppose you wanted to process repetitive segments of one very
large record using the same logic in a separate phantom process for each
segment, how large a record can be read and processed in Universe?

Dave

 So how would a user 'chop up' a file for parallel processing?  Ideally, if
 here was a Mod 10001 file (or whatever) it would seem like it would be
 'ideal' to assign 2000 groups to 5 phantoms -- but I don't know how 'start
 a
 BASIC select at Group 2001 or 4001' ...

 -Original Message-
 From: u2-users-boun...@listserver.u2ug.org
 [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
 Sent: Monday, October 01, 2012 3:29 PM
 To: U2 Users List
 Subject: Re: [U2] [u2] Parallel processing in Universe

 0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
 0002: LOOP
 0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
 0004:PRINT LINE
 0005: REPEAT
 0006: STOP
 0007: END

 Although, not sure if you might need to sleep a litte between the
 READSEQ's
 ELSE and CONTINUE
Might suck up cpu time when nothing is writing to the file.

 Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

 Now your phantom just needs to print to that printer.

 George

 -Original Message-
 From: u2-users-boun...@listserver.u2ug.org
 [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
 Sent: Monday, October 01, 2012 4:16 PM
 To: U2 Users List
 Subject: Re: [U2] [u2] Parallel processing in Universe

 The only thing about a pipe is that once it's closed, I believe it has to
 be
 re-opened by both Ends again. So if point a opens one end, and point b
 opens
 the other end, once either end closes, It closes for both sides, and both
 sides would have to reopen again to use.

 To eliminate this, you could have one end open a file, and have the other
 sides do a  append To that file - just make sure you include some kind
 of dataheader so the reading side knows which Process just wrote the data.

 -Original Message-
 From: u2-users-boun...@listserver.u2ug.org
 [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
 Sent: Monday, October 01, 2012 4:11 PM
 To: U2 Users List
 Subject: Re: [U2] [u2] Parallel processing in Universe

 pipes


 -Original Message-
 From: u2-users-boun...@listserver.u2ug.org
 [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
 Sent: Monday, October 01, 2012 4:05 PM
 To: u2-users@listserver.u2ug.org
 Subject: [U2] [u2] Parallel processing in Universe


 What's the largest dataset in the Universe user world?
 In terms of number of records.

 I'm wondering if we have any potential for utilities that map-reduce.
 I suppose you would spawn phantoms but how do they communicate back to the
 master node?
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users


 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users

 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users



___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] [u2] Parallel processing in Universe

2012-10-01 Thread Ross Ferris
If the file were big enough, and already had part files, then I believe that 
you could have a phantom process each of the individual parts. Failing that, 
get an SSD  relatively cheap, and will give your processing a reasonable 
kick along!!

Ross Ferris
Stamina Software
Visage  Better by Design!


-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton 
Sent: Tuesday, 2 October 2012 8:47 AM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

OK - I was trying to create a 'smoother use' of the disk and 'read ahead' -- 
this example the disk would be chattering from the heads moving all over the 
place. I was trying to find a way to make this process more 'orderly' -- is 
there one?

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Robert Houben
Sent: Monday, October 01, 2012 4:48 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

Create an index on a dict pointing at the first character of the key, and have 
each phantom take two digits. (0-1, 2-3, 4-5, 6-7, 8-9)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton
Sent: October-01-12 2:43 PM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

So how would a user 'chop up' a file for parallel processing?  Ideally, if here 
was a Mod 10001 file (or whatever) it would seem like it would be 'ideal' to 
assign 2000 groups to 5 phantoms -- but I don't know how 'start a BASIC select 
at Group 2001 or 4001' ...

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 3:29 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the READSEQ's 
ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat -  /tmp/pipetest

Now your phantom just needs to print to that printer.

George

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 4:16 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

The only thing about a pipe is that once it's closed, I believe it has to be 
re-opened by both Ends again. So if point a opens one end, and point b opens 
the other end, once either end closes, It closes for both sides, and both sides 
would have to reopen again to use.

To eliminate this, you could have one end open a file, and have the other sides 
do a  append To that file - just make sure you include some kind of 
dataheader so the reading side knows which Process just wrote the data.

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug
Sent: Monday, October 01, 2012 4:11 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

pipes


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Monday, October 01, 2012 4:05 PM
To: u2-users@listserver.u2ug.org
Subject: [U2] [u2] Parallel processing in Universe


What's the largest dataset in the Universe user world?
In terms of number of records.

I'm wondering if we have any potential for utilities that map-reduce.
I suppose you would spawn phantoms but how do they communicate back to the 
master node?
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org

Re: [U2] [u2] Parallel processing in Universe (Unclassified)

2012-10-01 Thread HENDERSON MIKE, MR
I have often thought about this - mostly in an idle moment or as a
displacement activity for something less amusing that I ought to be
doing. ;-)


First of all, Universe is already extremely parallel: there's a separate
O/S thread for each TTY and for each phantom, and you can't get more
parallel than that for interactive processing.

So you want more parallelism for your batch processes.
Different applications have different degrees of inherent parallelism.
For example in utility billing systems there is frequently the concept
of a group of premises - based on the old concept of a foot-borne meter
reader with a 'book' of readings to get. Each 'book' can be processed
independently of every other. In payroll, each employee's record can be
processed independently. Other areas of commerce have different
characteristics.

I think that whatever unit of parallelism you settle for, you'd need
three processes: a 'dispatcher' that selects records for processing and
queues them into some structure for processing; a set of 'workers' that
take queued work items, process them, mark them as processed and put the
results in some common store; and a 'monitor' that looks for unprocessed
records and indications of stuck processes, and collates the results for
final output.
I've seen a couple of versions of this, one for electricity billings and
another for overnight batch-processing of report requests, both well
over a decade ago, and neither still in use although their underlying
packages are still being run.

The major issue is that these days the whole entity in the general
commercial world is far more likely to be I/O limited than CPU limited,
and therefore introducing parallelism will be no help at all if the I/O
system is already choked.
Even if the system is currently CPU-limited, multi-threading may not
produce much improvement without very careful design of the record
locking philosophy - introducing parallelism will be no help if all the
threads end up contending serially for one record lock or a small set of
locks.


If you want it to go faster, buy the CPU with the fastest clock you can
get (not the one with the most cores), and put your database on SSD like
Ross said.
The Power7+ chips being announced any day now are rumoured to go to
5GHz+, maybe even more if you have half the cores on the chip disabled.


Regards


Mike

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Ross Ferris
Sent: Tuesday, 2 October 2012 3:50 p.m.
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

If the file were big enough, and already had part files, then I believe
that you could have a phantom process each of the individual parts.
Failing that, get an SSD  relatively cheap, and will give your
processing a reasonable kick along!!

Ross Ferris
Stamina Software
Visage  Better by Design!


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David
Wolverton 
Sent: Tuesday, 2 October 2012 8:47 AM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

OK - I was trying to create a 'smoother use' of the disk and 'read
ahead' -- this example the disk would be chattering from the heads
moving all over the place. I was trying to find a way to make this
process more 'orderly' -- is there one?

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Robert Houben
Sent: Monday, October 01, 2012 4:48 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

Create an index on a dict pointing at the first character of the key,
and have each phantom take two digits. (0-1, 2-3, 4-5, 6-7, 8-9)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David
Wolverton
Sent: October-01-12 2:43 PM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

So how would a user 'chop up' a file for parallel processing?  Ideally,
if here was a Mod 10001 file (or whatever) it would seem like it would
be 'ideal' to assign 2000 groups to 5 phantoms -- but I don't know how
'start a BASIC select at Group 2001 or 4001' ...

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George Gallen
Sent: Monday, October 01, 2012 3:29 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

0001: OPENSEQ /tmp/pipetest TO F.PIPE ELSE STOP NO PIPE
0002: LOOP
0003:READSEQ LINE FROM F.PIPE ELSE CONTINUE
0004:PRINT LINE
0005: REPEAT
0006: STOP
0007: END

Although, not sure if you might need to sleep a litte between the
READSEQ's ELSE and CONTINUE
   Might suck up cpu time when nothing is writing to the file.

Then you could setup a printer in UV that did a  cat - 
/tmp/pipetest

Now your phantom 

Re: [U2] [u2] Parallel processing in Universe (Unclassified)

2012-10-01 Thread Ross Ferris
Interestingly I'm currently trying to find  a definitive answer/correlation 
between clock speed  performance on a single core/thread on Intel CPU's to 
confirm, or deny, that for grunt batch work if a 4C Intel running @ 3.4Ghz will 
actually be faster than an 8C running @ 2.7Ghz -- the answer isn't as straight 
forward (or as easy to find) as I would have hoped, as even within the same 
family (e5-2600) there can be architectural differences that come into play 
 and if anyone has a definitive answer, please feel free to share !

Ross Ferris
Stamina Software
Visage  Better by Design!


-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of HENDERSON MIKE, MR
Sent: Tuesday, 2 October 2012 1:18 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe (Unclassified)

I have often thought about this - mostly in an idle moment or as a displacement 
activity for something less amusing that I ought to be doing. ;-)


First of all, Universe is already extremely parallel: there's a separate O/S 
thread for each TTY and for each phantom, and you can't get more parallel than 
that for interactive processing.

So you want more parallelism for your batch processes.
Different applications have different degrees of inherent parallelism.
For example in utility billing systems there is frequently the concept of a 
group of premises - based on the old concept of a foot-borne meter reader with 
a 'book' of readings to get. Each 'book' can be processed independently of 
every other. In payroll, each employee's record can be processed independently. 
Other areas of commerce have different characteristics.

I think that whatever unit of parallelism you settle for, you'd need three 
processes: a 'dispatcher' that selects records for processing and queues them 
into some structure for processing; a set of 'workers' that take queued work 
items, process them, mark them as processed and put the results in some common 
store; and a 'monitor' that looks for unprocessed records and indications of 
stuck processes, and collates the results for final output.
I've seen a couple of versions of this, one for electricity billings and 
another for overnight batch-processing of report requests, both well over a 
decade ago, and neither still in use although their underlying packages are 
still being run.

The major issue is that these days the whole entity in the general commercial 
world is far more likely to be I/O limited than CPU limited, and therefore 
introducing parallelism will be no help at all if the I/O system is already 
choked.
Even if the system is currently CPU-limited, multi-threading may not produce 
much improvement without very careful design of the record locking philosophy - 
introducing parallelism will be no help if all the threads end up contending 
serially for one record lock or a small set of locks.


If you want it to go faster, buy the CPU with the fastest clock you can get 
(not the one with the most cores), and put your database on SSD like Ross said.
The Power7+ chips being announced any day now are rumoured to go to
5GHz+, maybe even more if you have half the cores on the chip disabled.


Regards


Mike

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Ross Ferris
Sent: Tuesday, 2 October 2012 3:50 p.m.
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

If the file were big enough, and already had part files, then I believe that 
you could have a phantom process each of the individual parts.
Failing that, get an SSD  relatively cheap, and will give your processing a 
reasonable kick along!!

Ross Ferris
Stamina Software
Visage  Better by Design!


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton
Sent: Tuesday, 2 October 2012 8:47 AM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

OK - I was trying to create a 'smoother use' of the disk and 'read ahead' -- 
this example the disk would be chattering from the heads moving all over the 
place. I was trying to find a way to make this process more 'orderly' -- is 
there one?

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Robert Houben
Sent: Monday, October 01, 2012 4:48 PM
To: U2 Users List
Subject: Re: [U2] [u2] Parallel processing in Universe

Create an index on a dict pointing at the first character of the key, and have 
each phantom take two digits. (0-1, 2-3, 4-5, 6-7, 8-9)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Wolverton
Sent: October-01-12 2:43 PM
To: 'U2 Users List'
Subject: Re: [U2] [u2] Parallel processing in Universe

So how would a user 'chop up' a file