I need to do a UniVerse LIST statement that would only populate a column if the
contents met certain criteria.
For example, suppose we have a file with details of telephone usage and that 3
associated mulitvalued fields contain date call was made, duration and if the
call was a toll call. Is
Mark
Took me a couple of times reading through the post to understand the issue
..
I think you're going to have to call a subroutine rather than use a LIST.
Brian
-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of
How about LIST CALLS EAMP.NAME EMP.LOCATION WHEN DATE.CALL GE 2012-06-01 AND
DATE.CALL LE 2012-06-30 DURATION TOLL WITH @ID EQ '123456'
In order for the When to work, it must be declared as Multivalued in the DICT
of DATE.CALL, DURATION and TOLL and most likely, need to
have the same associated
LIST CALLS '123456'
EMP.NAME EMP.LOCATION
DATE.CALL GE 2012-06-01 AND LE 2012-06-30
DURATION TOLL
?
From mark.hennessey
I need to do a UniVerse LIST statement that would only populate a
column if the contents met certain criteria.
For example, suppose we have a file with
could you setup an index based on employee number.?
then possibly use that index against employees active for June
Rich
Hennessey, Mark F. wrote:
I need to do a UniVerse LIST statement that would only populate a column if the
contents met certain criteria.
For example, suppose we have a
I think the answer is to use an i-type dictionary.
You can use WHEN instead of WITH on multivalued columns to limit printing to
just the multi values you want, but then you wouldn't get an output line for
the items where there were no calls in the range.
You could do 2 separate reports, one
We are getting the message:
Warning: Attempt to extend file GJ/INV beyond the 2 gigabyte file
limit. 32-bit static hashed or dynamic files cannot exceed a file size
of 2 GB. Use a distributed or 64-bit file.
I get why we're getting the message, I'm wondering, what is the
easiest/safest way to
If this were a dynamic file, you would have to create a new file adding
the phrase 64BIT.
I don't know if the phrase is valid for non-dynamic files.
On Mon, 2 Jul 2012 10:20:13 -0500, Holt, Jake wrote:
We are getting the message:
Warning: Attempt to extend file GJ/INV beyond the 2 gigabyte
Personally, what I would do right now, is to create another file as your
temporary holding place if you will.
Then select, perhaps on a date range, everything earlier than... 2005 or
whatever, and read it, write it to the new location and delete it from where
it's at.
That will solve your
T
No, that will remove any records that don't have a date in that range.
Same using WHEN.
He wants to keep those records (i.e. the other columns) but have that column
blank.
Brian
-Original Message-
From: u2-users-boun...@listserver.u2ug.org
IBM told us otherwise at the time, they told us we had to buy it through
the var that sold us the licenses.
-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson
Sent: Saturday, June 30, 2012 11:49 AM
To:
Hi:Could you plese tell me how to upload data from pc to my account.
Thank you.
sudheer
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
Ohhh my;
Here come the flames...
... david ...
David L. Wasylenko
President, Pick Professionals, Inc
w) 314 558 1482
d...@pickpro.com
-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of SUDHEER RAMISETTI
Sent:
Thanks for the comments Doug; it is always appreciated to hear what we can
improve upon and also what we've done right. Glad to hear you can now load
XLr8Tools inside of the U2 DBTools package. Hopefully you can find some
creative ways to take advantage of that :)
As with you, we have been
Copy the file, at the windows level, to the HOLD file in the directory where
the Universe account resides.
You can then access it directly inside Universe.
That is the simplest method.
-Original Message-
From: SUDHEER RAMISETTI ramisettisudhe...@gmail.com
To: u2-users
Since we don't have outer joins I would build a work file starting with
selecting all employees, then select the calls and merge them together
in a loop. How you do it depends on how you want to present the data.
If you're happy with showing the calls as multi values you can build the
records
Flames? How about Use AccuTerm.
A better definition of the problem will return better suggestions.
Is this a one-shot or a regular thing?
How many users?
What kind of data?
T
From: David L. Wasylenko
Ohhh my; Here come the flames...
From: SUDHEER RAMISETTI
Hi:Could you plese tell
The safest way is of course, create a new file and move the data over
to the new file when the file is idle.
On 7/2/12, John Thompson jthompson...@gmail.com wrote:
If this is universe- the RESIZE command might have an option to
convert it. My disclaimer is that I have never tried such a thing
Personally I don't trust CONCURRENT
It's like tiny bits flying back and forth over my head in a helter-skelter
mishmash
But maybe I'm too old school
Perhaps if they came out and explained exactly how it's doing this, I'd feel
more confident in using that.
-Original Message-
Hi
It depends on the type of data and what you want to do with it.
You can create a directory file (type 19) on UniVerse (DIR on UniData) and
simply copy files into there: each file will be visible as a separate record
in the database, with each line of the record being a separate field. That's
I told them they could purge some records or sit on it till Saturday and
they decided they'd be ok with purging some records...
Thank you both for your help,
Jake
-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of
I was wondering if anyone had instructions on RESIZE with a dynamic file? For
example I have a file called 'TEST_FILE'
with the following:
01 ANALYZE.FILE TEST_FILE
File name .. TEST_FILE
Pathname ... TEST_FILE
File type .. DYNAMIC
File
I was wondering if anyone had instructions on RESIZE with a dynamic file? For
example I have a file called 'TEST_FILE'
with the following:
01 ANALYZE.FILE TEST_FILE
File name .. TEST_FILE
Pathname ... TEST_FILE
File type .. DYNAMIC
File
Hi Chris,
The whole point of dynamic files is that you don't do RESIZE. The file will
look after itself, automatically responding to
variations in the volume of data.
There are knobs to twiddle but in most cases they can safely be left at their
defaults. A dynamic file will never perform as
Yep. I've been burned on concurrent myself. So I guess I should
rephrase... Only use RESIZE if the file is idle.
On 7/2/12, Wjhonson wjhon...@aol.com wrote:
Personally I don't trust CONCURRENT
It's like tiny bits flying back and forth over my head in a helter-skelter
mishmash
But maybe I'm
The dynamic file I'm working with is below. What do 'overflowed' and 'badly'
refer to under MODULUS? Is the goal of the RESIZE to eliminate that
overflow? Any ideas what I should change to achieve this?
File name .. TEST_FILE
Pathname ... TEST_FILE
File type
Group size appears adequate (although anytime anything hashes into the group(s)
with the largest record [3267b], you'll split: 3267 is 79.8% of 4096, so if you
have a lot of records up in the 3K range, you may want to increase group size
and decrease min modulus accordingly), but the minimum
I guess my main question is regarding the 'overflow' and 'badly' #'s which you
can see when you do an ANALYZE.FILE filename STATISTICS.
Is the goal not to have any overflow #? And what is 'badly'?
After playing around with RESIZE on this file, I was able to come up with the
following:
RESIZE
Hi Chris:
You cannot get away with not resizing dynamic files in my experience. The
files do not split and merge like we are led to believe. The separator is
not used on dynamic files. Your Universe file is badly sized. The math
below will get you reasonably file size.
Let's do the math:
Chris,
I second the thought that, because of the splitting and merging of groups, it
can be a waste of effort to overwork the sizing of a dynamic file.
One problem with your TEST_FILE below is that the Large Record Size is spec'ed
at less than 50% of the group size. Each record that is larger
30 matches
Mail list logo