Donald,

Confronted with a similar issue in the past, what I decided to do was :

(1) Construct a variety of cell pools with upper size boundaries on various 
lengths - eg 256 bytes, 1K, 2K, 4K, 8K, 16K and 32K
(2) When an element is to be added to the table, determine the correct cell 
pool to use based on its size
(3) Each element is then described by a small element descriptor (another cell 
pool) that includes the address of the cell that contains the data and the cell 
pool ID it came from
(4) When you update/delete a cell, each element descriptor has enough 
self-contained information to release the existing cell back to the cell pool 
it came from

Your code could also keep track of the number of cells in each cell pool and 
report statistics that enable you to tune the PCELLCT and SCELLCT 
  

Rob Scott
Lead Developer
Rocket Software
77 Fourth Avenue . Suite 100 . Waltham . MA 02451-1468 . USA
Tel: +1.781.684.2305
Email: rsc...@rs.com
Web: www.rocketsoftware.com


-----Original Message-----
From: IBM Mainframe Discussion List [mailto:IBM-MAIN@LISTSERV.UA.EDU] On Behalf 
Of Bernd Oppolzer
Sent: 21 December 2012 12:40
To: IBM-MAIN@LISTSERV.UA.EDU
Subject: Re: Large table in memory

To suggest a solution it would be important to know how the elements of the 
table are accessed (by table index, or is there a key element in each table 
element that is searched, is there a key table where a binary search is done 
for that key or a tree structure that yields the table index etc.), and:

whtat kind of update activity is done to the table entries, is it, for example, 
possible, that short table entries are replaces by longer ones, and how often 
does this occur?

I would suggest a solution, where only the really needed parts of the variable 
length strings are stored, so that the storage needed is about to 875 * 35.000 
plus some administration overhead. But how exactly this is done depends on your 
answers to the questions above.

Kind regards

Bernd




Am 21.12.2012 13:25, schrieb Donald Likens:
> I have a table with variable length entries that range from 94 bytes to 32K 
> and an average length of 875 bytes. This table has a maximum size of 35,000 
> entries. I am thinking about using cells for this table but concerned on the 
> impact to the system getting over a 1 gigabyte of storage (35K*32K). I am 
> putting this cell pool above the bar but what about backing this storage with 
> AUX and page faults? Should I be concerned? If I don’t use a cell pool the 
> memory usage is around 30M.
>
> ----------------------------------------------------------------------
> For IBM-MAIN subscribe / signoff / archive access instructions, send 
> email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
>

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to 
lists...@listserv.ua.edu with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to