Literal Data - 64 K size limitation (OpenInsight)
At 17 APR 2001 11:48:20PM Don Smith wrote:
In Arev 3.12, we have a limitation of 64K length for literal data (Variable). Do we have the same limitation in OpenInsight. Is there any solution to constructing a Literal Data Variable exceeding the 64K length dynamically.
One possibility is that we can assign a new variable if the length exceeds 64 K. Then, in this case is there a way by which we can pass two or more variables containing literal data as a parameter and invoke a popup.
Don Smith
At 18 APR 2001 06:13AM Oystein Reigem wrote:
Don,
To store more than 64K in one variable you can use a dimensioned array. Each element can contain 64K. At least I think so. I've never used dimensioned arrays myself.
But I feel certain you can't feed a dimensioned array to a popup, so it's probably a blind alley to you.
What is this big piece of data of yours? A result list? Then I might have the same problem as you. I have large result lists that I want to display in a popup-like fashion. What I have decided to do is the following:
- Instead of the Popup dialog have my own window with an edit table
- At any time display a suitable chunk of data in that edit table
- When the user scrolls outside that chunk of data retrieve more data from SYSLISTS and the queried table and reset the contents of the edit table
- Have some programming that supports retrieval of any chunk of a result list, no matter how many SYSLISTS rows the list spans.
With this technique I can handle result lists of any length.
I'm not quite there yet. So far I've only displayed dummy data that are generated programmatically. My fears are that my window will operate sluggishly once I start retrieving data from tables, Xlate-ing symbolics and stuff.
- Oystein -
At 18 APR 2001 07:19AM Hank Huizinga wrote:
Our solution was to make our own popup window and populate it on the fly. When fully populated, we make it active and visible. The only other solution I know of is to use a temporary table to store the data and them call the popup with the temporary table as the data source.
At 18 APR 2001 09:13AM Don Miller - C3 Inc. wrote:
Oystein..
You're right about dimensioned arrays and popups. In fact, there are several problems with dimensioned arrays: each element eats a variable in itself; it's not possible to re-dimension an array on the fly once it's been used; however it IS possible to treat each element of a dimensioned array as though it was a record. Each element can hold +-64k and can be parsed into a dynamic array:
DIM FOO(64) ;* allow for 64 elements in the array
* build each element of FOO to contain an arbitrary # of elements delimited by either @FM, @FM or whatever
* keep track of how many array elements are used to hold the data
LAST_USED=15 ;* for example
* make sure that you open the data file and dictionary
OPEN "JUNK" to SRC_FILE else …
OPEN "DICT.JUNK" to @DICT else ..
FOR I=1 to LAST_USED
RELOAD:REC=FOO(I)
Assume you stored the data as @ID:@FM:DATA_REC
alternatively, you can just store the keys in the arrays and do
your own I/O@ID=RECREC=DELETE(REC,1,0,0) ;* to get the rest@RECORD=REC
you can do any dict symbolics or whatever here
now populate a control with the data..I like your external scroll
bar idea
to go forward to the NEXTI label
to go backward decrement I and goto RELOAD
the downside of this is that there might be a lot of overhead
ifNEXTI:
NEXT I
My experience is that large hunks of static data (if parsed using delimiters) can be extremely inefficient.
Useful? ;)
Don Miller
C3 Inc.
At 18 APR 2001 09:50AM Donald Bakke wrote:
Don,
You've probably already got one of your answers: No, OI has not removed the 64K limitation on variable storage.
Using diminsioned arrays is one way to work around this. Another way that we deal with this is to create user defined properties "on the fly". For instance, we have a client with a large parts database. They like to enter the first portion of the part in a field and have any parts that match auto-fill (like IE does when entering a web address.) Therefore we get all the keys (which are also stored in lookup records for quickness) and assign them user-defined properties like this:
<code> For Loop=1 to NumChunks Read PartList from hTable, "PART_IDS":Loop then Set_Property(@Window, "@PART_IDS":Loop, PartList) end else ...error code here end Next Loop</code> [email protected]
At 18 APR 2001 12:21PM Oystein Reigem wrote:
Don M,
I'll try to heed your warning about inefficient handling of delimited data. But I doubt my chunks (the current content of the edit table) will be very large.
Come to think of it - perhaps I'll load the edit table with all the data if I'm certain there's room for everything, and just smallish chunks if not - let's say at most a few hundred records at a time, or a few K.
- Oystein -
At 18 APR 2001 01:54PM Don Miller - C3 Inc. wrote:
Oystein..
That ought to do the trick nicely. You should be able to determine how many rows a given set of criteria would yield. For example:
* build CRITERIA_FOR_SELECT into a variable that contains the WITH and BY clauses
RLIST('SELECT SOME_TABLE CRITERIA_FOR_SELECT…)
IF @RECCOUNT then
ROWS2SHOW=@RECCOUNTGOSUB SHOW_STUFFEND ELSE
* select failed for some reason .. display whatever you want to do about it
END
RETURN 1 (or zero if nothing will follow this)
SHOW_STUFF:
BEGIN CASE
CASE ROWS2SHOW < 1000
just populate the displayCASE 1
SET @USER0 to show you're in paged mode@USER0=1
populate a hunk and keep track of where you are in @USER0END CASE
RETURN
You can use either dimensioned arrays or fiddle with the keys to SYSLISTS or whatever. The point about parsing delimited data mostly has to do with Multi-Valued extraction or the LOCATE opcode. It turns out that a MULTI-VALUED XLATE can be pretty fast, but can produce bizarre results. For example, assume a record with 1000 related ID's and you want to get the NAME info for these. Since the name field (which may itself be a symbolic built out of from 1 to 4 fields or so), XX=XLATE('PATIENT',PAT_IDS,"PAT_NAME",'X') might have the following consequences:
1. You might blow the 64K limit on a variable if you had a lot of longish names.
2. The indirect calls to build the SYMBOLIC field PAT_NAME in the PATIENT file can impose significant overhead. Depending on how this is coded (Braces vs. @RECORD calls internally), you can use up a lot of string descriptors.
Oh, well you knew all of this anyway. I'm probably just blathering on.
Don Miller
C3 Inc.
At 18 APR 2001 08:12PM Bob Carten, WinWin Solutions Inc. wrote:
you could write page-sized records to the lists file,
listname=longlist'
list_id='
keys='
pagenum='
pagesize=20
done=0
i=0
call rlist(myselect, 5,
,
)loop
readnext key else done=1
until done
i+=1if mod(i,pagesize) else gosub put_listkeys=keyrepeat
if len(keys) then
gosub put_listend
return pagenum
**
put_list:
list_id=listname
if pagenum then
list_id:=*":pagenumend
write keys on f_syslists, list_id else
err=Unable to write lists rec ...'GoTo Errorend
keys='
pagenum +=1
return
error
this lets you put | ]| vcr buttons on the form,
Have search button perform whole select,
click on any VCR button set @PAGENUM in the form, then fire read.
code the logic to fill the edit table in the read event
read gets keys via xlate(SYSLISTS,LISTNAME*PAGENUM-1…., then builds EDIT table.
…..
I did something like this for a version of INET_POPUP
hope this helps
Bob
At 19 APR 2001 03:53AM Oystein Reigem wrote:
Bob,
Last time I encountered a similar problem it was more like yours: presenting long lists on the web. Like you I presented the list in chunks with controls/links to other chunks.
But I saved my lists and chunks in a different way. I did consider your technique of saving to tailormade chunk-sized SYSLISTS rows, but ended up with storing the list the standard way and instead established the chunks as a set of pointers into the SYSLISTS rows. (The pointers I stored in a separate SYSLISTS row with a name derived from the name of the original target. If the result list spanned more than one SYSLISTS row I initially made chunks only for the first row, waiting with the rest of the chunking process until really needed. This to keep access time down.)
This time it's not a web application. But more important is that I want a seamless presentation with no chunks. In the background I will use the chunking routines I developed for the web app, for faster access to any piece of the list. But the user should not be aware of any chunks. Neither will the chunks of data presented in the edit table correspond directly to the "background" chunks.
I don't foresee any problems with plugging in my old chunking routines, except I'm anxious to see how fast and fluid the interface will perform. My current problems are: (1) Keeping everything in synch when the user navigates the edit table (the content, scroll and current cell of the edit table, and the scroll of a separate scroll bar). (2) A weird problem with the edit table's GOTFOCUS running before window CREATE, or alternatively, a malfunctioning of my own brain (see the common discussion list).
- Oystein -
At 20 APR 2001 07:27AM Oystein Reigem wrote:
Don,
I hope I don't have to include symbolics when I display large result lists. But I'm not sure yet.
I might be a little rusty when it comes to symbolics and efficiency. I might have to read up a bit. Thanks for the warning.
- Oystein -
At 24 APR 2001 01:23AM Don Smith wrote:
Hi
Thanks for all your feedback. We agree with you that there is no easy solution to this problem.
We are working on a subroutine to handle this problem, but we are yet to assess the efficiency of handling data by this method.
Will keep you posted.
Regards
DON SMITH