Exporting Arev 3.12 to DBase (AREV Specific)
At 17 FEB 2000 10:55:49AM Ed Bell wrote:
I need to export some Arev 3.12 files to a flat file such as DBase.
I then want to Import them to SQL 7.0. I know that there is a export function in Arev for doing this and I have used it before for some small files. I need to do some rather large files and feel that this is beyond my knowledge of Arev. Does anyone know where I could start looking for a consultant to help with this project. I know there are
a good number out there but I'm not sure where to start looking.
At 17 FEB 2000 11:23AM Matt Sorrell wrote:
Ed,
Quite honestly, if you have used the export ascii routines for small files, then you know all you need to do it for large files, it will just take longer. Provided you are not having to export multi-valued data, it is very straight forward to export your data into either a fixed width or delimited file using the built-in export routines.
Matt Sorrell
At 17 FEB 2000 12:00PM Warren wrote:
As Matt said, it is mainly a matter of time due to the number of records.
However there are several (perhaps) severe limitations to the ARev export utility:
1) The number of columns that can be exported it limited. This will manifest itself as an ERRMSG 27.4 I believe. There is a limit to the number of entries that the program stack in ARev can handle and EXPORT can hit this due to the nature of the program. You may need to split the file into several exports depending on the number of columns required to export.
2) There is a Y2K bug in date handling. The export program will only handle a "D2x" OCONV on date fields and *always* appends the century '19' to create the internal DBase date format.
3) Mulivalued fields are not normalized. You can export them as memo fields with the following limitations: a) The memo field must be defined in the export template with a size of 10 - anything else will result in an invalid .DBT file. b) The value/subvalue marks will be embedded in the memo field, although you could use a symbolic to change these to something else.
If you still feel you need a consultant, post a wanted notice in the Developer Network section of this website. Mention of geographical location would be most welcome.
At 18 FEB 2000 11:00AM Michael Slack wrote:
We have a project at the moment where we are building an application in PowerBuilder to replace one of our Arev 3.12 applications. About a year ago I wrote a process that allowed us to take data from an Arev table and write them to a text file. From there we used SQL to import the data into the PowerBuilder tables.
The process I wrote pulls in the dictionary information into a window. From there the user sets values in columns to indicate if the data is to be put into the text file and at what position and in what format. Once the conversion settings are done, the next part of the process will read that conversion information so it will know how to build the records in the text file. While it is doing that, if it runs into a piece of data that doesn't match what is expected, it will write an error line to another text file. This will allow the user to clean-up the data or adjust the conversion settings. Once the text file is build, the user would read that into the target file using and SQL statement.
Because the process that I created was originally meant for one particular project, there may be things that I left out or coded for specifically that may not quite match your needs. I did try to make it as generic as possible. If you would like a copy of the programs and window that the process uses, let me know.Michael Slack
E-mail: [email protected]
At 02 MAR 2000 03:32AM Dean Todd, Computer Resource Team - Orlando wrote:
Export simple LH structures is easy. But, when you start dealing with MV and Relation indexes, it quickly become apparently most other DB environment are not up to it. We had to write RBasic routines for MVs and relation indexes, essentially hard codeing the info. Wish I could find the code, but it's nearly 4am here and I'm out of gas. Dean
At 02 MAR 2000 12:25PM DSig wrote:
Ed,
You have received a lot of good information and help .. let me add just a little bit.
1) Have you mapped out your database? If you haven't then this is the first thing you want to do. By moving into a 'normalize' world you are going to kill yourself with performance hits if you do not map out the data first and make rational decisions about the normalization to be done.
2) I think someone mentioned relational indexes. In most cases you can do way with them as index on foreign key will return all the information you need. There are times when you need to make a secondary table for referencing these but these instances seem to be few and far between .. thankfully
3) Be sure to create/run a routine to examine all data in each table. Let's face it .. the one big problem with all the flexability we have in the MV world is that you don't have to do what the dicts say. Quite often there is data in tables that you really didn't expect and that the dicts didn't 'define'. So .. a routine which will find the max number of fields, each fields max size, are there delimiters (sv smv tm etc) is the field numeric …. This is critical before you start trying to load data.
4) Export your data into text files. Then use BCP to load your data into your SqlServer db. BCP is very very fast.
I can't emphasize(sp?) the importance of data analsys (#3) and modeling (#1).
I have moved several systems from MV world to Relational world and these are the 2 points which will kill a project.
If you have any questions let me know ..
DSig
At 07 MAR 2000 09:55PM Chris Leenhouts wrote:
There is a new utility called xPort at http://exorsys.com/revelation.htm which resolves all the technical and conceptual problems asssociated with Revelation database exports.