× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



John Jones wrote:
I believe if you use ZIP to compress the database you can also specify file
sizes so it cuts off every x MB. That way, if you transfer the data and it
errors out you can continue from the bad chunk and not have to start all
over.

And database files will generally compress like crazy.

Or try it in a SAVF using data compression set to *MEDIUM. That usually
shrinks databases by 80+%.



John I had tried *HIGH and was shale we say NOT impressed the time it was taking, nor the CPU usage. I never did have time to fully finish the library in question. I did however at your suggestion try *MEDIUM and was please with the results. The 50GB lib normally took about 45min to do a SAVLIB to a Savf with Compression *YES. The resulting file was about 16GB. With *MEDIUM it took just under 2 hours to save but the resulting file was approx. 4GB.

This time frame to save and then transmit the savf now works for us... Many Thanks


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.