|
As an aside, some years ago, as I looked at other platforms and how they
handled subfiles I noticed that rarely is a subfile greater than a couple
of hundred rows. When I wrapped my head around that I recognized no one is
going to willingly page through 6000 rows. Ever. That made me rethink work
flow and made me start watching what these other platforms do when there
are millions of records available. Now, when a subfile is likely to exceed
300 records I step back and look to inserting a filter to cut down the
records to be processed. Once I changed my attitude about the 9999 row
subfile limit I found my programs were simpler to use, less training
needed, and markedly improved response times. Also, writing only load-all
subfiles is just a whole lot less code.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.