× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Pete,

Yes, my parsing could probably use some work, but it's reletively clean
now.

the last csv I posted was 77mb, and it took a couple hours to do the whole
thing - reading, data parsing, date conversion, data validation and update
another PF.

on the other hand, I was using fgets to read one record at a time (average
size 30-40 bytes), so based on your benches, i'd say i could save a bunch
of time by at least increasing my read buffer size.

and I'll probably end up writing the parsing routine to process this
particular file and screw flexibility.

as a learning experience, this all has been invaluable.

thanks again for everyone's help,

rick

-------original message---------
I didn't do anything except read the data - no parsing. I was interested in
the different timings of the various methods of reading the contents of the
file. I wasn't proposing to do Rick's job for him. You're right, of course,
that the crux of the matter is the parsing. I suspect that is the area
where
the real time savings are to be made.

Pete



As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.