|
> My experience is like Bill's. If the program is a heavy hitter and you can > easily keep DIV out of the code its worth it. I just did a benchmark on my system. A multiply takes 5893 nanoseconds, and a divide takes 20931 nanoseconds. This means that if you had code that does ONE MILLION divides, you would waste about 14 seconds. It's definitely not worth making your code harder to read to save 14 seconds per day in some big million record processing batch job. Furthermore, in an interactive app which doesn't do a large million-iteration loop, the time will never be noticed since it's less than the time it takes to blink. In any application you write where you "save time" by using MULT instead of DIV, I can almost guarantee that there's something else in that app that's wasting more time than the difference between MULT and DIV.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.