×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
Hello Roberto,
Am 14.11.2019 um 18:03 schrieb Roberto José Etcheverry Romero <yggdrasil.raiker@xxxxxxxxx>:
When a customer's ASP is more than 60% IFS, with PDFs, JPEGs and etc numbering into the double digit million files, backup becomes pretty pretty slow.
Good point but this is also true for other platforms. Doing file-level backups with a real lot of (small) files is inherently slow. NFS is just a layer above local filesystems on other machines, so this problem is moved but not really solved.
Besides, why have a 20 core machine with only 3 licensed for i and waste some of those i cores on IFS when you could have a fileserver on the same machine?
Good point. I doubt that IFS does great strain on CPU but on IOPs, though.
That is the premise I start this question from.
Thanks for supplying this important information.
I've been getting the customer to splice the folders by year/month/branch or whatever criteria helps to avoid having so many files in a single folder (and to allow saves to NOT save the entire IFS daily but only the current in-use folder).
Thousands of files in a single hierarchy? While this might work perfectly on QSYS.LIB, it's a very very bad idea in anything else I know of. Linux, Windows, other Unices and some greatly outdated and mostly forgotten stuff. And IFS, which is functionally derived from UNIX filesystems. So yes, creating a hierarchy to limit files in one folder to some 100's greatly helps. Mo matter which platform.
Other tips?
Depends on the exact backup strategy, which shall be dictated by the allowable time frame to do a full restore from backup media. If you elaborate on that, maybe there are tips. :-) (I'm very keen to let people know why they should do backups properly: Not for having a better sleep because they know they have backups. But to have a fast and easy way to bring the data back in case of a horrible failure. Fast and easy because when such things happen, stress level is already high and I'd be not in the mood to think about how exactly I am supposed to get the data back onto a replacement system. It must be really easy and foolproof at it's best.)
:wq! PoC
PGP-Key: DDD3 4ABF 6413 38DE -
https://www.pocnet.net/poc-key.asc
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
[javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.