While I agree it is best to understand the limits before designing a system, in 40 years of coding, it has been a constant battle dealing with limitations placed by the big OS companies - IBM, Microsoft, and others. The ifs object limits is only one of many. The mother of all limits was Y2k..
That it is a performance hit to have more than a few thousand objects in a directory when the max is 999,998 objects kind of bites..
To have to build complex application structures to work around such limits adds more overhead.
I have worked on many applications where ifs use is critical, and just because your particular business might not have that need, doesn't mean it is not fairly common.
It's also true that big volumes may be 10 years down the road from when the app is designed.
In my current work we import, generate/export thousands in a day or two, and have regulatory requirements to keep data for a long time...we work around the limit.
For the record, we have been able to live with performance serving docs from directories with 200,000 + objects (and I think some directories have much more).
Have had to be careful of generating a doc (pdf, txt, csv) and then immediately move or ftp it - either a 2 second dlyjob or monitor message about existence and retry.
-----Original Message-----
From: MIDRANGE-L [mailto:midrange-l-bounces@xxxxxxxxxxxxxxxxxx] On Behalf Of Joe Pluta
Sent: Thursday, April 11, 2019 11:22 AM
To: midrange-l@xxxxxxxxxxxxxxxxxx
Subject: Re: IFS limits
I agree with that sentiment, John. The last time I had a situation where we had a large number of files in a directory, we started seeing various performance issues. Lists took much longer than simple linear math would have suggested, some utilities couldn't handle the number of files, various commands in QShell had problems, and so on.
Generally speaking, I prefer to limit the number of files to the low thousands. After that, I try to come up with a subdirectory management structure.
On 4/11/2019 10:12 AM, John Yeung wrote:
On Thu, Apr 11, 2019 at 10:08 AM Doug Englander
<denglander@xxxxxxxxxxxxxxxxxxxxxxxx> wrote:
We have one IFS folder with over 261,000 PDFs in it. I am wondering what the limit is so I can be proactive and avoid a problem.
In my opinion, quarter of a million objects in one directory is way,
way, way, way, way too many already. Obviously, it's best if you're
proactive right from the beginning. I have a rule of thumb: If I
stumble upon a directory and I have to wonder if it has too many files
in it, then it has too many files in it.
What counts as "too many" for my sensibilities is so far below any
hard limits that I keep forgetting that there can even BE hard limits,
other than total disk space.
John Y.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
visit:
https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxx for any subscription related questions.
Help support midrange.com by shopping at amazon.com with our affiliate link:
https://amazon.midrange.com
As an Amazon Associate we earn from qualifying purchases.