× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Hi Rob,

A number of things to discuss on this one.

The recommendation - it is good to consider what drove the recommendation
for the size up to 1MB. It was performance for backup and recovery as
well as catalog resyncs. Ideally, you really don't want to hit 2 million
.nlos as there have been problems in the past with such a large DAOS
catalog. With that said, if you are not having any trouble with backup
and recovery and the number of .nlos is staying somewhat constant or
growing slowly on NOTES01, I wouldn't bother removing those old small .nlo
files. Why mess with something that isn't broken? However; my guess is
that the archive server is going to continue a steady climb and thus
increasing the minimum size on that server asap would be recommended.

The estimator - The estimator will still work for you. However; to get
realistic data from it be sure you have
DAOSEST_BUCKETS=16,64,128,256,512,768,1024,2048,3072,4096 set in the
notes.ini or specify the sizes you prefer. Also, by default the estimator
only reads 50% of the databases and then uses the data it calculated from
those to "estimate" the results. For a more accurate estimate you can use
the -p parameter and specify a higher percentage. Of course, the more
databases read the longer the estimator takes to run. For example, to
collect on all databases: Load DAOSest -o DAOSEstimator_Output.txt -p
100

Pruning - There is a nasty, nasty bug that can delete .nlo files
prematurely. Before you even consider following the technote to change
the size of the .nlos and begin to prune out the small files, be sure you
are at 8.5.3 and have run a resync -force since the upgrade! (If anyone
out there reading this is using DAOS and has not upgraded to 8.5.2 FP4 or
8.5.3 - do it now and run resync -force!!!) If you are not sure if you
have run a resync -force lately, run it again before you start. The other
issue is that it can be hard to get rid of those small .nlos.
Specifically, you need to get compact to run and complete on all of the
databases. After all, trying to track down which database still has a
reference to a single .nlo file that should be pruned is very difficult.

In summary, will technote 1449358 work? Absolutely, I wrote it and
successfully worked through it with several customers. Was it time
consuming - yes, when scheduling and verifying the compacts. What is
another option? Just change the minimum size and only worry about all new
attachments. On your mail server the old/small files will go away as
they are archived from the server. As for the archive server. You can
certainly run the compacts, and let prune work at its scheduled time as
I'm not familiar with your data retention policies for that server.

Good luck!
Amy
_________________________________________________________
Amy Hoerle
Kim Greene Consulting, Inc
Senior Consultant
ahoerle@xxxxxxxxxxxxx
507-775-2174 (direct) | 507-367-2888 (corporate)
Office Hours: 8:00 AM - 2:00 PM CST, other times by appointment
http://www.bleedyellow.com/blogs/iLotusDomino
http://www.twitter.com/iLotusDomino




From: rob@xxxxxxxxx
To: domino400@xxxxxxxxxxxx,
Date: 04/23/2012 08:41 AM
Subject: Best practices: DAOS size recommendations
Sent by: domino400-bounces@xxxxxxxxxxxx



I was reading
http://www-10.lotus.com/ldd/dominowiki.nsf/dx/daos-best-practices
and it recommends a minimum DAOS size of 1,048,576.
The recommendations are set to the above to avoid overwhelming the file
system and backup utilities.
We currently have this set to:
DAOSMinObjSize=64000
Our NOTES01 server has 7 'containers' (DAOS/000# directories).
Our ARCHIVE1 server has 40 containers.
Each holds a maximum of 40,000 nlo's. WRKLNK can't display them all with
option 5 (I wonder if it's the old 9999 subfile limitation? CPDA092).
EDTF '/notes01/notes/data/DAOS/0001' (or option 2 to edit on the directory

itself) works just dandy to list them all.

There's a document on how to change your daos size
http://www-01.ibm.com/support/docview.wss?rs=0&context=SSCPNFN&uid=swg21449358&loc=en_US&cs=utf-8&lang=


Will the DAOS estimator go the reverse and tell us what the size of these
nsf's will be after changing to the recommended value?
We also have
- Use LZ1 compression for attachments
- Compress database design
- Compress document data
all turned on.

Comments welcome.

Rob Berendt

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.