Keeping this strictly as a business decision, how much of an overhead are
we talking about here? If the business decision is to avoid overhead then
how much are we truly saving? Or are we clinging to something from
technology that is no longer applicable?
I did some time trials.
FIIMJOE O E DISK
D iprod 25A
D iprodnbr 25S 0 overlay(iprod)
D theTime s z
for iprodnbr=1 to 50000;
File defined 4 different ways. Program needed no recompiling. 3 runs for
each file definition. CLRPFM between runs. Times are in seconds (second
dsply - first dsply).
CREATE TABLE ROB.IIMJOE (IPROD CHAR ( 25) NOT NULL WITH DEFAULT,
CONSTRAINT ROB.IPROD PRIMARY KEY (IPROD))
CREATE TABLE ROB.IIMJOE (IPROD CHAR (25 ) NOT NULL WITH DEFAULT)
Granted I am a little floored by that constraint. Let's set aside for a
moment and drop down to the old method of defining a primary key
constraint. The last time trial. As you can see, you INCREASE overhead
by not having a primary key. So, again, what business reason is there for
not having a primary key on the item master? Even if you counter argue
that the data needs more time trials because of one skew I think it's fair
to say that there is no overhead, performance wise.