× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.


  • Subject: Re: Software versioning....
  • From: MacWheel99@xxxxxxx
  • Date: Fri, 9 Mar 2001 14:18:48 EST

>  From:    Domenico.Finucci@Fiditalia.it (Finucci Domenico)
>  
>  Good morning, sirs.
>  I would like to know your opinion and your solutions about the management 
of
>  software releases: locking sources, releasing them, handling versions, and
>  so on. Do you use home-made solutions or packaged software ? 
>  Please let me know.
>  Sincerely
>  
>  Domenico Finucci
>  Fiditalia , Milano, 02- 4301-2494

Hi

I hope that my reminiscing provides some of the kind of information you 
looking for & that as other computer professionals review my history, your 
reaction is something other than "Al, you damn fool ......."

This is highly dependent on the software packages you have at your company, 
the willingness of the software vendors to provide source & the wisdom of 
employer in paying for access to it, the amount of disk space you have, the 
ability & willingness of the end users to do serious testing, what kind of 
budget is available to enhance the productivity of the computer 
infrastructure staff, how volatile the program changes are & so forth.  I 
have seen articles about packaged systems to support software versioning & I 
wish I had one of those, but I have never had the privilege of using one.  

The kind of thing that would have been most helpful to my career is to have 
some tool that compares two copies of same file & lists the differences, 
similar to today's PDM-54.  In fact we have user situations today where this 
would be of immense value.  I have some modification proposals currently on 
the table & we have done stuff like this in the past, where we create a file 
that has selected information in it as of the end of one month & we get same 
deal end of another month, and the users get a report showing which items 
changed which values month to month & what the differences come to.

Although I have had many jobs in my career, they all pale in significance to 
my experiences at Willis Music where I worked for 16 years & at my present 
job which I have had since Autumn 1984.  Both of them are family owned 
businesses in which the MIS staff is extremely small with me doing most of 
the work needed - programming modifications, managing software upgrades, 
operations work, researching end user challenges, security, the whole bit.  
So fortunately, for me, I have rarely been in the position of a programmer 
working with test stuff not trusted to touch the production stuff.

At Willis we had a few purchased packages that had NO activity with upgrades 
from the suppliers - it was basically 2,000 programs that I managed that was 
100% home brew (written by me or inherited from other people who had worked 
previously in my position). This was all on boxes before the AS/400 or S/36 
in which when I left them it was the S/34.  When a program was being 
modified, I had a copy of the original source, stored under a slightly 
different name, and the new version was being developed & tested also under a 
slightly different name.  

Fortunately for my sanity with testing, the vast majority of programs that 
got upgraded were reports & inquiries off of a large data base, so I was able 
to supply users with two menu options - the original version & the test 
version.  All programs had a style involving name of program in a fixed 
location on reports & screens, so anyone could tell from a report or screen 
print if they were looking at the original or test version.  The users could 
then come to me with a marked up print-out showing what needed to get moved, 
inserted, etc.

We did not have the luxury of test files, so when an update was involved, my 
policy was to do testing in the evening when everyone had gone for the 
evening - I would do a total backup - I would run some reports showing 
contents of records supposed to be updated, contents of some records supposed 
to be untouched, statistics on # records in file, totals of various fields, 
and a hash dump of the file to spool file - I also made copies of all the 
files touched by the program to be tested, using similar name - old variant.

Then I ran test, then same kinds of reports showing what is in the file, 
including hash dump to spool, then rename the files involved, using similar 
name new variant, and rename old variant back to the files so that the 
current reality is that the files should agree with how they were when I made 
the backup, so that normal evening processing can be running while I am 
examining the evidence to identify that the records that changed were the 
ones the program supposed to change & only those, and that the changes were 
in keeping with how the program was supposed to be doing things.  

In some tests I found a program might be messing with records it not supposed 
to, or fields not supposed to be altered, or missing something it supposed to 
be doing.  Generally I would find one thing wrong, go to the working source & 
figure out why, and fix that, then return to the evidence & look for some 
other problem, and keep at that until exhaustion, then another nite do the 
test over again, and not declare the software ready for the users until I had 
done a test in which I could not find any problems, not to say there were 
none.

The big problem at Willis with how we managed software modifications was that 
a user would explain to me what they wanted & I would reason with them about 
what was reasonably doable & we would come to a VERBAL agreement what I was 
to do & I would jot down notes after the meeting of my marching orders.  
During software development I usually had to make a few compromises with the 
agreement when I saw where there were some almost insurmountable challenges.  

Meanwhile the user is excited about the new software I am developing & 
EMBELLISHES EXPECTATIONS upon what the VERBAL agreement was & does not 
realize that the expectations are evolving from the VERBAL agreement, so that 
whatever I deliver is a disappointment.  I tried several times to get 
management to approve some kind of system of written statements of what it 
was that I was supposed to be doing, that was signed off by the users, so 
that this could be compared to my deliverables, but they thought that was a 
waste of time, so we never got it.

At my current job, the company was in the middle of a conversion to MAPICS 
when I arrived, that I was never told when it started, but my sense is that 4 
years into the conversion we stopped it, at which time part of the company 
was running on the old homebrew ERP, part was running on MAPICS, and part was 
running in parallell.  We searched for another ERP, decided on BPCS, and 
there was about 6 months time in which several accounting departments were 
running in triple ... old homebrew, MAPICS, BPCS.  Thankfully it was not me 
who made the decision to do that heck of a responsibility.

Over the years on BPCS there have been a number of evolutions in how we do 
things & currently we are in a reality with a version that SSA will be 
dropping support for in a year & I am extremely glad of management vision of 
wanting to remain on that version.  Management has turned over, so the 
current folks making these kinds of decisions for the most part do not 
remember the reality when we were on a version that was SSA's state-of-art.  
SSA came out with several thousand upgrades every six months, these upgrades 
included lots of fatal flaws, it all needed to be tested to find the flaws.  

Most end users do not have the patience, disposition, or training for heavy 
duty software testing, nor the management support for them to be spending 
significant amounts of their time validating software.  We have many people 
who can learn their jobs & do them well, but are not equipped to be exploring 
all the possibilities of what can go wrong & diagnosing what causes something 
to go wrong.  You really need a totally different and much more expensive 
investment in corporate staff to do a good job of testing releases of new 
software, than using a stable product.

Currently rules of thumbs that I now operate under include:

Any Downtime is Undesirable.  Unplanned Downtime is a Mortal Sin.
Users should never have to wait on the computer without a reasonable logical 
explanation.
Crashes & Lockups are Intolerable.  We need to analyse them & make 
modifications to reduce these incidents to virtual non-existance.
Users need access to their data during normal working hours & should have the 
expectation that their data is correct & reliable.
There needs to be continual improvement of the quality of service provided by 
each department to all other corporate departments.

All software changes need to be tested, with the department in charge of the 
data affected by the changes to approve the implementation of any 
modifications that are updating their files.  The addition of new reports & 
inquiries can be handled by 2 menu options - old & new, but what I have been 
doing is a menu option to add to library list a test library with enhanced 
version of same named software.

All software changes need to be trackable, so that if something goes wrong, I 
can back track what lines of source code were involved in any modification, 
and why they were done.  I have noticed that in some other arenas people do 
not do this, which can play havoc if you have multiple people modifiying the 
same source.

I have toyed with putting program documentation separate from the source, 
like in QDOCSRC, especially when there are many objects getting similar work. 
 When I do this, I am careful to put in each source affected some line like 
"See document JITMODS in QDOCSRC of BTESTSRC, section "Shop Order Transfers 
We Don't Want" for an explanation of this modification & listing of all 
objects affected by it."

It is unfortunate that some changes are ready for testing and the users told 
HOW then they forget, until the problem needing the change occurs again & the 
users treat the occurence as a new problem, and in my research of their 
problem, I stumble over the fact that there was a fix waiting on them to 
test.  Again I tell them HOW to test this, and history repeats on the same 
problem.

Any implementation of software must be done in such a way that if we discover 
some problem that was not caught by the testing, we can go back to the 
previous version of the software.  This has happened often enough to 
re-emphasize the importance of this policy of protection.

Our library list includes base software from SSA, and various releases of 
enhancements & small fixes that they call BMRs ... 

the latest version of some fix is the first thing in library list ahead of 
earlier stuff.

When we get a fix & are done with the testing phase, it ends up in the 
library list without combining the software, so that if a later problem 
develops we can take the BMR back out of the library list, but because we are 
so close to the IBM limit on how many libraries can be in our library list, 
after 6 months of no reported problems, I move BMRs from their own little 
libraries into a consolidated library of all BMRs that have gone 6 months or 
more with no problems.

their documentation says that release-2 includes everything that was in 
upgrade-1 but we found some stuff in upgrade-1 that was not in release-2.  We 
never found out if it was supposed to go away or if this was an SSA 
oversight.  A tool that would be a big help would be something that can 
compare the contents of two libraries & say "Here are object names that exist 
in this library that do not exist in that library."

Our library list also includes, in front of each release, copies of programs 
from earlier releases that were broken by those releases.

I could go on, but I think this is enough for a post here for now.

MacWheel99@aol.com (Alister Wm Macintyre) (Al Mac)
AS/400 Data Manager & Programmer for BPCS 405 CD Rel-02 mixed mode (twinax 
interactive & batch) @ http://www.cen-elec.com Central Industries of 
Indiana--->Quality manufacturer of wire harnesses and electrical 
sub-assemblies - fax # 812-424-6838

+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to MIDRANGE-L@midrange.com.
| To subscribe to this list send email to MIDRANGE-L-SUB@midrange.com.
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.