|
Hi Jamie! >The subject is testing and I feel that I have a >valid reason for promoting - sorry, discussing - >our range of automated testing tools! As I mentioned earlier, the physical amount of time required to traverse 5000 displays. each with say 4 options and 4 more command keys in all combinations is astronomical. An automated test tool would help take the drudgery out of the chore, but it won't help deciding WHAT screens, data and application flow are 'good enough' to say that we're ready to certify our application works with V5R2. I'm more curious to see if anybody on the list has a methodology for ranking a function/screen/option/what have you in terms of it's 'criticalness'. For instance, if my application uses a calendar function to help schedule service calls, that's pretty critical. If it uses a calendar as a convenience, it's not as crucial. So I'm not asking HOW to test an application, rather I'm asking if people have a formal scheme to determine what functions get tested. --buck
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.