I think it has already been well articulated that code modernization won't
reduce resource utilization (CPU, memory, storage, etc.). It's more likely
that more compute resources will be consumed after a modernization
One exception may be if your OPNQRYF definitions are not properly aligned
with your database indexes. In that case the SQL SQE could improve
performance a lot.
The thing I like about the OP is that - at the core - Tommy is asking what
the "value proposition" might be for a modernization initiative?
In addition to an improvement in developer productivity, which has already
been asserted - one other area of focus could be the implementation of an
IBM i infrastructure which reduces or altogether eliminates the need for
Windows resources. Have you considered that?
For example, we implemented a utility which enables web browsers to attach
any number of PC files (documents, images, multi-media, etc.) to
practically any type of IBM i DB record, and thus eliminate the need for a
separate document management system - most of which tend to be Windows
Consider what applications you might be running under Windows servers. I
suspect that the company would save money by migrating them to IBM i on
But what sort of infrastructure might be required to do that? Does it help
for me to provoke thought?
This thread ...
Re: how to determine cost savings based on resource utilization, (continued)
This mailing list archive is Copyright 1997-2020 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact