Robert:
I have been following this thread with some interest.
So far, no one has suggested the use of "design reviews" to augment code
reviews.
Before any code is written, gather requirements, do some analysis, and
create an outline (typically using "psuedo-code") to describe the
planned design approach of the changes or enhancements to be made. Then,
review this document with peers, before any actual coding takes place.
Often, newer techniques can be suggested at this early stage, thus
saving the time "wasted" on writing some "old fashioned" code, only to
have that uncovered in a "code review" with a recommendation to re-write
(at least parts of) that code ... thus avoiding wasted effort.
I have worked at professional software houses where this was standard
practice -- first you would document the design, and circulate that
design document to those peers who would participate in any "design
reviews" -- after each design review, if any significant issues turned
up, a (usually much shorter) follow-up design review would be scheduled
to review the amended design. Finally, once the overall design was
approved, coding would commence.
Next, a "code review" is scheduled to review the code (or possibly
detailed pseudo-code) to see if it conforms to "shop standards" and to
ensure it will meet the approved design objectives.
An important philosophy when using design reviews and code reviews is
"_egoless programming_" where the participants do not place any blame
(on the designer or programmer) and everyone agrees that the goal of the
process is to improve overall quality, not to find fault with other's
work. (We are all only human, and humans learn by making mistakes.)
The time invested in documenting the design requires developers to
thoroughly _understand the requirements_, and the design reviews often
turn up any oversights (errors of omission) or other misunderstandings,
or errors of comission (coding errors), thus preventing costly errors
further down-stream in the process.
Also, the code reviews are usually much smoother and shorter, since the
design was already reviewed and approved, and everyone has a good idea
of what is expected.
In really large shops, business analysts might be responsible for
gathering and documenting the requirements and perhaps doing some of the
initial "design" and then it is handed over to "programmer analysts" to
continue the process. Also, in larger organizations, there may be a
separate Quality Assurance team of testers. Business analysts who
defined the requirements could also be involved in developing "test cases."
(In "test driven design" test cases are developed along with the
requirements, so that during testing, it can be more easily determined
whether the code developed meets the stated requirements, by application
of rigorous testing.)
This all goes back to the classic "waterfall" model of software
development, illustrated below:
Requirements gathering
Analysis
Design
Design review
Code
Code review
Testing (Quality Assurance)
Design "test cases"
Unit Testing
Systems Integration Testing
User Acceptance Testing (if
needed, e.,g. for "usability" issues, etc.)
Regression Testing (to
ensure existing functionality is not negatively impacted)
Management Approval
Implementation
into the "live" production environment(s)
Note that at any of these steps, you can iterate back up to any of the
preceding steps, for "corrective measures", if problems are found. Thus,
it is an "iterative waterfall" model.
This detailed "waterfall" process model applies more to new applications
design and enhancements, but a "scaled down" version of the above can
also be used for "bug fixes" too, with many of the same benefits. For
example:
Requirements
Design (and review)
Code (and review)
Test
Implement
At each "step" (indentation) in the above "waterfall" diagram, it is
estimated that the cost to correct any errors at that step are 10 times
(10x) the cost of detecting and correcting that same error in the
previous step. So, you can see that if a problem is not detected until
the Testing phase, it costs many times more than if that same error can
be caught much earlier (e.g. in a design review or code review.) And of
course, it is most expensive when a problem is not detected until after
it is implemented into "live" production use.
Nowadays, where "/Agile Programming/" is all the rage, in the rush to do
things faster, it seems that many have forgotten (or /have never
learned/) the lessons of the past 30+ years in Data Processing (DP),
Management Information Systems (MIS) and Information Technology (IT).
The "waterfall" process seems like it just adds (unnecessary) "overhead"
to any project, but, when done properly, it results in /_improved
quality_/ and an overall _/reduction in total elapsed time/_ to get from
the top of the waterfall (Analysis) to the end result (Implementation)
while also helping to ensure that the desired results are achieved
(Quality).
What do you think?
Mark S. Waterbury
> On 11/16/2011 9:47 AM, RNewton@xxxxxxxxxxxxxxxxx wrote:
Thanks everyone for their input. I see the majority here are smaller shops
it or one man consulting firms so I can see why there are not any code
reviews needed. Our needs are a little different however.
We are implementing coding standards in our shop and will be enforcing
those standards with regular code reviews. Our shop has 30+ RPG
developers, each with their own styles and techniques varying from
techniques considered old 15 years ago to some guys counting the days
until 7.1 gets installed to get DB2 ALIAS in externally described data
structures.
We have ever expanding presentation technologies that our business logic
in RPG must be able to support. If we do not have code reviews, we will
continue to see business logic tied up in interactive programs and not
pulled out to service programs for reuse from other interactive
applications, web pages, web services, stored procedures, desktop
applications, etc...
Modernizing a shop this size without code reviews and shop standards for
techniques and syntax styling would be impossible (as we have shown over
the years).
Thanks,
Robert Newton
Estes IT
System Architect
804-353-1900 x2256
rnewton@xxxxxxxxxxxxxxxxx
As an Amazon Associate we earn from qualifying purchases.