advising on IT-business alignment
IT-business alignment about us blog our services articles & reports resources your profile exposure
blog
blog
Monday, August 11, 2008

The dilemma of "good enough" in software quality

I recently finished some research on the cost and quality benefits of upfront, automated code and design review (i.e. prior to and during the development and build process) through static analysis. One of my conclusions was that the case for "good enough" is no longer good enough in delivering software. But some might argue that this viewpoint is promoting the notion of static analysis as a means of perfecting something - "gilding the lily" - rather than an essential tool for knowing whether something is "good enough".

Not so. In many fields, it is widely accepted that it is not necessary to apply a rigorous quality or testing schedule without first understanding what is at stake, what the risks might be in the event of failure and the severity of those risk. And from there deducing the level of, and appropriateness of, testing techniques, processes and tools to ensure that what is delivered is fit for purpose or "good enough".

In software delivery I would argue that too often the desire to deliver something quickly - especially to meet a deadline, however ambitious or unrealistic - overrides the key question of (fully determining) whether what is being delivered is actually "good enough".

The attitude of 'good enough' has been hijacked as an excuse for "sloppy" attitudes and processes and a "lets suck it and see" mentality. Surely such an attitude cannot continue to exist in delivering software that is increasingly underpinning business critical applications?

It is, you could say, a problem of managing expectations. A one star hotel probably offers adequate and "good enough" services for those on a budget. But this would not be sufficient for five-star luxury seekers. The key, though, is that customers of each know what they are getting for their money and whether it is fit for their purpose. There is a quantifiable means of grading what is delivered and matching that to what is expected.

Software technology advances allow varying levels of sophistication and capabilities and enable much more to be achieved. Software is increasingly becoming deep rooted in every aspect of our modern lives. New business models are being founded on applications and systems developed with many of these new technologies and approaches. If organisations start to restructure their working practices around applications and systems which rely on the new generation of communication and collaboration technologies and approaches, then failure due to poor code or application quality becomes even less acceptable.

The inclusion of rich media and visuals; the push for greater collaboration through the Internet; and unified communications for richer interactive social or work activities, means that any failure in such services would not only have the potential of creating higher levels of frustration - it would reduce productivity more sharply. On top of this, company brands become more easily exposed to damage.

Increasingly when people buy or use software they expect, if not 5-star performance, then performance and quality that is at least "good enough".

How many companies can rigorously look at their testing programs and say that is what they are delivering?

If you set out to deliver something that is important then applying the sloppy, suck-it-and-see "good enough" mentality just doesn't stand up.

What I find incredible after all this time - given the weight of evidence and eminent studies on the cost savings and the growing complexity and importance of software in our modern lives - is that the "sloppy" mentality and attitude still holds such sway in software delivery processes.

Many organisations don't spend nearly enough effort on improving the quality of the software they produce. More often than not they pay lip service to the concept whilst secretly holding the belief that it is a waste of resources (time, staff and money). As I stated at the start, the reality couldn't be further from the truth. In a future blog and report I will examine and discuss the evidence in more detail.

Clearly there is more to this debate: especially as we place a greater reliance on software and grapple with the growing pressure to release new features and functions more quickly both to differentiate in the market and to gauge users' reactions and acceptance.

Is it better to just get something out there that is of reasonable quality and worry about more deep-rooted bugs later, since it is a well known fact that software is never flawless?

This certainly is the option taken by many in the business of delivering software who have met with varying degrees of success and, if you are on the receiving end, - frustration, disappointment and loss.

In the end it boils down to one's interpretation of "good enough" and the answer to the intriguing questions: who and what should dictate what constitutes "good enough" software quality and how do you go about governing it?

If you are looking to answer these questions you will need to adopt a combination of strategies:

- You will need the support of good processes. Static analysis is certainly one way of improving the processes and being efficient with it. It is a tried and tested means for progressing software quality, predictability and lowering software costs. However, it is not the only tool for software quality. It is part of an arsenal of tools and processes needed for a wider and more lasting, cost effective approach to delivering quality software.

- You will need a clear vision, and an understanding of the goals (business and technical) to establish whether something is actually "good enough".


- You will need to put in place a connected tools and communication framework to underpin and support your governance process.

Labels: , , , ,

Comments:
Your research reminds me of a column that I wrote about a decade ago about the "throw it over the wall" mentality prevalent in software development. Developers just don't want to be caught dead doing QA if they can move on to the next exciting thing, and they have little respect for those that do. Some things just never change.
 
Yes you are right, some things never change. But then that's probably why we continue to see the same problems and frustration with delivered software applications and components over and over again.

Good development teams and organisations look to break the cycle. Hard as this might be, I do however, believe that the benefits do in the short and long-term outweigh any upfront investment and resources applied to tackling the issue.

Of course there is a caveat to this and that is provided a carefully considered plan of approach and strategy to making those investments and committing the resources is followed. As with anything that people look to change, a review of the existing environment to identify where the problems are along with a clear view of the goals is important to ensure the most effective outcome.
 
Post a Comment

<< Home


Burn this feed
Burn this feed!

Creative Commons License
This work is licensed under a Creative Commons License.

Blog home

Previous posts

Links for 2008-07-31 [del.icio.us]
Links for 2008-07-28 [del.icio.us]
BPM isn't yet "the killer app for SOA"
Businesses aren't machines, and enterprise archite...
If you want our take on where Oracle's taking BEA...
New Collaboration Service Launch!
Links for 2008-07-04 [del.icio.us]
Links for 2008-07-04 [del.icio.us]
Collaborative mind mapping
A comprehensive analysis of IBM's BPM Suite

Blog archive

March 2005
April 2005
May 2005
June 2005
July 2005
August 2005
September 2005
October 2005
November 2005
December 2005
January 2006
February 2006
March 2006
April 2006
May 2006
June 2006
July 2006
August 2006
September 2006
October 2006
November 2006
December 2006
January 2007
February 2007
March 2007
April 2007
May 2007
June 2007
July 2007
August 2007
September 2007
October 2007
November 2007
December 2007
January 2008
February 2008
March 2008
April 2008
May 2008
June 2008
July 2008
August 2008
September 2008
October 2008
November 2008
December 2008
January 2009
February 2009
March 2009
April 2009
May 2009
June 2009
July 2009

Blogroll

Andrew McAfee
Andy Updegrove
Bob Sutor
Dare Obasanjo
Dave Orchard
Digital Identity
Don Box
Fred Chong's WebBlog
Inside Architecture
Irving Wladawsky-Berger
James Governor
Jon Udell
Kim Cameron
Nicholas Carr
Planet Identity
Radovan Janecek
Sandy Kemsley
Service Architecture - SOA
Todd Biske: Outside the Box

Powered by Blogger

Weblog Commenting and Trackback by HaloScan.com

Enter your email address to subscribe to updates:

Delivered by FeedBurner