Tag Archives: Quality

Linus Torvalds, Linux, and the Issue of Software Quality

Friend and Maemo/MeeGo bugjar master Stephen Gadsby alerted twitterites yesterday to a Fedora bugzilla flamefest, and at first blush it made for interesting comic relief.  Who doesn’t enjoy a good Internet argument?

But a second read sobered me up quickly.  The bug turned out to be an issue introduced into the crucial (and occasionally controversialglibc code library that doesn’t appear to have been sufficiently regression-tested.  The code change reason is described as an execution speed improvement, but it appears to have come at the expense of pre-emptive error-checking.

Most people aren’t going to care about the technical reasons underlying the discovered bug.  Most will, instead, be concerned with its impact.  And that gets us to the reason behind me writing today.   Continue reading

Nokia and the Art of Fulfillment

The current analytical buzz about Nokia’s mind and market share issues tends to be pessimistic, presupposing that the company has no chance of reclaiming its former glory days due to the unwitting tag-team onslaught of Apple and Android.  But this negative assumption arises from ignorance and forgetfulness.

It’s certainly true that customers have a stubborn inclination toward brand loyalty that can be difficult to unseat.  But Nokia was once on the positive side of that equation in areas where it now struggles (or has given up altogether).  What could keep it from returning to that former glory?

Nothing, actually.  Continue reading

A capacity for quality

About eighteen months ago a higher-than-usual employment bonus allowed me to finally purchase a large, flat-screen television.  Had the bonus amount been even higher I would have gone with a beautiful 40 inch Samsung LCD but it was priced at four times what I wanted to spend.

So I settled on a very nice 42 inch Phillips plasma that had been refurbished, halving its cost to $1000 USD.  Buying it was a gamble, as it only came with a 90 day warranty instead of the one year coverage for a new TV.  The salesman wanted to add an extended warranty to the purchase but I declined.  He seemed to think I was an idiot but I had an advantage not shared by the typical TV buyer: a background in electronics.

Continue reading

Reaction vs regulation

A strong free market advocate I know was extolling its virtues last year even as dark clouds of the economic apocalypse were forming.  Brushing aside the painful lessons of the 1980s Savings and Loan debacle, he was certain that everything would self-correct.

This same free marketeer is now mumbling something about “necessary regulations”.  I thought I had been zapped into Bizarro World when I first heard that blasphemy.

Yet the quality improvement community has long understood the value of regulatory tools.  We call the activities monitoring, statistical process control, auditing and other technical terms but the end result is the same: develop your process, implement it, observe it and then process feedback into it for continual improvement.  This is officially known as “Plan-Do-Check-Act (PDCA)” as well as “Plan-Do-Study-Act” (the latter being a refinement of the original premise).

Continue reading

Write once, risk many?

As an information management type who firmly believes in the management aspect, I have always been attracted to projects that emphasize consolidating and publishing data in the most efficient ways possible.  My mantra has been “write once, read many”; a simplistic way of putting the need to control a single master source that can be replicated as far and wide as needed.  The immediate benefit of course being that this gets everyone in an organization “on the same page”.

I have been involved in many business operations where individual persons and groups were allowed to control their own versions of master data and the results can be disastrous– large lots of the wrong product built and shipped is one frequent nightmare that comes to mind.

But simply coralling the data is not good enough, and may in fact cause a different sort of harm.  If false data winds up in the master repository then the system is now ensuring that more people suffer the same mistakes.  I have seen this in configuration management systems architectures that do not match the business process.  A single bad data element led to 80% of the defects I discovered in one situation.

Continue reading