Thursday, November 25, 2010

Payment problems at the NAB for 2 days


For 2 days the National Australia Bank has had problems processing customer payments and transactions, causing delays to EFTPOS, ATM and other electronic payments.



A follow up similar failure occurred on the 8th Dec but for only about 1 hour of service disruption. See http://www.computerworld.com.au/article/370720/nab_electronic_banking_woes_strike_again/?eid=-6787&uid=19517

Friday, October 22, 2010

Certification is failing, do we need Professional Accreditation?


Recently a question was posed by a colleague "Certification as a concept has lost it's meaning... The future as I see it, in line with other professions is not only to be certified, but accredited by some national body".

With volunteer lifesaving though Surf Lifesaving Australia, for the bronze medallion, for instance:
1. You must demonstate pool swim proficiency (400m in less than 9 minutes)
2. You must complete the training course (about 5 days)
3. You must pass the written exam
4. You must be able to do a run, swim (400m ocean swim), run within time limit
5. You must be assessed for practical components (rescues, lifts, resuscitation, first aid, observation, radio)

Then each year you have to redemonstrate proficiency, including:
1. Above again
2. Recap on changes to procedures (about 1 - 2 days)
3. Above - cut down written proficiency exam
4. Above again
5. Above again (criteria selected by assessors)

And you must maintain a minimum number of patrol (or water safety) volunteer hours per year.

Last year I did effectively 5 days of training. This year for proficiency, it probably took me another 2 - 3 days. The practical assessment alone was 0.5 days.

It is a lot for doing around 50 volunteer hours per annum, but they are very professional in their approach, and one reason why ultimately in our own company culture we aspire to Surf Lifesaving Australia's values. Many of them are skills that I may never have to use, but when I do need them I am confident I will get 90% of it right, and that will make a difference. The other key is you work as a team, and that way you support and remind each other on learned skills, they focus a lot on communication. It comes out in assessment, as they encourage team work in assessment, and look for the communication and reminding each other of procedures. That should be a goal in our work also.

So what is reasonable for someone doing this 1000 - 1500 hours per annum in their profession. Obviously if you are doing it everyday, the training and assessment is quite different. Is revisiting 5 days of training and assessment each year appropriate? This would exhaust the average training budget industry currently provide (see our industry benchmark).

So why can we expect such a high level of training in lifesaving (which is based on volunteers), but in a professional discipline, which also has some significant impact on people's lives and livelihood, skills expectations are not at a similar level.

I think the training provide by SLSA is great. I find the time commitment a big challenge though. And sometimes the reasoning on procedure changes is confusing.

So yes, I agree about stepping up to accreditation. Foundation level certification being the current bar is way to low, and the way much of the assessment is done is just pathetic. The outcomes are disappointing. I am certain with multi-choice exams done online, I would guess that around 30% of answers come not from the assessee, but from some other coach. I have seen this first hand with other multi-choice proficiency exams in other disciplines.

Certification is a step in the right direction, but failing to improve skills and competency outcomes, means that our best intentions may ultimately be threatened, and certification will be seen as a farce and a failure, and our intentions are reversed.

How to get influence???

It needs to be influenced via an industry body like the ACS, or a National Training Authority. Ultimately in needs commitment from employers that it makes a difference to the outcome and productivity.

I think there is a long way to go...

Monday, October 18, 2010

Faulty Speed Cameras Suspended

According to Computerworld, faulty point-to-point speed cameras in Victoria have been suspended after a fault was detected which recorded incorrect readings on nine occasions.

Deputy Commissioner Ken Lay said the cameras would not be switched back on until they had been thoroughly tested by independent assessors.

Sunday, September 26, 2010

Upgrades Cause Delay In Internet Banking


According to Computerworld, Suncorp has admitted that recent upgrades to internet banking has caused delays, while some customers have complained of loss of functionality.

Virgin Glitch Strands Passengers


According to ABC News, a crash of the check-in system at Virgin Blue has stranded thousands of passengers around the country.

Tuesday, September 7, 2010

Maritime Software Sinks


Customer Service Officers claim that a new registration software system for NSW boat owners is a failure (as reported in CIO).

The software has failed to send out letters for license renewals, sent out incorrect expiry letters, and registration processing time has grown from 5 minutes to 30 minutes.

Wednesday, September 1, 2010

Jonathan "Peli" de Halleux

We have had the pleasure of Jonathan "Peli" de Halleux visit from Microsoft Research visit us as keynote for the Test Automation Workshop and a presentation to the ACS Special Interest Group on Software Testing in Brisbane.

Peli presented work he and his colleagues have undertaken on Pex and Moles. Pex and Moles are tools which support Unit Testing and Stubbing in the .NET applications.

Pex will generate Unit Tests directly from the code under test. Moles is used to dynamically stub out interfaces. Both these tools were quite impressive, and will truly accelerate unit testing activities.

You can try out Pex online in a virtual environment (no installs!) at http://www.pexforfun.com/, otherwise you can use the full version in Visual Studio.

Saturday, April 3, 2010

How to be "Better" as a professional

I have just read "Better", by Atul Gawande. In his book, Gawande describes examples of surgeon's endeavors to improve their practice. The book is highly engaging, provoking thought with real-world examples and human compassion.

I have taken a strong interest in E-Health in recent years, and I see myself over the next decade or so seeing where my work in Quality Assurance can be applied to the Healthcare sector to reduce adverse outcomes. I can see elements of Total Quality Management and Demming applied in a clinical setting. Indeed, Gawande ponders the question "is medicine a craft or an industry?". Again a question the IT profession is asking of itself over the past couple of decades.

I thoroughly recommend this book as reading to those that are aiming to improve professional practice. Even though dealing with a domain which I have had no clinical experience, only observations from one of IT enablement, I found this book inspirational.

In Gewande's afterword, he pondered on how a surgeon may make a difference. His five propositions I believe are aspirations for any professional. Here they are :

1. Ask an unscripted question
Stepping outside of the professional script, both with the patient and colleagues. Here Gewande see the value of the human relationship to build upon the professional relationship. After nearly 13 years in business I have customers that I count as close personal friends. Professional service requires trust, and through building the relationship and empathy, we become a little bit more reliant on each other to step up when needed.

2. Don't complain
Gewande points out we all have something to complain about, but doing so doesn't actually advance us, and it depresses us and the team. Look toward the positive. Look for what we can do to prevent these problems in the future.

3. Count something
In this book, Gewande gave numerous example where measurement of individual and group practice had led to revolutionary advances. Examples he cited were Battlefield Surgeons in Iraq, Agpar scoring in Obstetrics, and comparison of Cystic Fibrosis centres across the US.

This idea I reflected upon a lot while reading this book. Indeed I am keen for us to measure our practice. This insight in most cases doesn't give me the answers, it leads me to ask more questions, but in the process I think we understand more of what is going on. Indeed I often quote Lord Kelvin, "when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind".

Gewande discussed how the Cystic Fibrosis Foundation is publishing the performance of individual centres, and how this may appear controversial, yet at the same time essential to get us to do better and become a "positive deviant". I reflect on how this is recently undertaken here in Australia with schools via the MySchool website.

Recently we have commenced the Software Testing Industry Benchmark, which aims to measure Software Testing practice across organisations. I think this is essential so that we understand our performance in relation to others. However we need to go further in our own company, measuring projects and individuals, not as criticism or punishment, but so that we strive to improve.

4. Write something
Gewande says this is how we contribute, and become part of a larger community. I agree. I found that contribution is a duty to give back to the community, to the profession. Whether it be a blog like this, a newsletter article, a quick presentation to colleagues at lunch, or a short idea to the staff mailing list. I found in laying down those thoughts, it also helps me to better understand what I am proposing and in the process gain greater insight.

5. Change
Here Gewande points to three adopter profiles: early adopters, late adopters and skeptics. [For more discussion of adoption profiles see also "Crossing the Chasm", by Geoffrey Moore]. He says we should seek to be early adopters, be willing to recognise the inadequacies in what you do and to seek out solutions.

The above five points I think are useful concepts for us also as Software Testing professionals to take into consideration. Certainly attributes that personally I will aspire to.

Wednesday, March 24, 2010

Business doesn’t see value in $900 million spend on software testing


Industry research recently released has shown despite significant and increasing proportions of IT budget being assigned to software testing and quality, businesses are not convinced of the investment value.

Dr Kelvin Ross, CEO of KJ Ross & Associates who conducted the Software Testing Industry Benchmark study said: “Part of the problem is the expectation that IT should just get it right. Why should you have to test if development just built it properly in the first place.”

Spending on software testing has increased to 22% of IT project spend. Annually this equates to an Australian enterprise spend estimated at over $900 million.

The proportion of IT project spending on testing will continue to increase. Higher reliability expectations, greater integration, larger projects and increased complexity are putting greater pressure on software quality, and more effort is required to check the software is right.

While Test Managers are always wanting to spend more on quality, they perceive Project Managers and IT Executives view the current spend as about right. Surprisingly though, greatest pressure to reduce software testing spending comes from the business. Over 50% of respondents indicated that business considers spending on testing too high.

The industry benchmark found business knowledge was amongst the most sought after skill in the testing team. Yet 30% of test managers found customers and business users are not happy about being involved in testing. It remains a major challenge to develop and retain the business skills in the IT testing team.

The research further showed over 20% of all defects found in testing were problems traced back to poor requirements. Yet these requirements problems are found very late in the project, often causing rework, slippages and delays.

“The key to improving software quality outcomes is engagement of the business”, said Dr. Ross.

“First, the business must be informed software development is complex, and failures are inevitable. Then the business needs to work with testing and quality more closely to accurately define the user expectations, and trap failures early, thereby reducing downstream costs and project delays.”

The Software Testing Industry Benchmark study provides a detailed analysis of software quality and testing practices within Australian organisations, and covers aspects such as stakeholder perception, budgets, resourcing strategies and costs, adoption of tools and techniques, defect discovery profiles, and key performance measures.

Further information:

Tuesday, February 9, 2010

400,000 Prius Recalled to Patch ABS Software

Toyota have officially recalled almost 400,000 Toyota Prius and Lexus HS250h vehicles.

An update to the ABS software is required to fix a problem where brakes have been reported to momentarily stop working as the braking system switches between regenerative and conventional braking.

The fault has been attributed to several accidents, but no deaths.

Toyota have always been solid reputation for quality, showing leadership in the Automotive sector. They were the pioneers of Lean Manufacturing and Just-in-time manufacturing processes.

However this recall follows two other recalls of 8 million vehicles (for slipping floormats and sticking accelerator pedals), and has raised a media frenzy.

Further links:

Thursday, January 28, 2010

MySchool strained under the load


It has been pretty much headline news all day today here, MySchool, which publishes comparative ratings for all Australia schools, has been hit by load issues as it went live today.

It was highly publicised, following the controversial debate regarding school performance comparisons using National Assessment Program Literacy and Numeracy (NAPLAN) test results.

The site suffered availability issues after go live as the public started to pound the application. While the site was designed 1.75 million hits per day, the site was reported to experience 2.5 million hits between 1:00am and 10:00am this morning.

Further reading:

Monday, January 4, 2010

Bank of Queensland's EFTPOS's role over to 2016 rather than 2010

Thanks for pointing to this one as well Alistair (http://watirmelon.com/2010/01/03/software-quality-metrics/).

News.com.au reported Bank of Queensland EFTPOS machines rolled from to 01 Jan 2016 rather than 01 Jan 2010. Retailers reported losses as unable to process electronic transactions.

As one commentor noted - it is a Y2.01K bug!