We use cookies to ensure that we give you the best experience about vital software issues 'n stuff. A cookie is a small text file containing information that a website transfers to your computer's hard disk for record-keeping purposes and allows us to analyze our site traffic patterns. It does not contain chocolate chips, you cannot eat it and there is no special hidden jar. If that's still okay for you, and you want to continue to use our website, you can close this message now (don't forget to read our Data Privacy Notice). Otherwise you gotta leave now :-( Close

Birds love BUGS

Here are the stats. I raised 2500 bugs in 12 years, then moved to a company where I raised a thousand bugs in only 19 months.
Presuming that a typical year has 252 working days, this gives me rate of 2.5 bugs per day or 12 per week (compared to an average 0.8 per day or 4 per week during the last 12 years).

That means the rate of identified defects has increased by the factor of 3.

What do these numbers tell about me or the software-under-test, or the company and what does it tell about the developers who introduce these bugs?
Do these numbers really have any meaning at all? Are we allowed to draw a conlusion based on these numbers without having the context? We don't know which of these bugs were high priority, which ones weren't. We don't know which bugs are duplicated, false alarm and which of those look rather like they should have raised as a change request.
We also don't know what is the philosophy in the team. Do we raise any anomaly we see or do we first talk to developers and fix it together before the issues make it into a bug reporting system. Do we know how many developers are working in the team? How many of them work really 100% in the team or less, sporadically, etc...Also, does management measure the team by the number of bugs introduced, detected, solved or completed user-stories, etc.? May the high number of identified issues be a direct effect of better tester training or are the developers struggling with impediments they can/cannot be held responsible for and these bugs are just a logical consequence of these impediments? Are there developers who introduce more bugs than others?

As is with these numbers, they are important, but they serve only as a basis for further investigation. It's too tempting to use these numbers as is and then draw one's one conclusions without questioning the numbers.

(Source: Simply the Test)

Checking the Cloud

...or "Hi, just wanted to see how my data looks like in the cloud".












and here how the first draft looked like...







(Source: Simply the Test)

Deserting the Antarctic

 Contribution to D. Good luck!

(Source: Simply the Test)

A beautiful Feature

...

(Source: Simply the Test)

End of holiday season

When developers come back from holidays...

(Source: Simply the Test)

ISTQB® releases new Certified Tester Foundation Level 2018 (CTFL) Syllabus

The ISTQB® General Assembly has approved the new 2018 version of the ISTQB® Certified Tester Foundation Level syllabus for general release. Foundation Level is core to the ISTQB® Certified Tester Scheme providing essential understanding and knowledge to anyone involved in testing. The updates reflect market feedback and the current state of the software testing industry.

The release consists of the ISTQB® Foundation Level 2018 syllabus, an Overview document, Accreditation guidelines, Glossary items and terms, Exam Structure and Rules and Sample Exams. Read more here.

(Source: ISTQB)

Welcome to Weirdo-Land

I just wanted to draw an oldtimer Porsche Speedster 356 of the Fifties. Usually I am less addicted to sports cars, but this one is an exception. I watched an interesting report by Jay Leno about a 356 replica bulit by JPS Motorsports and since then can't take my eyes off it.

(Source: Simply the Test)

Baffled sequenceIDs

Sorry folks, Insider. I don't know how I can better generalize the cartoon so it is also funny for those who haven't experienced our exciting moments we had with our love-hate relationship-sequence IDs. The counter was reset after each deployment. It resulted in duplicate numbering, hence kept us busy hunting for ghost bugs.

(Source: Simply the Test)

No undo in the elevator


A nice test pattern to apply while testing software, is to revert all changes performed on one object or more, either by sending the keys CTRL-Z as often as possible or by using any other provided cancel/revert/undo operation if it exists. An interesting observation for me still is the fact that until this day, I had never been in an elevator that provided the possibility to cancel your choice on pressed buttons... resulting in the experience of real "pain".




(Source: Simply the Test)

Ready to take off – or good enough to go live?


Many, many years back, when I drew the first version of the cartoon on a draft piece of paper, I was inspired by an email sent out to everyone claiming the software is good enough to go live although the software never went through proper testing. Only "A." had tested it; and since A was such a great techie we were feeling quite comfortable if he had looked into it. This attitude had established more and more until we forgot to deliberating on the fact that even for "A." it was simply not possible to deal with all those increasing bugs showing up at customers. We had to change our mindset to simply rely on exploratory testing only, and therefore set-up a team that performed a more planned approach in testing the software. This included a study of all varying characteristics of the different customer's most important workflows, and it resulted on testing these workflows with the goal to mitigate the risk these workflows are broken in the next release.

(Source: Simply the Test)