Monday, 28 January 2008

Book Review: Software Testing by Ron Patton

image This review actually covers the 1st edition, and not the current 2nd edition.
I read this a long time ago - made my notes and have subsequently lost them. So I start again.
My basic memory from the last time I read it recalled as: "a good book for beginners". So I'll see what a second reading does for me.
[amazon.com][amazon.co.uk]


The first section of the book gives a basic introduction to software testing based on pragmatism.
While part 1 does have a section on the "Realities of Software Testing", the section "What exactly does a software tester do?" gives out what I consider unwise advice in form of:
"The goal of a software tester is to find bugs, find them as early as possible, and make sure they get fixed."
I would not state a definition like this to beginner testers as I know from experience what kind of behaviours this will drive into overly zealous young testers. Certainly our goals include the finding of bugs, yes preferably as early as possible, but "making sure they get fixed"?. When a junior tester takes 'making sure they get fixed' rather than a 'bug advocacy' or 'information providing' role, they end up acting as 'gatekeepers of quality'. 
So I hope the beginner tester will not take this definition fully to heart. After making this statement Ron then goes on to temper it slightly with one of his tester attributes stating '(mellowed) perfectionists' (not quite the 'make sure they get fixed' attitude) and then we learn that "Not all bugs you find will be fixed". But a slight tempering does not fully mitigate the repeating of this phrase throughout the book.
I find it entertaining, and yet saddening,  that one of my biggest problems with the book stems from the 6 words "...but making sure they get fixed", but then testing does attract pedantic monsters (not one of Ron Patton's attributes of a good tester).
And so...moving on... I do hope the beginner testers learn from the 'Realities of Software testing section' as Ron has built this up from experience.
Testing Fundamentals provides useful approaches to testing from, and without, specifications. Including:
  • Perform high level reviews
  • Build your own 'spec' which tells people what you plan to test
  • Use an ambiguity checklist
  • Don't forget to test error messages
The fairly short dynamic black box section packs a lot into its 26 pages. For example, the Data boundary discussion read as a more pragmatic and thorough discussion than normally outlined by beginning tester books and provides a small set of classes to look for for other boundaries: Slowest/fastest, shortest/longest, empty/full, etc.
I will have no hesitation in recommending this section to a beginner tester as it provides a broad coverage very quickly and I wish I had had it available to me when I started out.
I found part III - applying your testing skills the weakest section. Hopefully testers can generalise from the section and apply some of the lists as heuristics. The configuration chapter goes into more detail than most testers will use, but I hope testers can generalise from it and apply configuration lessons presented here to more than just the hardware. The Usability chapter will hopefully encourage testers to keep their eyes open as they test to items beyond the specification and requirements. I don't think that the overly basic web testing chapter will encourage testers to go off and read something like "How to break web software" [google vid] [amazon.com][amazon.co.uk]
Part 4 consists of two chapters, one on automated testing and the other on beta testing. The automated testing chapter hints at tools that I don't see many testers using (monitoring software) so I gave positive marks on seeing that mentioned and Ron tempts the reader with a tool called "Koko the smart monkey" but a little unfairly as the tool doesn't appear downloadable or available from anywhere online that I could find, so while the chapter offers possibilities it provides a very basic overview of automation. Similarly the beta testing chapter, I found too sparse.
Chapter 16 starts with the important distinction between 'planning your testing' and 'writing your test plan'.
"The ultimate goal of the test planning process is communicating (not recording) the software test team's intent, its expectations, and its understanding of the testing that's to be performed... Of course, the result of the test planning process will be a document of some sort."
This chapter combined with James Bach's Test Plan Building Process and Test Plan Evaluation Model should put a junior tester in a good position to construct a good test plan that communicates their intents and concerns.
The Test Case Chapter (17) provides the splendid advice of
"[test case] level [of detail] is determined by your industry, your company, your project, and your team."
In other words, your context.
"It's unlikely that you'll need to document your test cases down to the greatest level of detail..."
I consider this good advice for testers to take, that you treat the level of detail required for your documentation as a negotiation with the project to meet its needs.
Chapter 21 embraces the unfortunate decision to embed a lot of information and web links in the "Career as a Software Tester" when an up to date online web page with this information would support the reader better but the text book (informIT) and author does not seem to have a major up to date web presence.
So...I come to the end of the book and have mixed feelings.I like part 1, 2, and 5, but was less keen on parts 3, 4 and 6. fortunately I consider parts 1,2 and 5 as the most important parts. I did not see enough in the book that would irreparably damage a beginner tester so I can recommend Software Testing by Ron Patton to the beginning tester - mainly because of its down to earth tone and hints & tips approach.
Test the book yourself by reading excerpts on the informIT site
[amazon.com][amazon.co.uk]

Thursday, 24 January 2008

Meet your USB app launching needs with PStart

I carry a lot of applications around with me my on USB memory stick for use in impromptu testing situations. I need a tool to help me manage them. I use PStart from Pegtop Software to do that. If you haven't used PStart before, or haven't dabbled in the world of portable software then this review could help you on your way.


The freeware PStart from Pegtop Software helps me launch applications from my portable drive.
PStart acts as the 'start menu' for my 'on the road' portable application armoury stored on my USB memory stick.
Once loaded, PStart creates an icon in your system tray.
pstartInToolbar
In normal use, a left or right click on the PStart icon will show a popup window, much like the start menu, where you can launch your portable applications.
pstartPopUpMenu
You can create menu items (called groups), sub menu items (groups within groups), links to folders on the disk or portable drive, and links to applications themselves.
PStart can install as a portable application (as documented in the Pegtop Faq)  so no stray files will end up on your main system.
I have file links setup to launch java applications using JVMs installed on the drive, with multiples JVMs installed for varying degrees of compatibility.
All links in PStart get created relative to the PStart application so it doesn't matter what drive letter your portable drive gets assigned.

Configuring PStart

A Double click on the PStart icon and you get the Config screen.
pstartConfig
Add applications by:
  • drag and dropping the .exe file into the menu hierarchy.
  • choosing 'Add File' from the edit menu
  • right clicking and choosing 'Add File' from the popup menu
  • or scanning the drive and having PStart automatically create a list of applications, which you can then organise into groups.
Once you have added a few applications, you can drag the icons around and organise them into groups. The groups show up as menus with sub menus in the popup menu from the system tray.
A simple configuration process means that a little thought around how you want to organise the files into groups will increase your productivity with your portable drive.

What else?

If you forget which group you put an application in then the handy Search function comes in useful.
pstartSearch
A double click on the returned items launches them.
The Notes tab provides a very handy StickyPad of free format notes for reminders that you want to carry around with you.
pstartNotes
Each note can have a long test description, you can put them in different colours, and you can even add a time and date to get reminded about it.
pstartNoteEditing
The reminders appear as little floating dialogs in the corner of your screen:
pstartReminder
And the Info tab will tell you a little information about your portable drive so you can see if you require an application purge exercise or not, or if you have enough space left to install that cool application that you just spotted.
pstartInfoTab
Recommended Next Steps:
  1. Visit  Pegtop Software
  2. Download PStart
  3. Sign up for the Pegtop newsletter
  4. Visit the portable apps list on Pegtop

Monday, 21 January 2008

Agile Acceptance Testing

Some notes and links on Agile Acceptance Testing. First the links.
Read the online Acceptance Testing chapter from "Test Driven - Practical TDD and Acceptance TDD for Java Developers" by Lasse Koskela.
Read Brian Marick's article Series Agile Testing Directions.
And now the notes:


I haven't yet read "Test Driven - Practical TDD and Acceptance TDD for Java Developers" [amazon.com][amazon.co.uk]. I have, however, read the free online chapters and I have already sent the urls of those chapters to people that I work with to introduce them to TDD and Acceptance Testing principles. I will eventually buy the book to study. (I don't want to overload myself too much by reading this when I haven't yet finished working my way through Agile Java [amazon.com][amazon.co.uk]).
This chapter presents a very readable and detailed guide to the practice of acceptance testing on an Agile project and where and how to do it within the iteration.
Brian Marick wrote an article series called Agile Testing Directions where  he refers to 'acceptance tests' as 'checked examples'.
To me, the phrase a 'checked example' implies that acceptance tests represent a set of agreed 'good enough', 'broad enough', base set of tests that we can reuse. We need to do further investigation to determine our confidence in their 'checkedness' and 'comprehensiveness'. So we supplement these checked examples with additional user/exploratory testing/functional automation/unit tests etc. I think that Scott Ambler alludes to this in this Dr Dobbs article.
Acceptance Tests represent examples from the customer so we want the customer to understand the test and so we have to write the test in a way that supports us communicating them to the customer. Preferably documented in a way that the customer can read and understand the test without help, and in their most idealised form - written using a tool that the customer can amend/create the tests.
But ignore any fancy abstraction approaches until you get a set of tests that the customer can understand. This allows the customer to agree that the tests implement their basic coverage acceptance examples. Over time you can amend the tests to write them in a more customer friendly way, as you learn more about the domain and the customer's preferred representation style.
Then comes the fun part. In addition to examples, identify any risks. Identify things that you think you might have concerns about but don't yet see the value in formalising into an automated acceptance test.  Think through the 'other' questions that you have about the stories because they can help you build your starting charters for your exploration of the implementation.

Saturday, 19 January 2008

5 Exploratory Test Documentation Lessons

eurostarNoteBookPage
While at Stockholm for the EuroSTAR 2007 conference I managed to conduct testing on a public booth and have collated some simple lessons on Exploratory Test Documentation.


I read James Bach's post on Amateur Penetration Testing a few weeks before going off to Stockholm for the EuroSTAR 2007 conference. While there I managed to recall some of his techniques while using a few of the free test training booths provided by the Stockholm authorities in their fair city.
Michael Bolton gave a talk about his Tester's Notebook. From which I gleaned a few tips in effective notebook usage.
Lessons from both Michael and James led to the production of this post.
While I reviewed my notebook pages covering my time in Stockholm I found my notes on some booth exploration where I found a vulnerability on a booth in Stockholm.
I include those notes here to try and illustrate a few lessons about exploratory test documentation.
Lesson one: Develop better handwriting than I have so you can read your notes at a later date.
logNotesNote: I made these notes @ Eurostar, after I conducted the testing. The title "Eurostar" does not mean that I conducted the testing @ Eurostar itself. The title "Eurostar" on the page tells me where I sat when I wrote the info. I have not included the name of the venue hosting the booth, just in case the owner of the venue hasn't fixed the problem. I did raise a defect report. I left it in their suggestion box.
Lesson two: write down what you did
This scrawl tells me the order I tried to do things:
I tried to get hold of a pdf and either use the download dialog, save dialog or some other dialog on the screen to access the file system. But no luck - unresponsive pdf links and I could not find a way to access them (so many unresponsive file types - zip, doc, EVERYTHING seemed locked down, so I stopped trying that attack).
I tried a few shortcut keys that I know, but none of them caused any visible effect that I could figure out how to exploit.
I used the Shift+Alt+PrintScreen control key that James mentioned in his blog post (which I didn't know about until I read it there) and that created an interesting display, but again nothing that I could figure out how to exploit.
And then.... "E"... well I didn't even finish writing it as a word because a diagram seemed more appropriate.
Lesson three: use diagrams, and don't worry about the formality
graphicLogNotes
This booth had a little icon on the top right which took me to the manufacturer's site - great. I found support forums there and manuals so I had a quick browse around for any info that could help me, and I read a whole bunch of useful hacking info about config files and key shortcuts I could enable, but first I had to get to the file system, and I had not figured out how to do that.
But wait a minute... the manufacturer has a .exe download link, and when I click on that I get a file save dialog. And as soon as a file browse dialog gets displayed, I can access the file system. And then the opportunity to exploit becomes available. So at that point I reported the vulnerability.
So much for the self promotion of a secure booth manufacturer.
Lesson four: Make notes during the session.
Lesson five: If you don't make notes during the session - make them as soon as you can afterwards.
Fortunately I had a very short testing session and could retain it in memory until I managed to write it down.

Friday, 18 January 2008

Resource Hacking for Beginners

I'm going to introduce you to the Win32 testing tool that I use for looking at application resources, and which I used to find Mercury Screen Recorder secrets.


My first tool of choice for viewing the resources in applications is called...Resource Hacker.
Resource Hacker (TM) is a freeware tool for viewing the resources embedded in a Win32 executable.
I find this to be a handy tool for checking what hidden secrets await me if I test the application hard enough:
  • What error messages have I not triggered?
  • What icons and pictures have I not seen?
  • What general strings have I not encountered?
A handy little tool to have on your USB stick.
I also have the slightly more dangerous XN Resource Editor which, as its name suggests allows you to edit the information - I haven't found a good reason to do that when testing though so I haven't really used this as much.

Wednesday, 16 January 2008

Mercury Quality Center in "It Supports Exploratory Testing" Shock

I can't believe I'm about to promote Mercury Quality Center, particularly in regards to exploratory testing, but here goes...


Mercury appear to have licensed Blueberry consultants Ltd Technology into Mercury Quality Center as part of their Mercury Screen Recorder.
This is great news for any tester that has Mercury Quality Center installed on their machine at work, as the Screen Recorder is an addin that you can download from your add in page. (a submenu off the Help menu)
I had a quick look in the resource information in the MSR Recorder.exe file and saw the mention of BB Flashback. So I'm not sure which functional suite has been licensed from Blueberry Consultants. BB Flashback might just be the default name from which all components are built, certainly it looks like BB TestAssistant to me.
Here is the resource information that I found:

<description>BB FlashBack</description>
<assemblyIdentity
    version="1.0.0.0"
    processorArchitecture="X86"
    name="BB FlashBack"
    type="win32"
/>
The icon sets seem to suggest that it is BB TestAssistant.  Below is your chance to play spot the difference 
image image

So how can you use this for exploratory testing?
The default use of the software in Quality Center is to create a defect in Quality Center from the movie.
image
But if you choose to 'Edit the movie before submiting the defect to quality Center' (yes I know there is a spelling error there, but the spelling error is present in the dialog above). So 'Edit...' it then you get access to a more common BB Flashback/Test Assistant view:
image 
And from here you can save, export, edit, annotate, etc.
I'm happy to have found this but I'm kicking myself for only discovering this now. So the main reason for blogging this? You have no time to waste if you have Quality Center in-house - start using the Screen Recorder now.
Obviously a more expensive way of getting the BB recording technology on your computer but, as I say, if you already have TD installed (I found this using version 9 - I'm not sure if the Version 8 recorder is based on this technology or not)  then you get access to one of the most commonly promoted Exploratory Test Tools.
I installed the Add-in from Quality Center using the Addins submenu of the Help menu. And on the install at work, I had to go to the 'more addins' section.
Now... at least to the point of helping me record testing sessions... Mercury Quality Center can support me in my exploratory testing sessions.