Archive for May, 2010

What is AU2H? (and why I cared)

May 27, 2010

Agile Up to Here:  an experience report

If you haven’t heard the term “Agilistry”, don’t worry, it’s not a new development methodology you have to learn in order to be current, but there is a good chance you will be hearing more about it. 

Agilistry is the name for a training space in Pleasanton, CA, opened by Agile luminary and long-time software development consultant Elisabeth Hendrickson.  Known for her immersive and practical software development exercises, Elisabeth has opened a space for software professionals to learn the “true spirit of Agile software development.”

Last week, I had a chance to see if her studio lived up to her claim of “a place where Agile software development professionals come to sharpen their saws and practice their craft.”

I’ve known Elisabeth since 2000 when she came to Satisfice, the company (and training space) my brother created in 1999.  He created it to give testers a chance to practice their craft.  Ten years later (and partly inspired by her experience at Satisfice) she has turned the tables and invited me to see it in action.  Actually, I was just one of 11 guests summoned to Pleasanton to see what she had in mind for her workshop idea called “Agile Up to Here.” (search #au2h on Twitter for threads)  

As Manager for Corporate Intellect here at Quardev, part of my job is to put myself in places that maximize my ability to learn new things about software so we can stay competitive. Principles and practices related to Agile Development are things that continue to emerge for us on more and more projects we are asked to bid on.

When she invited me, my main concern was what value I would add to an Agile workshop.  In my experience, Agile was about programmers doing all of the testing and I’m not a programmer.  Also, Agile proponents always seem to imply that there are no defined roles for testers because developers did all testing through unit and acceptance tests. 

I expressed this concern to Elisabeth and she was adamant.  “Not only is exploratory testing part of Agile, it is a crucial component of it. You are required to be here.” That made me feel better.  I trusted Elisabeth because she had demonstrated that although a very fervent fan of Agile, she hadn’t lost her passion for testing.

I’m not a newbie to Agile, but there are tons of people who know a lot more about it than me. Sure, I’m familiar with the Agile Manifesto and know about story cards, backlogs, refactoring, sprints, Scrumboards, big visible charts, Test-Driven Design.  I was also a stage producer at the Agile2008 Conference in Toronto, hosting the “Questioning Agile” track, and I have worked as a test manager on projects that used facets of Agile. 

At Agilistry last week, I was first to arrive (a bag of Seattle coffee in hand to brew for the crew) and found Elisabeth setting up.  There were 7 pairing stations, a big rolling whiteboard, index cards of every color everywhere, a few small couches to sit, a monitor on the wall for the Hudson integration system to advertise its results, a small fridge and sink area, a printer, a wireless network… and that was about it.  A pleasant space in Pleasanton, not over-complicated, but resembling what the Agile conventions suggested – no cubes, no walls, maximized for pairing, transparency, and communication.

 

Leading up to the workshop, there had been a wiki for us to get to know each other, post our bios and expectations, take advantage of the Twitter hashtag (#au2h), etc., but as people arrived, it wasn’t clear to me what our mission was. 

We had our first stand-up – introductions. Everybody was a programmer except for me.  Just as I had thought, I was sure I was going to be made obsolete, but I trusted what Elisabeth told me.  That I was a required component.  That I would add value by being there.

 

Alan Cooper from Cooper Interaction Design and author of The Inmates Are Running the Aslyum and About Face told us the mission: He was a word nut.  For years he had collected homophones – words that sound alike but are spelled differently and mean different things (e.g. ere, air, and heir). He had a website that listed some of his collection, but a lot of it was tucked away on his hard drive.  Furthermore, his site was old – vintage 1997, web .5 (not even 1.0) and the list was hardcoded HTML.

As Product Owner (not designer), his main objective for us was “Get me out of 1997!”

He didn’t elaborate more than telling us what homophones were, but he did make enough of an introduction for me to get the gist that we would be building a site for him from scratch in these 5 days. I love challenges like that, especially when they are authentic – a real problem for a real person.  Abstraction lessons can be fun, too, but I’d much rather provide value to some person.

Part facilitator, part host, and part programmer, Elisabeth announced that she would need some help configuring the machines.  In seconds, she got two of the programmer-types to volunteer — Pat Maddox and BJ Clark helped her configure the pairing stations with the tools we needed: Hudson CI, GitHub, Rspec, Ruby on Rails, and Cucumber.

BJ Clark and Pat Maddox

Jeff Patton, an independent consultant and Agile coach, was also in attendance and emerged as a natural ScrumMaster, suggesting that the rest of us meet with Alan to get an idea of the kinds of things we wanted to see in a new site.

Jeff Patton and Alan Cooper

And just like that, without fanfare or ceremony, we broke from our huddle like a team taking the field. 

It felt weird.  No specs, no design docs, no budget, no buy-in, no high-level meetings, no executives, no paperwork to fill out.  Just go and DO.

So as Jeff Patton took the lead to interview Alan Cooper about his ideas for the new site, Dale Emery, Matt Barcomb, Katrina Owen, and I gathered around to listen. Index cards were plentiful and Jeff used them like a sculptor uses clay.

Two hours later, the machines were set up and my group was done talking with Alan – we had enough to get an idea of what he wanted and the board was full of Backlog.

Storyboard

The standup we had after that was simple.  After a quick status report, Alan did a brief chalk talk on design, then we set to work, picking the few stories that we’d do the rest of that day – no bickering, no dissention, no turmoil.  It just flowed.  There was no confusion, no chaos, no tension.  It reminded me from that scene in Apollo 13, where the ground crew had to build a filter out of spare parts.  Yes, there was urgency and energy around the mission, but there was no clumsiness. People worked together and all they had to do was say or suggest something and a natural affinity formed for people who agreed.  For those that wanted to do something different, they did and found someone to pair with.

Elisabeth, Alan, me (in hat), and Matt Barcomb

What struck me when I paired with Elisabeth was TDD seemed like hacking.  She would write code and then tests around that code and the tests would fail.  That was a good thing, she said.  Then she did trial and error fixing so that the tests worked.  She admitted when she was stuck or didn’t know how to do something and she’d just ask the other pair next to her for advice or look it up online or in the API help docs and a solution would emerge, but I rolled my eyes because this was just hacking. She was trying different things, not knowing if they would work. That was TDD?!?  Come on, really?!?

When I questioned Elisabeth about this, she said something that instantly hit me.

Yes, experimentation is ok with TDD, but it’s not just trying *anything*, it’s thoughtful experimentation.  In one phrase, Elisabeth caught me judging TDD in the same way people attack exploratory testing as just reckless “banging on the keys.”  There was a method to her trials and I didn’t see it because I didn’t know what to look for. Much in the same way test managers and execs aren’t hip to the language of skills and tactics that testers use when they explore – things like modeling, conjecturing, observing, branching, backtracking, questioning – these words describe what many people walking by would call “playing around”, but when the right language is used to describe what exploration really is, it’s more apt to be understood and taken seriously.

Elisabeth, working out code

Another thing I chided Elisabeth was how she found a bug and fixed it in about 30 seconds. The finding and fixing part was cool, but then she took 30 minutes to write TDD tests around it!  I thought that was a waste of effort.  The bug was found and fixed, why waste all that time writing a regression fix for such a little thing?!?  Then she explained it to me, it’s not just regression, but the *process* of creating the test that’s important.  The lessons learned in building that test may come in handy later.

Again, I felt sheepish.  Sometimes I go down a rat hole with a test and it may seem like a waste of time to a stakeholder.  But it was what I learned from that “wasteful” test that stays with me.  That seemed to me to be a big part of Agile development – learning.  In fact, I was surprised (happily so) to know that when developers do a spate of programming in this trial-and-error way, they call it a learning “spike”.  I liked that.  I have a word for it, too, called a “session”, but I didn’t have a word for a smaller period of time, so “spike” is what I can borrow from them.

Dale Emery, BJ, and Kat Owen

The first three days, I did not feel that the site or any of its functions were ready for me to test using my favorite testing approach.  I didn’t feel that I would have added *value* by testing what was there.  The components were simple, they worked, and to test it in the ways I had in mind did not seem to suit anyone, even me.  The risk was low and it was still under construction anyway.

When developers finished a story and some TDD tests, they would ring a bell and everyone would want to know what was implemented. That turned out to be an important component of feeling we were providing value — a mini-celebration.  The bell rang more frequently that I had expected.  Progress was very fast, but not sloppy. The confirmatory tests we wrote were working, but I was ready to try something more sinister to expose risks.

By Wednesday, enough of the pieces were coming together where I felt it would be worth it to the team to see what could be wrong with it. So I started pairing.

First, I paired with Matt on a session to explore risks in the homophone search feature:

Then I paired with Pat on a session to explore risks in how homophone sets were presented:

 

I got to show exploratory testing in action — questioning, adapting, chartering, note-taking, and learning *outside* of TDD-creation. And the programmers were open and receptive.  I bounced ideas off them, and they bounced ideas of me.  When we found bugs, I was happy, but instead of ringing a bell, all the celebration I needed was to write it on a red card and put it on the board to make the point Elisabeth knew all along — exploratory testing has an important place in Agile development. And no one complained about that. On the contrary, they reacted with purpose and curiosity to what I have found.

Work-in-progress board (kanban)

I learned that the synergy of Agile programming and testing was not meant to make testers extinct after all.  It was a means to learn both sides of two important components of development.  In fact, I’d say it was the fun part of the studio environment. It was, as Elisabeth might say, “Agilistry in action.”

Most importantly, in 5 days, we turned this old, 1997 site:

http://www.cooper.com/alan/homonym_list.html

Into this:

http://homophones.heroku.com

“You just have to try it for yourself,” is a conversation-stopper. It’s usually said when the person trying to persuade you of something has given up on you.  But if you dismiss the freight and take them up on their invitation, it might be a profound experience.   

After what I went through at #au2h, I was honored to have been invited. I wanted the chance to see if Elisabeth’s studio was indeed a place where “Agile software development professionals come to sharpen their saws and practice their craft” and I left convinced that she had hit a home run in designing the perfect space to emphasize these experiences.

Oh, by the way… did you remember that Alan Cooper was Product Owner? If you want to read his lessons on what happened for him, here it is: http://www.cooper.com/journal/2010/05/agile_up_to_here.html

The Truth about Testing?

May 19, 2010

It takes a lot for me to get riled up, but here I am.

Stuart Reid is doing a keynote at EuroSTAR titled “When Passion Obscures The Facts: The Case for Evidence-Based Testing.”

Here are three things he intends to show:

• How testing ‘evangelists’ use their apparent passion to conceal a lack of evidence supporting their claims
• Which claims are supported by evidence, which are just plain wrong, and which lack real evidence.
• How we should collect metrics to provide evidence to support testing improvements.

To me, these are not articles of scientific inquiry for an honest presentation about the origins and intricacies of controversies in our craft, they are weak opening arguments in a frivolous lawsuit he is bringing against it. 

His argument is that there are rival philosophies of testing (called “schools”) that are misleading you about testing. (Though for what purpose, he does not say).  This talk seems to be about how he will drag these rival, passionate evangelist ne’er-do-wells before the High Council so that he can show how they are obscuring the truth as represented by what he calls “facts” & “evidence”.
 
First, I identify myself as one of the “passionate evangelists” from one of the schools he is taking to task (the Context-Driven School). Second, I consider myself an advocate for the craft and science called “software testing” and that questions like “is exploratory testing more effective than scripted testing?” need to take a lot of context into account before they can be answered to someone’s satisfaction.  But to say I have the “facts” about controversial testing topics like this framed as “evidence” that can transcend years of controversy would not only be ridiculous, but arrogant and insulting.

But he goes on…

“This presentation will identify which claims are supported by valid evidence, which claims disagree with the available evidence, and those claims where there is currently insufficient evidence to reasonably support a claim one way or the other.”

Did you notice what words he chose to accompany the word “evidence?” — “real”, “valid”, “available”, and “insufficient”. 

According to whom?  You, dear reader? 

Of course not.  You can’t use these words because you don’t know any better.  You’ve been manipulated.  He hasn’t, thank goodness.

His case depends on convincing you how his evidence — obscured to you by people like me [see his title] — finally allows you to sort out six specific software testing controversies that have persisted for years.  How else other than showing you his briefcase full of facts will me and the other svengali evangelists from rival testing schools be exposed for misleading you about these issues? How else, other than seeing his evidence, will you be free once and for all from the polarizing debates we svengalis perpetuate?

I see Reid as a misguided politician-lawyer who needs a big case to get noticed.  He’s hoping you will not be smart enough to see that any premises (and promises) of “evidence” are subjective.  In other words, they need context — the theme of one of the very schools he says is swaying you. 

Is he really the crime-fighting hero, armed with a briefcase that once opened, would settle these testing debates bewteen the rival schools that have been misleading and plaguing gentle, innocent, unsuspecting tester-folk for years?

I think it’s more likely that you’re the jury in this case, knowing that software testing is a challenging intellectual process, not a set of absolute truths held in someone’s briefcase waiting to be laid out for you — especially by someone who doesn’t think it is. 

At least, that’s what the “evidence” of his title and abstract show to me.  The main difference between me and Reid is that my School has taught me that evidence, like in court of law, can be circumstantial.

20 things I’ve done to inspire testers

May 15, 2010

Clarification: I don’t know if these *actually* inspired testers that have worked for me, but I have indicators to believe that they seem to have built good will.

1) Help them midwife their ideas.

2) Catch them doing something cool.

3) Be an example (as in Parimala’s blog about looking for a book).

4) Pretend you’re the new guy and ask them for tips and advice.

5) Tell them your failures and invite them to suggest what you could have done differently.

6) Find a way to “Dogfood” the app you’re testing — to not just pretend to be a user, but find a way they would actually use it themselves.

7) Ask them what movies or actors inspire them, then care about the answer.

8- Solve one problem for them OR allow them to solve one problem for you.

9) Back them up in a conflict they had with management.

10) Demonstrate testing to them but show your thinking (mistakes, assumptions, etc.) as you test.

11) Have them email someone in our business like Michael Bolton or Lanette Creamer, who have good ideas and love responding to honest questions from colleagues.

12) Pretend that the developer forbade them to test something and see what they would do about that.

13) Have a friendly competition of who can find the best bug or create a flash mob for them to share their ideas or borrow from others (the #parkcalc one last month is a good example).

14) Have them go to a testing conference but be sure to hang out with other testers AFTER the conference day is done.

15) Allow them a safe space to fail, but also to show their smarts.

16) Invite them to give YOU a brainteaser or puzzle that YOU have to solve in front of them.

17) Pretend that you are shipping tomorrow and see what they would do about it at a time when management may think “we found everything”.

18) Encourage them to participate in a Weekend Tester session, and if they’re shy, just have them lurk.

19) Take an existing bug and don’t tell them what it is, but have them try to reproduce it by pairing up with someone.

20) Remind them of other metaphors — like how testers are heroes like the Secret Service, Men in Black, bodyguards, or crime scene detectives.