Jim takes on the Chaos report for their definitions of success and failure. This is something that has bothered me for years. I've been on several "failed" projects over the years that were wildly successful in the eyes of the business and are still running to this day.
I for one am going to take it upon myself to challenge anyone who uses the Chaos report as a basis for any kind of action.
There was a goodly number of people from the QA profession (not surprisingly), but yet a surprisingly large number of non-QA focused people. The majority were involved in some kind of agile effort directly or indirectly which is an interesting indicator of the spread of agile. It wasn't that long ago that the large majority were thinking about an agile project, not doing.
I like to have a lot of audience participation especially when there is a mix of experiences in the topic. After introducing the topic and going over some of the essence of agile I asked for the audience to give me topics they were interested in. Once we have a decent list I ask the audience to prioritize the list so we talk about the most valuable tings first, since we alway run out of time.
I got some of the typical questions that a QA audience asks, such as:
- Should everything be tested by the end of the iteration?
- Should testers automate tests (i.e. write code)?
- What about unit testing?
- What about regression testing?
However, the question that got bumped to the top of the list was "How do we build trust?"
This was not a question I was really prepared to discuss in an agile QA context, but upon reflecting a bit since then it does seem quite relevant, especially since I listed Trust as one of the key essences of agile. Additionally there were a few people in the audience that were really struggling with trust between functions (the perils of letting the audience set the agenda).
I'm not sure I gave the greatest of answers at the time, but have been thinking about it since then. I think I've boiled it down to a few things:
To gain trust, you need to give trust.
That is, you can't just demand trust from someone else if you are not willing to take the risk yourself.
To gain trust, you need to deliver on your promises.
In other words you need to be reliable. In my experience the only way to gain lost trust is to do what you say over and over again. Which leads to:
Trust is not earned quickly.
Try as we might, we rarely gain trust immediately. It is a long term project where we continuously prove ourselves trustworthy. The goal is worthy, as the more we trust each other the more we can eliminate fear, which leads to better cooperation, which leads to better results. Which is what we are all after.
Other articles on trust:
The Aligning talk wasn't as engaging as I'd hoped. Not because Geoff didn't know what he was talking about or wasn't personally engaging, rather by necessity it was somewhat introductory and is is something I've been thinking about a lot in my new role of VP.
Geoff's "secret sauce" is:
- Tie development investments specifically to business goals and priorities.
- The development process should match the business processes
- Enable the development staff to deal with business drives that drive projects
- Measure development performance against business goals
Geoff pointed out that the business needs to know what it's goals are before there can be any alignment.
When talking about business value Geoff identified 3 different types:
- New opportunity
- Staying open
- Cost reduction
Increasing revenue and decreasing costs are types I've dealt with commonly. The cost of staying open was a new way of thinking about certain problems for me. The example he used was the need to upgrade a database server from an older version to a newer version. Typically I've had a hard time arguing the value of such a project. Now I have a way to think and explain why we might want to do such a thing. One of the audience members rightly pointed out that this is really a risk decision. Do we want to stay on the old version even though it is no longer officially supported (risk) or do we want to upgrade (different kind of risk).
The other point that resonates with me is the identification of the value of not doing something. I don't think enough businesses think about opportunity cost. For example the other day I was involved in a discussion about an upcoming meeting with 4 people to discuss whether an extra $400 software license should be purchased. Just looking at the $100/hr burden rate for those employees was enough for me to justify just purchasing the license. Even more significantly the opportunity cost for those employees is in the $300/hr range. I'd much rather have those people producing value.
Geoff had a couple of book recommendations:
I also think Software by Numbers is another great resource.
The Extreme Hiring talk was interesting, and I think I'll take away a few idea. Some of the concepts were (admittedly) specific to Ternary's business model and hiring needs.
Here is an outline of their process:
- Candidate find job ad and reviews website
- Candidate applies for job
- Technical Assessment
- Technical Phone Interview
- Personal Phone Interview
- In-person interview (day 1)
- In-person interview (day 2)
- Job offer
This by itself is clearly more effort than many companies go through, but the interesting points are that cultural fit and talent are more important than skill, resumes are optional and secondary, candidates must write code, and candidates must work in a team.
Brian put a lot of emphasis on hiring for cultural fit and talent over skills since you can add investment to improve skill, but fit and talent aren't as easily improved. He did admit that skill becomes more important as your ability to invest decreases.
He also talked about having the candidate solve a coding problem and then submit it as a discussion point during the technical phone interview.
This is something I've been experimenting with for a number of years now. My original concept came from Johanna Rothman back in 2003 when she recommended auditioning developers (and testers). I've moved from having candidates pair program with me, to having them program with me in the room as the customer to sending the problem home with them after spending some time on it during the on-site interview. I think I am going to switch to giving it before the technical phone screen as a way to thin out the crowd (both from a level of effort required and the clear indicator of the kind of programmer they are).
Something I haven't done in the past that Brian does is elevate essay questions above resumes. He makes candidates respond to six(!) essay questions as well as provide a cover letter. Optionally he allows candidates to include a resume. Clearly this isn't something most HR departments will be comfortable with. The questions are oriented to expose cultural fit and talent. You can see the current set of questions here.
The final unique technique is the group simulation he runs on the first day of in-person interviews. He brings in 4 candidates, gives them a problem to solve, some equipment and infrastructure and lets them go at it. He says that by lunch everyone stops trying to keep up the interview persona and really gets down to it. Brian doesn't use this technique if he is only filling one position, but since he is regularly looking for teams this works well.
Anybody ever use anything like this? Anyone ever gone through an interview like this? Like it? Dislike it?
Due to work reasons day 4 of the conference is day 1 for me. After taking a red-eye I've landed in Boston and managed to get to the SD Best Practices show. The check-in experience was great and I'm busy looking through the sessions to see what would be interesting.
For the rest of the day I'm looking at Extreme Hiring and Aligning Software Development with the Business.
Basically it comes down to 2 time horizons. The 10 minute and 30 minute horizons.
I was going to leave the following comment, but his comment system seems to be broken at the moment.
Thanks for the great approach for agile QA. Too many times it seems that QA groups get bogged down in the procedures, and can't see what they are trying to achieve (hint - it is not producing defect reports).
Indeed it has been "a little while" since my last post, nearly a year in fact. Where did the time go? I have been busy (surprise, surprise).
- I spoke at SAO in Portland
- I was a guest lecturer at WSU Vancouver
- I spoke at XP2007 in Como, Italy
- I was an experience report shepherd for Agile2007
- Agreed to write a book for Addison-Wesley on Agile Anti-patterns
- And accepted a job as VP of Software Engineering for ISI
I've been giving some thought to changing the tone of my blog from some time now as I don't have as much time to dedicate towards polished entries. I'm going to give it a go for a while and see how it feels.
Here is one of the XP2006 workshop antipatterns the group worked on.
Acceptance testing is either not done or done at the end
Excessive belief in unit testing
No testers involved
Insufficient awareness of how to “agilize” old habits
No acceptance test criteria
Excessive time/resources to run
Lack of hardware/understanding of economic issues
Perfectionism – 100% must be testable
Reliance on manual work
Systems that don’t work
Loss of trust
1,000 unit tests, but the system is broken
Insist on acceptance test criteria
Training on iterative testing