Category Archives: Software Craft

Software Craft

Dial the manifesto

Elisabeth Hendrickson put up a response on Why I defined Agile in terms of results on her blog, which made me thoughtful about the dialogue I had last weekend with Alistair Cockburn on the XP Days Germany. Though a more thorough write-up of the conference from me is still missing, I decided to respond on Elisabeth Hendrickson’s blog entry.

During the Open Space session on Saturday Cockburn introduced to us the use of the value statements in the Agile Manifesto as dials. Overall there are eight dials on the meter to consider:

  • Individuals and interactions
  • processes and tools
  • Working software
  • documentation
  • Customer collaboration
  • contract negotiation
  • Responding to change
  • plan

Based on the particular context we tune the amount of individuals and interactions and the amount of processes and tools. Likewise the remaining six values can be adapted to our particular situation. Indeed, we value all eight mentioned terms. That means that we may sometimes need to create documentation. As Hendrickson pointed out to me at the Agile Testing Days earlier this year, being Agile does not equals being stupid. If there is need to create that thorough documentation so that you can be set-up for the next round in the Cooperative Game, then you better create it – now! On the other hand if you find out that you’re documentation is mostly waste, then you should reflect on how to recude it or get rid of it completely. But you need to consider your particular situation, the project, the customer, the people involved at hand. That is what good coaches do.

The second thing Cockburn pointed out was the fluent up and down among practices, principles and values. During our work we keep on continuously switching from one level of abstraction to another, up and down. Some time ago I thought that this fluent switch among the levels is comparable to skill-set development with Shu-Ha-Ri. The main difference is, that you go through the Shu-Ha-Ri stages when learning a new techique, like TDD or Refactoring or facilitating Retrospectives. When working in an Agile team, you switch quickly between the values like simplicity or responsibility and the practices and principles, like TDD, Refactoring or Emergent Design. From time to time you may adapt and adopt some of your principles, practices or even values. Indeed, the Software Craftsmanship Manifesto is just a result of such a value shift. Giving the Agile values, we have put our new values beneath them. This does not mean that we don’t value the statements from the Agile manifesto, but we found new value statements that we think are worth to consider for the next ten years or so.

Getting back to Elisabeth Hendrickson’s write-up, focussing on the results is essential in all the above. When we loose our focus on resulst, we start to day-dream and seldomly get something productive finished. You can use the dials from the Agile or Software Craftsmanship manifesto to finetune your situation, but do this based on the results you already got. The essential thing here is still reflection. Without reflection I constrain myself to self-blindness and No-Problem-Syndrome thinking. This hinders innovation and improvement in your actual work.

My Definition of Done

Over the course of the Agile Testing Days I noticed that the definition of “Done” is troubling for Agile teams. It was not only raised in the Tutorial I took, but also on several key note speeches and there were several presentations that also dealt with it. Personally I was wondering why. During the last three days I thought “Done” simply means that you can make money with it. Sounds like an easy definition, isn’t it? If your customer or product owner is not willing to pay money for the user story implemented, it’s not “Done”. Period.

But today I came up with another, easier definition of “Done”, which was strikingly easy to realize in the end. For the definition let me put together several influences from Elisabeth Hendrickson, Adam Goucher, Kent Beck, and many more to a remarkably easy combination.

Elisabeth Hendrickson told me that “Done” simply means implemented, tested (or checked as Michael Bolton defined it) and explored. Very easy, isn’t it? But there’s more to it. Implemented means it is designed, and written in probably some programming language, right? Does this make sense? Considering my visit at the Software Craftsmanship Conference in February I noticed that it seems to be accepted – at least for the crafts people I met there – that implementation means to do it using Test-Driven Development. There was no discussion and arguing about this, it was simply taken as a fact and it was done in a TDD-style. But, remembering back Kent Beck TDD means Red – Green – Refactor, right? A three part meme, oh another one. Great.

The next term in Elisabeth Hendricksons definition of “Done” is tested (or checked). Considering current approaches to get the customer’s requirements using an Acceptance Test-Driven Development style, I would like to coin tested to mean, that we agreed on examples to implement and these examples serve as acceptance criteria in an executable manner. So, actually tested here means, that the code was able to pass the acceptance tests which were agreed upfront onto and were elaborated maybe by the testers on the project to also contain corner cases, which were missed. On Monday Elisabeth Hendrickson taught me, that ATDD can be thought of as Discuss – Develop – Deliver. Gojko Adzic corrected this over twitter to Describe – Demonstrate – Develop. But what do we have here? I claim that tested (or checked) here refers to ATDD style of development, which is itself again defineable as a tricolon itself. So, wrapping up, we have

  • Done is Implemented using TDD, Tested via ATDD and Explored
  • Implemented using TDD is Red – Green – Refactor
  • Tested via ATDD is Discuss – Develop – Deliver

(Forgive me Gojko, but I find Elisabeth’s definition more intuitive to remember.)

Oh, I got one more. Explored clearly refers to Exploratory Testing. It actually might be a coincidence, but Adam Goucher came up with a definition of Exploratory Testing today in a tricolon manner, too: Discover, Decision, Action. Sounds reasonable to me. During Exploratory Testing we discover information about the product which we previously did not know. Based on the information we decide what to do about this. Michael Bolton uses to ask me here: “Problem or not a problem?” so that I can decide, what to do next. Inform the next test executed about what to do. After that we take the next reasonable action based on the information we just recently found out about our product. To make the terminology more cohesive here, I propose to coin explored to mean Discovery – Decide – Act.

So, to wrap it up, just like we have Practices, Principles and Values in Agile, we learn using a Shu – Ha – Ri fashion (thank you Declan Whelan for saving the Agile Testing Days for me by mentioning it), we can define “Done” in this same manner:

  • Done is Implemented using TDD, Tested via ATDD and Explored using Exploratory Testing
  • Implemented using TDD is Red – Green – Refactor
  • Tested via ATDD is Discuss – Develop – Deliver
  • Explored using Exploratory Testing is Discover – Decide – Act

Responding to Change

During my stay in the US during our vacations this year I was able to collect some notes and reflect on some stuff in the real world and compare it with software development occassionally. Today I decided to do a write up on the “Responding to Change” value of the Agile Manifesto.

While finishing the book The Pragmatic Programmer I got confronted with some concepts that I also noticed in the real world. Early on in the Design Patterns movement I learned to decouple my code as far as possible to allow change to happen. Craig Larman and Robert Martin early on got me into this thought process.

Being a European there were quite a bunch of things different in the US as here at home. Here is a list of things I noticed and I was wondering how hard it would be to change them in our software system. Luckily most of the things I wasn’t able to find at all, whereas most major variation points like currencies or tax systems were also thought on. How does your software respond to change it to the US system? What about selling your software to a European customer? For the testers and checkers among you, would you test that your software supports these? Do you test your software supports these? Does your software support them? Here’s the list, make the test, build up your mind and maybe let me participate in your findings.

Currencies There are a bunch of currencies I was confronted with. Euros of course, Deutsche Mark in the past (does anyone know what a “Groschen” is?), US Dollar, Pennies, Dimes, Quarters, Cents (does your software support the ¢ sign?), Disney Dollars (you can actually pay with it in Walt Disney World ressorts or take them as a gift for your mates at home), …

Tax Maps In the US every single state has a different tax model, even every county can have it’s own. In Germany there are different taxes applied usually, the lower one for food, the higher one for the rest. In Brasil there are 27 states each with their own tax model. Tax laws seem to be the most complex stuff.

Distances Meters, millimeters, centimeters, feet, inches, miles, kilometers, bavarian ells, english ells.

Area sizes Square feet vs. square meters.

Volume Litre, cubicmeters, Gallons to name just some.

Temperatur How many degress Celsius is one degree Fahrenheit? Is 90 degrees Fahrenheit hotter than 30 degree Celsius? What about Kelvin?

Fuel prices How much is 2.59 USD per Gallon in Euros per litre?

Consumption Is a fuel consumption of 28 Miles per gallon the same as 5 litres per 100 kilometers?

Voltages 110V vs. 220V vs. 400V, AC vc. DC.

What did I forget? What hard to change facts do you have to deal with?