Showing posts with label Agile. Show all posts
Showing posts with label Agile. Show all posts

Saturday, 13 December 2008

Announcing a new family development project

The family Jack is proud to announce a new project, code named Baby Jack 2.0.

My wife and I decided that the Agile methodology would not be appropriate in this instance, so we will be following the traditional Waterfall pattern. Requirements analysis has been completed (boy or girl will be just fine; complete set of appendages is a must). For the Functional Specification and Design stage we were fortunate in having complete blueprints that we could appropriate for ourselves - reverse engineering DNA is beyond my current skillset, even taking into account the potential of C#4.0. Build is now well underway: after 12 weeks we have a complete implementation, and we're just working on scaling the product up and out. We aim for a single delivery in the June timeframe, based on current progress.

An early screenshot is shown below:

BabyJack2-0Scan_12_08_2008

Tuesday, 7 October 2008

Doing Planning the Agile way

So we're going to use Agile to manage the development of our secret new product. What does that actually entail? I'm not qualified to say for the general case, but I can show and tell how we've been doing it on our project. Honesty reminds me to state that we didn't invent any of these ideas: most of them came from Mike Cohn's excellent books User Stories Appliedand Agile Estimating and Planning.

Handlers planning their dog's route round a Dog Agility courseTelling Stories

My first job, once we decided to go Agile was to fill up the Product Backlog. This is our wish-list of everything we would like to go into the product at some point, though not necessarily in release one. It contains a whole bunch of User Stories, which are concise descriptions of pieces of functionality that a user would like to be in the software.

There's no IEEE standard for User Stories, and that's a good thing, because they're meant to be written either by the end users of the software, or at least as if the users of the software had written them. How many IEEE standards do you know that can be implemented by your clients?

But don't panic: just because there's no standard, doesn't mean there's no help. We followed along with Mike Cohn's suggestion of writing User Stories in the form "As [some kind of user], I want [something] so that [I get this benefit]". In some cases we went on to record some high-level details about how the feature might work, but nothing more than a few sentences. User Stories are supposed to be placeholders for conversations that we'll have with our users (or pretend users) nearer the time when we implement the feature. I found the acronym INVEST helpful: User Stories should be Independent, Negotiable, Valuable, Estimable, Small, Testable.

In our Backlog we've got stories like "As a User, I want to be able to reset my password myself if I forget it, so that I don't get shouted at by the Administrator" and "As a Sales Manager, I want to be able to issue new license keys to customers so that I can make more sales and get a bigger bonus".

We'd already started along the well-trodden road of writing functional specifications before we were lured in a different direction by Agile, so creating our Product Backlog was mostly reverse engineering: working out what reason a user would have for needing each piece of functionality that we'd specified. This was a useful exercise in itself, as I could make sure that all the features had a reason for being other than "Wow! That would be cool to program."

Another time, I'd probably build up the Backlog starting with a Project Charter, or a high level overview of what we want to achieve, and then using a mind-mapping technique to break this down into stories.

Playing at Estimating

So now we know what our software's going to look like. Can we get it done by next week, as the boss wants? The first step in answering that is deciding how big the project is, and given our Product Backlog, the best way of measuring that is by sizing the individual stories. In fact, we don't even need to calculate an absolute size, just a measurement that ranks stories against each other.

For this reason we chose to measure size in Story Points. This isn't a unit that ISO can help you with; each team will have its own definition of a Story Point, which should emerge naturally through the course of the project. We could have chosen to measure in Ideal Days (days assumed to be without distractions and interruptions), but we again heeded our virtual mentor's advice that this would slow us down, as we'd start to think too much in terms of individual tasks, rather than how a particular story compares in size with others in the list.

The one problem with using abstract units like Story Points is deciding how big one is. We solved that problem by scanning though the list and picking a story that looked pretty small, and a story that looked fairly large and assigning them a 1 and an 8 respectively. We then measured other stories up against these two.

The other thing we agreed on was a sequence of "buckets" for holding stories of different sizes. For small stories, it's relatively easy to agree whether they differ by 1 or two points of complexity; but as features get bigger, it also become more difficult to estimate precise differences between them. So we created buckets with sizes of 1/2, 1, 2, 3, 5, 8, 13, 20, 40, 60, 80, 100 Story Points each (you might recognise that as a kind of rounded Fibonacci sequence); the agreement was that if a story was felt to be too big to fit in one bucket, then it would have to go in the next one up. A story that was felt to be bigger than an 8, for example, would have to be assigned to the 13 bucket.

Planning Poker CardsWith that sorted, we were ready to play Planning Poker. I made my own cards (in Word 2007) each showing the size of one of the buckets, one deck for each developer. If you want to play along, but don't like my cards, you can buy professional decks from a couple of sources. The "game" works like this.

We'd pick a Story and discuss it. We had all the people in the room we needed to make sure that questions about the scope of the story got answered. We then had a moment of quiet contemplation to come to our own individual conclusions about the size of the story, and to pick the appropriate card from our hands. Then, on the count of three, everybody placed their card on the table. If everybody had estimated the same, great! we recorded it in our planning tool (VersionOne). If not we talked some more. What made Simon give the story a 5, while Roman gave it 1? Then we had another round of cards - or sometimes just negotiated our way to an agreement.

It felt a little strange at first, but soon became quite natural. It's amazing how liberating it is to work by comparing stories, rather than by hacking them into tasks. We had a backlog of about 130 stories, and it took us just under three sessions of 4 hours each to get through the list - not bad for a first go, I thought.

The final thing we did was to triangulate: to go through all the stories that we'd put in a particular bucket and to make sure that they truly belonged there. Was this story packed so full of work that it was flowing over the top of the bucket? Move it up a bucket. What about that one, huddled down in the corner? That would surely fit in the next bucket down?

Self-adjusting estimates

It was tempting, back when we had a Backlog, but no sizes assigned to the Stories to jump straight to the stage of estimating a duration for the project. But that would have bypassed one of the big benefits of using Story Points: self-adjusting estimates.

Imagine you were going on a journey, and you didn't have Google Maps to help you plan. One way of estimating your journey time might be to look at all the cities (or motorway junctions, or other way-points) that you have to pass and guess at the time needed to travel between each. Now suppose you've set off on your journey and travelling the first few stages has taken longer than expected. The children are in the back of the car chorusing "are we there yet?". "No", you say. "How long?" they ask. And what do you tell them? You'd have to work out the journey times for the remaining way-points and apply some kind of scaling factor in order to give them an answer. But you don't do it that way. Do you?

Instead, before you set off, you calculate the total distance. Then, as you're driving you guess at your average velocity. At any time you can divide the remaining distance by the average velocity to give a fairly good estimate of when you'll arrive, that, because you've used historical data, automatically factors in things like, how overloaded the car is and how bad the traffic has been. If your kids have their lawyers on the phone ready to sue if you don't arrive exactly when stated you can even use your minimum and maximum velocity to give them a range of times between which you might arrive.

And so it is with Story Points. They say nothing about duration: they are simply a measure of size - like using distance as the first step in estimating journey times. Velocity is a measure of how many Story points you complete in an iteration. Estimating the duration of the project is then as simple as dividing remaining Story Points by velocity, and multiplying up by the iteration length.

But we haven't completed an iteration yet, so how do we know what velocity to use? If we had done a similar project using agile we might be able to apply historical values. This might be the case when we're working on version 2 of our product. But for now, we need to go with a forecast of our velocity.

We started by estimating how many productive hours we would have from all developers during an iteration (of three weeks in our case). Industry veterans reckon on up to 6.5 productive hours in an eight-hour working day, though we're still debating this in our company.

Then we picked a small sample of stories of each size from our backlog, trying to include a mix of those that focused on the UI as well as those that mainly concerned the server. Breaking these stories into tasks gave us an indication as to how long each story would actually take. We made sure to include every kind of task that would be needed to say that the story was really and truly done including design, documentation, coding, and testing of all flavours. Finally we imagined picking a combination of stories of different sizes so that we could finish each of them in the time we had for an iteration. Adding up the Story Points the stories in the combination gave us a point estimate of velocity.

We would have been foolish if we'd gone forward using just that estimate. So we took advice from Steve McConnell's Cone of Uncertainty (which shows how far out an estimate is likely to be at each stage in a project) and applied a multiplier to our point estimate of velocity to get a minimum and maximum. Since this was getting too much for our little brains to handle, we fired up Excel, and made a pretty spreadsheet of the minimum, expected and maximum number of iterations that we predict the project will take.

Unexpected Benefit

The final estimate was far larger than we'd expected (who is surprised?). So we needed to par back the scope. If we were basing estimates on a functional specification we could have cut whole features, but it would have been quite difficult to cut parts of a dialog box. But since we were using Stories, we were able to remove them from the release plan, then, simply by subtracting their Story Point estimate from the total, have an easy way to see the impact on the overall project.

At the end of all that, I'm happy. My feeling is that it is the most robust estimate and plan of any I've produced, and for once, we'll have a trivial way of reliably updating the plan as we see how we're getting on.

Monday, 22 September 2008

Trying out Agile

As I mentioned a few weeks back, we've started work on a new product. So far, this has entailed much writing of Functional Specifications, many meetings, and very little gathering of momentum. Whilst on holiday back in August, doing my stint of baby-sitting duty outside my daughter's bedroom, giving my wife a break, I spent some time pondering on the best way to run the project, and how we could get things moving.

I'd taken along a couple of books (more on these below) about project management, and they each recommended Agile as a way of managing projects. The more I thought about Agile in the context of our project, the more I felt it was a good idea. Returning to work, it didn't take much to persuade my fellow developer, Roman that it would be nice to try it out. Our Project Manager came on board with the idea quite quickly, and now the Director sponsoring the project has given his blessing to adopting it.

For those of you who have never considered Agile before, I thought I'd jot down my thoughts about it, and point you in the direction of some reading material that I've found helpful. Any confirmed Agilists, or Anti-Agilists should feel free to chip in where I've got it right or wrong.

What is Agile?

Agile isn't one methodology: it's a whole family of them. Think of it as an old-fashioned Taylor's shop, where you can go to choose your cloth and colour, and have a methodology made to measure. Members of the family include DSDM, Crystal Clear, the better known Scrum, and the infamous (in some circles at least) EXtreme Programming (XP). They all vary in formality, terminology, documentality (I thought I'd been clever and coined a word, but it seems I wasn't the first), but at the heart of each is an emphasis on flexibility and feedback.

Flexibility is an attribute we all wish our work possessed at the end of a project when the client decides that what he's got isn't what he wants, even though it is exactly what he signed up to at the beginning. Unfortunately, by that late stage, it's usually too late to accommodate the kind of drastic changes that the client invariably has in mind. An Agile methodology side-steps that problem by delivering the work in increments, making sure that the customer gets the things that he values most early on in the project, and giving him opportunity to change his mind about what is important at each stage as he sees how things are going, and when new ideas strike him.

For many developers, Feedback is what they get at the end of a project in the form of a dressing-down from an irate executive who has just had to explain to a customer why they finally got the software six months late, and 100% over budget. The developer would no doubt like to feed back to the executive, that the reason for this is that she was asked to commit to a plan made before the requirements were even agreed or developers assigned to the project; to say nothing of the design, meticulously refined during the first few weeks of the project, that had to be thrown out when a key feature was added half-way through. Mention Agile to this kind of executive, and he'll snort, and dismiss it with the put-down that there wouldn't even be a plan if Agile had been followed. Which isn't true. Agileists might only spend a couple of hours scribbling estimates on post-it notes before they dive into coding; but they do that at regular intervals throughout the project when they know what progress they've made on implementing whole, working, tested features, so can anticipate with increasing confidence how much work remains to do.

Agile works very much on just-in-time, and just-enough principles. Requirements are gathered, often in the form of user stories, short descriptions, in the language of the user, describing something that the system needs to do. Enough of these are captured to define the first version of the system, and they include just enough detail for the group of developers on the project to give an order-of-magnitude estimate (often in Story Points) to each Story. At this stage attention turns to Iterations.

Iterations are the building blocks of Agile plans. Whatever you call them (Scrum calls them Sprints and others call them Time boxes), they are periods of time in which a new, potentially releasable, version of software is created. Customers priorities the stories, and pick out the ones they'd like for an iteration. Developers unpack each story to discover what tasks are involved, and how long each will take, in the process creating a detailed iteration plan. More discussion with the customer fleshes out each story, and provides a list of Acceptance tests that will indicate when the feature is done. The team get their heads together to design as much of the architecture as they need for that iteration; maybe the new designs will involve  refactoring code they have written before - but they always work with the safety net of fully automated unit tests to protect them. 

With each Story coded, tested and documented, the new version of the software is ready for demonstration to the customer. Seeing it might give him new ideas. But no matter; they just go on the list and get prioritised along with everything else. The Developers reflect on the iteration passed. They review their expectations about how many Iterations they expect the whole release to take, then round they go again on the next cycle1.

The first agile practices began back in the 1980s when developers and managers realised that for many software projects the Waterfall, plan-and-design-it-all-first, methodology wasn't working. Agile built up its fan base through the 1990s, with Kent Beck introducing Extreme Programming in 1996. In 2001 a group of Agile leaders got together and created the Agile Alliance and formalised the Agile Manifesto which states the principles that unite their different methodologies. Agile now has wide adoption, with big names like Cisco, Google, Microsoft and Yahoo all using it to a greater or lesser extent.

How can I find out more?

  • I first learnt about Agile in Alistair Cockburn's book, Crystal Clear. This describes the Crystal Clear Agile methodology which is perhaps the the one that refugees from Waterfall projects will feel most comfortable with, as it is more moderate than XP. Alistair leaves no area of the project lifecycle untouched, giving advice about how to carry out each part in an Agile manner. He has useful advice on the roles that different people in the project play, and what artifacts each should produce.
  • Johanna Rothman has written the excellent book Manage It! that covers a multitude of Project Management techniques, not just Agile one.  Reading her book, it seems to me that Agile emerges as her favourite way of running projects.
  • VersionOne, the developers of the agile planning tool that we're currently using (because they do a free version!), have put together a useful overview about Agile, Agile 101.
  • Gabrielle Benefield, formerly Director of Agile Development at Yahoo has co-authored a Scrum Primer
  • Last of all, I must mention Mike Cohn: I wish I'd discovered his work earlier. On his website, mountaingoatsoftware.com, he has a number of useful articles and presentation about how and why Agile works, and also his blog. I found his Introduction to Agile Estimating and Planning very helpful. I'm currently reading two of his books, User Stories Applied and and the more general Agile Estimating and Planning. Both of these books appear within the top 20 of a recent list of the Top 100 Software books, so I'm clearly not the only one who thinks these books are well worth your financial and cognitive investment. They'll be useful whatever flavour of Agile you might be considering. Each has an informative extended case study describing how Agile plays out in practice on a project.
Footnotes
  1. Agilists of other denominations might detect a bias towards XP and Scrum in my description: that's because proponents of these two methodologies seem to be most vocal on the web, and have contributed most to my stack of reading material.