Saturday, September 10, 2005

BDUF, little waterfalls, user stories, and on-site customers

There have been a few discussions on the xp mailing list that discussed BDUF (Big Design Up-Front) vs Agile/XP methods. And also the concept of "little waterfalls". Are you really doing iterative/xp like development, or have you just taken your big waterfall (2 months of writting requirements, 3 months of doing design, 2 months of coding, 1 month of integration testing) and just broken it down into iterations where every iteration is a little waterfall - 1 month of writting requirements; many "tasks" where each one is 2 days of doing design, 2 days of coding, 2 days of tessting; and then at the end 1 month of integration testing. Are you doing waterfalls, they are just smaller ones?

On Implementation we currently do little waterfalls. How much depends on the team, but we are obviously doing it. This was highlighted by the fact that some members recently wanted to go back to doing very detailed designs before entering the "coding" phase. We no longer use Word documents - we now use the Wiki. This has given us some benefit in that everyone can see everyone elses comments - real-time; we can be more collaborative, because the author/reviewer can have on-line conversations and everyone can see them; etc. However, in this context, what is happening is people are pretty much writting all of their code on the wiki (or in the actual source and then copy/pasting it to the wiki) but not compiling it or running it. Using an XP term - they are getting no feedback. Using another XP term - this is a waste of productivity. In any case, this is what is happening. The "design" is then reviewed, comments made on standards, style, approaches, etc. And refactoring happens. But again - with no machine feedback. The feedback is all at the human level. Once that is all approved, then the person copies the "design" back into their code (or maybe they've been updating the code all along and copying it into their "design") and then they start the "coding" phase.

At that point they are running the code and, for the most part, manually verifying that they have the results they want. This is very much a waterfall approach. In any case - that wasn't what I really wanted to talk about - but it lays the context of "here is where we are now". Another part of this (and what I wanted to discuss) is "requirements gathering" and "writting requirements". Again - we take a traditional waterfall type approach. We get all the requirements up front, and have them as detailed as possible. Requirements may change or have to be clarified (and the fact that we acknowledge that is a good thing), and ideally (in our current process) those will get documented. Everything has to be documented, in detail. The XP thought on this is that it is a waste of time. I'm reading Extreme Programming Installed right now, and it talks about this. It talks about the programmer directly talking to the on-site customer and resolving many of these little issues - that aren't discussed in the user story (which is a pretty high level statement of some feature that should be implemented - my words). We don't have an on-site customer, but we have an on-site customer representative (the Business Analyst (BA)). The XP thought is that a developer going over and clarifying something with the BA is a "good thing". It is better than email, irc, using the wiki, etc. - because it is human communication, faster, etc. We currently tend to believe this is a "bad thing". And the following from the book sums this up very well,
You might also be concerned that this important information about the requirements will get lost somewhere, but remember that you will have specified acceptance terst, and they will surely cover [the requirements that were being discussed]."

At this point a light-bulb goes on for me. I'm trying to reconcile how cool I think XP is, how much I like change and the whole idea of going to User Stories, etc. appeals to me; but at the same time - at work - I'm pushing for "You need to make sure you get the requirements specification updated" - "You need to make sure and add that requirement to the wiki", etc. And this is exactly it - the proven fear (i.e., this has happened over and over again and lead to bad things happening) that the information will get lost. That we'll be reviewing the design or code and say "That doesn't meet the requirement" and the programmer will say "Oh - yeah, I talked to the BA and that isn't the real requirement" or a tester will be testing the system and have the same experience.

XP, in part, addresses this by the statement above. "..., but remember that you will have specified acceptance tests,..." And that is why we have the fear. Because we DO NOT have specified acceptance tests. We do a lot of testing and we do a great job of testing... but it is more a waterfall approach to testing. We have unit tests and regression tests. And we have tests written by the testers that are based on the requirement documents (and thus the need for everything to be documented). But we don't have what I believe XP is talking about. We don't have the developers/testers and BA sit down and write the acceptance tests together. And we definately don't have any form of TDD where the discussion that appears in the book (or happens in "hallway conversations all the time at work) drive two new tests to be written - RIGHT THEN. The book is right. IF as soon as that conversation happend, the people involved sat down together and wrote up the new tests - and then let everyone know those tests existed - and those tests were part of the test suite that automatically runs whenever testing is done. Then yes, we wouldn't have to go update the requirements documents - because the tests meet that need.

And beauty of this XP practice is that the XP way is so much more efficient. It uses human communication to resolve/clarify - which is faster. It uses an acceptance test to document it (which given the right framework, etc. - is faster than having the BA document it, the developer review it, etc. etc.) and that acceptence test has dual purpose. Not only does it document the requirement, but it is then used to prove that the system meets the requirement. In our current methodology that is two tasks. The requirement statement documents the requirement - and then some time later (days for sure, but more likely weeks) a tester writes a test to prove that it works. Based on their understanding of what the text says - not of what the actual human said.

The challenge, imo, is to get people who are used to working in this methodology to change. Because most likely they are going to think it is less precise (it isn't written down in a document) and that it takes more time (because it is so easy to just write it up, vs having to create a test). The truth of the matter thoug, is that the test is more precise - the system will either do or not do exactly what it is testing, and that it will take less time - because there is so much wasted time when requirements are misunderstood and code is redone, tests are redone, information is lost, etc.

So - as is said over and over in the XP books and discussions - you can do some of XP (and most likely you'll have to adopt it a piece at a time), but the more practices you do, the more "bang for the buck" you'll get. Here is a great occurence of that. You can do user stories on their own - but how they really work is when they are combined with acceptence tests - and acceptence tests that are written in collaboration with the user and written as the user story is being worked, not after the fact.

Bottom line: We currently have a problem that our requirements are not always percise enough and sometimes hallway conversation information is lost. The solution is NOT to write more detailed requirements. The solution is to use User Stories COUPLED with developing Acceptence Tests - not only upfront, but any time more information is learned; Acceptence Tests which reflect that newly accquired information.

Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?