Monday, 3 August 2009

Agile Documentation and Passing Your Audit

Or should this be called "Agile Documentation versus Passing your Audit"? Remember the Agile Manifesto:
"Working software over comprehensive documentation."
(Please do not take this too literally: I understand that the Agile Manifesto is not against documentation, it suggests that we expend effort elsewhere rather than on unnecessary documents that no-one reads).

There's been some discussion lately on the Agile Alliance LinkedIn group about documentation in Agile software development which has been all a bit high brow for me (a deficiency of concentration on my part), because I want real, concrete answers to questions and useful "do this, not that" advice about how to get documentation out of an Agile software development process that will satisfy our ISO auditor.

So, in the absence of concrete answers, I suppose I'd better start thinking of some for myself...

First of all, in the interest of keeping documentation to a minimum, lets enumerate the things we need it to do for us, from both a development perspective and an audit perspective. From the development perspective, it needs to:
  • Communicate the over-arching vision (perhaps optional, if it's communicated in some other way?).
  • Capture the value requirements so we build the right thing and test functionality against genuine user expectations.
  • Record our progress (during designing, coding and testing) such that we can effectively plan and manage the project.
From our auditor's point of view (and I don't claim to know this inside out: this will be mostly conjecture), there are two main categories of document:
  1. The description of the processes we use to design, develop and test our software.
  2. The instances of various documents for specific projects that illustrate that project's state and history.
The former set of documents is in a much slower state of flux, but it is still in motion (if we practice Kaizen that is). Let's leave that set of documents out of this discussion and come back to it some other time (although I realize that a process audit is going to look at both sets and ensure that the project documents are a "product of" the process specification, rather than some ad-hoc jumble that bears no resemblance to your promised quality controlled production system).

The second set of documents try to capture a living process (which is why it's so hard, and why Agile suggests not wasting effort creating documents that are immediately out of date!) and they overlap considerably with our requirements as software engineers, test engineers, project managers and quality managers. So to ensure we only do what's necessary, it's sensible to figure out what the minimum requirements are to pass the audit: If a set of index cards with user stories on is satisfactory, then use it. If your spreadsheet of test cases will do, stick with it. Taken lots of pictures of whiteboards? Check if that's OK to archive and do no more than is necessary. There is rarely a genuine need for a part of a quality process to be captured in a full blown, expensive to produce and maintain text document, so avoid them where you can.
The big idea for me here is that we're not actually interested in documents per se, rather an information repository.
Many of the artifacts I referred to in the previous paragraph are great for capturing a moving target: spreadsheets are meant to be filled out, modified, filtered and transformed. Taking a picture of a whiteboard captures an extremely accurate record, not prone to the scribe's opinion. Index cards are perfect for user stories, can be easily marked with start/end dates, developer, tester and can be moved around to illustrate their current place in the process with great ease and many software versions of this popular system are now in existence. There are two types of information being captured here:
  • The current state of affairs: the current set of features and their acceptance tests; development task breakdown and costings for each; un-started, in progress and "code complete" tasks; the current set of QA tests and which have passed, failed or not been run; the current prioritized bug list.
  • The history of the project: when features were added or removed; re-estimations; when coding tasks were started and finished; particular obstacles and delays; tests run for older builds; time spent bug fixing; records of meetings (minutes, photos, archived flip charts).
The current state of affairs is likely the information we'll use to plan and manage our projects, whereas the history is very much an audit tool. The two are not at odds though: it's easy to allow one to transform into the other or to have the history generated simply as backups or deltas to the current documents. Something I've come to realise more and more of late is that recording the history of your projects is not just a necessary evil to satisfy your auditor, but it is a vital tool in estimating and planning future projects.

So, with a good idea of the kinds of information we need to capture and the methods we want to use to manage it, it's not very hard to envision a system that contains all of the information and has different views upon it to allow interactions in the ways required. In fact this is probably yesterday's news, as tools like XPlanner and Version One provide a significant portion of this functionality. The things I'd say are important to capture are:
  • Vision: a paragraph of text; perhaps a few photos of your "vision box".
  • Requirements and design activities: the current feature set, together with user acceptance tests; design meeting records and discussion threads; mock-ups. Include all changes to this, including features that are removed.
  • Estimation: coarse preliminary estimates; secondary any detailed breakdowns and estimates; the current task breakdown and estimates. Again, include all changes made including re-estimation.
  • Coding Activity: the current task backlog; current tasks in progress, start/end dates and revision numbers; assigned developers. Record all true effort expended on these tasks, as this gives you the true picture of progress required for ongoing estimation and future project estimation.
  • Testing/QA Activity: the current set of QA tests; features in testing with start/end dates; assigned testers; bugs created, addressed and resolved. Recording both the testing and re-development effort is again a vital activity.
  • Project Management Activity: strategic planning; project meeting records; project progress metrics; project resources; risk assessments. Again, changes to the information should be recorded and it is very important that the nature of strategic decisions can be understood at a later date from an audit perspective.
Imagine all of this data in a flat database, for simplicity's sake, say a single XML file. Now could someone write some XSLT and a bit of CSS to create a very nice looking specification based on the features and their acceptance tests? They most definitely could. The same to give a dashboard overview of development and testing progress? Easy too. The view for a group of developers and tasks, including some schedule predictions? You get my point...

So, capturing the data is the key thing, not the creation of "documents": just make sure the data gets recorded and come up with some way of transforming it into media to suit the various requirements of your internal stakeholders and your auditors.

Well, I've rambled on for a very long time, so I'll leave it at that for now. I'll probably come back to the nature of the process documentation at a later date.

No comments:

Post a comment