If you’re an up and coming product manager, you probably have had discussions about the definition of done; one of the more useful concepts coming out of agile methodologies. It’s nothing fancy. It’s really just a checklist of items that are agreed upon by the product and engineering teams (as well as other departments like sales, marketing, devops) that must be complete before you can consider something done.
This could look something like:
- at least 2 code reviews
- unit / UI / integration / performance tests have run and passed in all lower environments
- passed accessibility requirements
- meets acceptance criteria
- end-user docs updated
- API technical docs updated
- Go to market teams notified and activations scheduled
A definition of done helps with planning, reduces arguments about the backlog item and can ensure that teams not working directly on the item know when it’s their turn to spring into action.
In many cases, a fully resolved definition of done means we get to ship our code to production! Our clients will fist-pump, rejoice and immediately start jumping on the new feature! Our prospects will stop evaluating competitors and sign their orders!
The problem is, each of these checklist items on our definition of done has no definition of done. We tend to get it twisted that because we put checkmarks in all the boxes, that we’re really going to nail this for our customers. Your definition of done is not about that.
There’s so much risk in the software development process, especially if you have a thousands of users. If you blindside users with a workflow change without proper docs or communication, they’ll be pissed. If you don’t do performance testing and suddenly you introduce a slow blocking query, you’ll create a bad day for your company and your customers as well.
You’ll still fuck up and miss things. That’s a fact. But at least you can point to the definition of done and know where the gaps are and how to improve. You can reduce internal conflicts just by saying, ‘You know our definition of done that we all agreed upon? Turns out, it wasn’t done. There were gaps. We’re fixing it. Here’s what we’re adding.’
Like all things in the world of technology, everything is a work in progress.
Here’s a good example from a previous product I worked on.
We had introduced some new product on-boarding walkthroughs. These had passed all definition of done criteria. They’d been rigorously tested. Everyone knew they were launching. Marketing, sales, support, everyone knew. Except implementation. Somehow, they weren’t explicitly told. Not that big a deal we thought; these walkthroughs are designed to make their lives easier! Except, unbeknownst to the product team, they used some browser automation to automate configuration of new client accounts. The new on-boarding guides were disrupting this automation without anyone’s knowledge leading to many false starts with client activation calls.
When our implementation team leaders came barging in, we could point to the definition of done, show where there was a gap, and discuss the plan to sharpen it. In this case it was as simple as adding a notification to the implementation team and logging record of acknowledgement explicitly. We dramatically reduced the time spent arguing and got immediately into solutioning.
It’s probably 20% done according to your users. This is completely OK. You may have done tremendous due diligence on the product discovery, but you never really understand the problem you’re solving until release date, so build into your roadmap some extra effort to build v1.1, v2.0 and vNext of the feature. Now that you’ve announced this new feature, you’ll finally get truthful feedback and can move forward with refining. This is the premise behind MVPs to some extent and it’s a meaty topic worthy of it’s own deep dive in a future blog.