Since posting The Great Agile Hardware Myth last week, I tried to think of some obvious myth that exists in the mainstream; some claim that we’ve all made that, without fail, turns out to be absolutely and entirely false. Took a while but I think I found it. We could have called it The RTL Done Myth, but I chose to call it the The 90% Done Myth.
|We’re 90% done. If everything goes well, all that’s left is testing and debugging this last 10%.||The gant chart that says we’re 90% done is lying to us. We know it’s lying to us because it’s lied to us before, many times. We’d rather not use this gant chart but because tracking development progress is so important and we haven’t figured out a better way to do so, we feel like we have no other option than to keep trying to believe it.|
(NOTE: I think of the 90% Done Myth as being most familiar to RTL and verification engineers only because of my experience. If that’s not you, it may still apply but you’ll have to use your imagination.)
The 90% Done Myth starts with design and verification engineers building their list of tasks, adding them to the project schedule, then working to cross things off the list. For the first few months everything always looks pretty good on the gant chart because we’re writing lots of documents and code and getting lots of stuff done.
- Build a schedule… done
- Write a design document… done
- Write a testplan… done
- Review some documents… done
- Write <these> RTL blocks… done
- Write <those> testbench components… done
- Integrate <these> with <those>… done
- Write <some script> for running tests… done
- Testing… in progress
Where code is concerned we usually mean done except most of the testing. There’s also a few little things here and there that don’t show up anywhere but that’s just easy stuff. Get the code done and the gant chart expects everything from there to go smoothly.
Secretly we know better though it’s always hard to start with the worst case scenario (aka: likely scenario) that testing is going to take longer than we first suggested. Sooner or later everyone finds out anyway when we put the design and testbench together only to find the quality of code we’ve written is terribly poor. From there, we struggle for months, fixing each bug, hoping it’s the last, only to bump into the next. Going from bug to bug to bug, we never really know where we are but we always feel like this next bug could be the last. In our status reports we pressure ourselves into suggesting we’re close to being done. Of course that’s not the case until the very end so we sit in status meetings trying to explain for the 8th week in a row why this time we’re finally definitely 90% done.
Busting the 90% Done Myth for Developers
Trying to explain why you’re still 90% done for the 8th week in a row isn’t fun. You’re stressed and you start doubting yourself but you also still feel like you’re almost there… which is why it’s so hard to break the cycle. It’s time to trust your experience. You’re not almost there. Here are some ideas for avoiding the stress and uncertainty of 90% done.
- Resist the practice of task-based planning and reporting: planning and reporting tasks is what gets us into a mess so a good place to start is to avoid tasks altogether. Documents don’t count as progress; neither does code written. Without some way to measure the quality of what we’re producing, there’s too much ambiguity with task-based development.
- Use feature-based development: instead of completing tasks, build features. Prioritize them in terms of importance and complete them one at a time.
- Measure progress with passing tests: An objective way to measure progress is to stamp features as DONE when they are documented, coded completely and tested exhaustively and you’re confident there’ll be no need to come back to them. “Done except for…” is not done.
- Deliver passing tests as soon as possible: To suggest managers will be nervous dropping the task-based progress metrics they’ve quite likely used their entire career is an understatement to say the least. The only way I’ve found to calm nerves and build trust in objective progress metrics is to deliver progress as soon as possible, then keep delivering. To be clear, “as soon as possible” is not months from now nor is it weeks. We’re talking days. Deliver progress before anyone expects it and your boss will see you’re on the right path.
Busting the 90% Done Myth for Managers
You don’t want to see it again… the rosy pictures painted by early progress reports transitioning seamlessly into a perpetual cycle of confusing status updates and weekly extensions (i.e. “We’re still 90% done. We just need to <blah>. We should be done next week.”). From there, you’re in panic mode scrounging for people, tools and licenses that’ll help get things back on track with limited budget. Here’s some tips that can help avoid getting stuck at 90% done next time around.
- Insist on objective feature-based metrics: Ask questions like “has this code/feature been exhaustively tested?” or “if we stopped today could we ship this feature?”. Objective feature-based development will help you avoid the ambiguity that comes with task-based estimates and reporting.
- Limit work-in-progress: It’s easy for people to be overwhelmed by long SoC requirements documents. Limiting work-in-progress will condense the problem space in play at a given time and help your team keep its focus. Encourage development in small batches that take a couple weeks to a month or two to complete.
- Observe progress, don’t plan it: Using objective feature-based progress metrics and observing the rate at which your team is coding and testing features will help you build evidence based delivery estimates. They’ll help you predict major delays, build contingency plans and prep upper management long before the usual panic sets in. An example…
- We’ve got 20 features on our list and the team has only tested (finished) 3 so far. Even though they say they’ll be done on time, if we continue at this pace it looks like we’re going to be late by a couple months. It’s still early so now would be a good time to ask for more people or propose we cut scope to meet our schedule.
I’m guessing we’ve all been under the influence of The 90% Done Myth at least once in our careers. It’s fueled a lot of last minute blazes that make signing off an SoC or ASIC or FPGA or IP or whatever you’re building so stressful. Feature-based development can help bust through the 90% Myth and alleviate that stress next time around.