Effectiveness of a real product stream

1st March 2010
Simon Baker

I’ve pulled together some data for the first year of a product stream we created and plotted it as charts for throughput, rework and effectiveness.

The first chart shows the weekly rework. I’ve talked about rework previously so I won’t cover it again here. The blue line indicates the remaining technical debt and the blue bars the repaid technical debt. The pink line indicates the remaining defects and the pink bar the fixed defects. Week by week, it can be seen that defects were fixed as soon as they were discovered to reduce the remaining defect count to zero. Also the technical team continuously repaid technical debt to keep the remaining amount of rework small.


The second chart shows the amount of throughput every week.


The peak at week ending 18/12, without any throughput during the 8 preceding weeks, demonstrates a flush of inventory amounting to 104 cards and corresponds to the alpha release of the product. In the rework chart, a small increase in fixed defects can be observed during the same week. Inventory again builds up for two weeks, as improvements are made to the automated deployment system, until the next peak at week ending 15/01. At which point 48 completed cards are flushed. Releases then occurred every week and while some variation is observed the throughput remained stable.

To improve the performance of the product for the beta release during week ending 26/02, 7 technical debt cards were completed. As the system experienced more rigorous use by editorial users, 12 defects were fixed plus a further 8 the following week due to increased traffic. The official launch was completed in the week ending 18/03. During that week some data inconsistencies were encountered following migration from the old content management system, which resulted in 9 defects. In response to traffic loads
the load balancers were tuned with 5 technical debt cards. This effort continued the following week with a further 8 technical debt cards and 7 fixed defects as traffic increased to approximately 180 million page impressions and 3.7 million unique users per month with an average page weight of 2Mbytes. Further peaks in technical debt of 20, 16 and 10 can be seen during the weeks ending 06/05, 17/06 and 24/06, respectively. This work concentrated on the expansion of the product with reconfiguration of the production environment to support additional channels.

It’s worth me recapping on effectiveness. Effectiveness is used as a measure of the product stream’s ability to sustain throughput and minimize failure demand, which allows capacity to focus on meeting value demand. It was inspired by the First-Time-Through (FTT) measurement used in Lean manufacturing to measure the effectiveness of a cell’s standardized work as a percentage of product made without any need for rework or scrap.

The effectiveness of the product stream is defined as:

Effectiveness = ( Throughput – Rework ) / Throughput


Throughput = number of cards released to production (excluding completed rework)


Rework = the number of technical debt and defect cards in inventory and work-in-process

The final chart shows the weekly effectiveness of the product stream.


The lows at weeks ending 29/01, 25/03 and 01/04 can be attributed to marked dips in throughput. At 29/01, 12 cards were queued as inventory whereas a small increase in the amount of remaining rework was present at 25/03 and 01/04. Clearly the product stream is most effective when the completed rework was small compared to the throughput and was enough to keep the remaining rework small compared to the throughput.

No comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Web Analytics