This week, I am expanding on five lessons I've learned while designing and consulting on content-related workflows. The first of these: "Data collection and process checking should be used to identify and resolve problems, not find a function or person to blame."
When errors or problems occur, we have a natural inclination to avoid data collection and substitute personal or departmental belief. Although the behavior is not unique to publishing, it can be more persistent, in part because we're trained to believe that each issue, each book or each content project is in some way unique.
Somewhat perplexingly, many data-gathering efforts are truncated by approaches that combine analysis with the presumed answer to the problem (e.g., "We need to show production that they are the reason we are shipping books late.") Even when these efforts gather useful information, other departments can often see the bias, reducing buy-in.
Workflow improvements take hold when the relevant departments or functions first agree on a clear, ideally simple statement of the problem they want to solve. Ideally, they extend that statement and identify the data that would help them understand and assess the causes of the problem.
Systemic problems (the ones most worth solving) are not a function of poor performers. In the vast majority of cases, errors and delays occur because workflows are poorly designed. Even when individual or departmental skills are issues, they often are the residue of inadequate planning.
While this approach represents an ideal, it's not idealistic. Data that can be shared across a set of functions is foundational: it builds both understanding and the capacity for sustained improvement.
Tomorrow, I'll address the second consideration, "Quality is best supported by processes with fewer handoffs (and over time, fewer inspections to maintain quality)."