Monday, October 30, 2017

“Perfect vs. Good Enough” – Writing Quality in the Online Age - Part 2

This is part 2 of a three-part post examining the issue of “perfection” in content creation in the online age. 

The first part, which I posted on October 10, is a column I wrote in 2001 discussing an event from 1998. (Stay with me here...)

This second part is a column that I wrote in 2009 discussing what had changed since the first column in 2001. Look for the third part, in late November, to revisit the issue of “perfection” in light of emerging trends in 2017.

In this post, I’ll list the core points of the Wired article and some ideas about their impact on technical communication. First, the Wired article…

“…It’s … the latest triumph of what might be called Good Enough tech. Cheap, fast, simple tools are suddenly everywhere. We get our breaking news from blogs, we make spotty long-distance calls on Skype, we watch videos on small computers… The low end has never been riding higher.

So what happened? … technology happened. The world has sped up, become more connected, and … busier. As a result, what consumers want from the products and services they buy is fundamentally changing. We now favor flexibility over high fidelity, convenience over features, quick and dirty over polished. Having it here and now is more important than having it perfect. These changes run so deep and wide, they’re actually altering what we mean when we describe a product as “high quality.”

And it’s… everywhere. As more sectors connect to the digital world… they too are seeing the rise of Good Enough tools … Suddenly, what seemed perfect is anything but, and products that appear mediocre at first glance are often the perfect fit.”

Two examples from the article…

·         MP3, whose audio quality is lower than the CD standard but whose greater file compression lets us cram hundreds of songs into devices the size of a pack of cards.

·         The netbook, with minimal storage and power but which is light, portable, and cheap compared to traditional laptops that have more features, most of which may go totally unused.

The examples offer “flexibility over high-fidelity, convenience over features, and quick and dirty over slow and polished” and each has altered its market or created new markets. How might these factors affect technical communication? Here are three ideas – none new but, based on my training and consulting experience, worth repeating:

·         A major change since 2001 is the appearance and partial acceptance of user-generated content for online use. “Let the engineers write the doc” has been a laugh-getter for years within technical communication but the idea keeps coming up for one good reason – the engineers (the subject matter experts) know the material. And their content has now been appearing for years in blogs, wikis, and tweets.

I don’t see user-generated content replacing traditional online documentation/help but extending it. The documentation/help will still contain stable core content but link to user-generated content in blogs or wikis containing new, changeable content. Technical communicators and user-authors form a virtual team. If you create online documentation/help but don’t link it to your company blogs or wikis, take another look.

Similarly, video and animation have been around for years but not often used because of the costs and required skills. But lower prices and simpler tools are putting video and animation into more hands – e.g. user-generated. It may be “movies” created quickly using tools like Adobe Captivate, TechSmith Camtasia, or MadCap Flare, or from video bloggers. (YouTube may also be a source. You may not find the perfect video there, but there are so many clips about almost any topic that you may find one that’s good enough. The volume of clips provides flexibility, and the material is available quickly, even if the production values may be “dirty”.)

So rather than discount the idea of user-generated content, we should be actively helping to create, organize, use, and distribute it in the first place.

·         Software-driven writing features like templates and style sheets have existed for years but are still not used as often as they should. One reason is that the settings in these control files are often not quite “right.” Something in your material deviates from a setting in the control files. You could modify the setting, but it’s often easier to set up the non-standard material by hand. The result? You get perfect content, but at the expense of losing the consistency and automation provided by the control files.

Instead, consider setting up your control files to handle your common needs and ignore or modify other needs that are too difficult or marginal to handle in the control file. For example, you might create a “first-paragraph” style with extra space above for use in hard-copy, but can you replace that style with the “body” style and live with the “good-enough” result?

So the results may lack the perfection that you got by hand-tweaking the material, but you get the good enough, quick-and-dirty convenience of bringing programmatic control to your writing tasks. (Wired made an interesting point about users coming to accept MP3 quality as the standard rather than the higher quality of CD because they used MP3 more and got used to it. As more and more readers get material online, they may come to accept online style quality as the standard.)

·         Finally, consider lowering your writing standards – not to write badly but to change the definition of quality, standardize that definition, and write to it.


Many technical communicators started in hard-copy and transitioned to online, a transition that involved some hard changes including:

·         Adding tasks once performed by other people, like editors, to the writer’s workload.

·         The speedup of the work, losing the time we might once have had to get material “perfect.”

·         The appearance of media like blogs and wikis whose need for immediacy runs counter to the idea of perfecting the writing.

Since the old column appeared, I’ve seen more and more technical communicators accept the idea of “good enough.” But many still fight it, which is a losing battle. The field has seen many changes, each fought but not stopped. This is one more. If we fight it, the change will occur but without us. That would be a shame because these “good enough” technologies and methodologies are actually fun and highly challenging.

Wednesday, October 25, 2017

Correction To My Post About Flare 2017 R3

In my post about Flare 2017 R3 on Nov. 24, I said:

"If you copy the content in this topic, paste it into Word, and generate the readability statistics in Word, you’ll get different results. When I tried it, Word gave a readability of 65.1 and a grade level of 7.0, both still excellent but different from Flare... This may be caused by Flare’s using the same "Flesch Reading Ease” and “Flesch-Kincaid Grade Level” algorithms as Word but with different options enabled."

As it turns out, the problem is not one of having different options enabled in Flare vs. Word but rather the fact that Flare and Word interpret what a "sentence" is somewhat differently. So the feature is still as useful as I said it was in yesterday's post, but the Flare and Word results are not directly comparable.

Tuesday, October 24, 2017

Some New Features in MadCap Flare 2017 R3

MadCap released Flare 2017 R3 a few days ago. In this post, I’ll look at two of the new features that I think are most useful.

Text Analysis

In the past, one problem that I had while writing topics was that I couldn’t determine the readability of the topic content. Flare didn’t offer a readability checker. I had to output a Word target and run that through Word’s readability checker. This process worked but was a bit clumsy. The new text analysis feature seems to offer a simple solution to that problem.

Selecting Text Analysis on the Tools ribbon opens the Text Analysis pane, shown below.

I selected the readability scores option for one topic (from the basic training class), shown below, with the results also shown below.

Flare shows good results with a green bar color, fair with yellow, and poor with red. So this topic has a fair reading ease score of 76 and a good grade level score of 3.9. (Both actually excellent.) I can check any content from one topic to an entire project.

Be aware of one thing when using this feature. If you copy the content in this topic, paste it into Word, and generate the readability statistics in Word, you’ll get different results. When I tried it, Word gave a readability of 65.1 and a grade level of 7.0, both still excellent but different from Flare’s results. This may be caused by Flare’s using the same "Flesch Reading Ease” and “Flesch-Kincaid Grade Level” algorithms as Word but with different options enabled. (We can’t yet modify those options in Flare.) This peculiarity aside, I’m delighted to see the text analysis feature because it simplifies my Flare workflow.

Style Inspector

The Style Inspector is a short-cut way to perform stylesheet tasks without opening the full Stylesheet Editor. It lets you see what styles your text is using, modify a style’s properties, add new properties to a style, add a comment to a style, even convert local formatting to a style on the stylesheet.

Selecting Formatting Window in the Styles group on the Home ribbon opens the Formatting pane with the Style Inspector tab selected, as shown below.

In this example, I put the cursor on the topic’s title in the left pane and:

  • The Style Inspector on the right tells me that the title uses h1, the font-size is 140%, and so on.
  • There are no local style attributes, as indicated by that empty pane at the top.
  • I could add local formatting by clicking the + sign in the top pane or an additional property by clicking the + sign in the lower pane.
  • I could change the value of one of the properties by clicking the ellipsis to the right of that style.
  • I can see what style sheet is controlling this style, here “ipswitch_styles.css” and see the path to that stylesheet by hovering over its name.
  • I can add a comment to a style by clicking any property of the style, right-clicking the style itself, and selecting Add Comment. You must click on one of the properties first.

All without having to open the Stylesheet Editor. (However, the stylesheet opens in the Stylesheet Editor if you add a property or change the value of a property since you’ll have to save the stylesheet to register that change or addition.)


I like how the Style Inspector makes it easy to manage my style usage. Personally, I still prefer to go into the full Stylesheet Editor but using the Style Inspector means I don’t have to. That’s useful if you’re new to styles and find the Stylesheet Editor overwhelming.

I especially like the text analysis feature because it’s completely new and solves a problem – the inability to get readability statistics in previous versions of Flare.

Between these two features, plus Microsoft Excel file import, last action repeat, a thesaurus, and some snazzy new templates, Flare 2017 R3 is a solid and useful release.

Tuesday, October 10, 2017

“Perfect vs. Good Enough” – Writing Quality in the Online Age

Part 1

In August 2009, Wired Magazine published an article entitled “The Good Enough Revolution: When Cheap and Simple is Just Fine” by the Wired Staff in the Gear column, 8/24/09, at  Its theme – “cheap and simple beats perfect almost every time.” That article reminded me of a column I wrote in 2001 (“’Perfect vs. Good Enough’ – Writing Quality in the Online Age”) that discussed why technical communicators needed to change our definition of quality for the dot-com era.

On rereading, the 2001 column still seemed relevant. So, this column, part 1, presents the core points from the old column from 2001. Part 2 will revisit the 2009 column to present the core points of the Wired article and how they might apply to technical communication. Part 3 will revisit the issue of quality in the emerging age of taxonomies and semantic markup.

First, the old 2001 column, with my comments in italics.


I typically get one or two calls per week from prospective clients or people looking for writers with certain skills. Three years ago (1998), I got a call from a dot-com looking for a “content provider.” It was the first time I’d ever heard that title so I laughed and said “So you’re looking for a writer?” and was taken aback when the caller vehemently said “No! We don’t want a writer.”

I asked why. The answer – “… writers get too focused on perfection… we don’t have time for. If we wait until the material is perfect, our competitors will beat us to market. We do not need it perfect; we just need it good enough.”

I mentioned that conversation often. Two people used it as the basis for presentations in the Bleeding Edge stem at the 2001 annual (STC) conference – one discussing the issue from a writing perspective, the other from a tools perspective. Here, I discuss it from two other perspectives – trends and standards.


Four major trends affect the issue of writing quality:

·         Time-to-market is getting shorter.

·         Editorial positions are being cut back or eliminated in many companies.

·         Single-sourcing is becoming increasingly complex.

Single-sourcing isn’t new. If you used RoboHelp to create WinHelp and hard-copy in 1995, you were single-sourcing. But today’s single-sourcing technologies work best with rigorously structured content. We can no longer get away with “winging it”.

By supporting “good enough” as opposed to “perfect”, isn’t winging it exactly what I am calling for? But it’s not winging it if you write to a standard, just that that standard may call for “good enough.”

·         New competitors are entering our field.

Technical writing was once unglamorous and fairly low-paying. Today, companies are starting to view content – including documentation – as a strategic asset. That shift has attracted consultants looking for new business. But technical writers also want that work.

Outsourcing is a new competitor. Technical writers are upset over the perceived lower quality of outsourced material, and lost jobs. But consider the business perspective. If outsourced material has 50% of the quality but is written at 25% of the cost, a company may decide it’s a worthwhile tradeoff.

What are the effects of these trends?

·         Shorter time-to-market means less time to write perfectly or fix stylistic inconsistencies. (Without editors, there may be no one to fix or even notice those inconsistencies.) So we need to define the material’s look and style before the project starts. We need standards and consistency at a human level.

·         Increasing single-sourcing complexity means that consistency and simplicity are key to getting our material into a form for re-use. We need standards and consistency at a structure and format level.

·         Consultants often use formal methodologies to do their work and help sell their services. We need standards at the business level.

Defining A “Perfect vs. Good Enough” Standard

Few companies have formal writing standards. Even those companies that do often don’t use them. There seem to be two reasons for this.

·         There’s a lot of creativity and subjectivity in writing, so how do you define “good”?

·         Many writers dislike tools that measure writing quality. This may be due to a reluctance to have a creative process measured by machine, bad experiences with a tool, or antipathy toward a tool’s vendor.

But setting documentation standards can let us do two things:

·         Determine how to change our processes to compete with the new entrants in the “content” field and participate in emerging markets and niches.

·         Define measurable standards to help justify why technical writers should do the work, or at least participate in it.

These standards should do three things:

·         Establish a baseline. What is “perfect”?

·         Define acceptable and measurable deviations from the baseline. Formalizing such deviations – a maximum acceptable percentage of passive voice, for example – will help improve consistency.

·         List and describe tools, especially third-party tools, that let us measure the baseline and deviations.

These standards could be created in two ways.

·         Each company defines its own baseline, deviations, and tools as part of its style guide. However, many companies don’t have the time to do this.

·         An organization, such as the STC, could define a “perfect” baseline standard and make it available to members to use as is or to define their own deviations.


Because of the nature of writing, our profession has always accepted a subjective definition of quality. But changes in the market and technologies are starting to undermine that viewpoint. We’re going to have to confront this issue at some point. Now would be a good time, while we have time to do so thoughtfully and deliberately.

The old column ended here. In part 2, I’ll look at the core points of the Wired article and some ideas about their impact on technical communication.