Wednesday, September 19, 2018

Is Single-Sourcing Dead or Alive – the Debate Continues


I recently wrote a blog post called “Is Single-Sourcing Dead” on the Hyper/Word Services blog (http://hyperword.blogspot.com/) in response to a blog post by Mark Baker called “Time to move to multi-sourcing” (https://everypageispageone.com/2018/04/06/time-to-move-to-multi-sourcing/). Baker responded with a post at https://everypageispageone.com/2018/09/10/is-single-sourcing-dead/.

In this post, I’ll respond to what I think are Baker’s major points. (This debate-by-blog can only go on so long before it overwhelms both of us, so I’m going to propose a live discussion between Baker and I at the STC Spectrum 2019 conference in Rochester, NY. We’ll see where that idea goes.)

Note: My most often-used tool these days is MadCap Flare and many of my answers will be from that perspective. However, I suspect that many of my answers apply to other authoring tools as well.

In some cases, Baker and I are, as he put it at one point, “in violent agreement”. Here’s where I think we disagree.

First, in the big picture – Baker notes that there are problems with the current model of single-sourcing and suggests various alternatives. I agree that there are problems but I think they have straightforward solutions – not necessarily simple ones but straightforward ones. I also think that today’s single-sourcing tools have so much untapped power that it would be a mistake to discard them too early.

Now, more specifically, with Baker’s points in red italics.

Re the single source format/single repository issue:

That single source format/single repository model has several significant disadvantages, however. I outlined those in my original post on the subject. But since the single format/repository model was used in part to enable multi-format delivery and content reuse, does that mean that those things are dead if we move away from the single format/repository model?

In a word, no, since they can manifestly be done independent of it. But we have to think seriously about how we do them if we don’t have a single source format and a single repository. Going back to everyone using their own independent desktop tools and storing the files on their own hard drives has all sorts of well documented disadvantages, and is a non-starter if and when you move to an integrated web strategy, semantic content, or Information 4.0. So, if the single source/single format approach isn’t the right one either, we have a legitimate question about how we do multi-format publishing and content reuse going forward.

The single source format and single repository is an ideal and, like most ideals, we’ll never quite reach it. But we may not have to. Flare, and probably similar tools, let us create content in the tool but also take content created in other tools in other formats, mainly Word, and automatically import it into the tool and convert it to the tool’s format. Authors using tools like Word do have to use it minimally correctly –styles rather than local formatting for headings, for example – but that can often be handled with simple training and motivation.

Re the “appropriate tools” issue:

The solution Perlin proposes is simple: Buy the appropriate tools for everyone who needs them.
But there are a couple of problems with this, beyond the unwillingness of companies to pony up the cash. First, these tools are unfamiliar to most of the people who would be asked to use them and they are conceptually more complex than Word. That introduces a training overhead and adds complexity to the writing task every day. And if the contributors don’t use those tools full time, they will forget most of their skills they are trained in.
Giving everyone more complex tools is not really a sustainable strategy, nor is it one that is easy to sell.
My point here is not to buy everyone new, expensive, and unfamiliar tools but instead to buy whatever tool is appropriate. In many cases, authors already have the appropriate tool – Word – and just have to learn how to use it minimally correctly. In other cases, companies may have to buy real single-sourcing tools. Some companies will balk at this, saying that they already have single-sourcing tools in-house so why buy new ones? But many of those tools were released years ago, no longer meet code standards, and may be minimally supported if at all. I’d argue that it’s a cost-saving in the long run to buy a modern tool for that small number of authors in the company who need them.

Re the training issue:

Perlin argues that many current problems with single sourcing arise because writers are not properly trained to use the tools they have. The solution: more training.

I’m not arguing for more training, although that’s often helpful. Instead, I’m arguing for any training. Too often, authors are thrown into a new tool with no training, just some instructions from a former author that may not apply to the current version of the software or current output needs, and that may contain errors.

Re the inappropriate standards issue:

·         Templates and embedded prompts get overwritten with content, so the structured is not retained and is not available to guide subsequent writers and editors.

Baker is right about the risk of templates and embedded prompts getting overwritten with content. But one feature of templates in modern tools is that they can be added to the tool interface for re-use. That way, creating new material does not overwrite the templates and prompts.

Re the increasing complexity issue:

Documenting all of your complexity is not a good (or simple) solution. Documenting it does not remove it from the writer’s attention. It is better than not documenting it, but not by much. The writer still has to deal with it, still has to spend time and mental energy on it, and can still make mistakes in following the complex procedures you have documented. Much of this complexity can be factored out using the right structured writing techniques.

Another area in which we agree overall but disagree on the details. Some of the complexities can indeed be factored out using structured writing but some can’t. For example, if you’ve defined fifteen different conditions, when should you use each one? What are the rules for clearly naming new topics, graphics, snippets, etc.? And so on. Documenting your projects isn’t the total answer but not documenting them is an invitation to disaster. (My book “Writing Effective Online Content Project Specifications”, available on Amazon, discusses how to document your projects and presents many unpleasant examples of what can happen when you don’t.)

And the authorial motivation issue:

·         With the best will in the world, people can’t comply with a regime that benefits someone else rather than themselves unless they get clear, direct, and immediate feedback, which current tools don’t give them, because the only real feedback is the appearance of the final outputs.

Perfectly true. That’s why, when I show a client how to use some feature that supports single-sourcing, I always emphasize how it will help them. “Remember that white paper you wrote that had fifty subheads formatted using Comic Sans and how you had to change each one individually? How about if I show you how to change all fifty at once by using these things called styles.” Authors don’t always get it right but they’re interested and motivated because the solution is benefiting them and, incidentally, the larger workflow.

·         Management oversight can’t ensure compliance in the production phase of a process if it can only perceive compliance in the finished product. Assessing the finished product every time is too time consuming and error prone (you can miss things). And the effectiveness of management oversight decreases exponentially the longer the delay between when the writer writes and when the manager finds the defect.

Also perfectly true. But problems in the finished product can usually be traced back to and solved in the production phase. We’ll never solve all the compliance problems but we can solve a lot of the major ones. In other words, this is a QA problem.

Turning to the broader point of what can take us beyond single-sourcing:

In Perlin’s model, all of the complexity of making single sourcing work is pushed onto the writers. “the more that this conversion and coding can be pushed back upstream to the individual authors … the easier life will be”. Well, if all that work is pushed to the writers, it is not their lives that are being made easier, since all the work and the responsibility is being pushed onto them. If anyone’s life is being made easier, it is the tool builder’s life.

Here we disagree. I’m not saying that we should push “complexity back upstream to the writers”. I’m saying that we should push tasks that improve the workflow back upstream. For example, rather than authors using local/inline formatting in their documents which then has to be fixed by the information architect, show authors how to use styles from a stylesheet in the first place and, as I noted above, explain how this will benefit the authors. This is a “do it right the first time” approach.

Today there is a rich collection of tools and standards available (largely created to run the Web, which is to say, to build and deliver content systems). With the right roles defined and the right system design, you can construct an appropriate custom system using these components. People do it every day, and at every scale. 

Baker is perfectly right about this. But somebody has to:
  • Combine those tools into a working system.
  • Understand, promulgate, measure, and enforce those standards.
  • Define the roles.
  • Design the system.

However, each one of these tasks has problems.
  • How to combine the tools into a working system? Combined by whom? The tasks may require code-level skills.
  • Driving the standards is hard. They’re often hard to understand without training – what is the CSS standard and what version should we adopt? What is DITA? (“Darwin Information Typing Architecture” tells us nothing about what DITA actually is.) And so on.
  • Roles can certainly be redefined but doing so can be confusing or sound gratuitous. (Yesterday I was a technical communicator. Today, I’m a content engineer. What’s the difference?) Baker is right that there’s a role for content engineers or information architects but there needs to be meat behind the title.
  • Every company has a system, but it’s often one that’s grown organically over years. It may have problems that everyone knows about and knows how to work around but no one has the time or skills to fix. Designing a new system from scratch is a wonderful opportunity but it takes time.

In summary, Baker and I agree in some ways. Today’s single sourcing works, even with its problems. It may not be robust enough to carry us into Information 4.0, but few companies in my experience need to worry about that yet. Most companies don’t have the time or resources to completely overturn their current workflows for a somewhat undefined future. That will happen, but iteratively.

Today’s single-sourcing tools have so much untapped power that abandoning them strikes me as a mistake. If people can make better use of those tools without changing the development model, that’s a simpler approach.

Monday, September 17, 2018

A Comment About Flare's Dropdown Link Type


On July 23, I wrote a post in the Hyper/Word Services blog on “A Review of MadCap Flare’s Link Types” at http://hyperword.blogspot.com/. This post was repeated in MadCap’s blog at https://www.madcapsoftware.com/blog/2018/08/30/navigation-best-practices-guide-link-types-madcap-flare/?utm_source=Newsletter&utm_medium=Email&utm_campaign=20180911Newsletter&utm_source=Newsletter&utm_medium=Email&utm_campaign=20180911SepMadCapInsider. In the post, I stated in the Dropdown Drawbacks section that there were “None, in my opinion. However, I’d be interested to hear competing opinions.

In response, Jana Vacková of ABRA Software in Praha (Prague) wrote:

We might know about one J - so if you are interested in it, here might be one. (I fixed a few spelling and grammatical errors but otherwise left the response as is.)

The opened dropdown menu has problem to flow around the side menu and to use all possible width of the page. Let me elaborate it more:

-         In our project (= ERP SW on line help) we use dropdowns very often – as we found it very cool J for making the long contents more clearly arranged.

-         We use TopNavigation skin with TOC menu at the side of the screen (side menu).

-         When the section (of our help) is complicated (as our SW is really huge, complex and complicated J) the side menu (with TOC) has a lot of items and is “long“.

-         So, if the topic inside in this section has dropdowns on the top (in general placed horizontally in the area where the side menu is) and user clicks on one of these dropdowns, the dropdown opens but the width of its body is limited (thanks to side menu). And is limited up to the end of body content (although the body is much longer than the side menu itself).  When the body itself is long there is a lot of unused space on the screen and user must scroll down more. If there are wide tables with many columns in such dropdown body, there is a problem L.

Have a look below:

I mean the unused area under the side menu:




It could be perfect if the width of dropdown-body could adapt the free space. So, at the bottom of side menu the text would start to flow round.

P.S. We have contacted MC Flare support but they haven’t advised any solution how to adapt this behavior of the width of opened dropdown body in combination with side menu.

Has anyone else encountered this problem and found a solution?



Monday, September 10, 2018

Word processing through the ages

This article was originally published in ISTC Communicator, Autumn 2018 Supplement.

Neil Perlin looks at the impact word processing has had on technical communication and his career.

In February, 1979, I was hired by a computer company called Digital Equipment Corporation to write the user manual for a general ledger accounting package. I have an MBA in accounting and operations management – mathematical process control – from Boston University, so I knew how a general ledger worked.

I wrote the manual by hand, 400 pages, using pencil and paper. We didn’t have word processors, and all typing was done on typewriters by ‘the girls’ in the typing group. (Stay with me…)

I sent the finished manual out for review. Four of my reviewers said I’d gotten it wrong – a general ledger didn’t work the way I described. What they said ran counter to what I’d learned in my MBA program but I assumed that a big computer company would have gotten a waiver on the standard. (I was very young and innocent then…) No word from the fifth reviewer.

So, I threw out the 400 hand-written pages and wrote a new 400-page manual. By hand. Pencil and paper.

When I finished, I sent it out for review. The four reviewers blessed it. However, the fifth, who had been on vacation during the first review pass, called and spent five minutes giving me an epic chewing out.

When he finished, I explained what happened. After he finally stopped laughing, he said ‘Tell me what you wrote the first time.’

I did, and he said ‘That was exactly right. The other reviewers don’t understand accounting. Go ahead and rewrite what you wrote the first time.’ Which I did. By hand. Pencil and paper.

So, I ultimately wrote 1200 pages – by hand, pencil and paper – to get 400, with the last 400 saying the same thing as the first 400.

Many technical communicators from that era have similar ludicrous stories.

The appearance of word processing changed technical communication forever. Stories like mine became things of the past. Things like ‘paste-up’ and ‘carbons’ vanished into history. In this article, I’ll look at how word processing came to be and end with some thoughts about where it may be going.

History


Word processing dates to Gutenberg and movable type. But for this article, I’ll start with the electronic version.

According to a Computer Nostalgia article1, the first units functionally recognisable as word
processors appeared in 1964 with IBM’s introduction of the MT/ST (Magnetic Tape/Selectric Typewriter) which added magnetic tape storage to a standard IBM Selectric. Users could store, edit, re-use, and even share documents. But it was still a typewriter – no screen.

People also did word processing on mainframe computers with time-shared terminals. To get a sense of this, see the first page of ‘Word Processing on the Mainframe Computer’, written in 1984 by Sue Varnon2.

The first units with screens – recognisable as modern word-processors – debuted in the early 1970s from companies like Lexitron and Vydec. Wang Laboratories’ CRT-based word processing system, introduced in 1976, became the standard and made Wang the dominant player in the word processing market. These systems were crude compared to today’s. Most had no navigation keys and instead used the e/s/d/x keys on the keyboard. They had no function keys for attributes like boldfacing, which was done by pressing key combinations at the beginning and end of the text to be emboldened. There were no options for fonts, and other things that we take for granted today.

WYSIWYG displays didn’t exist. Monitors showed text using the system’s default font. Formatting was done by inserting control characters. There’s debate as to when WYSIWYG appeared – some claim that the early Apple MacIntosh with a bitmapped display made it possible. Others claim that it wasn’t until laser printers became affordable and could fit on a desk that true WYSIWYG became possible and you were able to see what was printed, on screen.

But they offered the kernel of what we expect in word processors today.

Furthermore, the term ‘word processor’ referred to dedicated machines rather than software running on general purpose PCs. The general-purpose PCs we use today were just emerging. But once they did, the dedicated machines were doomed. Wang went through internal turmoil due to changing markets, management, and strategy and filed for bankruptcy in 1992. (A fragment of the company survived until 2014.) Other companies like Lexitron, Lanier, and Vydec disappeared so thoroughly that Google searches return only fragmentary mentions.

To put this in perspective, and for an interesting aspect of cultural sociology – (see the following item, reference “the girl”), consider this piece of history from the Computer Nostalgia1 article :

The New York Times, reporting on a 1971 business equipment trade show, said:

The ‘buzz word’ for this year's show was ‘word processing’, or the use of electronic equipment, such as typewriters; procedures and trained personnel to maximize office efficiency. At the IBM exhibition a girl typed on an electronic typewriter. The copy was received on a magnetic tape cassette which accepted corrections, deletions, and additions and then produced a perfect letter for the boss's signature....

These pioneers were replaced by software with almost legendary names – MacWrite, Lotus AmiPro and Manuscript, PC-Write, Electric Pencil, VolksWrite, MultiMate, PeachText, XyWrite, and three that will be more familiar – WordStar, WordPerfect, and Word.

WordStar was the leading application in the early 1980s when CP/M and MS-DOS were competitors. But changes in technology and interface and customer service issues made it falter. WordPerfect took its place as the leading word-processor in the 1980s. But problems with a release for Microsoft Windows gave Microsoft an entrée into the market with Word. Between a smoother introduction and bundling deals that led to Microsoft Office, Word took the lead in the 1990s and has not looked back.

Results


What has this evolution wrought?

  • Word processing has changed how we write, for the worse according to some literary critics. See ‘Has Microsoft Word affected the way we work?’ by John Naughton in the January 14, 2012 issue of The Guardian3 and ‘How Technology Has Changed the Way Authors Write’ by Matthew Kirschenbaum in the July 26, 2016 issue of The New Republic4.

    Personally, I agree that it has changed how I work - for the better. Using a typewriter, changing the material was difficult, often involving White-Out or perhaps even pulling the page out and re-typing it entirely. This made it easy to lose my train of thought. With a word-processor, I can write material, modify it as I go, and easily revert to a previous version. And I can try different wordings to see which is clearer or gives a better readability score. So, overall, and especially after my general ledger user manual fiasco in 1979, I could never give up my word-processor.
  • WYSIWYG authoring is useful but there are periodic arguments about whether it leads authors to focus on formatting content rather than on writing it – appearance over substance. Here’s one example, ‘Word Processors: Stupid and Inefficient’ by Allin Cottrell5.

    Personally, I agree with some of his positions but I think word processing as it currently exists is too entrenched to change in the near future. Also, and interestingly, Cottrell’s position ties in well with the emerging need for content in HTML or XHTML that has no format of its own but that can use multiple stylesheets for single sourcing.

  • WYSIWYG authoring, plus the ability to insert and position graphics electronically, has sharply reduced the role of the graphic designer. That’s not to say that a graphic designer couldn’t do a better job, just that graphic designers are no longer needed.

  • Authoring support tools like spell-checkers and readability analysers in word-processors sharply reduced the role of editors. (When I was at Digital Equipment Corp in 1982, there were, as I recall, about 20 writers supported by a formal editorial group. Today, I’m surprised and pleased if one of my clients has even one editor on staff.)

  • Many managers wanted computers in their offices because computers were cool, but didn’t want to actually use them because typing was considered to be secretarial work. So, some unsung marketing genius coined the term ‘keyboarding’ instead.

  • Typing pools were almost entirely female because management viewed typing as a secretarial function. The advent of word processing caused debate about whether it would perpetuate the typing pool as a so-called ‘pink ghetto’ or open new avenues for advancement for women. My experience from Digital Equipment was the latter. One woman who started as a typist became one of the coordinators of the company’s export control compliance programme.

  • The culture of technical writing changed. In 1980, my department got two word-processors for the writers to share. Soon after, the manager told me that he had offered jobs to two writers, both of whom turned him down on the grounds that 'technical writers don't use computers'.

  • The culture of technical writing changed. In 1980, my department got two word-processors for the writers to share. Soon after, the manager told me that he had offered jobs to two writers, both of whom turned him down on the grounds that ‘technical writers don’t use computers’. 
  • In the same vein, one of the greatest presentations in the STC conference’s Beyond the Bleeding Edge stem, which I started in 1999 and managed until it ended in 2014, was a retrospective look at changes in writing culture by a speaker who showed a video of a presentation he gave in 1980 entitled ‘why technical writers should be allowed to use computers’. It’s one of the funniest but most meaningful presentations I’ve ever seen at a conference. (Why meaningful? Because it examined a huge technical and philosophical shift in technical communication. Why funny? Because, almost on cue, the older attendees looked at each other and said “I remember those days!” while the younger attendees looked at each other and said “No word processors? No way!”)
  • Users of word-processors, primarily Word these days, break all kinds of rules to make sure the document prints well. But these users rarely consider that their documents may have to be converted to HTML or XHTML for use online. So, breaking the rules, often using local formatting rather than styles, seemed to have no down-side but now causes frequent problems.

  • Related to the prior point, management tends to view word-processors as akin to typewriters and thus doesn’t train the users on how to use the tool effectively and correctly. The result is usually chaos.

The Future?


Will today’s word processing powerhouses eventually go extinct? Word processing is so embedded in business and technical communication that it’s hard to imagine, but many once-dominant tools and companies have vanished.

I can think of two things that might change the future of word processing:

  • It’s been said of Word that most people use 5% of its features. The problem is that each person uses a different 5%. So, an interface that users can easily customise, without a consultant to do so, would be a big help.

  • Eliminating typing. A speech-to-text interface, an Alexa of word-processors, may be possible in the future. But the system will have to be smart enough to recognise and remove all the throat clearing and ‘like’ and ‘you know’. And, each person’s voice is different so the system will need a lot of training. And AI might be needed to help the system understand when to emphasise a word without the authors having to tell it to do so and breaking their train of thought.


And the need for word processing as we know it might disappear. An article called ‘Getting The Next Word In’ by Ernie Smith6 from 2016 makes some interesting philosophical points. “The reasons we have traditionally used word processors has slowly been eroded away,” he explained. “LinkedIn is replacing the resume, GitHub is replacing documentation, and blogging (and respective tools) have chipped into journalism. Even documents that are meant to be printed are largely being standardised and automated. Most letters in your physical mailbox today are probably from some bank that generated and printed it without touching Word.”

Perhaps the best indicator of how thoroughly word processing has penetrated the world, especially that of technical communication is the fact that it’s taken for granted except when we complain about some feature of Word. The wonder that it evoked in 1971 is long gone. And that’s a sign of success.

References


1.     Computer Nostalgia (no date) ‘Computer History. Tracing the History of the Computer – History of Word Processors’ www.computernostalgia.net/articles/HistoryofWordProcessors.htm (accessed July 2018)
2.     Varnon S (1984) ‘Word Processing on the Mainframe Computer’ The Journal of Data Education, Volume 24, 1984 – Issue 2 www.tandfonline.com/doi/abs/10.1080/00220310.1984.11646292 (accessed July 2018)
3.     Naughton J (2012) ‘Has Microsoft Word affected the way we work?’ The Guardian www.theguardian.com/technology/2012/jan/15/microsoft-word-processing-literature-naughton (accessed July 2018)
4.     Kirschenbaum M (2016) ‘How Technology Has Changed the Way Authors Write’ The New Republic https://newrepublic.com/article/135515/technology-changed-way-authors-write (accessed July 2018)
5.     Cottrell A (1999) ‘Word Processors: Stupid and Inefficient’  http://ricardo.ecn.wfu.edu/~cottrell/wp.html (accessed July 2018)
6.     Smith E (2016) ‘Getting The Next Word In’ Tedium. https://tedium.co/2016/10/04/word-processors-future (accessed July 2018)


Related reading


Ashworth M (2017) 'The death of sub-editing' Communicator, Spring 2017: 14-17

Dawson H (2017) 'Industrial revolution in Fleet Street' Communicator, Summer 2017: 26-29

Glossary


AI. AI (artificial intelligence) is the simulation of human intelligence processes by machines, especially computer systems. 
https://searchenterpriseai.techtarget.com/definition/AI-Artificial-Intelligence

Alexa. Alexa is a virtual digital assistant developed by Amazon for its Amazon Echo and Echo Dot line of computing devices.
https://www.webopedia.com/TERM/A/alexa.html

Carbon copy. A carbon copy (or carbons) was the under-copy of a document created when carbon paper was placed between the original and the under-copy during the production of a document. In email, the abbreviation CC indicates those who are to receive a copy of a message addressed primarily to another (CC is the abbreviation of carbon copy).
https://en.wikipedia.org/wiki/Carbon_copy

CP/M. CP/M originally stood for Control Program/Monitor and later Control Program for Microcomputers, is a mass-market operating system created for Intel 8080/85-based microcomputers by Gary Kildall of Digital Research, Inc.
https://en.wikipedia.org/wiki/CP/M

GitHub. GitHub is a web-based version-control and collaboration platform for software developers.
https://searchitoperations.techtarget.com/definition/GitHub
https://github.com

HTML. Hypertext Markup Language (HTML) is the standard markup language for creating web pages and web applications.
https://en.wikipedia.org/wiki/HTML

Keyboarding. Enter data by means of a keyboard.

MS-DOS. (Microsoft Disk Operating System). MS‑DOS was the main operating system for IBM PC compatible personal computers during the 1980s and the early 1990s.
https://en.wikipedia.org/wiki/MS-DOS

Paste-up. A document prepared for copying or printing by combining and pasting various sections on a backing.
https://en.wikipedia.org/wiki/Paste_up

Pink ghetto. ‘Pink ghetto’ is a term used to refer to jobs dominated by women. The term was coined in 1983 to describe the limits women have in furthering their careers, since the jobs are often dead-end, stressful and underpaid.
https://en.wikipedia.org/wiki/Pink-collar_worker#Pink_ghetto

White-Out. White-out is a correction fluid. It is an opaque, usually white, fluid applied to paper to mask errors in text. Once dried, it can be written over. It is typically packaged in small bottles, and the lid has an attached brush (or a triangular piece of foam) which dips into the bottle. The brush is used to apply the fluid onto the paper. In the UK, ‘Tipp-Ex’ is used more commonly.
https://en.wikipedia.org/wiki/Correction_fluid

WYSIWYG. WYSIWYG is an acronym for ‘what you see is what you get’.
https://en.wikipedia.org/wiki/WYSIWYG

XHTML. Extensible Hypertext Markup Language (XHTML) is part of the family of XML markup languages. 
.

Thursday, September 6, 2018

Is Single-Sourcing Dead?

Matthew Dorma of Calgary, AB pointed me to a post by Mark Baker (https://everypageispageone.com/) entitled “Time to move to multi-sourcing” (https://everypageispageone.com/2018/04/06/time-to-move-to-multi-sourcing/). I discussed that post with several people and have thought about the implicit question it poses – is single-sourcing dead – for a while. This post is the result.


NOTE: The first part of the post – Is Single-Sourcing Dead? – discusses problems with single-sourcing in general. The second part – Alternatives to Traditional Single-Sourcing? – briefly addresses the specific points that Baker raises in his original post.

Is single-sourcing dead?

In my opinion, no. The single-sourcing concept – write once, re-use many times in many ways and many places – has some problems. But the basic concept is so useful that I see nothing that can replace it yet.

What are those problems? Can they be fixed, and how?

 ·       Inappropriate tools – Today, single-sourcing is based on using tools like MadCap Flare (note that I do a lot of training and consulting for MadCap), or standards like DITA. However, companies aren’t going to buy those tools for every employee who has to write something; the cost is too high. Instead, most of those employees will use Microsoft Word because companies see Word as being free. And employees can use Word to create print and PDF output – technically, single-sourcing but lacking the flexibility and output options of full-power single-sourcing.

The result? Trying to do single-sourcing using the wrong tools.

The solution? Simple in theory. Identify which employees create what material that has to be single-sourced and buy them the appropriate tools.

 6       Inappropriate training – In the early days of word-processing, specially-trained operators did the work. Today, employees get a copy of Word and are largely on their own to figure it out – with no training. The results are often inconsistent and with ugly code, but no one cares as long as the document looks good when printed. But if that document has to be imported into a single-sourcing tool like Flare, ugly code often causes problems that few authors know how to avoid or fix. Authors need at least some training in how to use Word but few companies offer it.

The same is true in single-sourcing. Authors may have the right tool but are often not trained on how to use it, or on the concepts of single-sourcing. I often meet Flare authors who were given the tool and told to figure it out on their own. Sometimes the results are surprisingly impressive but often the authors are just terribly frustrated.

The result? The best tools are often worthless if authors don’t know how to use them.

The solution? Obvious. Train the authors on the tools. And for subject matter experts upstream who use Word et al to provide content to the single-sourcing authors, provide at least minimal training and support in how to use their tools. How minimal? Two examples…

o   A client in Austin, TX whose authors used Word asked me what those “styles” were. I explained what they were and how to create and use them. The client was ecstatic at the amount of work they could save. From a five-minute discussion…

o   A client in Connecticut was having trouble getting their authors to create consistently-structured material. They had defined a structural standard but the authors deviated from it constantly. I explained how to create topic templates that could be added to their authoring tool’s interface. The client’s employees spent about an hour at a white board laying out a template which I then turned into an electronic one and added to the tool interface in about five minutes.

          Inappropriate standards – People often have no standards to follow to when it comes to using their tools – no templates for different types of material, or style usage standards, for example.

The result? People do whatever provides the result they want, even if that causes trouble down the road when it’s time to import the material into a single-sourcing tool or output to a new format.

The solution? Surprisingly simple. Identify authors’ pain points and create standards for them. Better still, embed the standards into the authoring tools as much as possible to make their use automatic. For example, create topic-type templates with embedded prompts – “type the list of tools here” – to guide authors as they write. Or create a stylesheet with clear style names and make it the project’s master stylesheet so that it will be applied automatically to every topic.

Adding standards is surprisingly straightforward. What’s harder is getting authors to use them. That will take training and time and perhaps some management muscle to insist that using the standards is a requirement, but that’s not a new task.

          Increasing complexity – Single-sourcing requires many tasks beyond just writing the content. Authors have to decide which output is primary in order to decide which features to use because some won’t work well or at all on different outputs. That means understanding those features. Authors have to create and assign conditions to control which content to use for which output. Define re-usable chunks of content. Create style sheets that behave differently depending on the output. Perhaps define microcontent. And more. And this all must be documented somewhere for reference by the current authors and later ones.

The result? The increasing power of our tools and increasing customer demands are leading to increasingly complex projects that that can easily go out of control.

The solution? Again, simple. Document your project. (See my book “Writing Effective Online Content Project Specifications”, available on Amazon, for my suggestions on how to document your projects and what can happen if you don’t.)

·        Lack of motivation on authors’ parts – Single-sourcing isn’t on most authors’ radar so they have no reason to move from the tools and workflows they know to something new to support some vague goal of single-sourcing.

The result? Authors type their content and make sure it prints well and that’s that.

The solution? Several parts. First, make single-sourcing a job requirement. Second, and crucially, explain why single-sourcing is important to the company and show how it can solve authors’ problems. Without that, authors will do the bare minimum needed to meet the single-sourcing requirement and even skimp on that unless there’s management oversight.

Alternatives to Traditional Single-Sourcing?

What about the “shared pipes” (from Sarah O’Keefe) and “multi-source” (from Alan Porter) models that Baker describes? Each seems to fix some problems of single-sourcing. However, each one has to add a complex black box in the center of the process, where the conversion and coding is done. In my view, the more that this conversion and coding can be pushed back upstream to the individual authors by giving them templates, style sheets, and other tools and leaving the black box central processor to the tool vendor, the easier life will be. No need for a dedicated IT person managing and maintaining a proprietary system that, in my experience, languishes after its initial champions have moved on.

What about the “subject-domain” model that Baker describes? In my view, this model can be handled by creating information-type templates for authors to use. We generally think of templates as specific to types of information/topics, but there’s no reason why templates can’t be applied to specific domains of information as well.

Summary

Single-sourcing isn’t perfect. No authoring model is. But it’s worked well for years and its problems seem to have straightforward solutions. Try those before throwing the single-sourcing baby out with the bath water.