Posted in Uncategorized

Content Strategy practices are not hard!

pulling-your-hair-out-girlWhile I haven’t been an official content strategist/publisher for that long, I actually have been web publishing for a long time now. Over the years, I’ve learned the difference between good practices and bad practices, from experience and through classes and webinars I’ve taken. I’d like to think that from all of this that I’ve learned to be a pretty good content strategist and web publisher. Even so, I still don’t understand why people find content strategy difficult to understand, and why creating a high standard of quality in content strategy and publishing content that’s user-friendly is so difficult. It makes me want to pull out my hair it frustrates me so much!

A recent occurrence of this lack of comprehension spurred my intense frustration again. I’ve experienced this before in many places that I’ve worked, but this was just the latest occurrence that sparked my ire.  Among several projects that I’m working on at work, one of them is managed by another web publisher. In our project, we’ve been assigned to revamp a current internal website. Par for the course–this is what we do. The project manager was given an outline by the internal client, along with the main content, which included documents to be linked within the pages. That sounds fair enough. Of course, as most technical communicators know, content written or planned by non-technical communicators usually needs some help to make it more user-friendly.  In this case, much of the formatting of the content was…less than desirable. In addition to making the outward facing part of the microsite user-friendly, we also had to make the back end–the organization in the content management system–user-friendly as well, since the client would be maintaining the site after we were done setting it up. This all sounds like a reasonable task, and a technical communicator would be just the person for the task.

However, I found myself frustrated with the process, or rather, the quality of what was starting to go up. The project manager gave me sections of the website to work on and format. I found it difficult to decipher the client’s outline because the outline was written poorly. Nevermind the actual text itself, which wasn’t always well written either. I couldn’t really touch that. The outline was meant to help the web publishers–the project manager and I–understand how the client wanted the site organized. At a high level, the main outline seemed fine, but when getting into the finer details, it easily fell apart for many sections. I often had to consult the project manager for clarification, as I wasn’t supposed to be talking to the client directly, for some reason. Whatever.

The other problem was that nothing was labelled in a way that made sense or was user-friendly for use on the front or back end. I can understand that people have different naming conventions for files that make sense to themselves. But when creating the name of a file that is some sort of document or form to be used by others, and not giving the document a title? I don’t get that. For example, if the document is a quick reference guide about how to use your Lotus Notes account, then the text on the web page should be something like,

Quick Reference Guide for Lotus Notes

and the file on the back end should be called something like, “QuickRefGuide_LotusNotes.pdf,” or something like that in order for the user to understand what they are downloading. The file shouldn’t be called something like, “QFC-LN_ver1_01.02.14.pdf”. Down the road, someone will look at that downloaded file and question what that file is. Wouldn’t it be easier to title the file more appropriately rather than have to open it?  I’m sure some would argue something about versioning here, but in our CMS, there seems to be a bad practice of putting many versions of the same document up with different names rather than utilizing the versioning function of the CMS. I use the versioning function on the CMS extensively on the other sites I work on, so this confuses me that others think it’s okay to clutter up the system with many versions of the same file under different file titles.

To add to the grief, the client sent files in zip files which yielded unorganized folders and files as well. In this instance, the project manager would keep the folder convention the client had given, even when it didn’t make sense. When I questioned the project manager, I received the response of, “The client had them organized that way, so we’ll leave it because they’ll be maintaining it later.” NOO!!! The organization didn’t make sense, it didn’t follow the client’s own outline, and complicated the back end so that it didn’t make sense! I am confident that the client just slapped some folders and files into a zip file, and sent it along for us to decipher it. I spent the past year cleaning out another department’s very large microsite doing just this–giving files more appropriate names and creating a folder system that would make sense to ANYBODY going into the site to find the page or document needed that followed what was on the front end. And now, when changes need to be made, it’s easy to find the appropriate documentation.

As I’d do the pages I was assigned to do for this new microsite, it became clear to me that the project manager didn’t care. Granted, it’s a big project, and we want to get it done quickly. It would be easier to be able to merely cut and paste content into the site and be done, but it’s also our responsibility as content strategists and technical communicators to make things easier, more streamlined, more user friendly for both the front end and back end.  The mantra for all technical communication is always user advocacy– for all aspects of the project, whether it be digital or print.

This means that there needs to be attention to details, thus the “copy and paste” method of entering content into a CMS system alone is not enough. I used to be known at one job as the “Table Queen” because the CMS used didn’t like the copy and paste of tables from Word, so I usually had to go into the HTML code and fix everything so it displayed correctly–or if I could, make it display even better.  Tables are something simple to figure out in HTML, but even so, it was something that other people at that particular job with the title of “web publisher” did not know. (They didn’t even know HTML at all, so why were they called “web” publishers?) It was important to make the pages look consistent and be organized in a way that would allow the users to find information quickly and easily.

In this project, I’ve found that the project manager isn’t taking the lead in setting the standard for the website. I’ve been disappointed that the same standards that I would expect aren’t being displayed by this person. It frustrates me, but like I said, it’s not the first time I’ve encountered this reluctance to make a website work.

Do understand that I’m not a perfectionist. I let things slide to a certain point, too, and post things that are “good enough”.  But in the end, it comes down to the foundation of the website. If the foundation and the building blocks aren’t sound, it’s not going to hold up. In content strategy, if the infrastructure of the site isn’t sound, and the content isn’t well defined, then the website will reflect that disorganization.

Content strategy, at its core, is really easy. It’s all about organizing information in a way that it can be easily searched and retrieved. It’s about labelling files and folders so that they make sense.  Val Swisher’s analogy about content strategy being like one’s closet still stands at the heart of it.  If you can organize your closet and identify the different clothing pieces in order to categorize them, then you understand how to do content strategy. The only difference is that instead of having shirts, skirts, pants, and shoes to organize, you have folders of documents, webpages, and multimedia.  The method of making sure that users can find those documents, webpages, and multimedia should be streamlined, clear, concise, and user-friendly. As content strategists and user advocates, it’s all about making sure that what the audience is viewing looks and reads well, and what the content managers can maintain easily.

Ultimately, when creating a content strategy and setting it up for maintenance, do it correctly now, even if it’s time consuming. If for no other reason, it’ll save time and headaches later. It’s not difficult. It’s just common sense.

Posted in Uncategorized

Maybe I’m Amazed I met this Tech Comm legend…

macca“Excuse me, Dr. Corfield, I’m tweeting this event for Adobe today. Would you happen to have a Twitter handle?”

With the apology that he hadn’t one, but that he did have a Facebook page, I had started a too-short yet lovely pre-event chat with Dr. Charles Corfield, the keynote speaker for the 2013 STC Summit’s Adobe Day. In my mind, being the inventor of Adobe Framemaker would easily qualify the tech comm pioneer for the Tech Comm Hall of Fame (if there was such a thing). For me, talking to Dr. Corfield was like talking to the Paul McCartney of tech comm (and that’s super high praise coming from a Macca fan like me!). Just as McCartney is unequivocally deemed as one of the early pioneers who revolutionized how we listen to rock music today, Corfield helped to revolutionize tech comm with his creation of Framemaker, and in the process, created what we know as a software standard for technical communication that still holds up today. I loved listening to Dr. Corfield’s soft-spoken, British accent as he chatted with me briefly about social media and about some of the things he was going to be talking about in his presentation. I was truly having a fangirl moment, and hopefully I kept my cool during the conversation. Awesome!

CharlesCorfield
Dr. Charles Corfield
The “Father” of Framemaker

Dr. Corfield started his talk by presenting us with a history of how Framemaker came about. He explained that before Framemaker, computing was still fairly archaic, but workstation computers were starting to become more powerful. As a graduate student at Columbia, he was looking to create software that could take things a step beyond word processing, namely make software that could also create unified pagination and page layouts. Framemaker allowed page layouts and paginatable text to work in a symmetrical flow. The software targeted long documents and other paper output done by humans.

Dr. Corfield pointed out that the first content management problems started to occur as a result, and those issues included the need for internal references, such as footnotes, indexes, cross references,  and markers. The power of Framemaker’s ability to create indices to update long documentation was–and still is–more powerful than Microsoft Word even today. He also added the ability to refer to external factors like external references and hypertext.

Framemaker created the ability to manage variants of a single document, leading to what we now think of as single source publishing. Variants would be such objects as variables, conditional text, frozen pagination and change-pages. This yielded a new dilemma. As Corfield posed it, do you send out fully changed documentation or only the pages that were changed, especially with super large documents? The problem would be that with big documents, people would say, “Well, what changed?” Corfield pointed out the Boeing 777 project in 1990s needed IMMENSE documentation, so they needed to use retrievable databases. The Boeing 777 project solution was to use SGML (the predecessor of HTML and XML). This project made it the first “web” delivery of documentation. The Boeing 777 project used Framemaker with SGML, using HTML, XML, DITA as well as “structure.” Framemaker provided a server-based generation of documentation.

Shifting his talk a bit, Dr. Corfield started to talk about Framemaker’s impact today.  He pointed out that the original retina display was actually paper! Sophisticated layouts had to be used to maximize the user-experience. The computer came along later to expand on that concept. Displays started out with 72 dpi (dots per inch) displays, which led to crude layouts. Now, retina display is available at 300 dpi, but we need to re-learn what we did on paper yet also include dynamic content from high resolution video and images.  Corfield pointed out that there has been a proliferation of platforms. We have desktop, laptops, smartphones, and tablets that use different platforms such as Unix, DOS, and MacOS (for PC and Mac products respectively) that need different outputs. Technical writing, therefore, is directly impacted by all the different displays and platforms in relation to  document authoring. It is a requirement to produce structure and rich layouts for the output. Documentation needs to be able to support dynamic content (video, animation, etc.) and it needs to manage content for consumption on multiple platforms. The good news is that Framemaker can do all that! While there are other tools out there that can also deliver different kinds of output, many still struggle to manage and deliver to these needs the same way that Framemaker can now. Dr. Corfield is not part of Adobe anymore, nor is he part of today’s Framemaker product, but he seems happy with where the product has gone since he left it in Adobe’s hands.

(I should note, that while this was a talk sponsored by Adobe, this really wasn’t intended to be a big info-mercial for Framemaker, but rather something that puts the concept of tech comm software into perspective, and it happens to be the product of the sponsor.)

So, where does this tech comm legend think technology is going next? Corfield thinks that going forward, voice is going to have the biggest impact. He felt that screen real estate is full, and that much of the visual is about adding a new widget, then removing a widget. Voice, he continued, eliminates how keyboard shortcuts are remembered. How many keyboard shortcuts does the average user know? Touch screens are a slow way to perform data entry. The impact of voice will be the ability to use visual tips, and have voice act as a virtual keyboard. Voice will be impacting product documentation, allowing it to understand how existing workflows can be modified. Corfield’s prediction is that Framemaker, along with other software out on the market, will “assimilate” voice, just like everything else.

Since leaving Framemaker, Corfield has been working with a product called SayIt, using voice as part of workflow optimization, and emphasized that voice truly is the next big thing (you heard it here, folks!). When asked about the use of voice technology in practical office use, Corfield responded that push-to-talk technology helps prevent cross-talk in an office environment. He also pointed out that with voice, there are no ergonomic issues as there are with carpel tunnel syndrome using a mouse and keyboard. If anything, voice will be more helpful!

On that note, the presentation was over. The long and winding road had ended, but has lead to new doors to be opened. 😉

I really enjoyed listening to the history and the thought process behind Framemaker that Dr. Corfield presented. Everything he mentioned made total sense, and it’s to his credit that he had the foresight to think about the next steps in word processing to create a useful tool like Framemaker to help technical writers meet the needs of documentation in the digital age.

There is a certain aura around creative, imaginative and smart people who make huge differences in our lives, whether it’s in music like McCartney, or tech comm software like Corfield. You can’t help but be awed in their presence, and yet understand that they are generally humble people.  When you have a chance to meet an individual like that, you want the opportunity to capture the moment–like have a picture of yourself and that person to prove that it happened. I was much too shy to ask Dr. Corfield for a photo with me to be honest. I felt awkward asking, so I didn’t. Heck, I felt awkward asking about his potential Twitter name! Even so, I’m glad I had the opportunity to meet him and hear him speak.  He’s got my vote as a candidate for the Tech Comm Hall of Fame someday.

(And, Dr. Corfield, if you do ever read this, please feel free to correct anything written here or add any clarification or other commentary below!)

Posted in Uncategorized

Adobe Day at LavaCon 2012 Roundup!

This post is just a quick summary of the Adobe Day at LavaCon 2012 series from this past week. As you see, there was so much information that it took six posts to try to summarize the event!

Being in Portland, Oregon was great. It was my first trip there, and being a native Easterner, my thoughts pushed me to that pioneer spirit of moving westward in this country. Once there, I saw a hip, young, modern city, continuing to look towards the future.  The information I gathered at Adobe Day was general information that was endorsement-free, and practical information that I can use going forward as a technical communicator, and that by sharing it, I hope that others in the field will equally take on that pioneering spirit to advance what technical communications is all about, and bring the field to the next level.

To roundup the series, please go to these posts to get the full story of this great event. I hope to go to more events like this in the future!

As I said, I really enjoyed the event, and learned so much, and enjoyed not only listening to all the speakers, but also enjoyed so many people who are renowned enthusiasts and specialists in the technical communications field and talking “shop”. I rarely get to do that at home (although it does help to have an e-learning developer in the house who understands me), so this was a chance for me to learn from those who have been doing this for a while and not only have seen the changes, but are part of the movement to make changes going forward.

I hope you’ve enjoyed this series of blog posts. I still have many more to come–at least one more that is inspired by my trip out to Portland, and I look forward to bringing more curated content and commentary to you!

The autograph from my copy of
Sarah O’Keefe’s book,
Content Strategy 101.
Awesome!
Posted in Uncategorized

Adobe Day Presentations: Part II – Sarah O’Keefe and Content Strategy

Sarah O’Keefe
of Scriptorium Publishing

After an energetic first presentation by Scott Abel, second presenter Sarah O’Keefe, author of Content Strategy 101 and founder of Scriptorium Publishing, talked about “Developing a Technical Communication Content Strategy.”

Sarah started by telling us that many companies don’t understand the value of technical communication, so technical communicators need to justify their approach. When writing up business cases for these justifications, technical communicators need to include what the current situation is, recommendations to improve the situation, costs associated with those recommendations, as well as the benefits and risks of taking the actions recommended.  If there are regulatory and legal requirements, then there is the need to build a case for more efficient compliance in order to avoid legal complications.

Sarah expounded on how technical communication departments should talk to management about how technical communications can control costs. She explained that there is a myth that cheap documentation can be done. She busted that myth by explaining that cheap documentation is actually more expensive, as it can be limited in availability making it useless, it can be hard to understand and out of date, and it may not be translatable into other languages. The cost of bad content is high customer service volume,  lost sales, content duplication, huge global costs, and it can contradict marketing communications.

The solution, she said, is efficient development involving the reuse of content, using single sourcing and cross-departmental reuse of content, only tweaking text that is already available. She stressed that formatting and production are important! Using templates and various structures are helpful. She encouraged using tools for creating the needed output.  Sarah also said that localization is important as well, that translations are needed component of communication documentation. All these can help bring costs down significantly! Sarah gave an example of how a common obstacle to efficient customer service or tech phone support is often a monster-sized PDF that the support representatives need to read before providing service while on the phone! The process of having to read the long document while online with a customer is time consuming and not cost efficient.

Sarah encouraged technical communicators to work on collaborating and creating better working relationships with other business departments such as tech support, training and marketing with technical content, as this will help to support those departments with pertinent information as well as help them to streamline information. Technical communication can be used to support sales–read documentation before you buy! Technical communication content also can help to increase visibility by creating searchable, findable and discoverable documentation,  especially for Google or SEO purposes. Sarah recommended building user communities with technical communication documentation, and making sure that technical communications aligns with business needs.

Sarah has further information which goes into greater detail both in her book, and on the book’s website, which is found at: http://www.contentstrategy101.com .

Sarah’s presentation was really good, in my opinion, because coming from my own experiences, much of what she explained was true, and as she said, the biggest battle is making management understand the value of having solid content strategy. One of my biggest issues at my last consulting job was exactly the scenario that Sarah described; marketing was not taking proper advantage of the technical communication documentation available, nor was it sharing resources and creating reuseable content. As a result, in-house documentation was long and overly customized when much of the information was the same or very similar (needed few tweaks), and the sales advisors that needed the information rarely looked at it because it was too long. When I made the recommendations about reuse or editing from a technical communications standpoint, I was ignored. Of course, I was only a consultant, and I wasn’t privy to understanding the departmental costs, but it did not feel good to know that some of the issues could be fixed with the kind of collaboration that Sarah described. In this respect, I could associate with what she was saying.

An aside note is that Sarah is a self-confessed chocoholic, and a fun part of her presentation was that she incorporated chocolate production into her presentation. To verify her chocoholic status, I was out with Sarah after the event, and caught her in the act of buying more chocolate at one of Portland’s chocolate boutiques:

Sarah O’Keefe buying more chocolate for inspiration!

I do think Sarah’s message is very clear. Technical communications has a lot of value, especially with structured content and reusable content, and as technical communicators, we need to push that agenda to management so that we can provide a bigger service to our clients and companies that they currently realize.

(Sarah–feel free to correct any of my interpretations in the comments below!)

Next post: Adobe Day Presentations: Part III – Joe Welinske and Multi-screen Help Authoring