Posted in Uncategorized

Adobe Day Presentations: Part V – Mark Lewis and DITA Metrics

After Val Swisher spoke about being global ready in today’s tech comm market, the final speaker of the morning, Mark Lewis, took the stage to speak about DITA Metrics.

Mark literally wrote the book about how DITA metrics are done, titled, DITA Metrics 101. Mark explained that ROI (return on investment) and business value are being talked about a lot right now in the business and tech comm worlds, so it’s worth having a basic understanding of how DITA metrics work.

Now, I have to admit, I know NOTHING about how any tech comm metrics are done, let alone how DITA metrics are done, so I listened and interpreted the information as best as I could. (Mark, if you are reading this, please feel free to correct any information below in the comments!)

Mark began by explaining that content strategy applies to the entire ENTERPRISE of a business, not just the technical publications. There are lots of ways to measure tracking through various means, including XML. Traditional metrics involved measing the cost of page, and the type of topic would be gauged by guideline hours. For example, a document outlining step by step procedures would equal four to five hours per write up of this type of procedure. Traditional metrics looked at the cost of the project through the measure of an author or a team’s output of pages, publications per month. It doesn’t measure the quality of the documents, but it concerned more with quantity instead of quality.

Mark referenced several studies which he based the information in his book, especially a paper done by the Center for Information-Development Management, titled, “Developing Metrics to Justify Resources,” that helped to explain how XML-based metrics are more comprehensive. (Thanks, Scott Abel, for retweeting the link to the study!)

XML-based metrics, Mark pointed out, uses just enough DITA information, concerning itself instead with task, concept and reference within documentation. XML-based metrics can now track the cost of a DITA task topic, showing the relationship between occurrences, cost per element, and total number of hours. The cost of a DITA task topic is lower because referenced topics can be reused, up to 50%!  For comparision, Mark said that you can look at the measurement of an author by measuring the number of pages versus the amount of reusable content of a referenced component. The shift is now in the percentage of reused content rather than how many pages are being used. Good reuse of content saves money, and ROI goes up as a result!

Mark introduced another metric-based measurement, namely through the perceived value of documents as a percentage of the price of a product or R&D (research and development), as well as looking at the number of page views per visit.  Marked warned the audience to be careful of “metrics in isolation” as it can be an opportunity loss, a marketing window. He clarified that page hits are hard to determine, because hit statistics could either mean the reader found what they wanted, or didn’t want that information. We have no way of knowing for sure. If technical communicators are not reusing content, this can make projects actually last longer, hence producing more cost.

Mark emphasized that through metrics, we can see that reuse of content equals saving money and time. Productivity measures include looking at future needs, comparing to industry standards, how it affects costs, etc. He suggested looking at the Content Development Life Cycle of a project, and how using metrics can help to determine how reuse or new topics cost in this process. By doing this, the value of technical communications become much more clear and proves its value to a company or client.

I have to admit, as I said before, I don’t know or understand a lot about the analytical part of technical communication, but what Mark talked about made sense to me. I always thought that measuring the value of an author based on page output rather than the quality of the writing didn’t make sense. Part of that is because as a newer technical communicator, I might take a little longer to provide the same quality output as someone who is more experienced, but that doesn’t mean that the quality is any less. So measuring pages per hour didn’t make sense. However, if consistency in reusing content is measured instead throughout all documentation, then the quality, in a sense, is being analyzed and it can be measured on how often information is referred or used outside that particular use. Using DITA makes a lot of sense in that respect.

More information about DITA metrics can be found on Mark’s website, DITA Metrics 101.

I hope you’ve enjoyed this series of all the Adobe Day presenters. They all contributed a lot of food for thought, and provided great information about how we as technical communicators should start framing our thought processes to product better quality content and provide value for the work that we do. I gained so much knowledge just in those few hours, and I’m glad that I could share it with you here on TechCommGeekMom.

Posted in Uncategorized

Adobe Day Presentations: Part I – Scott Abel and Structured Content

Scott Abel
The Content Wrangler

As I had mentioned in my first post about Adobe Day, there were several well-known tech comm authorities presenting, and Scott Abel was the first presenter. Scott is the founder and CEO of The Content Wrangler, Inc., and he has a great Twitter feed and blog, if you haven’t read them. Scott’s presentation was called, “More than Ever, Why We Need to Create Structured Content.” If you’ve never read Scott or seen Scott speak, he is a force to be reckoned with, as he’s definitely got strong opinions from his experiences, and he’s not afraid of letting you know what he thinks.

Scott explained that structure is in everything we do–including in nature–so it would make sense that content needs to be structured as well.  Structure formalizes a content model and provides authoring guidance. It can enhance the usability of content, providing visual cues and is the foundation for automatic delivery of content through syndication. Structure makes it possible to efficiently publish to multiple channels, outputs and devices from a single source, making it a critical component of transactional content and making business automation process possible. By providing structure, it is possible to adapt content and leverage responsive design techniques. Structure also allows us to leverage the power of content management systems to deliver content dynamically and increasingly in real time. By creating structured content, it is possible for us to move past “persona-ized content” and facilitate innovative reuse of content in known sets of related information as well as in the unknown needs as well.

Scott quoted author and technologist Guy Kawasaki by saying that innovators must allow time for the majority to catch up; new ideas take time to filter through. He followed up with the question, “How much time does it take to adopt?” He answered his question by explaining that technological innovation is getting faster, and the technological adoption rate is becoming shorter, which is good news. However, how long to adopt structured content? He explained that it’s actually not a new idea–the idea started in 1963! It started with the Sequential Thematic Organization of Publications created in the airline industry to standardize airline manuals.

With that in mind, Scott presented that the question now is to figure out where are we today with structured content. Scott concluded that right now, one of the major challenges is that old ideas are getting in the way because many technical communicators are still stuck with design concepts created in the print paradigm.  Other major challenges include the lack of knowledge and experiences, writers making manual updates, the lack of human resource support, and making tools work with configurable content. He did point out that lots of reuse content is going on! His point was that the tools work, but it’s the people and processes are the problem, as Sarah O’Keefe paraphrased him on Twitter.

So when asked why we should be technical communicators, Scott’s response was that as technical communicators, we know how to create structured content. Knowing how to create structured content increases our value and makes us marketable. “Our professional needs to change whether we like it or not,” he concluded, to keep up with these technological changes.

Much of what Scott talked about in his presentation was echoed again in the panel discussion later in the morning. Scott had provided a few examples during the presentation which showed what unstructured content versus structured content looked like, and it was very clear what the differences were. As technical writers, as Scott said, we understand the importance of structured content and how reused content can be used effectively. Those who don’t have that mindset tend to repeat the same processes and make more work for themselves, wasting time and money for a company. We have value, and we need to promote our skills in creating this kind of content organization.  I think technical communicators take this ability for granted, and by being proactive in showing how we can help create efficient and structured content, we can add value not only to ourselves, but also provide on a larger scale a true cost-saving service to our respective companies and clients.

(Scott–if you are reading this, please feel free to clarify anything that I’ve written if I didn’t interpret it quite correctly in the comments.)

Scott later offered his slideshow online, which is available with his permission.

Scott’s talk was a great way to start the morning, and lead smoothly into the next presentation by Sarah O’Keefe, titled, “Developing a Technical Communication Content Strategy.”

Next: Adobe Day Presentations: Part II – Sarah O’Keefe and Content Strategy