Posted in Uncategorized

What Does Knitting Have To Do With TechComm and m-Learning?

I’m so glad that this is now my 200th post on this blog! TechCommGeekMom has come a long way since it started out as a class project in grad school, now hasn’t it? For this particular post, I’d like to share my thoughts on something that I’ve been thinking about for the past two weeks or so.  It reveals one of my hobbies to you, but hopefully you’ll like the analogy.

As you’ve seen in the subject line of this post, I’m going to be talking about knitting and how it relates to tech comm and m-learning. Now, I know what you are going to say. Knitting is for grandmothers who make ugly stuff for everyone, and you are obligated to wear it when she comes over and visits. You couldn’t be wrong. Knitting has had a huge upsurge in the last ten or so years, and more and more people are adopting it as a hobby. It goes back to the 9-11 attacks, when people were trying to get back to a sense of security and home. I think with economic times, it’s also a relatively easy and inexpensive hobby to have (unless you are a true diehard like some of us).

Knitters these days make more than clothing items, accessories and toys, but it’s become an artform unto itself. (Ever hear of yarn bombing?)  I can’t remember when I started knitting exactly…but I’m guessing it was around 2004 or 2005. I know I was knitting in 2006 when I went to California for a convention for the now defunct Body Shop At Home businesses, and met Anita Roddick, as there’s photographic evidence of knitting in my hands during the event. In any case, it gave me a chance to learn something new that spoke a whole other language of its own, had a different vocabulary, and I got to work with all sorts of colors and textures in the process. For someone with sensory integration issues, it’s a great outlet for sight and touch. Even the rhythm of knitting up something has a calming effect, and following patterns forces my brain to focus.

So what does all this have to do with tech comm and m-learning? Well, as I thought about it,  there’s definitely an analogy that could be made about the benefits of knitting and how they lend themselves to these topics.  All this first came to me after I had spent the day at Adobe Day, and later took a small soujourn into the city of Portland with four fabulous technical communicators who also happen to be knitters. They had invited me to go on a yarn crawl with them (similar to a pub crawl, but in search of high quality yarn instead of libations), and I readily accepted. We only made it to one store, but we had a great time checking out all the high-end yarns and knitting notions available.

Pardon us drooling over the yarn!
Five technical communicators who
are also knitters! We know!
At KnitPurl, Portland, OR, during LavaCon
L to R: Sharon Burton, me, Sarah O’Keefe,
Marcia R. Johnston, and Val Swisher

As I reflected on Adobe Day, one of the big themes of the morning was the idea of using structured content. Without structured content, all of one’s content could fall apart and lose strength. An architecture needs to be created to make it work. Well, knitting is like that. If one doesn’t follow a pattern, and just knits in a freestyle, haphazard manner, instead of a nice jumper/sweater, one could end up with a garment with no neck hole and three sleeves. Without the structure of a pattern, and even reusing good content (or stitches, or groupings of stitches describing appropriate methods for structure), the whole thing falls apart. The beauty, too, of reusable content in content management, just as in a knitting pattern, content that is produced well, is solid, and the reader can understand clearly and concisely will produce good results, and can be recombined effectively in different instances without losing its meaning.  Take a look at a sweater or knit scarf you might have. You’ll find that each stitch makes sense, even when you look at different pattern of how the cuffs and collar differ from the sleeves and the body. But it all fits together.  In my mind, this is how reusable content can be used.  Very tight, well written content can be reused in different combinations without losing its context and form if done correctly.

The other way I thought of the analogy of knitting again had to do with how one learns how to knit, and how it relates to m-learning. Knitting fair-isle sweaters, Aran sweaters or lace shawls doesn’t come on the first day of learning how to knit. Heck, I’m even still learning how to do all these techniques! It comes with learning a foundation–namely the knit stitch and the purl stitch–and building upon that foundation. Any piece of knitting you see is all a matter of thousands of knit and purl combinations to make the item. But first, one has to master the simple knit and purl stitches by learning how to understand how to gauge the tension between the needles, the yarn and your fingers. Once that is mastered, then learning how to read the “codes” or the knitting language of K2, P2, S1 (that’s knit two, purl two, slip one), for example, then the real fun begins. Knitters have to pay attention to details in the directions, because knitting can be a long task. Except for tiny baby sweaters or sweaters for dolls or stuffed animals, I don’t know any sweater that could be hand knit in a single day, even if it was done from the time the knitter woke up until the time the knitter went to bed. It just couldn’t happen, even for a fairly experienced knitter.  So, each part of the knitted pattern must be learned or read in chunks so the knitter can understand where he or she left off.  Talk to me about lace patterns especially, and it’ll make more sense. But each technique takes time to master, and most knitters learn these techniques a little bit at a time. Whether a knitter is self-taught or taught in a conventional learning environment, nobody learns all there is to know about the most advanced knitting techniques on the first day. Just getting knitting and purling down takes a while. It’s an arduous task to learn to knit and knit well, and to be patient enough to see a pattern all the way through.

Just like in m-learning, things need to be learned in small chunks for comprehension. Information has to be short and to the point so that the reader, just like the knitting pattern reader, can take that information, mentally digest it, and then work out how to use the information. There is definitely trial and error in both m-learning and knitting; if one doesn’t succeed, then it’s possible to go back and try to re-learn the information and correct it, and in doing so, retains the information better.

Now, if one happens to be BOTH a technical communicator AND a knitter, then these are easy concepts. Reusing content, breaking down information into smaller portions for better learning retention, structuring the content appropriately and consistency comes both with our words and our stitches.

A variety of tools can be used in either case to create the content. For technical communicators, it’s the use of different software tools that help us achieve our goal. For knitters, different sized needles, different kinds of yarns and other tools can be used in the process. Is there only one way of doing things? Of course not. Is there any single tool that will do the job? Generally, no. This is the beauty of both technical communication and knitting. In the end, the most important tool is the mind, because without the individual mind, creativity and intellect cannot be expressed. When all of these tools and factors work together, it is possible to create a fantastic piece of work. When this combination of factors aren’t followed, it can look pretty disastrous.

So, next time you see someone with a pair of knitting needles in their hands, look carefully at the workflow that person is following. You might learn something from it.

Posted in Uncategorized

Flame Wars need not apply.

I had planned for this post to be something a little more lighthearted, but my plan was changed when I received my first insulting comment on this blog. It came in, and made accusations that proved that the person hadn’t read the blog post carefully, and additionally made insult of my relationship with Adobe. I was shaken up by this comment, because it was meant to be insulting, and in no way was the criticism constructive in any way. I was taken aback by it, and when I told my husband about it, he replied, “This is ‘typical’ internet behavior these days… don’t take it personally.” I knew he was right, and but still…it truly bothers me. It certainly doesn’t seem like professional behavior.

I choose my words carefully on this blog. Every entry is not written off the cuff, and I take a lot of time to write and edit each post. I do my best to be as diplomatic as possible when writing, even if I have a very strong opinion about something. I do my best not to insult anyone or anything. I try to dish constructive criticism when I feel it’s necessary. My intentions are to put forward my own thoughts as a new technical communicator who is trying to make her way into the field, and share ideas that I find interesting or educational. If I curate something from the web from my ScoopIt account, it’s because I found something worthy of sharing with my TechCommGeekMom audience.

This blog started out as a class project in graduate school, and it has taken off to have a life of its own. I don’t claim to be an expert. I don’t claim to be highly experienced. I don’t claim that I am familiar with everything that is related to tech comm. I try to be humble with what I do or don’t know. Yes, I have some knowledge and experience, but if you want to read commentary from someone more experienced who is an “expert” in the field, please, be my guest. You can go elsewhere.

I do write a lot about Adobe on my blog, and I feel that I need to clarify that, because if this one individual is questioning it, perhaps others are as well. My current relationship with Adobe was something that happened to me by surprise. I have always been a fan of Adobe products, even before this association happened. I’ve been using Adobe products for the last 15 or so years. I wrote a case study in grad school supporting Adobe’s business practices with Flash a year ago–well before I ever started this blog. So, when Adobe contacted me several months ago, it was a total shock. It was really out of nowhere for me. All I did was promote my blog and a post on my blog that called out Adobe and its competitors for making it a little difficult for students to get their hands on tech comm software. I never expected anyone to respond. If the MadCap Software, the makers of Flare, had responded the way that Adobe had, I’m sure I would be a Flare advocate right now. Same with the makers of Lectora and Articulate. I’m new, and when I wrote that fateful post, I just knew that these software packages have the same main function, and that I needed to learn this kind of software to get a job. Plain and simple.

Out of the many companies that I named in that blog post, Adobe was the only one that actually responded. As I said, I didn’t expect ANYONE to respond– it was just a fairly well articulated rant, if I do say so myself. Evidently, someone at Adobe thought so too, and wanted to help. Since I already liked their products, how could I not respond favorably to them? When offered the chance to do a webinar for their Thought Leadership series, that shocked me as well. What the heck did I have to offer or to say? I’ve been told that because I’m new to the TC world, it was because I had a fresh perspective of the field, and it was great to get a new opinion in the mix. From there, Adobe has provided me with opportunities such as sitting in on a conference call previewing products, attending a pre-conference event hosted by them at a major tech comm conference, and promoting my blog to a global audience. Did I ask them to do that? No, not at all. Am I going to take advantage of such opportunities? Well, I would be very stupid not to do that, especially since it’s still very early in my tech comm career!

Adobe is an advertiser on my page, but they aren’t paying me a salary. I am not employed by Adobe at all. (Although I wish I was! I’d be a great product evangelist!) I would love to have additional advertisers on this blog, as I totally embrace diversity in products and software if it helps get the job done. If Apple, Google, Microsoft, MadCap, Lectora, Articulate, TechSmith or any other software or hardware vendor wants to establish a business partnership to advertise on my blog, I welcome the opportunity! These are among the best of the best, and there are plenty of others out there as well that I’d be happy to include. Adobe happens to be the first to take advantage of my offer there on the right column.

Adobe is like the Doctor Who in my life. They came in unexpectedly, have taken me places and given me opportunities that I would not have had without them, and so there is a certain amount of loyalty they’ve earned from me. Is that so wrong in that context? I don’t think so. Unless they do something really ugly and downright horrible to me, I have no reason not to support them, especially in light of them supporting me and this very young blog that’s only 7 months old. They have never told me or asked me what to write on this blog. They have supported my independent thinking. This is not an Adobe blog. Perhaps it leans towards a “fan blog” sometimes, but it’s not solely concentrated on this.

TechCommGeekMom addresses technical communications, m-learning, e-learning and educational technology from my perspective as a new technical communications professional who is trying to make her way into this field and make a difference. While TechCommGeekMom is meant to be a place where I can share my thoughts and concerns, others can as well. Differing opinions are welcome if they are done in a fair and constructive manner. This blog is meant to embrace and discuss the best practices in the tech comm and e-learning fields as they move forward. If you don’t like what you read, that’s your prerogative, and you can go elsewhere. But I’m not going to change how I write or who I am for anyone. I hope that my regular readers, as well as newer readers, will appreciate my position, and embrace it by continuing to visit this blog.

As a mom, I’d like to quote Thumper in the movie, Bambi,

Posted in Uncategorized

Adobe Day at LavaCon 2012 Roundup!

This post is just a quick summary of the Adobe Day at LavaCon 2012 series from this past week. As you see, there was so much information that it took six posts to try to summarize the event!

Being in Portland, Oregon was great. It was my first trip there, and being a native Easterner, my thoughts pushed me to that pioneer spirit of moving westward in this country. Once there, I saw a hip, young, modern city, continuing to look towards the future.  The information I gathered at Adobe Day was general information that was endorsement-free, and practical information that I can use going forward as a technical communicator, and that by sharing it, I hope that others in the field will equally take on that pioneering spirit to advance what technical communications is all about, and bring the field to the next level.

To roundup the series, please go to these posts to get the full story of this great event. I hope to go to more events like this in the future!

As I said, I really enjoyed the event, and learned so much, and enjoyed not only listening to all the speakers, but also enjoyed so many people who are renowned enthusiasts and specialists in the technical communications field and talking “shop”. I rarely get to do that at home (although it does help to have an e-learning developer in the house who understands me), so this was a chance for me to learn from those who have been doing this for a while and not only have seen the changes, but are part of the movement to make changes going forward.

I hope you’ve enjoyed this series of blog posts. I still have many more to come–at least one more that is inspired by my trip out to Portland, and I look forward to bringing more curated content and commentary to you!

The autograph from my copy of
Sarah O’Keefe’s book,
Content Strategy 101.
Awesome!
Posted in Uncategorized

Adobe Day Presentations: Part V – Mark Lewis and DITA Metrics

After Val Swisher spoke about being global ready in today’s tech comm market, the final speaker of the morning, Mark Lewis, took the stage to speak about DITA Metrics.

Mark literally wrote the book about how DITA metrics are done, titled, DITA Metrics 101. Mark explained that ROI (return on investment) and business value are being talked about a lot right now in the business and tech comm worlds, so it’s worth having a basic understanding of how DITA metrics work.

Now, I have to admit, I know NOTHING about how any tech comm metrics are done, let alone how DITA metrics are done, so I listened and interpreted the information as best as I could. (Mark, if you are reading this, please feel free to correct any information below in the comments!)

Mark began by explaining that content strategy applies to the entire ENTERPRISE of a business, not just the technical publications. There are lots of ways to measure tracking through various means, including XML. Traditional metrics involved measing the cost of page, and the type of topic would be gauged by guideline hours. For example, a document outlining step by step procedures would equal four to five hours per write up of this type of procedure. Traditional metrics looked at the cost of the project through the measure of an author or a team’s output of pages, publications per month. It doesn’t measure the quality of the documents, but it concerned more with quantity instead of quality.

Mark referenced several studies which he based the information in his book, especially a paper done by the Center for Information-Development Management, titled, “Developing Metrics to Justify Resources,” that helped to explain how XML-based metrics are more comprehensive. (Thanks, Scott Abel, for retweeting the link to the study!)

XML-based metrics, Mark pointed out, uses just enough DITA information, concerning itself instead with task, concept and reference within documentation. XML-based metrics can now track the cost of a DITA task topic, showing the relationship between occurrences, cost per element, and total number of hours. The cost of a DITA task topic is lower because referenced topics can be reused, up to 50%!  For comparision, Mark said that you can look at the measurement of an author by measuring the number of pages versus the amount of reusable content of a referenced component. The shift is now in the percentage of reused content rather than how many pages are being used. Good reuse of content saves money, and ROI goes up as a result!

Mark introduced another metric-based measurement, namely through the perceived value of documents as a percentage of the price of a product or R&D (research and development), as well as looking at the number of page views per visit.  Marked warned the audience to be careful of “metrics in isolation” as it can be an opportunity loss, a marketing window. He clarified that page hits are hard to determine, because hit statistics could either mean the reader found what they wanted, or didn’t want that information. We have no way of knowing for sure. If technical communicators are not reusing content, this can make projects actually last longer, hence producing more cost.

Mark emphasized that through metrics, we can see that reuse of content equals saving money and time. Productivity measures include looking at future needs, comparing to industry standards, how it affects costs, etc. He suggested looking at the Content Development Life Cycle of a project, and how using metrics can help to determine how reuse or new topics cost in this process. By doing this, the value of technical communications become much more clear and proves its value to a company or client.

I have to admit, as I said before, I don’t know or understand a lot about the analytical part of technical communication, but what Mark talked about made sense to me. I always thought that measuring the value of an author based on page output rather than the quality of the writing didn’t make sense. Part of that is because as a newer technical communicator, I might take a little longer to provide the same quality output as someone who is more experienced, but that doesn’t mean that the quality is any less. So measuring pages per hour didn’t make sense. However, if consistency in reusing content is measured instead throughout all documentation, then the quality, in a sense, is being analyzed and it can be measured on how often information is referred or used outside that particular use. Using DITA makes a lot of sense in that respect.

More information about DITA metrics can be found on Mark’s website, DITA Metrics 101.

I hope you’ve enjoyed this series of all the Adobe Day presenters. They all contributed a lot of food for thought, and provided great information about how we as technical communicators should start framing our thought processes to product better quality content and provide value for the work that we do. I gained so much knowledge just in those few hours, and I’m glad that I could share it with you here on TechCommGeekMom.

Posted in Uncategorized

Adobe Day Presentations: Part IV – Val Swisher asks, “Are You Global Ready?”

Val Swisher
of Content Rules, Inc.

Following a short break after Joe Welinske’s talk about Multi-screen Help Authoring, Val Swisher took to the stage.

Val is the founder of Content Rules, Inc., and she spoke about eight simple rules for technical communicators to follow to make content global-ready–now! Her specialty is doing translation work, so she knows a thing or two about making content ready for a global market. As she went through each rule, she would explain the impact of the rules and why the rules were in place, although some were self-explanatory.

The rule she listed were as follows:

Rule 1: Not all errors are created equal. Some can cost you thousands of dollars!
This is one of those obvious rules. Taking the time to write content carefully as well as making sure proper editing is done is a necessity. Even one small typo can make a difference.

Rule 2: Creative Writing is a myth. Standardize.
Val’s point with this rule is that superfluous writing is not necessary. Keeping content clear, concise, cogent and correct is especially important in translation, and allows for better reuse of content.

Rule 3: Real copy editors don’t do it without a terminology manager. 
It is vital to use the same terms for certain words, especially for translation purposes. For example, the words “puppy”, “dog”, and “canine” all refer to the same animal, but are clearly different words, even though they essentially mean the same thing. In translation, there are times that this much word variation for a single item isn’t available in a different language, so choosing one word as the referential term is recommended. It keeps terminology within the content–especially if reusing content–consistent.  Style guides are, unfortunately, not followed as often as they can be. A system is needed to manage terminology and help prevent problems like this example from occurring.

Rule 4: Have you got translation memory (a translation database)? Your vendors do. Use it. It keeps content standardized and saves money.
This is another fairly self-explanatory rule. I was not aware, since I’m not in the translation business, that there are such things as translation databases. From what I could understand how it works (and someone please correct me if I’m wrong), a translation database has features that when a specific turn-of-phrase is used on one language, there is a specific translation for that combination of words into another language. When a translation is done, the database looks for that word combination and translates it accordingly. This, again, allows for consistency in translations between the different language editions of content.  As a technical communicator who does translations, Val is saying that if you don’t have such a database in place, you should have one because in the long run, it will standardized content and save money.

Rule 5: Don’t complain about quality of your tech writers. You agreed to outsource docs to ___ in the 1st place.
Val pointed out that while there are good outsource resources for writing and translation out there, sometimes the quality is not as good as keeping it in house or closer to home, especially if the content is written by someone whose first language is not English. Good quality source material is key! Having good quality source material helps control costs, especially with translation!

Rule 6: If you write flabby copy, even the nicest vendors will email you a bill for localization that will astound you.
Again, this comes back to having quality content in place. Val’s point was that if you do write weak content that is difficult to translate because it is not quality content, even one’s best clients will send you a bill for the translation for localization purposes, and the bill will be VERY HIGH. Again, having quality content saves money!

Rule 7: Get rid of extra adjectives and superlative words! Delay this product launch, and there’s no next product launch.
This rule is a strong recommendation related again to how content should be written. The use of extra adjectives, adverbs and other superlative words do not enhance the content. Using such words that have to be rewritten or translated can delay a product going out, and for a client, that can be a deal-breaking move. By delaying the product due to not meeting a deadline due to overdue time for translation, and there will be no next time being able to help with a product launch. Obviously, that would be bad business.

Rule 8: Translation is a team sport. You want to work alone? Become an accountant.
While this rule elicited a laugh from the audience, it was a point well taken. Teamwork is KEY! A better source of English content will result between source writers and translators if they work together.

Val was asked the question at the end of her presentation, “What alternative tools for style guides are on the market?” She responded that there are lots of software tools out there, but to be careful about push technology within those software items.

More information can be found at Val’s website, http://www.contentrules.com  and her free e-book is available by e-mailing her at vals@contentrules.com.

I found this presentation rather fascinating, especially since Val presented it with a sense of humor. But her point was clear. Content needs to be as precise as possible when it will be reused and especially when used in translation for consistency. By following her basic rules, costs can be controlled, and the quality of the content can only get better.

I thought about what it takes to do translation, searching my own memory banks from when I almost minored in French during my undergrad years and had to do translations, to the present day watching my husband translate literature written in German to Spanish for a group he’s been involved with for years, to my own struggles to translate what I want to say to my in-laws into my broken Spanish. Translation is not an easy task, but when thinking about translating my English thoughts into another language, it can get tricky because of the turn of phrase or colloquialisms used from area to area. Even in talking to my husband about the topic, he will say that there are different idioms used between Spanish speaking countries, although Spanish will still be relatively “standard.” Being from Ecuador, he can still understand someone from Spain, Mexico or Argentina as much as an American can understand someone from the UK, Canada, or the Australia. But I’ve even found in my own teaching of a business and technical writing course to a corporate group in Asia is that English taught globally is not consistent due to the source English being from different countries, so I have to go and set the record straight.  I can certainly appreciate where consistency and choice of words can lead to better quality content and communication in the long term.

The next presentation, and the last in this series: Adobe Day Presentations: Part V – Mark Lewis and DITA Metrics.