This just came out in the news today, which I saw through the Mashable feed. Google’s AI Assistant is really learning how to interact using natural language in a big way. The future, if it’s not now, is coming very soon!
If this is truly working, and I’m guessing it’ll be available to the public soon enough, it’s going to be kicking the back end of Siri and Alexa and Cortana. I’ve used Siri for a while now, and it’s not perfect, but it’s okay–it’s gotten better over the years. Alexa has been a bit of a disappointment to me–Siri can usually do better. With mixed results from those two, I haven’t really ventured into trying Cortana, but I’m willing to bet that it’s still not as developed as the Google Assistant.
How does this affect technical communicators? Big time. From what I can tell, this is about the chatbots and machine language learning that’s been talked about recently. But at the same time, it affects how we communicate through rhetoric or voice. Sometimes we take actual speaking for granted, and it’s when we try to describe something that one sees clearly that it becomes difficult. Or, sometimes we can write it out well, but can’t explain well in voice. This means that plain and very clear language is going to be helpful going forward as we develop the content for these AI assistants that will be developing.
Soon enough, we’ll be talking to HAL or to our starship’s computer with ease.
When going back in time in Star Trek IV, Chief Engineer Scott forgot that there wasn’t AI in the late 1980s.
What do you think about this development? It’s exciting to me–enough to make me want to purchase a Google Assistant! It definitely raises the bar for Apple, Microsoft, and Amazon, for sure. Let some healthy competition begin! (And more tech comm jobs associated with it!) Include your thoughts below.
Anyone who has read this blog for a while knows how much I love Val Swisher of Content Rules. Why? Well, first of all, she’s a lovely person and great friend all around. But that’s beside the point. As a technical communicator, she is one of the foremost experts on content strategy, specializing in global content strategy. I’ve seen Val give presentations at various conferences, and the thing I like about her presentations is that while her topics might be high-level topics, someone like me who is still learning can understand what she is talking about. I never walk away from a Val Swisher presentation without feeling like I absorbed something that I can use in my own work–or at least have a better understanding of how it fits into the content strategy field. I have often credited Val for providing me with the ideas that have helped me get my current job and make an impact there.
So on that note, I was pleased to see that she had written a book on her specialty called, Global Content Strategy: A Primer, which is available through The Content Wrangler’s Content Strategy Series published by XML Press. This book is easily read in an afternoon, and is loaded with a lot of information.
If you’ve never had the pleasure of hearing Val’s presentations, then this book is a great way to have many of the concepts she talks about in her presentations found in one place. She provides not only the basics of what global content strategy is, but breaks down bigger ideas into simple terms, and includes color images to provide examples, which is a good move. Val explains that globalization is not just about translating content, but also being sensitive and knowledgeable about localization as well. For example, what works in Portugese in Portugal doesn’t necessarily work for Portugese in Brazil. Translations can’t always be made word for word because of idioms and expressions that aren’t universal. Val provides many examples of this applies not only textually, but in imagery as well.
The book also talks about how the translation process can get complex and bungled without establishing a translation memory database and consistency of terminology. Val provides some pointers to help global strategists wade through these issues to keep it all straight, including what not to do as well as what best practices are.
My only criticism of the book, ironically enough, is that it doesn’t seem to be written for a global audience, but rather for an American–or perhaps North American–audience. There’s nothing wrong with that, but what if I was in another country and looking to create a global strategy? Perhaps it’s because as the Internet has grown, it has seemingly been American English-centric, and by writing for an American audience, Val has written for the group that needs to become more aware of the global audience it needs to reach!
I’ve been a team member on a global web project for the last six months or so, and I remember much of what I learned from Val had taught me to the members of my immediate team that would show that even though we were working on a North American section of a website, we had to find that balance between the global and local content. Sometimes my words were heard, and sometimes it fell on deaf ears. Upon reading this book and revisiting the concepts that I’ve heard her present in the past, I do wish this book had been given to someone at the top of the global team to understand that translation alone is not enough, and that localization makes a big difference. They could’ve used this book as a great reference to better streamline the process and the web project. Many global companies could benefit from reading this book to help put their content in perspective.
If you are looking to acclimate to the concept of global content strategy and what that entails, then this is a great resource for you to read. There are a lot of details squeezed into this slim volume that will be easy to understand, and yet you’ll feel a little overwhelmed at first at how many details one needs to consider when creating a global strategy. Fortunately, this reference book breaks it down so that it isn’t as overwhelming as it could be, and helps content strategists think in a more single-sourced, consistent way to provide the best ROI for a project.
Content Rules Inc. was kind enough to extend their invitation to have me blog for them again. This time, it’s on a subject that’s near and dear to their hearts as well as mine.
This article talks about my own personal experiences in trying to use standardized language. Whether you use standardized language in your personal or professional life, it’s something that one needs to keep in mind as a writer, especially when writing for a global audience, and even more so if you are writing for a digital format that is easily accessed through the Internet. It’s not easy to do, but it’s something that should be tucked in the back of every writer’s brain.
I realized as I was writing this post that this would be my 500th post on TechCommGeekMom. Who knew that so much information and thought could accumulate through original posts and curated content? I’m also very close to my all-time 15,000 hits mark (only a few hits away at this writing). I wouldn’t have believed you if you told me that I’d hit these benchmarks when I started this blog, but of course, I’m going to keep going! I debated about what I should write for my 500th post–whether to finish my Adobe Day coverage or do something else, and in the end, it seems fitting to finish my Adobe Day coverage, because in many respects, knowing and writing about the presentation of Scott Abel, aka “The Content Wrangler”, shows how far I’ve come already in my tech comm journey from beginner to covering internationally known presenters.
Scott is one of the most prolific and vocal speakers out there on the conference circuit speaking about content–whether it be content management or other technical communication topics. It also seems like he has written the forewords of many of the best tech comm books out there. He’s everywhere! To boot, he’s an accomplished DJ, and I found myself “bonding” with him over dance remixes and mash-ups while at Lavacon, because I always enjoy when he posts either his mash-ups or his favorite mash-ups on Facebook. (I’ll be writing a post about the relationship between tech comm and dance mash-ups in the near future.) He is a person who is full of so much kinetic energy that you wonder when he’s going to explode, but he doesn’t. Even the time I saw him at the STC Summit last spring with a bad cold, he was still more on top of his game than a lot of people would be on a good day. Much like Val Swisher, my love for all things Scott Abel also knows no bounds. He knows how to stir things up at times, but there is no denying that in his frenetic pace of delivering a presentation, you learn SO much. I’m lucky that he’s so kind to be one of my cheerleaders!
Scott Abel checking his files before his presentation
So when it came to thinking of a garden in Portland to use as an analogy to Scott, I had to deviate. In my mind, he’s the Voodoo Doughnuts shop located about four or five blocks away from the Chinese Garden. Scott’s talks always have lines going out the door, and like many of the Voodoo Doughnuts themselves, the unique flavors dispensed open your mind up to new and delicious possibilities and ideas, and you come back wanting more (hence, more long lines!). They are both crazy and sweet at the same time. You can’t beat that combination.
Scott was the keynote speaker for Adobe Day as well as the moderator of the discussion panel later in the event. Scott’s topic for his talk was called, “Five Revolutionary Technologies Technical Communicators Can’t Afford To Ignore.” If Joe Gollner went fast during his presentation, then Scott went at lightning speed, so my notes below are the highlights.
Scott started by telling us that translation is going to be an important part of automated content going forward. It’s important to understand that for the web, the World Wide Web (WWW) is equal to the “land of opportunity.” The WWW can reach a global market reaching new consumers. As American users, we forget that 96% of web users are not in the US. We don’t all speak English globally. In fact, less than 6% of the global population speaks English well, but don’t necessarily read or write it well.
Scott’s list of the five technologies the Tech Comm can’t ignore were as follows:
1) Automated Translation
Why would be need automated translation? We write for the *worldwide* web. There are over 6000 languages in the world, so translation is a big deal for a global reach and global connection. We need to recognize that content is written for both machines and humans. Even though we write for both machines and humans, we need to write for machines first, as they are the “gatekeepers” of content, such as for searches. Everything goes through the machine first. We need to recognize that writing rules learned in elementary school are no longer sufficient for a world in which language science is needed. We need to examine our content from the vantage point of a rules-processing engine and ensure it’s optimized for machine translation.
2) Automated Transcription
Automated transcription involves software that translates speech to text for machine use. Without transcription, content is locked and hidden from view. Transcription allows for better searchability of content. Scott recommended Koemei as a good transcription software tool for video and general transcription, as it can help transform editable content into other languages.
3) Terminology Management
Terminology management controls words in a central place, namely the words used the most and used consistently for branding, products, etc. Terminology management is important for consistency as well as for regulatory reasons. This is an instance where seeking a global content strategist is needed to help standardize processes. It’s best to adopt a terminology management system, such as Adobe partner and Scott’s suggestion, Acrolinx.
4) Adaptive content
Adaptive content is content that is structured and designed to adapt to the needs of your customer; it’s about substance of the content. Adaptive content adapts to the devices, e.g. laptops, GPS, and smartphones. Customers are demanding exceptional experiences, so we need to meet their expectations, so it’s up to responsive designers to meet that challenge. Adaptive content makes it possible to publish to multiple platforms and devices. It is content separated from formatting information. By allowing authors to focus on what they do best, adaptive content makes content findable and reuseable by others who need it. We need to rethink content, as the move to adaptive content involves work, but the ROI (return on investment) can be realized in months instead of years.
5) Component Content Management
Component content management systems are needed. They focus on the storing of content components that are used to assemble documents. Components can be in all sizes, and can be photos, video, and text. It’s about managing CONTENT not FILES.
Scott provided these slides as his example to show this:
Structured content, combined with a component content management system, supports personalized content and targeted marketing, which in turn increases response rates. In this end, this process can save money! The key is to remember that all customers are not the same! Reusing content without the “copy and paste” methods produce the best results. You can ensure that content is consistent by seeking a content strategist who understands content and is a technologist. Implement a component management system. Scott suggested checking out Astoria Software for a good component content management system.
At this point, Scott’s talk had pretty much finished, but in answering audience questions, he pointed out that there’s a lot more than just these five technologies to watch. He suggested that we should look out for wireless electricity, flexible surfaces, more wireless devices, wearable computing, and augmented reality as well. He also said that in order to mature as a discipline, we need to be content craftspeople, content designers and content engineers. We need to leverage using content and code. We need to think more like engineers, and less like writers and editors. Even websites that are very localized still need to be written for global purposes to improve the English used for the native speakers as well. Controlled vocabulary helps all end users!
Scott covered a LOT of information in a short amount of time, and he set the tone for the rest of the session, as the presentations that followed repeated much of the same information. (This is a good thing, because then we know that the information is valid, coming from several experienced technical communicators!)
Scott posted on Twitter than his presentation was available on SlideShare, but I have it below.
And as always–Scott, if I misinterpreted or misquoted any of the information I summarized above, please let us know in the comments!
Val Swisher was the next to last individual to speak at the Adobe Day at Lavacon 2013 event. For those who are regular readers of this blog, you know that my love for all things Val Swisher has no bounds. I’ve always been able to take her easy-to-digest information, and absorb it quickly into my brain, as well as relay her knowledge to others. When I looked at Portland Gardens to compare her to, I chose Ladd’s Addition Rose Garden. While it’s not as well-known (unlike Val, who is very well-known), this particular park, according to The Rose Garden Store, was one of four rose gardens especially built from the Ladd estate, in which the design included these gardens coming together to form the points of a compass. I often think of Val as my compass, as she has never steered me wrong with her information or with the wisdom and fun that she’s shared with me one-on-one.
Val’s Adobe Day presentation centered on talking about source English terminology in a multi-channelled, global world, and how terminology affects structured authoring, translation and global mobile content. She started the talk by reminding us that historically, we’ve always created content, whether it’s been on cave walls, through stenography, through typewriters or eventually on word processors. In every instance, consistent terminology has been essential for structured authoring and content. Managing terminology is also essential for translation and for reuse. She stated that prior attitudes used to be that the more complicated the writing was, the more “fancy” the product was. Today, that’s definitely not true. She used the example that I’ve heard her use before, but it’s so simple itself that it’s a classic. Her example involves writing for a pet website. If multiple words meaning dog are used, there can be problem with reuse, because you can’t reuse content if you use different words.
Here’s the example Val showed.
Val pointed out that it would be an even worse situation if technological or medical terminology was used instead.
Val continued by saying that when it comes to XML, reuse , and terminology, you cannot realize the gains of structured authoring if you’re not efficient with your words. Terminology is critically important to gain more opportunities.
Val Swisher explaining how to approach content from a translation perspective.
Translation comes down to three elements– we’re trying to get better, cheaper, and faster translation output. We MUST use technology to push terminology and style/usage rules to content developers. In order to make it cheaper, we need fewer words, reused words, and reused sentences. It’s impossible for writers to know or even know to look up all term and usage rules. We MUST automate with technology. For example, “Hitting the button” is not translatable, but “Select OK” is fine! She said, “Say the same thing the same way every time you say it.”
For better translation, translation quality needs to improve and meanings need to match in order for better machine translation to be a possibility. Bad translation comes from the source itself. If the source information is problematic, then the translation will be problematic. The best way to save money and time is to say the same thing, every time, using the same words, and use shorter sentences. For machine translation, don’t go over 24 words in a sentence.
Faster translation is seen as content that takes less time to translate, needs fewer in-country reviews, and gets to market more quickly. The key to delivering global mobile content is responsive design, global mobile apps, text selection is key, and terminology is the most important element. Val showed this example of how translation in responsive design isn’t working, where the Bosch websites are not exactly in synchronization:
The mobile website on the left looks nothing like the English language version on the right.
The simpler the design is for the website–especially in mobile, the less you have to tweak it. This is especially true where consistent terminology is important, because consistency is needed for structured authoring. Creating truly faster, cheaper, and better translation enables a true global responsive design. This is not a simple task, as there is no such thing as simple, even when writing about complex concepts. Even if you think you’re not translating, your customers are, so the content needs to be very clear. The scary part of this is that some companies use Google Translate as their translation strategy, which is risky at best. To use something like Google Translate as the translation software, the content had better be tight, clear, and consistent.
One of the things I enjoy with Val Swisher’s presentations is that it all comes down to common sense, and she breaks it down into easy manageable parts for those of us–like me–who might not have thought about the context of language for structured authoring, and the consequences for not strategizing content to include translation considerations.
I highly recommend checking out Val’s blog for other great insights.
(As always, Val–if you’d like to add or correct anything here, please do in the comments below!)
You must be logged in to post a comment.