This is part 5 of a 5-part series about my time at the SXSW Interactive Conference 2017. I’ve consolidated information around five trends and some of the most interesting things I learned about each of them. (Originally sent as a series of emails to a smaller group of associates, RPA thought this was so great that we decided to archive them on our website.)
I couldn’t do a wrap up of trends from SXSW without touching on chat bots.
When I first went through the list of sessions for SXSW on the app, the number of sessions on chat bots, conversational UI and artificial intelligence were staggering. You could have easily spent the entire week in sessions on those topics and nothing else.
If you’re speaking at a conference, it’s very good practice to predict a PARADIGM SHIFT. It really makes everyone sit up in their chairs. And if you’re right, and the world does change like you’ve predicted, you’re on the record about it and feel really good about yourself. And if you’re wrong, no one will remember. And the darling of the paradigm shifts discussed at SXSW this year was conversational UI. If you think about it, it’s kind of an easy one. Amazon’s Echo was the top selling product on Amazon during the 2016 holiday season. Seems like just about every new product can either talk to you, or be connected to something that can talk to you (“The Internet of Things” – another big topic at the conference). It seems entirely possible that in the very near future we could start using messaging and conversation to access what we need online versus traditional search engines and websites. As discussed in yesterday’s email, we are increasingly looking to social to discover, and then we can just ask Alexa to buy it. Who needs the web?
In anticipation of this shift in internet behavior, here’s what you need to know:
- If you think about the progression of computer technology, we went from code to a more user-friendly GUI/windows/icons system in which you didn’t need to know the code language. Now, iPhones and other touch devices have more of a natural user interface. Doesn’t really require an understanding of any kind of language – you just tap what you want. We are starting to see a rise in conversational interfaces (Siri, Alexa, etc.). With each progression, the interface is minimized, reducing the steps between thinking of something and the command being received
- Still very narrow use of artificial intelligence for conversation bots. But for these bots to actually be useful and not only serve very narrow purposes, AI is essential. One speaker referenced this Jean Baudrillard quote- “The sad thing about artificial intelligence is that it lacks artifice and therefore intelligence.” This was said decades ago, but still holds true today
- Human language is completely linear and totally open, which is very difficult for computers to comprehend
Chat bot best practices:
- Make sure bot has clear and understood purpose. May have higher bounce rate because of the narrow purpose, but users will understand and appreciate if your bot actually does one thing well versus being a useless novelty
- Understand the limits of the experience that people will push against and develop messaging around them
- Clearly define the path you want a user to take, anticipate that journey, provide an "out" that people can get to quickly
- -Consider a live handoff from customer service agent
Some conversational UI principles to guide development of conversation interfaces:
- Recipient design- design the construction of your conversation based on the recipient
- Minimization- speakers speak efficiently. People naturally try to speak with the fewest possible words required. But if someone doesn’t understand, it’s human nature to over compensate and overelaborate
- Understanding the difference between voice versus text- we say “uh, yeah, I guess I’d like Mexican food” (buys time to figure out what you're going to say) versus typing “Mexican.” Also, text responses provide opportunities for providing more information than voice, which takes much longer time to communicate
- Consider actions in line with displayed emotions- understand when something is wrong and show empathy. With the Alma conversational interface, when someone says "You're not too smart are you, " Alma understands the user is annoyed and says "Now you're hurting my feelings" to try and move towards a more constructive conversation and humanize the experience
If you’re still reading these emails, I’m impressed and thankful. Hopefully you’ve found them interesting. Again, there’s plenty more where this came from, so please let me know if you have any questions or want to grab a coffee and geek out on any of this.