Must Know Terms of 2017
(Part 2)

Covered Terms: Virtual Personal Assistants • Red Route • Digital Twins

Terms covered in Part 1: Blockchain • Information Scent • Haptic Feedback Incorporation • Age Responsive Design

 

Virtual Personal Assistants

“How can I help you today?” “Will that be everything?” “Would you like fries with that?” “Would you like to see our specials?” “May I suggest a different color?” “How may I direct your call?” 

As our devices become always on listeners and as they become more intelligent and human like, more of us are relying on voice commands to assist us in everyday life. Virtual personal assistants are embedded software programs that utilize voice or text input to respond to your requests based on a contextual story your recorded online history maps out.

Siri, Cortana, Alexa, Google. These are four premiere examples of voice activated VPAs embedded in the most popular devices. As with swiping or touch screen actions, voice activated commands are intended to reduce the time it takes to accomplish a task. One frustration VPAs are seeking to reduce in a users experience is called “app fatigue”. App fatigue is a modern expression denoting the tedium many smart phone users feel when trying to perform a basic function. For instance, if you want to edit a photo, you may have to swipe through several pages of icons, open a photography folder, find that icon, then click it to open. Hardly efficient. It’s a maddening information hierarchy that leads to feeling like you’re looking for a needle in a haystack, with piles of similar looking icons indistinguishable from each other without a magnifying glass.  Voice commands in tandem with contextual information about the user are meant to solve this problem by providing a speedy shortcut to the desired action. 

The most exciting possibilities VPAs promise though, are their connection to the cloud and your local data using contextual learning. What does this mean? It means that not only will Siri (for instance), tell you when you have your next meeting, but by accessing your online habits, tell you that you can get a run in before hand, suggest a haircut based on the last time you went in for a trim, suggest a gift for one of the meeting attendees, and many other interesting possibilities. Much of what separates man from machine has to do with pattern recognition and acting on these patterns in a way that improves one’s life. VPAs are increasingly adept at “knowing what you mean” and not just that, suggesting behaviors that reflect who you are  and, who you can be using vast databases of context. (see Deep Learning).

The promise behind these assistants is very exciting, as smart homes, health trackers, mobile phones all feed your data into a central repository where analysis can be done and suggestions can be made that you may have never thought of on your own.

Of course as many can attest to, as it stands today, the hype surrounding these assistants far outpaces the promise. 

Even the most basic request to Siri can result in copious amounts of swearing at your device, usually to the effect of how *!#@ worthless Siri can be.  “Hey Siri, read me the happy hour menu for my favorite sushi spot” - “Hey Siri, tell me a good song about summer time?” results in a dazzlingly incoherent response, usually asking me to search the web for something it has every right to “know” for itself. This tells us that we’ve come to demand more from our devices and expect software to learn us.  Google and Amazon have implemented VPA personalities in a more helpful way, and have leveraged natural language processing, deep learning and the cloud making them increasingly indispensable. The exciting possibility is that with more data being sent to the network by billions of users, the smarter these assistants get. Using AI and contextual learning, the responses will continue to reflect more accurately what the user means, and what they will want in the future. Apple is lagging in this space but are working to improve this capability as well. 

VPAs also are being used to provide customer service in interesting ways. Chatbots or “engagement bots” are less frustrating then they used to be, and companies are using them to handle basic customer relationship tasks. As these get smarter, human interaction becomes less crucial to feeling satisfied as a customer. 

We’ll discuss more on these “engagement bots” in later posts. Perhaps there is a way you can embed this functionality in your mobile application as users expect more from their machines.

 

Red Routes

“Red route” is a term used in London to describe roads that are painted red to indicate to drivers that these lanes are exclusively dedicated to public transit. Essentially, VIP lanes for public transportation. Why? These roads are major arterials requiring a steady flow of traffic. Avoiding log jams is the priority. 

If you’ve had any experience in project management, the term “critical path” is similar. How do allow a customer to easily reach their goals in a similar way. It’s similar to the idea of Occam’s razor, or path of least resistance. How do you create an experience with your site that gets out of the way and let’s users intuitively arrive at their destination. Defining these red routes based on user feedback and common behaviors can make your site a pleasurable experience, minimizing frustration and getting out of the way.

 

Digital Twins

⁃A digital twin is a digital representation of a physical object in the material world. Prominent examples are smart home products, surveillance devices, architectural structures, medical tracking, or any physical object containing sensors that transmit information back to a software repository where that data can be analyzed for future action. One of Busse Designs product examples reflecting this idea is Arlo by Netgear. It’s a smart home system using devices that have night vision, streaming of data to the cloud, remote controls, and like. The data can be analyzed by the user to determine the best way to implement the devices. Ideally the software can make changes or report on the concerns most relevant to the user (How often do my cameras detect motion? At what time of day? Is the motion triggered by human beings or other things? How do I best protect myself given this data)?