Neurons-Inspired Solution For Adopting Virtual Assistants

 

In my previous post, Contextual Banking in Digital Lifestyle, I brought forth a topic on how enterprises are required to reimagine services, and their enabling experiences, to engage with digital-native customers who are always connected. I followed up with two such communication channels using Amazon Echo and Facebook Messenger Chatbot. These are but two representative channels that are active in the space of AI-driven virtual assistants. In addition to these, there are numerous chat bots, connected auto, smart-home and mobile devices that offer similar services, all catering to the digital lifestyle of customers.

Enterprises beginning to embrace these communication mediums might choose couple of the popular ones such as those from Amazon, Facebook, Microsoft or Google. But its important to note that this is an emerging field where its not clear who would offer the best solution in the future, with new entrants actively spawning up with innovative and differentiated offerings. While I don't advise enterprises to wait until its too late to figure out the leading solution to adopt, it does require to pay attention on how we architect the solution to ensure our approach is adaptable and extensible.

A notable trait in the current AI-driven virtual assistants is their capability to perform NLP - Natural Language Processing of either textual conversations as in a chat, or verbal interactions as in Alexa or Siri.

The core component in any NLP design is to maintain a list of utterances and build a mapping of these utterances to actions. If you approach each of these devices separately, then you will end up proliferating and duplicating the logic to manage the utterances, conversations, and actions-mapping across various device handlers. This issue will only be exasperated when you consider the upcoming skills beyond NLP.

In the future, haptic, augmented and virtual reality would also need to considered. The point is that the digital lifestyle has moved past the older, human-driven conversations to AI-driven conversations that has high capacity to process touch, text, voice, gestures and visual messages. If you think about it, enterprises are slowly forming to seem like a living organism with multiple sensory organs. If such is the case, then why don't we look at how our sensory system has evolved to perhaps take a page from its design to our solution.

Neurons Structure

"While there are as many as 10,000 specific types of neurons in the human brain, generally speaking, there are three kinds of neurons: motor neurons (for conveying motor information), sensory neurons (for conveying sensory information), and interneurons (which convey information between different types of neurons)."

Proposed Solution

Mimicking the neural system, we can create three types of components or modules modeled after the three main types of neurons. Just as in the neurons, in our architecture each module is responsible for one part of the overall request/response processing. Its similar to the classical maxim in software design: "do one thing, do it well". The modules passes the request/response flow among themselves just as the neurons do through synapses.

[caption id="attachment_320" align="aligncenter" width="672"] A generic, extensible architecture to adopt multiple AI-assisted, sensory devices[/caption]

Sensory Modules:

Handles the input and output interactions on each channel. Focus is on the end-user communication. More on the syntax than semantics for which it passes on the request to Process modules.

Some devices cover the processing part. For instance, Echo comes with Alexa built-in with NLP to process voice to text to utterances. Whereas Facebook Messenger is yet come with its own built-in NLP service. Bypass this to instead leverage the process modules for this function.

Process Modules:

The intelligence mainly exist in the process modules. The NLP or image processors and processors that are required to process other type of content would all belong here. The repositories with rules, mappings, and any other transient or persistent data are part of these modules. Finally, the process modules convert the intentions to commands.

Action Modules

Based on the commands identified by the process modules, action modules handle the service orchestration to integrate with core services exposed by the enterprises. These service endpoints or API could be any protocol, REST API or web services. The actual mechanisms of enterprise system integration is encapsulated in these modules and not visible to other modules.

For development and testing purposes, these action modules might support service virtualization by integrating with or building its own simulated services.

 

 

 

 
Mahesh Alampalli