Chatbots using AWS & SAS Viya

An Open Solution To Quickly Unlock Innovation

The key driver for innovation in today’s diverse technology landscape for any organization is organizational ambidexterity (Check out this book for more on that). The idea of being able to both explore new avenues while continuing to exploit current strengths. No matter what business you are in, what you sell or who you serve — a sense of “openness” around strategizing or problem solving is the first best step to ensure that we do the right thing to not just maintain our business but to expand as new opportunities present themselves. This is because that mindset creates opportunities to explore – to change and pick up on new strategies as we see fit in the fastest manner possible. This is especially true when we talk about technology and investments we make on them. And so for businesses, nothing is more tempting when any technology brings choice, ease of use and accessibility together – allowing them to change and adapt faster than ever before. This is at the heart of what “open technology” really has to offer for businesses. It meets the cry, the business need, for asking to maximize for success not just now but in the future while truly minimizing operational risk. As a result, platforms that offer this combo : choice + ease of use + accessibility emerge as big winners. Plus as a great nice to have, these platforms that offer this combo typically open the floodgates of their capabilities to their customers so they can build their own solutions to meet their business needs in addition to providing hot value with just their own “core” offerings.

In the rest of this blog, we’ll see, as an example, how we can couple products and services from AWS & SAS (each leaders in their “core” space — Cloud Infrastructure Platform & Data Science Platform ) to build our own little chatbot — Alice. Why? Because Chatbots are the new IN thing to do and everyone wants one! (see below)

Deep Learning, Cognitive Computing & Conversational User Interfaces ?!! ok, all hail the mighty chatbots!

Also, as a side note, every time someone says some scary stuff about these things — show them this. Please!

“People worry that computers will get too smart and take over the world, but the real problem is that they’re too stupid and they’ve already taken over the world.” — Pedro Domingos, Master Algorithm

Anyway, back to our stuff, the AWS services that we will be using are Lex, Lambda & S3. As far as the brain behind the operation :- that demand is going to be met by SAS Viya. The bot that we are going to be building will help with answering questions around sales based on customers historical transactions we’ve collected and to help democratize the value from our machine learning (ML) model we built using this data.

As such this data is sitting in our Viya cluster on AWS and its a small demo dataset — 100K rows * 29 columns

Table info from our Viya Cluster on our Sample Demo data

We’ve also used a ML model to score this data to help classify whether a particular visit was a casual or a focused visit as far as the purchase visit is concerned. In this example, our business cares about focused shoppers and may potentially use this information later for some follow up marketing activities.

Result subset for transactions scored as focused visits

Ok! Now that we know what the data is, let’s see how we can use this data to build a chatbot around it. Clearly, having a dedicated data science platform to manage, govern and centralize assets will be very useful for businesses as they can have their data scientists build out their models and later expose these models via REST APIs with just a few clicks. All the versioning, re-training, model replacements etc. can be easily handled from withing this platform. We don’t have to worry about that stuff for the chatbot or this blog. We only care about the data or the API being available for us to tap into. So we can build, collaborate and innovate faster.

The following is the high level architectural blueprint for our chatbot — it uses the previously mentioned technologies to mashup and stitch together a nice user experience on Slack.

What makes the magic happen?

Before going any further, we also need to mention that SAS Viya also provides its own set of NLP capabilities in its text analytics suite and deep learning tool kit but the whole point here is to demonstrate the interplay between all offerings here so we can quickly go from nothing to a working application.

Also, I’ve already written up a detailed post on setting up the last mile of the architecture shown above — making the Lambda ← → Viya communication work seamlessly. Please head over here to see what it takes. That part of the set up is so generic and can be widely used for a slew of applications so its great to know how!

Awesome, so you have your application’s code sitting in lambda to wait and execute when invoked. But how does that happen? Good question! Let’s take a step back and walk through a sample set up for Lex. Lex helps developers build applications with highly engaging user experiences and lifelike conversational interactions. It share its underlying core tech with Amazon Alexa for Natural Language Understanding (NLU) and Automated Speech Recognition (ASR). That’s good intro, but basically what happens in Lex is that it converts the spoken or written utterance into into its core intents and relays the slots required to fulfill the intent over to Lambda. As an example, we see here for our bot Alice, I’ve set up intents and slots types within Lex. I’ve also provided Lex with some sample utterances to help our bot understand language. These are examples of spoken or written utterances that invoke the bot.

Sample Utterances To Invoke an Intent. Colored portions are the slots that help fulfill the intent.

From here, the language model to over improve over time is handled by Lex.

At this point when you test the bot, you will see that for a test utterance “sales in ca”- Lex starts to automatically manage the dialog state like so –

Notice how the slots get filled out as conversation evolves

Once Lex receives all the necessary slot values to fulfill an intent, it invokes our Lambda function that’s waiting to listen in from Lex. In our case, the incoming events are managed by our designated lambda handler from where we can call other functions in a cascading manner.

dispatch calls whatever it needs to do with viya

From this point on, Viya takes over. Does what it does best. Crank out out some cool shtuff.. (thanks urban dictionary!) Assuming you’ve configured your lambda function right, your response would look something like this

Sample Response Tapping Into a data set scored using a ML model on Viya’s Analytic Engine

Nice! If you’ve got this far, all you want to do now is to hook it up to a nice looking front end — we’ll choose Slack — because I personally love it and a lot of people use it. This is where using a framework like Lex helps because it comes with out of the box integrations for Slack, FB, kik etc. We’d still need to set up the bot user authorization & the postback from Lex to Slack, so Slack can talk to Lex.

Before that, we’ll have to set up our app using, pick up the secret keys and use them in Lex to generate the Postback and the OAuth URL. These then need to be picked up and slapped into our Slack app so the bot can interact with Lex. The documentation on how to set this is plenty and fairly simple, so I am going to skip on that. Head over here to know more and follow the steps listed there.

Once you complete this step, we’re ready to use our chatbot! The cool thing about using Slack is that you could very easily use your phone’s voice recognition service (the little microphone) to recite the voice commands to let your phone automatically convert the speech to text. All you need to do then is just hit enter and boom! Alice does the magic again!

Why do chatbots like these matter? This is something that we often think about as it matters to any solution — the ROI. Here’s some quick ninja math — some food for thought. If the chatbot is able to offload 1.5 hrs of daily effort for 3 data scientists and helps them focus on high value efforts while keeping them off grunt work, work that can be easily automated with a chatbot, assuming a fairly low cost per data scientist at $200 (i.e. a nominal total cost of $133/hr) the potential upside to be realized could look something like the following — $200 per person * 3 people * 20 days per month * 3 (for 24 hrs up time instead of standard 8hr availability)*12 months a year =$432,000/yr. Now, this is over simplifying stuff, yes, but the point that the value in productivity gains for an organization stands tall despite all those approximations. The exercise is really for us to help appreciate the value this type of technology might provide to your organization.

Hopefully you found this useful enough to help you get thinking about how chatbots can be made useful within your organization — especially when there are analytic workloads or complex decision making involved.

Connect with Sathish on LinkedIn

Source: Deep Learning on Medium