Machine Learning – the basics

Hey there !

So you are a programmer, wanting to master machine learning ?

You are at the right place.

Lets get started.

So, what exactly is machine learning ?

We as programmers, have been programming some pretty complex algorithms, which made our computers really smart. So how do we understand machine learning from the perspective of programmers ?

Machine learning and programming achieves the same purpose – as in both makes the computers do some useful work for us. The approaches are in exact opposition to each other.

While programming is about setting the rules to reach a particular output from inputs, machine learning is about finding these rules from existing output and work inwards.

While programming is intrinsic logic directly added by a programmer, machine learning is extrinsic – existing output data massaging a malleable function into shape, so that this function is the “new program” which can be put to use to find outputs from new previously unseen inputs.

And whats Deep learning ?

Its very tempting to help a function (which works backwards from output data) with what we know about the domain of the problem.

For example, when predicting the prices of houses in a city, we naturally would make use of our knowledge that it’s location in the city, the area of the house etc have a direct bearing on its price. These are called features of our function, which has to find the importance of each feature in predicting the output by working backwards.

Machine learning is what we call the data driven shaping of solution function (or program) when the features or the broader input variables are set by us.

When we have enough historical data (of inputs vs outputs) we have the luxury of the letting the function decide for itself which inputs are actually features (have a say in the output). This is more like larger program which makes use of sub routines to filter input signals to allow only relevant ones to enter the main routine. Only that the main routine and sub routines – all are shaped up by many iterations of massaging inwards by the historical data. Such a system which does automated feature selection and predicts output from the features so found, is know as a Deep learning system.

Deep learning systems obviously have more stacks of variables layered between input to output, to facilitate filtration or amplification of input signals as the need be, in preceding layers.

Fixing Cqlsh Import Error

Problem:

Running Cqlsh exits with the error :

from cqlshlib import cql3handling, cqlhandling, pylexotron,sslhandling, copy
ImportError: No module named cqlshlib

Fix:

Set right python path

find /usr/lib/ -name cqlshlib
#assume you got the result as /usr/lib/python2.7/dist-packages/
export PYTHONPATH=/usr/lib/python2.7/dist-packages/

Planning conversational Automation – Here’s what you should look for !

dhee । தீ । ধী । ధీ । धी । ಧೀ । ധീ । ઘી । ଢ଼ୀ

Are you planning to bring in conversational automation using AI to make your processes more cost effective and consistent ? There are certain things I would like to share with you from my few years experience on-boarding customers to conversational AI platforms. These can be used as high levels checkpoints to save ourselves from getting into avoidable activity traps or creating bots that fail to attract customer interest. Goal thus, is to gain high ROI on your time and money invested for adopting AI supported processes.

Identifying the right platform or vendor

This one decision will determine how easy or difficult your overall journey of adopting a virtual conversational agent is going to be. These are the questions that can help one choosing a right AI partner –

Do they own the NLP and dialogue automation stack ?

NLP is the fundamental layer every conversational bot depends on. Having an in-house developed stack for these functions provides a high level of adaptability to the vendor in supporting your use cases.

You might already be aware that usually a virtual agent can be trained only for about 60-70% of user queries and variations at the time it is released to the public. As the usage increases and the gaps in training are identified, the bot achieves higher and higher accuracy by a systematic retraining process employed by your vendor. Many times, such remediation will require a change or a retraining of the underlying NLP and NLU modules too. Unless your vendor has the capability to do that, you might have to live with a lot of gaps in your conversational automation.

Can they support the languages your customers would wish to converse in ?

The conversational AI technology is language dependent. Almost every attempt to make it language independent using a translation layer above a “single language stack”, has been shoddy when it came to user interactions in the non-main language.

You should definitely go for the vendor who has the NLP , NLU and dialogue-automation capabilities for each of the natural languages you wish to support for your customers.

How seamless or sticky is the on-boarding process ?

Does your vendor need you to annotate your data before they can automate your process ? Do they ask your engineers to learn their stack for creating intent automation ?

While doing things yourselves might be exciting when you begin, (and might even work if yours is a small business with a single user intent), it can soon grow into a pain especially when your use cases grow and your processes changes.

It is better to work with a vendor who would understand the user facing processes you want to automate, and then on will do all the bot training and preparation without your active involvement. Also designing a flexible dialog is an art and its always better a more experienced person on the automation stack works on it, rather than you having to train some of your employees for the same.

How many channels of interaction can they support ?

Although you might start off with conversational automation as a chat-bot on your website or app, it’s very likely you would quickly want to deliver the same automation to work with your customers on other channels, like Facebook messenger, Whatsapp, VoIP (or telephone call), Smart IOT devices etc.

The vendor you choose should be able to provide these value added channels seamlessly as and when you are ready to absorb them into your business processes.

Do they have continuous learning employed for your bot ?

Most providers do have a continuous learning process employed for their conversational bots. This is very important for the user experience of your bot to improve with time. Nonetheless, its always better to make sure your prospective vendor has such a process, and even better, they provide it to you free of cost (as a part of the sales contract).

Do they provide statistics of bot usage ?

Its important that you be able to see the bot usage, number of human interventions, intent wise distribution etc to understand the impact the bot is making to your business. The vendor should ideally provide you with detailed analytics portal where you can see the vitals and key performance indices of your bot. Depending on your business the KPI may be total sales generated, total appointments booked etc.

Can a human supervisor intervene when needed ?

When we release an AI to serve your customers, it’s always possible that a user is going to have a query which the bot is not trained to answer. It’s good from user experience perspective that the AI can at this point, transfer the conversation for attendance by a human supervisor, who can answer this query (or a series of related queries) and then handover the conversation back to the AI.

Additionally its desirable that the users be able to talk to human supervisors whenever they demand so.

Can they support visual interaction ?

If you want to engage your customers via visual medium too, your vendor should additionally have an avatar for the bot, and decent amount of machine vision and OCR capabilities. The more realistic and lip synced the avatar is, the better for a holistic user experience.

How robust is their infrastructure and live support ?

These are general tech-maturity concerns like scalability with respect to concurrency and content, guaranteed up time, support for live upgrades etc.

A cumulative scoring based on all the check points mentioned above will help us to reduce the risk of choosing an unsuitable AI vendor.

Now that we have discussed what to look for in a conversational AI vendor, here are observations on how to reap best returns from the bots we deploy.

Effective bots : Start early ! Start small !

Its important that once you have decided to improve your processes using a conversational AI, you release the bot quickly to your users – to get them accustomed to it early on – and introduce automations in small gradients.

Let AI begin as a process catalyst

It works better if the AI starts off as an enabler or a catalyst to improve your existing processes, rather than it taking over the whole process as such.

For E.g, Assume you are in the Insurance domain. Assume you are switching to an AI based process flow for on-boarding new customers. This should ideally start off as a process which just collects the lead details and updates your sales platform. Initially these leads generated by the bot should be taken up and on boarded by a sales executive. This is better than the bot going all the way up to collecting some 50 to 100 information points from the end user and performing the on-boarding all by itself.

Such an approach helps your customers get accustomed to, and appreciate the bot in short useful interactions. Long processes can quickly dry out the users’ interest – especially if they are using it for the first time – causing them to disengage and leave.

Quick wins first to gain user’s trust on the AI

Its important that when you begin, your focus should be on providing several quick wins to the end user via the bot. A quick successful interaction is where the user needs to provide just 2 to 5 data points to the bot and the task gets done. Few such examples are – finding the nearest branch, status of a consignment, NAV value of a fund, total value of SIP, fixing appointment with an adviser etc.

More of such short use cases will result in the AI getting easily adopted and trusted by your end users, preparing them for lengthier and higher value processes that you should automate down the line.

Iterative releases of higher value automations

Once the users are comfortable in their interactions with the bot, its time to surprise them with higher value automations. E.g, making a premium payment, adding a nominee to their insurance plan etc. A new automation per fortnight or month would be a nice pace to maintain the user interest and optimum engagement on all released automations.

Iterative releases of higher value automations in this manner would also provide us enough time to reallocate the human resources who are freed from the mundane activities thus automated.

Is your AI learning ?

After the bot owning up all the automations you had planned for it, you still have to monitor the usage data of your bot to make sure things work as you had envisioned. Particularly important is the number of human interventions/escalations that happen. Ideally this number should come down with time, as the bot learns from the gaps, and should not be repeating previous misses after each of it’s learning cycles.

Conclusion

We have discussed briefly all important aspects to be considered while zeroing in on the right AI platform to automate your business.

These pointers will definitely help to set the expectations right for those among us who are planning to adopt AI based automations. Feel free to let me know in comments if I have missed anything.

Also: If you work for an Indian company and wish to engage your customers using automations in Indian languages (including “Indian English”), feel free to inbox me. Dhee.AI – which is developed by my company DheeYantra – might fit just right for you.

Fix for – W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/xenial/InRelease Temporary failure resolving ‘archive.ubuntu.com’

Problem: apt-get update gives an error :

[bash]

W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/xenial/InRelease Temporary failure resolving ‘archive.ubuntu.com’

[/bash]

Reason: Name server is not configured correctly.

Solution : Use Google’s nameservers as below-

[bash]

echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf > /dev/null

[/bash]

 

Apt get should work now.

 

View logs and debug a docker service

 

To debug why a container within a docker stack didn’t start, these commands will help

[bash]

#list all processes

docker service ls

#note the id of your problematic service

docker service ps <problematic-service-id-from-above>

#lists all containers of this service. note the id of the most recent container

docker inspect <container-id-from-above-step>

[/bash]

How to install Docker Compose on Ubuntu 18.04 LTS

Here’s how to install docker-compose on Ubuntu LTS :

Open terminal and run these commands

[bash]

sudo curl -L "https://github.com/docker/compose/releases/download/1.24.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose

sudo chmod +x /usr/local/bin/docker-compose

[/bash]

Now verify installation

[bash]

docker-compose –version

[/bash]

This should show the version of docker compose that was installed.

GCloud gives warning : `docker-credential-gcloud` not in system PATH

Problem :

On an Ubuntu VM, the Docker pulls from Gcloud private repository give the error

[bash]

Error response from daemon: unauthorized: You don’t have the needed permissions to perform this operation, and you may have invalid credentials. To authenticate your r
equest, follow the steps in: https://cloud.google.com/container-registry/docs/advanced-authentication

[/bash]

After this to reconfigure auth, the following command is issued :

[bash]

gcloud auth configure-docker

[/bash]

This gives the error:

[bash]
WARNING: `docker-credential-gcloud` not in system PATH.
gcloud’s Docker credential helper can be configured but it will not work until this is corrected.
gcloud credential helpers already registered correctly.
[/bash]

Solution:

Issue these commands on terminal

[bash]

sudo su

curl -fsSL "https://github.com/GoogleCloudPlatform/docker-credential-gcr/releases/download/v1.5.0/docker-credential-gcr_linux_amd64-1.5.0.tar.gz" | tar xz –to-stdout ./docker-credential-gcr > /usr/bin/docker-credential-gcr && chmod +x /usr/bin/docker-credential-gcr

docker-credential-gcr configure-docker

docker pull <your-image>

[/bash]

The docker pulls should work now.

 

Spring boot and Hibernate Second Level cache giving NoCacheRegionFactoryAvailableException

Problem:

Setting below in application.properties,

[bash]
hibernate.cache.use_second_level_cache=true
hibernate.cache.hazelcast.use_native_client=true
hibernate.cache.region.factory_class=com.hazelcast.hibernate.HazelcastCacheRegionFactory
[/bash]

Gives the error:

[bash]

org.hibernate.cache.NoCacheRegionFactoryAvailableException: Second-level cache is used in the
application, but property hibernate.cache.region.factory_class is not given, please either
disable second level cache or set correct region factory class name to property
hibernate.cache.region.factory_class (and make sure the second level cache provider,
hibernate-infinispan, for example, is available in the classpath).

[/bash]

Solution

Move the above settings from application.properties to hibernate.properties.

🙂