Chatbot Build and Demo

How I got started with chatbots

This was a project I did while at the Partnership for Drug-Free Kids. It culminated with a proof-of-concept chatbot and a video demonstration.

A little historical background

When I was in college one of my computer science teachers said that all computers do the same thing: they take in some data, do some processing with it, and return the result. At first I didn’t buy it, but then, as I thought about it, I warmed up to the idea. To be sure the idea he was putting forward was a very high-level statement. How can your game of Angry Birds and program running a climate model for early Mars have anything to do with each other? In fact the computer in your phone and the computer at some NASA lab are doing the same thing. For Angry birds, the input if the information gathered as you touch your screen, for the Mars climate simulation, most likely gigabytes of data taken from spacecraft measurements, known climate models and physics. While your tap on your screen produces an instant result, the results of the climate modeling could take hours or more. But effectively, they are processing data and returning a result.

The same is true when you use a graphical user interface; data is taken in, processed according to the rules of the program and the processor and the output is delivered back to you via the screen. This is the way we have thought of computers since the 1980’s.

Since I grew up in the 1980’s I can remember when the word “computer” conjured up images of huge room-filling machines with tape drives and punch cards. But before that, in science fiction, we were treated to an image of computers you could simply talk to in movies like 2001: A Space Odyssey and TV shows like Star Trek.

On to web site design

One of the problems I encountered in web design, right from when I started in 1997 was that we had no idea what the user wanted to do. Certainly since the sites I was working on were for electronics engineers, the domain of discourse would be limited to engineering, but what were they there for? Did they want to browse the latest issue of the magazine? Did they need to solve a problem, and if so in what area? 

To address this we (and everyone else) started to organize the information on the site in some predictable way. We would have menus with submenus in some kind of logical hierarchy. Presumably the site visitors  would also codify the information in much the same way. If not, confusion would reign. 

For electronic engineers, this was pretty straight forward. However, not every site is dealing with orderly content. This was the situation when I arrived at the Partnership for Drug-Free Kids. I remember sitting with one of the editors and listening to her explain the different sections of the site to me. I asked what the menu item “Get Information” was for. She explained that is where parents could get information. 

That seemed a little vague to me and I think everyone realized the weakness in naming a menu “Get Information”.  At one point we experimented with an online chat module for the site, but it was a bit buggy. 

The chat experiment got me thinking. The support staff who were supposed to help parents with the problems of having a teen that was doing drugs were often answering questions of where to find things on the site. Having had a class in college on artificial intelligence, I realized that if we could hookup an A.I. agent with chat, people could be guided to the correct parts of the site without having to call anyone or use search.

Learning that it’s already been done

When I started to dig into the problem I quickly learned that the problem was being worked on (so much for my great idea). As it turns out, at that very point in time companies were getting involved with this technology, most notably IBM with Watson, but there were many others. 

I found a nearby group meeting about chatbots in NYC and asked my boss about it. She had never heard of the technology and I tried (and failed) to sell her on the idea.

Going Rogue

My verbal description didn’t sell the idea researching chatbots, so I decided to do it on my own. My boss’s reluctance was understandable; we’ve been living with websites now for 20 years and it’s hard to think that something better could be found. 

At the beginning of this story I mentioned my college class where the teacher said all computers are doing the same thing. It’s the interface that is different. Early on punch cards, then magnetic tape, then keyboards and printer output, then screens, then graphical user interfaces, and now voice/text. The progression seems lost on people because the interfaces seem so different. But essentially we are still supplying data, the computer is processing it, and returning a result.

To create my chatbot I learned to use IBM Watson Assistant, node, Heroku, and Twilio. It took a lot of trail and error, and even though online tutorials were really helpful, no single one had the answer I needed. 

After building the chatbot, I recorded a few demos and included them in a larger demo pitching the idea of chat to the Partnership.

Below is a clip from that longer video on chatbots.