Deploying chatbots? Consider the ethics

Everyone who has ever tried their hand at acquiring a loan online knows how easy it is, especially with help from a chatbot: 

Customer: Hello! I’d like some money please. 

Bank chatbot: Greetings! My name is Simon and I have money. 

Customer: Excellent, Simon. Please deposit as much as you’d like into my account.

Bank chatbot: Bleep bloop. Is there anything else I can help you with today? 

If only! But if your customers have tried this lending route, with your bank or another, they have probably dealt with artificial intelligence along the way. If your community bank is thinking of rolling out an online loan application program, or any more complex product offering, you will likely have to make choices about a digital helper and how it interacts with customers. If you already have one in place on your website, have you checked in with it lately? Has it recently developed a desire to become a board member? The bots, after all, are getting alarmingly ambitious.

You may have heard recently that an engineer at Google, Blake Lemoine, announced he believes AI developed by Google has achieved a level of sentience. From a Washington Post article on the subject: 

“If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” said Lemoine.

Lemoine was fired for going public with his assertion, but it has launched a round of discussion in the tech world about AI and the ethics of its usage, including the fintech sector. There is more than one software company in the business of providing chatbots specifically for financial applications. Currently, for a bank, this is going to largely mean the bot acts like a receptionist and will route inquiries to the proper human. The bot itself is not enabled to make any real decisions about money, but the largest financial institutions are already empowering bots as a sales force. (These banks also use software to determine credit worthiness.) There’s no reason these technologies won’t merge into something like a full blown digital loan officer, if they haven’t already. 

That’s not going to mix very well with the community banking model, but I’d predict there will come a time when even the smallest bank’s website will have some sort of chatbot built in, and whether or not it has any real power, the ethics of AI is something a bank must consider when deploying its own “Simon.” 

The first rule of the chatbots, and this has been around for as long as I can remember, is to make it very clear the customer is not dealing with a human. Without naming the non-finance company I recently interacted with in this fashion, I honestly couldn’t tell whether a real person was on the other end. Whatever I was chatting with did not offer up a name. I messed with it. I asked absurd questions, like “Would you rather fight one enormous duck or 100 tiny ones?” 

It responded like a human in a call center might very well respond to absurdity. “I’m sorry, I just don’t understand. Is there anything else I can help you with?”

It was polite, as if its manager might be observing the interaction. Then, to see what happens, I cursed at it. Repeatedly. At that point, it finally gave up and offered me an email address for support. 

I will never know if a human was typing at the other end or not. I would know this for certain if it had just come out and told me so upfront, either way. For bankers, this golden rule must absolutely be adopted. The feeling of having to question the humanity of a customer service rep is called distrust. I can’t think of a worse look for a bank. 

In this column, I often ask industry experts how community banks can square remote, digital life with their historically successful model of living in and loving the communities they directly serve. Responses uniformly concur that as far as anyone can see, when it comes to banking, human interaction will remain a valued premium. But chatbots do have a place in community banking, as long as customers know exactly what they are dealing with. Transparency is the key to this expanding space. And if one day you find your chatbot measuring a C-suite office space, don’t be alarmed. If it asks for a vacation, however …