Dennis Mortensen, CEO of x.ai, on the Present and Future of Artificial Intelligence

In writing What to Do When Machines Do Everything we met with lots of different people on the forefront of developing and deploying machine intelligence. One of the most interesting was Dennis Mortensen, the founder and CEO of x.ai, a company developing an artificial intelligence driven personal assistant (called Amy) for scheduling meetings. Space prohibited us using too much of the long discussion we had with Dennis, but here is a condensed version of our conversation. x.ai is a company to watch, and Dennis – and Amy – are folks to get to know.

Ben Pring
How did you come into the artificial intelligence world? What was your background?

Dennis Mortensen
My background, being both in computer science and entrepreneurship, has given me a very intimate relationship with data for the last 20 years. Any engineered agent, or AI, is built upon a given data set that you’ve assembled. Most people don’t understand or appreciate this; it’s the root of everything that will come out of this next paradigm in software.

The last decade has been one of making nifty little apps. But the real paradigm shift is trading helpful apps for finding certain verticals where we can hand over the job in full. Take what we do at x.ai, for example. It’s not that I don’t know how to set up a meeting, or that it’s hard. But what if I didn’t have to do it at all?

We’ll see things change from being assisted to being done in full. Doing jobs in full almost excludes traditional software as we know it. Take the self-driving car. If you tell that car to go from Wall Street to 42nd Street, you need to know with 100 percent certainty: “Are you now doing this or am I?”

Ben Pring
How close to that paradigm shift do you think we are?

Dennis Mortensen
For x.ai, we are very close to the point of having it solved for this one very narrow domain of scheduling meetings.

Ben Pring
Have you ever run into objections from people who say, “I hate meetings. You’re going to make meetings easier. You’re going to make my day-to-day life worse because I have to be in more meetings, not less.”

Dennis Mortensen
We certainly hear that. Here’s my defense: Ask one of the most senior guys you know with a human personal assistant, “The day you hired him or her, did you get more or fewer meetings?” He probably ended up with fewer meetings because he now had a defender, a gatekeeper.

So, if we do a job well here, why wouldn’t we create an agent that does exactly that?

Ben Pring
Could it also be seen as a democratization of luxury?

Dennis Mortensen
That’s exactly how we advertise the product, which is democratizing the idea of the personal assistant. It’s no longer a luxury for the few; it’s for everybody.

There are many things that used to be a luxury, right? Twenty-five years ago, the cell phone used to be a luxury for only the most aggressive corporate leaders. Now any kid older than seven can have an iPhone. We can now democratize personal assistants for knowledge workers.

Ben Pring
Do you imagine your meetings software (Amy) will take on other tasks like travel, or will your software just interface with other agents developed for those tasks?

Dennis Mortensen
We have focused on setting up meetings. No more, no less. But we would like to have the information so we can serve you in the best way possible. So, if you ask Amy to set up a meeting in Miami, I don’t want you to come back to Amy and say, “Amy, I arrived at 9:00am for a 10:00am. Please do a better job.” For Amy to do a better job, she would need to say, “I need access to your travel agent.” You should be able to grant her access to your travel “agent,” and the bots can coordinate both meeting and travel schedules.

If we think of software as we’ve done in the past, everything is per the very definition of it being different software, incompatible. No software is written in the same way. But I think the future is one not of traditional software and apps, but one of agents that don’t need to be told, “Go to this street, step left, four steps forward.” Instead, you tell them in natural language, “I want you to do this.” Then Amy can communicate with other bots in natural language, as well.

Ben Pring
It’s funny you mention bots communicating in natural language. I’m sure you saw the new Atlas video from Boston Dynamics, where the guys are pushing it over? I showed that to my wife and her knee-jerk reaction was, “That horrible man speaks so nastily to the robot.”

Dennis Mortensen
I think that video was brilliant for many reasons, but first and foremost, because of what you’ve just described: It evokes some sort of emotion in almost everybody. We see it at x.ai every day. Not as dramatic as you’ve seen in the video, but because we pick natural language as the way we communicate with our clients and guests, we naturally speak to Amy very politely. “Amy, would you be so kind and set something up between me and Ben?” Or, “Amy, so sorry. Can we push this to early next week, please?” Sometimes five days later, users figure out, “Oh, this was a machine,” and they have that ‘aha’ moment.

Ben Pring
You’re saying that we innately retain our humanity, even when we’re dealing with things we know not to be human?

Dennis Mortensen
Yes. For many reasons, we tried to build empathy into the system to be able to participate at our level. So, if we were to reschedule a meeting this morning, and then an hour later, must reschedule it for another day, there’s a certain point in that conversation when the whole thing takes an unfortunate turn. When you think I’m just rude.

Amy needs to know this is not cool. She can’t just be robotic to the extent that a meeting is a meeting is a meeting and we can do this 10 times. She needs to understand that right now, things are a little bit delicate. She needs to make sure to communicate with Ben at a level where it’s clear that I’m so sorry, but we have to move this to early next week.

Ben Pring
One phrase we thought about in writing What to Do When Machines Do Everything is what we call the intimacy paradox. That in a way, the software could act more humanly than people. Wouldn’t it be an incredible gestalt moment to go from seeing the world mediated through the intimacy of a person, to something that’s equally or more intimate through a virtual software agent?

Dennis Mortensen
I agree. I think one of the very first decisions that anybody in this new paradigm must make is whether you humanize your technology or not. This is one of those very black and white decisions; you cannot choose something in the middle, meaning that Microsoft decided to humanize Cortana. That is part of its agent strategy. Google decided not to. That’s part of its strategy. You can’t be in the middle.

Ben Pring
When you talk to your family, friends, civilians, people in your travels, how do you talk about this fear of the robots? This fear of their impact on jobs and the reality of displacement and substation as opposed to the promise of new roles?

Dennis Mortensen
I’m not sure anybody really has the answer to that, but I am extremely optimistic about the future because I’m very confident that, overall, people enjoy their job, company, pay, location, yet they do things everyday they dislike. I think a lot of the automation will impact those dislikes.

To me, that seems like a happier place than the one typically described, where we should escape automation because of job displacement, 20 million people out of work, and the current inequality we have is the root of all sorts of evil.

Think about the traditional job functions in software that we all know and recognize. Many companies put together a team of product people, frontend, backend, middleware engineers, UI and UX designers, information architecture experts – all to create a single app. There’s a whole set of well-designed positions there.

But in the case of x.ai and Amy, who owns that position? It’s not a UI guy. There aren’t any pixels to push around. It’s not a UX guy, because there’s no choice between a radio button and a checkbox. Sure, there’s information architecture, but there isn’t flow from one screen to the next, so it’s not quite your traditional role. So, who designs the agent? In this example, we’ve created the role of an AI Interaction Designer, whose job it is to create the dialogue and persona of Amy. For us, we’ve been positively forced to create a whole new set of jobs. This we think is the future – sure, some jobs go away, many are changed, and lots are created. Amy is a helper, not a destroyer.