Elaine Lee is leading product design for AI-enabled assistants on eBay’s New Product Development team. We interviewed her about the challenges of designing an AI-powered interface.
My first question is going to be straightforward: How did you get into conversational design?
I actually fell into this space. I’m currently at eBay; but, a few years ago, I was at a startup where we built an office assistant bot in Slack. It was part human and part bot.
What was the functionality of this office assistant Slackbot?
The bot would book, buy, and order anything for teams on Slack…that’s legal. We started with just us, humans, behind the bot. But, we quickly learned it wasn’t scalable. We needed to automate this. We started looking at patterns. What were people asking for? What kind of info did we need from them to meet their requests? We wanted to provide a consistent experience, always asking the right questions.
What does a designer do in this space? What are you focused on?
This space is pretty new. Both people using bots and people building bots are trying to figure it out.
I ask myself things like, How is this type of interaction useful for people? What problems are we solving? Why should people use the bot instead of what they’re used to today? And, is that what people want? To converse with a bot. And, how?
This isn’t siloed to a designer’s role, but I’d learn how best to solve people’s problems with a conversational bot or assistant.
Personally, I’d like to figure out how more designers can help make AI smarter. What information does our AI need to become smarter? What kind of experiences do we need to design to get that data?
When I first started working in this space, I didn’t really understand how AI worked. I didn’t know how the role of a designer fit in. I was making things, at the same time I questioned if what I was doing was considered design. But, in the end, it didn’t matter.
I had the opportunity to use whatever I had to build out a concept that turned into a product. And, I’m doing more of that at eBay now.
Is the definition of design different when you design conversations?
It’s still design and UX. But, the way you think about the problem space will be different. The output may be more focused on natural language than visuals, but visual design is just as important.
We’re working on a shopping assistant bot that you can talk to on your Google Home and also in Messenger. You ask it for something and it’ll do the research to find you the best deal…of products on eBay. I’m constantly thinking about how we can get the information needed for our bot to become smarter, but also how not to ask the user to do too much work. What are those easy and quick questions the bot can ask that will yield relevant results?
May I ask more about your background? You mentioned you started in advertising…
Yes. I started in advertising. But, my undergrad was in Psychology and Social Behavior. I thought I was going to become a doctor. Then…I knew I wasn’t going to become a doctor. I decided to go into advertising, which has a lot to do with psychology. So, I got my Masters in advertising, then went to New York for work.
I started as an interactive art director. And, art directors work very closely with copywriters. When I left the agency and ad world to go in-house as a product designer I had to write my own copy. Majority of it was UX writing. When I had to write more creatively and with personality, I would think…what would my past copywriters say? I’ve learned a lot about communicating with words from working with writers.
What is something you’ve learned while working on the AI-assistant team at eBay?
People speak to bots like they are typing in a search box. This is specifically for a chat interface. We text our friends in sentences and long phrases. Now, we are mixing a bot in an interface where you’re used to speaking more fluidly to people. The hope is that the chatting behavior carries over when chatting with a bot. But, that’s not necessarily the case. If you tried to talk to a bot in natural language and it doesn’t understand you, you may learn to speak to it like a machine…using few keywords. We want to move away from a search box paradigm. The value in the assistant is understanding intent, which comes from more expressive and full thoughts.
I have a feeling that you are going to say — But…
But *laughs* of course, there’s still a lot for the bot to learn to be as good as a human, to remember what you said the other day and apply it back to what you’re saying right now. To understand the nuances of your word choices.
The tone is central.
Absolutely. When I was working at the startup, we were tweaking the tone every day. If it’s too human, the end user will expect it to be as smart as a human. And, they will keep pushing the system to see how smart it is. Someone would ask us to tell them a thousand jokes. We didn’t have time for that. We asked ourselves: What if we make it sound more like a bot?
How did that play out?
It changed the user behavior. Setting people’s expectation is crucial to engagement. An expectation of what the bot can understand and deliver on.
Taking eBay ShopBot for example, what I’m working on. It’s an e-commerce bot. Shopping is what we’re great at. It’s not a general bot where we want people to ask it anything, especially not related to shopping. Whatever the user value is that you’ve defined, focus on that.
You’ve mentioned some of them. What are some other questions you ask yourself while designing the conversation?
Some of them are:
- How do we build a trusting relationship between the bot and people?
- How do we get closer to anticipating the user’s next move?
- What does the AI need to know to deliver on what people are asking for?
- How do we model our behavior in how we, humans, communicate with each other, in the bot? This drives our tone of voice and how we talk.
What advice would you give to designers that are moving from graphical to conversational design?
Use what you have, not just in design. Be a product thinker. Be familiar with how different chat and voice platforms work technically. Read the documentation, even if you’re not building within their platform. By the way,
I don’t think we are ever going to move to a world without visuals. People are stimulated by visuals. That’s not going away.
How often do you redesign in the chatbot realm?
I keep changing my mind in what the experience can be based on what I see in the data and usage. But, regardless, I want to change things all the time, in all things I create. It’s hard not to think…oh I have a better idea! Even if it’s a bad idea. Who knows.
The space is so new. You are helping create the space by taking risks and pushing. It can be discouraging at times but if you believe in it, you’ll keep iterating.
I have to ask given your overview at eBay. Will people ever purchase consistently from a chatbot or a voice assistant?
By the way, voice assistants have different hurdles than chatbots in regards to purchases, if you’re talking about only using voice for the whole shopping journey.
I feel this space, in general, is kind of like the mobile space when it began. People did their research on mobile then checked out on a desktop. Now, it’s common to convert on mobile. I expect a similar trajectory for this space when people begin to trust it more.
If you enjoyed this interview, be sure to check out Elaine’s medium for some of her great original content. Also, head over to interact and try the Ebay ShopBot yourself on Messenger. For you Google Home owners, simply start a conversation with the bot by saying “Hey Google, let me talk eBay.”
Elaine is the first in Botsociety’s Design The Future interview series. Stay tuned each Monday as we continue to explore the evolving world of conversational design. Have anyone special in mind that you’d like for us to interview? Leave your comments below!