To read more from Rodger, visit HIS BLOG.

Early in my career—back in the stone age before computers and smartphones—I worked in environments where memos were a primary means of communication. Sure, my colleagues and I could talk face-to-face, but the culture of the time was to memorialize much of our interaction in writing.

Believe it or not, there were some advantages in what now seems such an archaic practice. Unlike texts and emails—where one tap of the “send” button can fill you with instant regret—the old-fashioned memo provided a cushion of safety, a chance to reconsider.

My wife, who in those days was my go-to source of emotional intelligence, taught me an important lesson. After drafting a message that had any chance at all of being misinterpreted, she said I should put it in my desk drawer and let it percolate over night. Then in the morning, if I honestly thought the message accurately conveyed my views with an appropriate dose of the friendly factor, I could then consider sending it.

More than once I either redrafted or entirely discarded a message I had written earlier.

That simple habit, and many other discipline practices, helped me appreciate the importance of emotional intelligence.

There’s no doubt that emotional intelligence is absolutely critical to the success of human interactions. Interestingly, machines are now starting to do some of the work for us.

Rana el Kaliouby is at the forefront of this fascinating science. She’s a pioneer in artificial emotional intelligence (Emotion AI) as well as CEO and co-founder of Affectiva, the acclaimed AI startup spun off from the MIT Media Lab.

Rana grew up in Cairo where she was raised—as she says—to be a “nice Egyptian girl.” She defied the cultural expectations of her upbringing by earning a Ph.D. at Cambridge. Then she broke barriers in the field of technology—which is overwhelmingly white and male—as a young Muslim woman who landed a TED talk and a spot on Fortune’s 40 Under 40 list.

Her book is Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity By Bringing Emotional Intelligence to Technology.

Rodger Dean Duncan: You say that without emotions we can’t make smart decisions. That’s a surprising view coming from someone regarded as an expert in artificial intelligence (AI). How did you come to reconcile the two?

Rana el Kaliouby: I grew up in a household of technologists. My dad would always bring home the newest gadgets for us to play with. I was always less interested in the gadget itself, and more intrigued by the social dynamics of technology—how it brought us together as a family.

As I eventually went on to study computer science, my interest was once again in the human side of technology—how it changes our dynamics and interactions. In my quest to build intelligent machines, I studied a lot of literature around intelligence. One thing that stood out was the research highlighting the importance of emotional intelligence as a predictor of success for people. I asked myself, “Why should computers be any different? If we want tech to be effective, shouldn’t it have emotional intelligence too?” That’s how I arrived at the conclusion that AI needs not only IQ, but EQ.

Duncan: Some people are still suspicious—or even fearful—of AI. While championing your own pioneering work in this area, how do you empathize with the doubters?

Rana el Kaliouby

el Kaliouby: I’ve staked my career on the idea that AI will change society for the better. But at the end of the day, I’m a consumer like anyone else. I have two kids, and I think about things from that lens. What kind of future do I want for them, and how can AI help? On the flip side, where should we draw the line? That conviction to ensure the outcome is a positive one—for my kids’ sake, and others’—is the reason I do what I do.

To that point, one reason I wrote Girl Decoded was to help consumers be part of the conversation about AI and make it accessible to people outside of the tech world. Like fair trade products, consumers need a better understanding of AI—the benefits, and the potential risks—to have a voice in shaping how it will manifest in our future.

Duncan: You’ve developed technology that helps clients optimize brand content and media spend by measuring consumers’ emotional responses to videos, ads, and TV shows. In layman’s terms, how does that technology work?

el Kaliouby: Media analytics is a key market for Affectiva’s technology. Emotions influence everything we do, and insight from Emotion AI gives brands a sense of how they’re connecting with target audiences on an emotionally resonant level. The way it works is simple. Research panelists need only internet connectivity and a standard web camera to participate. Once they opt in to using the Emotion AI software, they can turn on their camera before being shown an ad or video content. As they watch, Emotion AI measures their moment-by-moment facial expressions of emotion and aggregates the results in a dashboard. We’ve seen that these insights tie directly to outcomes such as brand recall, sales lift, purchase intent and virality. So it’s a really effective tool. As such, 25% of the Fortune Global 500 companies use our tech to understand consumer engagement and optimize content and media spend accordingly.

Duncan: How do you address fears about potential misuse of personal information (such as people’s faces) that AI systems can track and synthesize?

el Kaliouby: We recognize that people’s emotions are extremely personal. So from day one, we’ve had strong core values around the ethical development and deployment of AI. A big piece of that is our commitment to data privacy, and requiring clear, upfront opt-in and consent from anyone who will interact with the technology. We do not license our technology for use cases like security, surveillance or law enforcement where there is no opportunity for clear opt-in, and people’s privacy is at risk.

Furthermore, our software does not identify people who use it. The technology analyzes the human face but does not identify the individual. There are also some applications, such as automotive, where data is processed locally on the fly, rather than requiring it to be sent to the cloud.

While these are important practices for us as a company, this is a dialogue that needs to happen industry wide. We believe in the need for thoughtful regulation on the development and deployment of AI, and are part of industry organizations to advance AI ethics, including the Partnership on AI, the World Economic Forum’s Forum of Young Global Leaders, and the Technology and Public Purpose (TAPP) Project, which is based at the Belfer Center at the Harvard Kennedy School.

Duncan: We now live in an emotion-blind cyber world where so much “communication” is done via texting and email. How can your work bring emotional intelligence to our machines?

el Kaliouby: When we communicate face-to-face, we share so much more than just the words we say. We emote through our facial expressions, vocal intonations, gestures and other subtle cues. But the digital world was not designed to take that into consideration. All of the nuances and richness of our emotions are lost in cyberspace, and that leaves a huge gap when it comes to empathy online

I believe we need to redesign technology to give it artificial emotional intelligence (Emotion AI). It may seem counterintuitive to turn to something artificial to keep us human, but the potential is powerful. Imagine if, before you hit send on a tweet or text, your phone could say “you seem really angry right now. Are you sure you want to send this, or should you come back to it later?” There are many applications for this technology, such as conversational interfaces, mental health, automotive, education, media analytics and more. And to me, the most exciting part is that this will not only transform the way people interact with their devices but also, by extension, it will enable more meaningful and empathetic connections between people.

Duncan: What do you envision as the role of AI in conditions like autism and Parkinson’s?

el Kaliouby: When I was pursuing my post-doc at the MIT Media Lab, the very first application that we explored for Emotion AI was for autism. We built smart glasses to help kids on the autism spectrum learn important social, emotional and cognitive skills. Today, we work with a company, Brain Power, that’s continuing this work, and I’m incredibly proud to partner with them.

Parkinson’s is another area with a lot of potential. Diagnosing Parkinson’s has its limitations, as it’s often reliant on in-clinic medical examination and the presence of symptoms that do not occur until late stages of the disease. However, several years ago, Affectiva was approached by a scientist and entrepreneur, Erin Smith—who was 15 at the time, believe it or not. She believed she could use Emotion AI to build a diagnostic tool that would detect early signs of Parkinson’s, based on changes in people’s facial expressions and other external manifestations of the disease. Today, Erin’s technology—powered by Emotion AI—is in clinical trials. I’m so excited to see how it develops.

Duncan: Major social events—like the Covid-19 pandemic and racial unrest—tend to underscore the role of leadership. What have recent events taught us about the importance of leading with empathy in times of crisis?

el Kaliouby: Traditionally, people have evaluated leaders based on hard skills and qualifications. Softer skills like empathy and vulnerability were not only de-emphasized, they were often seen as signs of weakness or inaction, especially for women in leadership positions.

But we’ve reached a moment of reckoning, where empathy is crucial to get companies, communities and individuals through immense change and challenge. I hope this shifts the perception that empathetic leaders are weak and proves that you can be empathetic while still being assertive and highly effective.

Duncan: Leadership commentators often talk about the need for “vulnerability.” What kinds of behaviors demonstrate vulnerability, and what effect does that have on a leader’s effectiveness?

el Kaliouby: I believe wholeheartedly that when you’re vulnerable in sharing your thoughts or experiences, it encourages others to do the same. That unlocks a true connection between people and can unite individuals behind a shared mission. That’s what being a leader is all about.

Leaders can demonstrate vulnerability in small actions they do each day. It sounds cliché, but I bring my whole self to work—talking about my kids, my personal challenges—to show others they can, too. I don’t shy away from talking about business challenges either. Transparency makes our team more united, determined and invested in our collective success. It may not always be glamorous, but leaders simply can’t create that sense of connection with their walls up.

Duncan: In your own life journey, what experiences have you had that have helped you along the way?

el Kaliouby: Throughout my life, I’ve been lucky to have mentors whose openness and willingness to connect with me shaped not only the trajectory of my career, but the trajectory of my life. I would not be where I am today (both in the US, and an entrepreneur) were it not for people who took a chance on me and encouraged me to pursue my wildest ambitions. It all starts with human connection, and I feel a conviction to pay that forward.