Are you uncomfortable with AI?

As I type, it’s not even noon, and technology has positively impacted my life. 

A machine perfectly brewed my coffee. Fresh, clean water flowed effortlessly into my glass. My six-mile roundtrip journey to the school was effortless, and I didn’t have to brush out my car or feed it any oats when I returned. 

You get the point. Every day, we take full advantage of technological advances that we take for granted. 

And while that’s a good thing, we also feel uncomfortable with some of our modern technological advances. 


Because they are often used to manipulate us, not serve the greater good of humanity.

At a TedX SononmaCounty talk in 2019, Dr. Rhadika Dirks used this example.

“About two years ago, Facebook divided all of its users into two groups. The one that they call the negative feed. If you were part of that group, all you were exposed to every day was extremely negative, down, depressing stories. . . The other group is constantly exposed to wonderful stories – positive feed, so they only see amazing vacations, kids rocking it in school, fancy cars, etc. 

This has profound psychological consequences. In the negative feed, we wonder, ‘What is wrong with the world?’  Whereas our friends (in the positive feed) wonder, ‘What is wrong with me? Why don’t happy things happen to me?’

This profound psychological manipulation has severe consequences on our individual psychological states. 

Now, we have regulations in place for this. Anytime anyone does massive social experimentation on humans, you have to go through a bunch of approvals. There are checks and balances in place, but Facebook was simply trying to maximize one variable–time spent on the page.

Social consequences, your emotional state, my emotional state, be damned.  This is what happens when what we are building no longer is connected to the questions that matter to us. 

It seems that when technology left the humanities, it left humanity.”

I love that last line, “When technology left the humanities, it left humanity.” 

As I reflect on her insights, I am left with three questions:

  1. How is my current use of technology impacting my friends, family, and coworkers? 
  2. When considering adding new technology to our work environment, how will it affect my coworkers, their relationships with one another, and our customers?
  3. Should we eliminate certain technologies to raise our level of humanity?


I don’t have clear answers to these questions, but I do believe that we need to take full responsibility for our use of technology–no matter how addictive they try to make it. 

After all, leaders worth following take responsibility for themselves, their coworkers, and their mission. 


One More Thing

Dr. Rhadika Dirks will be a keynote speaker at Leadercast: Human Intelligence. Radhika is an Artificial Intelligence Expert. She has been named one of Forbes’ 30 Women in AI to Watch and one of Deloitte’s top women in the AI world. She co-founded XLabs, a moonshot factory for solving huge problems like the cure for cancer. 



Brian Rutherford

Brian Rutherford is Director of Content and Product Strategy for Leadercast. Brian has been telling stories professionally for twenty-five years. Stories that inspire people to see themselves and the world differently. Stories that challenge people to take meaningful action in the world.

More Articles