“The oldest and strongest emotion of mankind is fear, and the oldest and strongest kind of fear is fear of the unknown” – H.P. Lovecraft (1927).
The thing about fearing the unknown is that it’s a guessing game. You don’t know what scares you and why. The fear of science or scientific advances that can impact our generations to come is no exception. You might identify a dozen reasons to doubt science and half a dozen worst-case scenarios. Still, for most of you, these “facts” are merely speculations, a guessing game.
And that’s OK. It’s more than OK; it’s as natural as the need to pee each time you take a dump. Fear and anxiety as central emotions help us survive.
So, if fundamental for survival, how can we advance if we’re always scared of the unknown? How do we develop if we constantly doubt new technologies, job environments, or projects?
Like most things in life, new unknowns bring both good and evil, and we need to balance our fears and curiosities to function in our world. The same applies to science, including gene editing and artificial intelligence (AI) or robots. A balanced view of these novelties allows us to develop these without losing our skepticism.
When you’re confronted with new stuff – and that can be everyday events like a presentation or interview – I invite you to rationalize your fears and follow through. A step in this process is to ask yourself, “what am I scared of and why?”
Before we look into that, let’s look at two examples of science-related fears you might have experienced in some way or another.
You love science and scientific advances
Yes, we know you love science and scientific advances. We all hail and praise discoveries that have the potential to improve our lives. Our love–hate relationship with CRISPR and robots is a prime example of that.
CRISPR-based gene editing became so available, cheap, and easy to use that almost anyone could visualize its future benefits: save lives, treat diseases, and improve the environment. CRISPR experts worldwide exclaimed their excitement with the tool’s potential over the media megaphones, and we all shared the good news.
Similarly, I remember our joy when we, for the first time, heard a real-life experience story about robots that could waiter – waiter-robots. They could take orders (say, a hotdog), return to the bar, and bring you the right order. [let me tell you that living in Amsterdam, such a story brings tears to my eyes. Bringing me the right order, you say…? That’s amazing stuff.]
We were stunned by both advances, even though we knew they were still at their infant stage. We know, for example, that CRISPR has several limitations in human therapy. We also know that the robot waiter would mess up the order with a slight change in the customer’s seating position. But that’s the exciting part. It can only get better from here on!
…and then you fear science
As scientific advances like CRISPR and robots created headlines, we shared the positive prospects, followed expert researchers in TV studios, and smiled while eating our premade schnitzel in front of the TV screen. But, once the positive hype expired, and once (it seemed) all the experts were out of their 2 cents, we suddenly found ourselves at a turning point; CRISPR and robots became the villains.
Our storytelling brains started creating the darkest futuristic CRISPR scenes – the apocalyptic ones. The ones where CRISPR gene editing creates super soldiers and designer babies who are faster, stronger, and more intelligent than we are.
We realized that these human-like robot freaks have the potential to develop, get a mind of their own, and eventually outperform us. Easy done for robots; they’re tireless. While we lazy humans need to stop practicing whatever activity we’re doing to eat and sleep, a robot will be on the field practicing any task without breaks.
You suddenly realize these freaks will outperform us eventually, whether in football, intelligence, or… war…
What once was our little self-created pet started resembling Cujo for every magazine we opened, every panel discussion we saw, and every post we liked.
We feared our ability to play gods, rebuild the Tower of Babel, or recreate Icarus’ wings. But, the actual dark predictions took the fear one step further. These featured some sort of warfare against you, your family, friends, and society; think bioterrorism and robot domination.
Perhaps you’ve already read that CRISPR is a potential weapon of mass destruction – if not, check it out here. Maybe you’ve heard prominent scientists and tech nerds warning us about the dangerous development of future AI. It feels like we’re entering a dark era.
The reason you fear new science
The question remains, why do you develop such a love–hate relationship with new advances in science? Why are they so exciting yet, at the same time, so scary? As you can imagine, the answers to these questions are rather complicated and depend on the individual and the group. Still, at a high level, let’s look into some of the potential reasons you (or we) fear new scientific advances and the unknown.
Evolution and the individual
Humans have thrived throughout development because of our fear of the unknown, whether primal, like the fear of falling, or acquired, like the fear of spiders. They’re all there for a reason: survival.
Throughout evolution, humans and other animals have developed a sense of caution when confronted with unknowns. We’ve survived because we’ve developed these “right amounts” of fear that make us skeptically approach unknowns and identify them as possible threats. In fact, this default vigilance mode is related to the so-called negativity bias, meaning that we tend to focus on negative information rather than positive.
That’s the default first approach to unknowns, but we quickly acclimate to the unknown once we’ve figured it doesn’t harm us. That is, we reduce our levels of fear once we learn about the unknown, once it becomes less unknown. [Yes, that’s where we’re heading.]
And according to animal studies, fear levels may also depend on individual genetic traits. In these experiments, rats showed lower fear responses to unknowns during early development if their mother had high levels of stress regulation.
So, the fear of unknowns, such as scientific advances, is a normal response that has allowed humans and other animals to survive through generations. So, you don’t fear gene-editing and AI robots per se – they intrigue you. Instead, you fear the development of these into something uncontrollable that might extinct us (not the robot waiters, of course). Not necessarily harming you personally but your species. The unknown and dark future of new things scares you.
The hierarchical (selfish) reason
Then, humans became superior to everything. Our imposed superiority in nature might partly explain our fears of unknown science and technology, especially if they can collapse our hierarchical structure or advantage.
What do I mean by that? Well, our superior self-image must have intensified once we began domesticating animals and became farmers. “Sit, Fido!! Good boy!” It developed further with slavery. So by now, humans are not part of nature anymore. Instead, we play nature like a game of chess.
Now, what happens to that superiority and status if the animals, slaves, or other groups we oppress start outperforming us?
We create robots and laugh when they behave like humans, picking up hotdogs and whatnot. We develop medical or scientific tools and praise their effectiveness. Still, our amusement has limitations, and we cannot accept just being part of nature, among robots or super babies (that have grown older by then). No, we need to rule it, and a loss in status would ruin humankind.
The cultural reason
The Japanese approach robots differently than “Westerners”. They don’t fear the rise of robots; they embrace it.
According to some Asian cultural roots, humans are merely a part of nature and not superior creatures. According to The Science of Storytelling by Will Storr, Asian cultures tend to be less individualistic, apparent in their story structures that are less self-focused and more group-focused or other-focused.
I believe this view releases the self from all responsibilities and, importantly, our perceived superiority to some extent. It’s pretty relaxing.
Whenever Hollywood depicts robots or genetically modified creatures, the shit hits the fan without exception (of course, Wall-e is an exception). One superhero must save us from the terrors that inevitably lead us to the apocalypse.
Like it or not, exposure to these stories and cultures will affect your view on new technology. Stories are compelling and can change our opinions on foreign things, even though we know that the stories are fictional. Somehow, they make the unknown more known, and for us in the “west”, the message about gene editing and robots seems to bring bad news, which is not the case in Japan.
How can you reduce your fears of new science (and unknowns in life)?
You will probably always remain cautious of new things. It’s inevitable and, to some extent, critical for your survival. But you can start approaching your fears of scientific advances and technology differently to reduce your anxiety levels.
Make the unknown more known
Keep being skeptical; it’s great that we keep a critical eye on old and new advances. But that doesn’t prevent you from being skeptical about your skepticism.
This is especially true when your starting point is based on no or limited knowledge. We recently published a meme on Instagram and Facebook about the so-called Dunning–Kruger effect, a cognitive bias that makes you overestimate your abilities or knowledge. The meme was related to COVID-19 but also applies to new and old science, including CRISPR, vaccines, the environment, and robots.
The more knowledge you get, the better you can estimate how new technology may or may not be used in the future. Read peer-reviewed papers and well-respected websites, watch presentations, or attend conferences.
The point is that sound knowledge is king. Awareness filters out all the disturbing noise coming from fear-mongers and conspiracy theorists.
The same can be said about your everyday life. The better prepared you are for an interview, competition, or presentation, the less energy you’ll spend trying to predict the future (often predicted as dark). You’ll be cooler and more assertive. Sure, just as with science, there’s a risk that things go wrong, but at least you’ll know it then; it’s not guesswork anymore.
Know that predictions you hear might be biased
Take a step back and remind yourself that reality is relative and that your fears of an unknown future are products of your surroundings. Just because your environment has an oppressive, aggressive, or conquering mindset doesn’t mean it directly translates to others.
For example, “The development of full artificial intelligence could spell the end of the human race.” – Stephen Hawking.
Nothing but respect for the late Stephen Hawkins and his great work. Still, his beliefs of robots replacing human life are mainly based on how we behave and perceive the world (emphasis on we). Our societies start wars, pollute the environment, and extinguish species, but is this necessarily applicable to any type of super-human being? Are they predisposed to act the same way we do?
Note that we do the same with alien theories. Ever heard someone claiming, “Once they arrive, they’ll be so advanced that they’ll dominate us”?
You could just as easily think that if robots, genetically modified babies, and aliens are so advanced, maybe they’ve “switched off” their primitive brains. They would’ve figured out how to live harmoniously and sustainably with nature and other beings.
Our fears of the unknown are ultimately merely reflections of ourselves. They’re projections of our flaws and what we need to improve, a mirror or lesson about ourselves. The dangers we see in the future reveal our societies’ weaknesses. So, instead of freezing and expecting the worst, we can use our fears of biology, robots, and aliens to fix our surroundings. Maybe that will bring us positive vibes.
Next time we’ll make it shorter, I think. Maybe about how to think like a scientist. Or how to dress like one? Anyway, until then, stay safe!
This Post Has 5 Comments
I was not aware that there is something like “fear of science”, thanks for sharing.
keep posting such a nice article.
I was not aware that there is something like “fear of science”, thanks for sharing.
keep posting such a nice article.
well, fear of unknown things is quit common i guess.
Thanks for your comment!
Sure, people fear all kinds of scientific stuff, including gene editing (GMOs), vaccines, and robots, usually when they don’t know enough about the topic. Once we overcome the information gap, we many times start seeing things differently; more rationally. I’m planning a post about GMOs as well (a lot of hysteria there).
Anyway, glad you enjoyed it! Hope to see you back!
“…we quickly acclimatize to the unknown once we’ve figured that it doesn’t harm us.”
True. But if we see evidence that it IS harming us? This would be the case with a lot of things people are suspicious of. Getting more information is good, but we need to be aware of the biases of the information, as you mention in your other way to reduce fear of the unknown. So it really becomes a examination of the 1) the science and 2) various authors’ motivations. What is the reason the information is being put forth, what is the funding source, what is the larger picture, these kind of questions.
Thanks for the comment! I think that you’re absolutely correct and that your point is worth repeating. I also mention in several parts of the text that we should never lose our skepticism and never accept harmful science. For example,
“Keep being skeptical, for sure, it’s a great thing that we keep a critical eye on old and new advances. They need to benefit us without causing harm.” It’s actually crucial that we question everything.
It’s indeed a hard topic to wrap into one blog post, in part, because there are different types of unknowns. That’s for a future post.
In this case, robots and gene editing are not harmful by themselves, they are scientific tools. A common belief regarding CRISPR, for example, is that we’ll be making mutant super baby soldiers in the near future. However, if people knew how realistic that is, they’d probably be calmer. That’s not to say that the tools cannot be abuse, because they can, for sure. And that’s better than a good reason to keep a critical eye scanning science and its applications.
Thanks again for your comment. Nice!