The Decay Of Information: Actions For A Knowledgeable Future

In “The Half-Life of Facts: Why Everything We Know Has an Expiration Date”, Samuel Arbesman suggests that facts decay over time until they are no longer facts or perhaps no longer complete.

Photo by Darya Sannikova

Decay of information. Photo by Darya Sannikova

Information, according to Arbesman, has a predictable half-life as well: the time taken for half of it to be replaced or disproved. Over time, one group of information replaces another. As we gain more information and a better understanding of the world, we discover more as well. Sometimes new findings contradict what we thought we knew, sometimes nuances about old things and sometimes we discover a whole domain that we did not know about.

This concept of decay is observed in all sorts of areas where the quantity or strength of something decreases over time. Radioactive material has a half-life, as do drugs, marketing campaigns, biology and many other domains.

The impact of the accelerated rate of decay of information is highly significant. Imagine if you were to study engineering about a hundred years ago. It would take about 35 years to dispel half of the information; long enough to make it to retirement unscathed. By 1960, it took a mere 10 years to disprove half of the information. Today, what we learn now, half of that is no longer relevant in 18 months. Let that sink in for a moment. What you learn in your first year of university or job, half of that will be outdated by the time it is graduation day or when you get promoted.

“Research by Philip Davis shows most forms of media have a half-life measured in days or even hours, 97 percent of academic papers have a half-life longer than a year. Engineering papers have a slightly shorter half-life than other fields of research, with double the average (6 percent) having a half-life of under a year. Health and medical publications have the shortest overall half-life: two to three years. Physics, mathematics, and humanities publications have the longest half-lives: two to four years.”

If we continue on the path of learning new information only, very soon, if not already, we will be unable to keep up with the speed of new facts. Throw in this mix the advent of artificial intelligence, machine learning, deep learning as well and it’s unlikely that the homo sapiens will win the race of learning at speed against the formidable machine. Mental breakdown, restlessness, insomnia, drug abuse, addictions, and the inability to focus are symptoms of trying to cope with an overload of information and how we grapple with the complexity of today’s life.

What then? Is all lost for humankind? That depends.

According to the World Economic Forum analytical thinking, critical thinking and creativity are top skills for today’s workplace. One of the learnings of the Covid-19 pandemic is that we realise the importance of communication, collaboration, co-creation. If we do it well, we can overcome momentous obstacles. At the same time, the pandemic has highlighted how we need to adapt and upskill all these social capacities in a virtual setting and relearn some of them in a physical environment.

Despite the reluctance that some of us had for using remote working software, technology isn’t the problem. How we work and where we work requires us to stop old habits and learn new ones. Often we know that we need to change and yet knowing it is not always enough to start the change.

Neuroscience is constantly progressing, discovering new facts about how we learn, however accepting new knowledge that questions our old beliefs is hard. For example, did you know that our memory enhances if we drink coffee after we learn, not before ?

We cannot avert the decay of information but we can learn smarter, faster.

Original posting, DutchCham Magazine, 2022, Issue 6

Oscar Venhuis

“I’m a Dutch-Korean artist who works and lives on Lamma Island in Hong Kong.”

https://www.oscarvenhuis.com
Previous
Previous

Risk: The dilemma of creativity

Next
Next

What Do Grey Rhinos, Black Swans And Art Have In common?