top of page

Humans, Tech, and AI – A Historical Context

  • Writer: Niv Nissenson
    Niv Nissenson
  • Jun 15
  • 2 min read

Updated: Jul 22

ree

Editorial -

As much as I admire AI and technology, my true passion has always been history. In my first general history course at university, while we were studying Early Middle Ages France, I remember several students — myself included — expressing surprise at the behavior of many French nobles. Their actions didn’t strike us as particularly loyal, Christian, or consistent with our preconceptions of people living in a highly religious and structured society.


Our professor chuckled at our naïveté and said: “Do you think humans weren’t human in the 8th century?” That comment stuck with me. Ever since, whenever someone asked me why I studied history, my answer was simple: history is really about learning humanity — how humans act under different circumstances, many of which echo throughout time. In short: while geography, culture, context, and technology all change, human nature is the constant.


Now, as we stand at the edge of an unprecedented technological revolution, I find myself applying that same historical lens to our current moment. Humans are never truly content with what they have. We have an inherent, often uncontrollable, drive to improve our fortunes. Whether explained by Maslow’s hierarchy of needs or other psychological frameworks, it’s this drive that pulled us out of caves and into civilizations.


Technological progress is not linear. Throughout history, there have been periods of stagnation — even regression. After the collapse of the Roman Empire in the 5th century, many technological capabilities were lost. Moreover, progress is not uniform across domains. Had the transistor never been invented, humanity might have made far greater advances in analog technologies, leading us down entirely different technological trajectories.


To illustrate this further: humans developed X-ray imaging more than 40 years before Penicillin was first used as a medical treatment — and the discovery of Penicillin itself was accidental. Had we not stumbled upon it, our approach to infection control might have taken a different route, producing breakthroughs we will never know.


Once humans mastered hunting, gathering, and basic shelter, the next great leap was the Agricultural Revolution over 10,000 years ago. It gave rise to settlements, states, and eventually civilizations. From there, the human story became one of solving problems and increasing efficiency — in food production, transportation, medicine, communication, and beyond. The domestication of wheat enabled stable food supplies and long-term storage. The invention of horseback riding allowed humans to travel farther and carry more, making one rider more efficient than several unmounted people.


The printing press, invented in 1440, didn’t create knowledge — but it made its replication exponentially more efficient. It created incentives to generate content that could now reach vastly more people. This, in turn, sparked a dramatic rise in literacy and had far-reaching cultural and societal consequences.


AI, at first glance, is another tool of efficiency — but it’s also something more. In my view, it is the first human invention that can, in some sense, think. That distinction is significant. And it’s a subject I will explore further in the next editorial.


Global GDP past 1,000 years - Maddison project. Source Bond Meeker AI trends report
Global GDP - last 1,000 + years per Maddison Project (source: Bond Meeker AI trends report)

 
 
bottom of page