The year is 1988. A burgeoning misspent youth unfolds. I’ve found my way to the local Working Man’s Club, and the most likely sports I am destined to excel in seem to be all indoors. Specifically inside a Pub. The Pub Olympics.
My face contorts as I trigger a neurological synapse chain aimed at deducting 37 from 501. Unfortunately cobwebs impede the progress of the signals within the 14 year old brain. I seem perfectly adept at chalking up the answer but working out what it should be in the first place seems to take longer than it should. Significantly longer. Equally, working out double and treble combinations to reduce a number to zero whilst remaining in the confines of Darts rules seems to take me a lot longer than my adult opponents. But why? Is it because they drink beer? Does beer make them more powerful somehow?
“It’s because all you young-uns use calculators in schools nowadays. Never in my day!” they retort.
To be honest, there seems to be a point here. The tool used to enable the more complicated calculations, and the exploration of more scientific mathematical methods to provide advantages in one area, seems to reduce the ability in another. Namely, the mental dexterity of manual arithmetic.
Remember Karate Kid? Painting the fence and waxing the cars built up the “muscle memory” for Daniel-son’s blocking techniques.
You see it played out behind the tills too. The older generation stand perplexed at the delay as the young cashier figures out the correct change. The brains plasticity turns its attention to new neurological tasks at the expense of the old, and that undernourished area retards as a result.
This is generational cognitive evolution at work, and it’s determined by our environment and the tools that we use.
We know now for example, from neuroscience (e.g. the advent of fMRI scans), to some degree how the brain operates. Stroke victims can recover abilities through learning new tasks that help formulate new pathways and avoid those that have suffered damage. For example, learning an instrument can aid in the articulation of speech as both functions involve neural channels in and around the Broca’s area (associated with speech control). So we aware of the brains plasticity, and how that can be developed, and equally how it can deteriorate and with it reduce the associated functions.
Over millennia of technological developments our cognitive processes are enhanced for the benefit of future generations. Here are a few notable historical examples:
Prior to the Arabic translation movement championed by Persian philosopher Al Kindi, Europeans were limited to the deeply flawed mathematics available through use of Roman Numerals. But, from the 2nd Century CE onward, Indian Mathematics had a perfectly workable “zero up” structure similar to what is in place today.
This subsequent paradigm shift (10th Century CE) in mathematical method paved the way for accurate geometry and navigation for instance. It changed the way our brains interacted with certain problems, and the outcomes became profoundly positive to mankind in several fields.
The abacus, when invented and then subsequently privately visualised, allowed experts to mentally picture numerical calculations in a different and more constructive way to enable faster calculation. Due to the visual representation required to complete this process, new areas of the brain are employed to complete this task, and so the human race develops its cognitive methodology.
Maps are constructed based upon the knowledge of multiple cartographers, allowing most users to orienteer or navigate to where they need to be, without using that knowledge of the cartographer(s) who produced the map, and thus develop map reading skills. Therefore, mankind becomes more adept at exploration based upon the combined intelligence of its forbears.
These are all complimentary cognitive artifacts that aid human flourishing and progression.
Going back to the start of this blog, arguably the calculator is complimentary, but also, competitive. It opens up new possibilities, but perhaps at the expense of others.
But what about future technology? Is it mainly good for us, or is it mainly bad for us? As technology seeps into every aspect of our lives, will we become lazy and exacerbate the situation? Will this make us into drone like followers of fashion? Will we become techno-zombies playing video games, and having algorithms determine the books we read, films we watch, music we listen to etc.? All this based on what someone like us has read, watched, or listened to! Will we just passively sacrifice our freedoms for the sake of convenience? Are we getting utopia mixed up with dystopia?
(Actually, that all sounds a bit like marriage!!)
Let’s step back a little. I drive an automatic car, and worry that I’ll forget how to drive a manual car. To combat this I occasionally drive the wife’s car, but keep stalling it as I forget to put the clutch down when breaking. Should I worry, or let it go? (I mean the concern, not the clutch!).
When cars eventually drive themselves, will we even be able to manually intervene? Will we lose that skill completely? Forget how to drive? Become unqualified to drive?Eventually, will we be prohibited from driving for the sake of public safety? And if so, is there actually a problem here?
The case I am making here is not related to future shock per se, its not even based on aversion to change. Equally I am not adopting some Luddite view of technology, worrying about it replacing specific functions (trades for instance), for more efficient and productive methods. Technological progress for our overall betterment should never be impeded.
Rather the point is, how do we harness the positive whilst addressing the negative? How do we embrace the complimentary cognitive artifacts, and compensate somehow for the competitive cognitive artifacts?
In an age of typing and even voice recognition we still (rightly) teach our children handwriting skills. Will this soon change? It doesn’t seem intuitively right does it? There’s a reluctance there, in that we are maybe throwing out the baby with the bathwater? Losing something else important..?
Very recent studies on Alzheimer’s disease suggest that learning something new (e.g. a foreign language) at an older age can act in some way to preserve neural functioning that could potentially combat the disease, or maybe slow down its onset in some cases. These are early days (at a correlation (not causation) stage) of research, but could a similar tactic preserve us all more generally from any negative impacts of this onslaught of technology?
Should we all pay attention to the processes that we are replacing (perhaps inadvertently), and adopt new strategies to preserve our human cognitive abilities? Is this actually a more immediate threat worthy of concern now, even more so that our fears of the Technological Singularity? The two fears may well become interrelated if our powers to react and resist are destined to be “dumbed down” over time.
Should we keep one eye at least on what we are replacing and make sure that our degrees of freedom remain intact? Perhaps go outside, sit in a field somewhere and think about it..
Wax on. Wax off…