So many people forget that the human brain is a muscle, and like all muscles it deteriorates when it isn't exercised. Some argue that the human brain is the most important muscle within the body, yet there is a trend of it being exercised less. Over the last 10 years I have watched a steady trend towards people thinking less, and while some of it I believe is down to choice, other parts I believe are down to our surroundings.
As someone who has loved computer games since they were a child, ranging from the game of Nebulus (that I never managed to complete), to the vast landscapes of the Fallout series, it's safe to say they have always played a large part in my life. What started as puzzles and learning a pattern became a love of virtual exploration (and all of the tasks/challenges that come with). I am forever reminded (by those who don't appreciate my love for Fallout) that as Jumanji put so elegantly, "A world for those who seek to find, a way to leave their world behind". In fairness I do see their point, as spending time in a virtual world while I could be out exploring a physical one does seem like an escape from reality.
In today's world however, the gaming landscape has changed somewhat, and diverged overall. Exploratory games (including FPS/RPG/MMO's etc) still exist and have expanded, while simple games (especially those on mobile devices) have also expanded. The latter has become a common pattern of simple repetitive tasks for virtual rewards/praise. So many of these games have no requirement for deep thought or observation, and even the requirement for skill is debatable at best. These types of game are more about feeling positive/winning a reward without real mental effort. As with everything, doing this for extended periods does seem counter-productive, and an easy way for the human brain to waste away. I recall seeing an image of a "gaming farm" within Asia, whereby workers spend 12 hours (or more) per day clicking on multiple mobile phones (usually close to 100) whenever a new task needs to be started, for virtual currency to be sold for real currency. Aside from the horrible nature of this work, the task at hand becomes completely reactionary, requiring no skill or complex thought, but instead reacting to a flash on a screen and tapping as a response. In no way can this be good for the worker, or the workers mind.
One of my first thoughts around this (many years ago) was if the quality (specifically the lack of) of music played a part in cognitive decline. As a child/teenager I was always fascinated by how many instruments I could hear within a song when listening to a CD (through a good pair of headphones or speakers). When MP3 downloads became a thing, the initial quality of these was poor (to match the lack of download bandwidth available via dial-up internet connections). I recall how tinny the audio would sound, and how a song would be riddled with compression artefacts that make for an overall unpleasant experience. With lower quality audio came lower quality listening, with there being less to pick out within an audio stream. The end result of this being that while audio was easier to listen to, it wasn't as involved for the brain and therefore less exercise for the all-important muscle.
Some could (and likely would) argue that this would free a person to spend more time thinking about other tasks, but in a lot of cases I would wager that music is used as a form of relief and multitasking isn't really the goal. With streaming music services finally bringing the quality to the level it should be (lossless services are finally appearing), it will be interesting to see how many people take this up and make the most of it. While streaming media is finally catching up, the audio devices sadly are not. Cheap headphones and speakers are exactly that, with an industry view that most consumers can't tell the difference, due in part to most people not having the experience in the first place.
Similar to that of music, video (be it television, DVD, or streaming services) is another facet of digital life that we have slimmed down to make more deliverable, at the cost of visual quality/fidelity. In the days of VHS and Betamax a television was something that doesn't compare to the size of devices today. While the size of the screen was small, the analogue encoding of the media meant that even at a laughable resolution of 576i (or lower), there were no compression artefacts to be seen. The fidelity then was as clear as it could be for the screen in question. With the advent of digital (be that DVB or DVS) the signal became compressed (to fit with the technology and regulations of the time), resulting in significantly less fidelity (especially in scenes with heavy motion or colours). While things slowly improved over time, the size of televisions increased faster than the bandwidth/codecs available. Watching an old DVD on a large screen shows just how low resolution and compressed the footage is. With this lack of fidelity, the human brain has less to absorb/process. No longer does the brain spot a peculiar flower within a hedge, as the hedge is just a green blur on the screen.
It's easy for one to say that there is no difference, but I'd challenge anyone to watch the Blu-Ray version of Planet Earth (or similar) on a very large high resolution screen and state that the quality is the same as the DVD version (it simply isn't). Even the start scene of the aforementioned show is is a testament to the fidelity, leaving most people seeing the planet in such detail you do start to look at the different continents to see what stands out the most (rather than it being a green blur on a blue background). With 8K TV's becoming readily available, the content once again hasn't caught up. Screens with resolutions so high that the human eye can't see any more visual detail are hampered by a lack of native content, leaving the viewer once again looking at a lesser image. I recall an email conversation with Netflix about how their 4K test videos didn't have a suitably high bit-rate as you could see compression artefacts at multiple points within their videos. Sadly, the content always lags behind the devices, and the human brain isn't taxed as much.
Memory (the lack of)
Where would the world be without some form of smart device in our pockets that keeps track of everything for us. I am guilty of this myself, relying on my device to keep my schedule for me and store all of the details of my contacts. Gone are the days when I could remember the phone number of each of my friends/family, replaced by an ageing memory and over-reliance on technology. The convenience is exactly that, convenience, but it doesn't talk of what we lose in the process. Some people argue that by not storing data like phone numbers in our heads we are leaving space for other (more important) things, however for the majority that free space is typically consumed with the trivial/less important.
In my youth I could remember IP addresses for years at a time, with no issue around recollection. Now, I struggle to remember the day of the week. While part of this is a result of age, I admit part of it does come down to the convenience of technology. My todo list/personal notes are great, and sync through all of my devices (keeping them safe), however it does feel like using technology to remember things results in remembering other things less (due to my brain being used less). I always wonder if an fMRI of my brain through each year of my life would show significant differences where over the years I have switched to using technology more.
To me, one of the biggest changes I have observed over the last 10 years is how things are presented to us in a way to simplify thinking. Choice still exists, but once a choice is made you can sit back and let things happen. Take reality TV as an example, which has grown in both production and consumption almost exponentially. As quoted by someone close to me, "I watch it because I don't have to think about it". Most shows are geared towards not having to think, not having to contemplate, simply to provide a form of distraction in the name of entertainment. While a break for the mind is good (everyone needs downtime), I do wonder if spending most evenings watching reality TV (the highlights of someone else's life) is actually healthy.
Someone once told me that they don't like to spend time idle because then they have to think about things, and they don't like that. It's a statement that has always intrigued me, and one I always ponder the ramifications of. To me, thinking about my actions of the day, my actions of the past, my potential future actions, helps ground me. As someone who doesn't like to make mistakes, remembering past ones is something I hold in high regard. I sometimes question if as a society we are moving more towards the removal of thinking outside times where we must, and what impact that has on who we are (and our brains).
With technology improving and higher fidelity content becoming the norm, I do ponder in which direction the cognitive abilities of people will travel. Technology has become so intertwined with our daily lives that I do believe we lose a part of ourselves to it, especially our cognitive ability. While technology can (and should) be used as an aid, I believe its important that it doesn't become a replacement for our native abilities. Time will tell...