By: Dr Hannah Marston, The Open University
This blog is one in a series of blogs on the Future of Ageing, published in the lead up to the ILC-UK Future of Ageing conference on the 24th November. To register to attend this conference, click here.
Technology has become an integral part of people’s lives since the turn of the 21st Century.
As revellers were celebrating the new millennium at the stroke of midnight in 1999, media and technologists were concerned that there could have been a technology meltdown with national computers due to an inability to handle 4 digit dates– however this wasn’t the case and the world witnessed and celebrated a smooth transition into a new Century.
Throughout the 20th Century there had been many innovations, developments and historical events. One of these was the revolution of technology, in particular digital games and the utilization and accessibility initially in public spaces such as public houses (Simon, 2009). This was in contrast to dedicated arcades accessible to those who wanted to play Pong[1] or Space Invaders. But as developments advanced, digital games became a feature within homes and by the 1980s many children and young people were playing a variety of games on dedicated consoles while witnessing the battle between Nintendo and Sony throughout the late 1980s and into the 1990s (Herman, 2001).
Fast forwarding into the 21st Century, technology developments have been phenomenal, not only in the games industry where digital game hardware and software have facilitated alternative approaches to game engagement through sensors (Nintendo Wii) to gesture and speech recognition (Microsoft Kinect[2]) but also wearable technology (i.e. Fitbit[3]/Jawbone[4]) enabling people to record their daily activities through step counts, sleep patterns and heart rate. The enhancement of mobile technology and tablets means that people now carry their own versions of a computer (also known as a smart phone) around with them, enabling people to individualize their own phone capabilities by downloading ‘apps’ which have been created by developers for both utility and fun. Some of the most common ‘apps’ include What’s app, Fitbit, Radio Stations, News (e.g. BBC News), Facebook, Skype, Instagram, Twitter, Flashlight. Additionally, there are work based ‘apps’ available such as Adobe and Drop Box. For some apps, users may have to pay a small fee for others they are free.
Over recent years, research continues to explore the benefits of technology in the areas of health rehabilitation (Marston & Smith, 2012; Hall et al., 2012; Bleakley et al., 2013) quality of life and well-being (Harley et al., 2010), people’s preferences of digital game content (Marston, 2013; de Schutter, 2010; de Schutter & Vanden Abeele, 2008) and monitoring one’s daily activity (e.g. wearable technology). Although excellent work continues in these areas nationally and internationally, there are facets that still need to be addressed such as the cost implications of owning technology as well as the purpose and benefits for having these technologies in one’s life. What happens when the technology doesn’t work, where does the person go to gain advice? These are just some questions which need to be explored and investigated further.
What does the future of technology hold for people in the 21st Century? If we reflect over the latter part of the 20th Century, the movie industry has provided viewers with films such as: Back to the Future I, II, III, The Matrix, Minority Report, I, Robot, and 2001: A Space Odyssey[5]. If we take reference from the file Back to the Future II, it was predicted that in 2015 society would comprise of flying cars, hoverboards, cooking food/pizza in a ‘Black and Decker Hydrator’, self-lacing trainers, air drying clothes, brain powered video games and hologram advertisements[6] [7]. These films are just some which provided viewers not only with entertainment and a narrative but also with a perspective into the future. One of the most famous books of the 20th Century written by G. Orwell is ‘1984’ (Orwell, 2004). For some, what Orwell wrote in this literature is or has come to fruition. For example
“He took his scribbling pad on his knee and pushed back his chair so as to get as far away from the telescreen as possible. To keep your face expressionless was not difficult, and even your breathing could be controlled, with an effort: but you could not control the beating of your heart, and the telescreen was quite delicate enough to pick it up” (Orwell, 2004; p. 67).
One emerging trend is for people to use technology to track their own health, often called the Quantified Self movement. There are news articles discussing the possibility of ‘implants’ (Tech Times, 2015) or ‘tattoos’ (Hurst, 2013) which record health data for a person. Is this the future for people of the 21stCentury – will we be encouraged by our health care professionals to undergo a minor procedure (i.e. implant or tattoo) which in turn doubles up as a form of identification and/or passport? If we require medical treatment, be-it through the NHS or an alternative solution; will the procedure of scanning the implant or tattoo to record/update our health records/procedures which in turn will implement and execute another segment of our lives which is banking. The financial implications of such procedures could include the deduction of money from our bank accounts in order to pay for health treatment or other activities. It is already possible for some users of Apple[8] and Android technology to pay (similar to contactless payment method) for small purchases. Further, Starbucks have an app available for download which enables customers to credit their Starbucks account which in turn will be deducted when making purchases.
It is beyond the scope of this article, but across all forms of technology and data collection – security, privacy issues and sharing of information needs to be considered; in regards to who has access to data, how is data stored, and what are the safety procedures currently and in the future will prevent data being stolen and sold on?
Technology and its use within society are exciting and research is showing there are positive benefits to this. Over the last 15 years the development, deployment and take-up has been phenomenal – and to predict what people in the next decade, or the next 50 years, will be using is difficult. However, for those older adults the take-up of technology may be different to those of the current ageing populations due to being so-called ‘digital natives’ rather than ‘digital immigrants’ (Prensky, 2001). One thing is almost certain: those that have grown-up with technology may face a different wave of challenges to technology engagement than their predecessors.
Dr Hannah Marston
Research Fellow, The Open University’s Faculty of Mathematics, Computer and Technology
Dr Hannah Marston is a Research Fellow in the Health & Wellbeing Priority Research Area at the Open University. In 2017, Hannah undertook an exploratory project called Technology4YoungAdults to explore the impact and attitudes of ICT on young adults.