Experts at WISE Summit 2019 discuss how technology plays a role in child development
With increasing demands on young minds to be adept at technology, what is the correct age a child should be introduced to the digital world?
“We have evolved over millennia to be good learners, but our environments haven’t evolved to keep up. A study showed that in the US in the 70s, the average age of children watching television was four years, and today it’s four months,” said Dimitri Christakis, Director of the Center for Child Health, Behavior and Development, University of WA and Seattle Children's Hospital.
“The typical pre-school child spends 4.5 hours on gadgets today. The benefits of technology is how it structures childhood. The apps on our phones and tablets are designed to essentially baby-sit children so that caregivers can walk away. Children need laps more than apps.”
Christakis was part of a panel titled How can we harness neuroscience to optimize holistic learning? Leading experts share insight into the applications of neuroscience for better learning, which took place on the first day of the WISE Summit 2019.
The session aimed to deepen understanding of brain development and how learning happens; how to best enhance learning across the various stages of development; and how educators can most productively interpret new knowledge to support diverse individual learners.
“Video games are a great way to learn but they aren’t holistic. For example, when children play with each other, they can choose to break the rules. They can’t break rules in a video game,” said François Taddei, Director, The Center for Research and Interdisciplinarity. “Interacting with other humans also allows emotional capabilities to grow.”
All panelists agreed that play is a fantastic method of learning as it is more holistic.
“One of the critical issues to real-world applications versus digital applications is the element of it being static – children can only do what the digital app tells them to do,” said Janet Rafner, Director of Learning, ScienceAtHome, Center for Hybrid Intelligence.
“In the real world, they have the ability to use tools for innovation, and to be able to think outside the box – something that digital apps don’t offer. When you deal with a complex problem, the learning that goes on is that much more varied. However, technology these days is about learning the simple problems – simple games, simple math – and so, children don’t have the opportunity to learn those complex learning skills through these apps.”
According Taddei, all children are scientists, and supportive environments need to be provided to encourage holistic learning.
“If you compare humans to other animals, we are not fantastic in many areas, but one element we are good at is our ability to learn from others; but this should be done in the right environment. Learning can be enhanced by play – it’s a very efficient method,” he said.
“Environments that allow children to stay motivated, focused, and inspired while receiving feedback is a positive dimension to learning well. But if children are constantly distracted, then that’s a poor way to learn. And technology is a good example of this. We have to learn to switch off phones and machines, and allow children to learn from humans. This is where we need to unlearn and relearn.”
Panelists also addressed the aspect of training parents and caregivers in raising children without being too dependent on technology. An example, they pointed out, is if caregivers were addicted to their smartphones or tablets, the consequences will be twofold: the quality of caregiving would be affected; and the child would also become addicted to the technology that they are being exposed to.
“It’s not always possible to choose between the digital and real world. The critical aspect is trust, and this includes everyone around the child,” said Randa Grob-Zakhary, Executive Director, Insights for Education.