On the 10th of May, the London School of Economics (LSE) hosted a symposium on the topic of families and “screen time”. The event addressed the critical issue of successfully reaching parents with resources which will empower them to better accompany their children in navigating the Internet, taking full advantage of the opportunities, develop resilience and minimize risks which could lead to harm. Much of the debate centered on the tricky contradiction between the necessity to present clear and simple messages and guidance while at the same time addressing the various parent “profiles” and their varying needs based on their parenting styles, socio-economic status or education level.
One take-away is that even if messages like the 2×2 from the American Pediatrics are well known, it is also wildly simplistic and doesn’t make any distinction between “screen time” and “screen context” or “screen content”, in essence, that not all “screen times” are equal and that playing a game with your family or doing homework is the same as watching YouTube alone in your room.
Screen time: still a relevant metric?
As was pointed out by Angharad Rudkin from the University of Southampton, while we do not know a lot on the effects of online media and screens on children’s cognitive developments, we do know about some of the “physical” effects such as risks of obesity or sleep disorder. Even if technologies such as Augmented/Virtual Reality may enhance physical activity, physical side effects may remain (exposure to “blue light, eye problems, sleeping disorders…) which means that even if a distinction needs to be made between “screen time”, “screen context” and “screen content”, it will nevertheless be necessary to exert some control over screen time. In a not so distant future, when screens will become wearables much along the lines of Google Glass, how will this affect recommendations on “screen time”?
What’s good for children?
Another issue of contention was how to distinguish between what is “good” or “bad” for children. Some of the participants raised the point that parents don’t want to be told what is “good” for their child and that parents could rely on peer to peer support to sort out what is good and bad, looking at other parents’ recommendations, reviews etc. One of COFACE’s French members, UNAF, does run such a platform which enables parents to review apps, websites or content for children (panelparents.fr). However, such a platform is heavily filtered and monitored to ensure that there is no conflict of interest in the reviews and especially, ensuring that parents use a consistent set of criteria to assess apps, websites or content.
As is customary on the Internet, any development which reinforces users can also be used against them. User reviews and peer to peer support have proven to be very useful, but at the same time, business models around reputation management have emerged, allowing content providers or developers to pay for positive reviews. Peer to peer recommendations and support also raise issues of “trust”. How can you be sure that the “parent” on the other side of the screen isn’t working for the app or website it is openly promoting or recommending? Furthermore, what are the credentials of the parents posting recommendations? It could very well be that very vocal parents online are not necessarily the “best” placed to objectively review online content/services. So while peer to peer networks will always be helpful, they are certainly not a panacea and need to be accompanied by measures to mitigate the problems raised above.
Classification systems and information given to parents is another way to sort content/services. There have been many advances in standardization of classification, notably via the PEGI rating system and initiatives such as the MIRACLE project which aim at making the existing rating systems and age labels interoperable. In this respect, the role of regulators is key. Only via the right regulatory “push” will the industry agree to get together and agree on a common standard for classification. But classification systems also have their limits. PEGI, for instance, provides age recommendations along with content related pictograms alerting parents about such things as violent content or sexually explicit content. In essence, this classification system warns only about risks but says nothing about opportunities.
One idea would be to further investigate the effect of online content/services on children’s cognitive development. How does a game affect “delayed gratification” or “locus of control”? While it may prove to be very challenging to come up with a scientifically sound and accurate answer to these questions, it is essential that we move forward since the “serious game” industry is booming and since video game developers or online service providers do not hesitate to prominently display (often unsubstantiated) claims about the educational value or benefits of their products.
While we may never come up with a “definite” answer on what’s good for children, what is clear is that the private sector doesn’t hesitate to bombard parents with their take on what’s good for their kids. Therefore, arguably, even a mediocre recommendation to parents on what is good for their kids from independent sources such as from academia, civil society or NGOs will still be better than to leave the coast clear for claims driven by commercial interest.
The POSCON project and network is a good place to start.
Follow the money
The prevailing business model on the Internet now relies on users’ time. The more time a user spends on a service, a content, a game, an app, the more he/she will generate revenue via exposure to advertising, via exploitation/sale of the data generated by the user or via in-app purchases of virtual goods.
At the same time, the Internet is seen by many as a providential tool, a great equalizer which allows for the realization of a number of core human rights including freedom of speech. Many stakeholders, including governments and civil society argue that we should simply apply a ‘laissez-faire’ philosophy to the Internet and it will become a space of freedom. It is certainly tempting to see it in such a light as it allows for all actors to sit back and watch and especially, shield themselves from advancing politically delicate recommendations.
Unfortunately, the Internet is undergoing a transformation by a combination of factors including algorithms and online business models.
Algorithms shape more and more what people see online, enhancing the visibility of certain content inside a social networking newsfeed or a search engine’s result page, thus inevitably skewing a users’ online experience. While there is no way around sorting content given the sheer volume of the Internet, the methods for sorting content (how algorithms are designed) do raise many concerns. Which criteria are used to “boost” the visibility of content? Is there any consideration of “quality” or whether the content is “positive”? Such considerations are especially important for services which are “designed” for children such as YouTube. The videos which are prominently displayed as recommendations on the app’s home screen do not land there “by accident”.
Online business models which rely on users’ time to generate revenue also contribute to corrupting online content. Any content producer looking to make money will seek to enhance the time users spend on their content/service. In extreme cases, this gives rise to click baiting techniques which rely on catchy pictures/videos/headlines to entice users to click and be redirected to pages filled with advertising. More importantly, whenever a content/service provider has to make an editorial choice about content, optimizing viewer statistics, click through rates, bounce rates or stickiness will be a top priority, often at the expense of “quality” considerations.
Some would argue that quality goes hand in hand with a website’s stickiness or viewer statistics but this is highly unlikely. One study found for instance, that brands talk like 10 year olds on Facebook, the main reason being to maximize user interaction and reach. If producing and posting a “funny cat” video will likely generate 5 million views and an educational video about nature will generate 500.000 views, which show will end up being produced? How will such a logic impact on creativity? How would a modern day Shakespeare fare with such a business model which seeks first and foremost to appeal to a mass audiences as opposed to pursuing art for art’s sake?
Again, when seeking to minimize risks and enhance opportunities for children online, one cannot ignore these realities.
If we want children to truly benefit from online opportunities, we need to take a closer look at who/what gets in the spotlight on the Internet or in other words, who’s making “editorial decisions”. Many would chuckle at this very idea since the Internet is now supposed to be a level playing field with user generated content taking over and “editorial decisions” being limited to censorship of content which violates terms of service. But as we have discussed above, algorithms and the prevailing online business model have a massive influence on who/what gets the spotlight.
Digital parenting or just parenting?
Is there such a thing as “digital” parenting? This is another question which was raised and discussed by a number of participants. Parenting does spill over into the “online” world since social skills, sexuality education, a healthy balance in children’s activities, social and emotional learning, values such as respect can all be transposed to online settings.
At the same time, the online world offers “new” sets of challenges. Children may encounter certain “risks” or “inappropriate content” online at a much earlier age in the online world than in the offline world, pornography being the most obvious example, which means parenting clearly needs to “adapt” to this new reality. Also, not all “traditional” parenting can be transposed to online settings. For instance, bullying and cyberbullying are clearly different and require adapted responses: cyberbullying is 24/7, there is a greater chance that the perpetrator(s) remain anonymous, the “signs” are harder to spot (black eye vs. a nasty message/comment) etc. So while a parent might know how to react to bullying (identifying the perpetrator, contacting the relevant authority like a head teacher or staff of a sports club), this does not necessarily apply to cyberbullying. If the perpetrator is identifiable and is a class mate, does the teacher/school have authority to intervene if cyberbullying has occurred outside of school premises/school hours?
The fox and the crow all over again
“If all parents had access to appropriate resources, advice, guidance about the online risks and opportunities, then children’s online experiences would be optimal and all problems would be solved.” Of course, no one would dare to voice such a claim, but some would come close. Private companies have every reason to promote such an idea as it is one of the most powerful arguments for delaying any policy or regulatory measures.
To provide a useful analogy, financial service providers would argue that financial literacy should be the focus to prevent over-indebtedness and agro-business would argue that informing people about healthy eating habits should be a priority to tackle obesity and decrease occurrences of chronic diseases… ignoring of course the fact that both industries engage in wildly counter educational advertising campaigns enticing consumers to act impulsively (take out credit to get the vacation you rightfully deserve) or fulfil their needs for social recognition via food (drink a Fanta and you’ll be popular with your friends).
Private companies essentially resort to the tactic that the fox employed to get the cheese out of the crow’s mouth. By pretending that consumers/users are full of resources, smart and informed, it allows them to better manipulate them into forfeiting control over their data, consenting to unfair terms of service or extracting large sums of money through unethical business models such as “free to play” or “in app purchases”.
The same logic prevails in the issue of advertising to children. Advertisers happily provide “educational material” via programmes like MediaSmart to children and flatter children’s intelligence and resilience to advertising only to overwhelm them with more insidious advertising techniques.
That being said, education is always a necessity in and of itself, regardless of any specific policy objectives, but a balance must be struck between the need to educate/inform/empower and the necessity to protect and shape the environment to be as conducive as possible to positive experiences for all. Education should never be a substitute for necessary policy/regulatory measures.
A provocative metaphor would be the following: should we put more focus on training people how to avoid mines in a mine field or focus on demining the field?
The Internet of Penguins?
All of this boils down to a simple yet very complex question: what kind of Internet do we want? The Internet is said to belong to “no one”, and in the eyes of many it is the “ultimate” incarnation of a public good. At the same time, throughout the Internet’s history, there have been many threats to its openness. At one point, it wasn’t certain whether users would be able to use Internet browsers for free! Before Microsoft released Internet Explorer for free, Internet Browsers were commercial (see Netscape’s history). The Internet has been and still is at risk of government control be it via censorship, blocking and filtering in countries like China or to a lesser extent Turkey, or via control of its governance (the US has had disproportionate control over Internet governance).
Nowadays, it is at risk of corporate capture via “de facto” monopolistic tendencies of online platforms like YouTube, Facebook and Amazon, selective display and filtering of information (control over discoverability) with search engines like Google or control over its core infrastructure (the Internet backbone) by private Telecommunication giants or even companies like Facebook (see their internet.org project).
As a society, we need to decide collectively how much of the Internet do we want to “keep” as a public good, independent of commercial interest: websites like Wikipedia, Open Source software, not for profit initiatives like the popular coding platform for kids, Scratch and many more. To make a comparison, we can think about the Internet as a planet which is slowly being populated. How much of the planet will be covered with shopping malls and how much with public parks and natural reserves? Do we want public parks to be crowd-funded and maintained by the community as a whole (like a Wikipedia) or do we want to privatize them, letting private companies manage these spaces while posting billboards on park benches and stamping adverts on trees?
Looking at piracy statistics and user behaviour online, it would seem that the unformulated answer is quite clear: users do consider the Internet predominantly as a public good, expecting to enjoy all of its content for “free”, a behaviour which has caused massive disruptions in the business models of industries such as music labels. A recent debate is raging around the access to knowledge and academic papers, sparked by the emergence of the search engine Sci-hub. All of these developments are directly linked to various scenarios: a stubborn pursuit of the current intellectual property and copyright laws where right holders will engage in an eternal witch hunt against piracy, the development of alternative business models such as unlimited subscription based models such as Netflix, or even the generalization of concepts like the “universal salary”, currently under experimentation in Finland, which would allow people to work for the community without having to worry about their pay check.
All of this seems far from the debate about “child safety”, but it will have a much deeper impact on children’s online experience than the addition of a new safety feature or reporting button on their favorite (unique) social network.
For more information about the event, visit the LSE blog
On the 22nd and 23rd of September, German and Polish Safer Internet Centers jointly held another edition of their major Safer Internet Conference in Warsaw, Poland. The conference revolved around several key topics such as privacy, sexuality, risky content, data ethics, cyberbullying and inappropriate online behaviours. Many key issues were touched upon such as:
-The growing use and exploitation of private data as a business model and the dawn of private data as a currency to pay for online content.
-The impact of sexting and exposure to sexually explicit content on children’s and young people’s sexuality, which borders on cyberbullying as well.
-How to secure your right to privacy online and some concrete tips to keep your data safe from unethical uses.
COFACE, represented by Martin Schmalzried, sat on a panel discussion dedicated to challenges and visions concerning child safety online in the present and in the future. The three main points addressed by COFACE were related to cyberbullying.
How to reach out to parents that are unaware of cyberbullying?
Some parents will always be left out and feel helpless when it comes to dealing with cyberbullying. Just like other topics such as sexuality, not all parents feel ready to discuss certain topics for a variety of reasons. Schools and teachers remain the best way to ensure a common knowledge and awareness about issues such as cyberbullying. That was why a universal schooling system was set up in the first place: to level the playing field and give each child the same chances in life through education. However, this is no reason to give up on parents and we should always try to reach out to them to make them feel more concerned and involved about issues such as cyberbullying. Examples include:
-Organising parents evenings in schools or through organisations such as family associations.
-Presenting them with easy tools and steps to protect their children online.
-Information campaigns via magazines and newsletters from family associations or the provision of easy tools and multimedia resources such as those delivered by the #DeleteCyberbullying project.
How do you explain the difference in awareness about cyberbullying between EU countries?
It all has to do with cultural differences and the environment. For instance, in some countries such as the Scandinavian countries, topics like sexuality, violence or gender roles are openly discussed by the wider public, while in other countries such as the southern Member States, these topics are much less “taboo”. Such cultural differences, among many other factors, may explain the differences in attitudes towards an issue like cyberbullying. For instance, in COFACE’s awareness raising video about cyberbullying, we have received many comments implying that cyberbullying is not such a tragic issue, after all, it’s “just” a few online words that you can easily ignore, especially if “you are a man”.
Parents often don’t come to parent evenings at school. How can they be more inclined to come?
There are many strategies for securing parents’ participation but we would like to put the focus on work-life balance. Parents and teachers are living busy lives. With both parents working, there is little time left for parenting, personal activities, social activities and household responsibilities. Securing a better work-life balance would enable parents to have more time to attend parent evenings and get more involved in their parenting, including digital parenting. COFACE has carried out a full campaign last year on work-life balance.
Are you a worried parent, fearing your child may be cyberbullyied or cyberbullying someone?
Or a teacher who wants to explore the topic of cyberbullying in class?
Are you a teenager who has received some nasty text messages or witnessed cyberbullying?
Download our free, interactive app, that contains:
– An interactive quiz for teenagers, parents and teachers that displays customized feedback based on the responses to the quiz and redirects the user to the most relevant information sources, material or help in case a user has experienced cyberbullying.
– A quiz to test your knowledge about cyberbullying and the internet in general, with the possibility to share your score on Facebook and get more information about cyberbullying.
– A “one touch” button for help in case the user is in need of direct assistance.
– An awareness raising video embedded in the app (english) or on Youtube (multiple languages available).
– A survey for teachers to help better understand their experience and expectations regarding cyberbullying.
– A section with more information about the project and the app.
Read more: goo.gl/9dLqhL
The European Parents Association (EPA) held a conference in Lisbon on 4 April on “Challenges for parents in the digital age”. The aim of the conference was to explore the various challenges for teenagers, parents and grandparents alike in an increasingly connected and digitalized world. Some of these challenges include: social networking and privacy, cyberbullying, the “digital divide” between generations, children’s rights, education in the digital age, digital skills and entrepreneurship, parent training and democracy. COFACE was invited to present our #DeleteCyberbullying project and children’s rights online during one of the afternoon workshops.
The key message of the presentation is that some children’s rights online are better implemented than others. While the right to freedom of speech is relatively well protected online including for children, other rights such as article 17 of the UNCRC on the role of mass media and access to quality content or article 31 on the right to rest and leisure appropriate to the age of the child along with the right to participate fully in the cultural and artistic life are far from being fulfilled.
The internet has indeed enabled children to express themselves quite freely via a plethora of online services (social networks, chat forums, websites, video platforms etc). At the same time, their exposure to inappropriate content has grown steadily and the various stakeholders responsible for online content (especially governments and online content/service providers) have not done enough to ensure that children have access to “information and material from a diversity of national and international sources, especially those aimed at the promotion of his or her social, spiritual and moral well-being and physical and mental health” (to quote the UNCRC).
Additionally, children have very little influence on the “cultural and artistic life” online. They are seen as present and future consumers, targeted by advertising and encouraged to spend money online via games or other activities where their participation is most often passive rather than pro-active.
For more information about the EPA conference and COFACE’s presentation of children’s rights online and cyberbullying, please visit the EPA conference article here