We’ve all heard the stories about kids spending too much time with their devices or, worse, spending that time doing things online that they probably shouldn’t be doing. It’s been an issue since I started writing about kids and tech in the early 90s but with the proliferation of phones and tablets — sometimes now getting into the tiny hands of toddlers — the issue has the attention of the media, policy makers and the public. And, as AP writer Michael Liedtke recently reported, even tech luminaries like Instagram co-founder Kevin Systrom worry about their kids’ use of technology. “Obviously, like anything — whether it’s food, or drink — moderation is key,” Systrom told the AP. Liedtke also interviewed Urban Airship executive Mike Herrick who, “sees his 13-year-old daughter getting lost in her smartphone and wonders: Is technology messing with children’s brains, even as it enlightens and empowers them in ways that weren’t possible when his generation grew up?”
The message about the potential overuse and misuse of technology has reached the campuses of major tech companies including Google, Facebook, Apple, Amazon and Microsoft — all of which have made some moves to help parents moderate their kids’ (and their own) use of tech as well as providing internet safety programs either directly or in partnerships with non-profit groups and schools.
I serve as CEO of one of these internet safety organizations — ConnectSafely.org — which has worked with (and received financial support) from Facebook, Google, Microsoft, Snapchat and other companies to provide resources for parents, educators, teens, seniors and others on how to best manage the tech onslaught. And all of these companies, to some degree, have launched programs or products aimed at helping parents guide their children’s use of tech.
These efforts fall into two categories:
- Actual products and services for children that come with parental controls and limits on the content available to kids
- Educational programs aimed at kids or families
Why are companies doing this?
I’ll get into the specifics of these products and services shortly but first a word on motivation. Skeptics have claimed that it’s an effort to entangle children into their web, not only teens who have traditionally used these services but even younger children who are now being reached by some products and educational programs. The goal, some say, is to profit off of young children, get them hooked on their brand and eventually migrate these children into other services after they reach an appropriate age. There are also charges that these companies offer safety education merely as a band-aid to mitigate but not prevent the alleged harm that they do.
While it’s tempting to latch on to these and other criticisms, it’s also important to understand the dynamics of what goes on inside these companies where employees and even executives wrestle with the unintended consequences of their products and services. There is always tension between wanting to expand offerings and find ways to make them safe and appropriate. There are also tensions between wanting to serve younger children and finding ways for them to be appropriate and in compliance with federal laws such as the Children’s Online Privacy Protection Act (COPPA). Also, many of the employees of these companies are parents themselves, so there is a growing and sincere awareness of the need to provide products that are useful but also appropriate and safe. Plus, these companies operate under a spotlight so, even if they didn’t have their own concerns, they are smart enough to know that how they market to children will be noticed by media and policy makers.
There is also the reality that whether the companies encourage it or not, younger children are going to get their hands on their products, often with the knowledge or even cooperation of their parents. It’s an older study but I have no reason to believe that the findings of danah boyd and other researches from 2012 don’t still hold true for users of kid-tempting services like YouTube, Snapchat and Instagram. Their study looked at Facebook which, these days, is not so popular with young children but the concept remains. The researchers found that there were millions of children under 13 on Facebook at the time and that “Almost three-quarters (74%) of parents whose child is on Facebook and who reported a minimum age knew that their child was on Facebook below what they believed the minimum age to be,” and that most parents actually helped their kid get on Facebook. Again, Facebook may no longer be so tempting for young children but other services are.
In the New York Times, Judy Ketteler asks “When is a Child Instagram-Ready?” and admits that she helped her son set-up a Google account to post YouTube videos about “flippers.” Quoting experts, she recommends that parents have conversations with their kids about tech use including “what’s appropriate for them to post.” Her article also recommends that parents set up an agreement with their kids. ConnectSafely has a page with Family Contracts & Pledges for young children, teens and parents.
The same is true for YouTube, where many children and a high percentage of teens go to look at age-appropriate videos even though the rules prohibit anyone under 18 from using the site. That reality is what prompted YouTube to launch YouTube Kids so that children could find that appropriate content without bumping into content that’s not suited for young children.
As co-founder and CEO of ConnectSafely, I’ve been working with many of these companies for years and I can say with absolute certainty that they all employ people who spend most of their time thinking about how to make sure their products are safe and appropriate. That doesn’t mean that these employees always get their way. As I said, there are always going to be tensions within companies and forces that try to expand offerings along with others who try to apply the breaks.
Products and services
Products like Facebook’s Messenger Kids, Amazon FreeTime and kids’ versions of their Fire Tablet and Echo speaker and Google’s YouTube Kids and Family Link are often the ones that get the most attention, including from critics who argue that they’re designed to hook kids and bind them to a brand or that they may not have the necessary privacy or content controls. But lost in these criticisms is the reality that these services can provide welcome entertainment and educational value to their users and offer parents an alternative to products that may not be suitable for kids. Its arguably like allowing kids to watch TV shows tailored for the needs and interests of children vs. shows designed for older audiences.
Google’s Family Link enables young children to use a smartphone with a companion app that empowers parents to control what kids can do with their phone. The app allows parents to set-up, manage and monitor a Google account and Android device for their child so that young children can take advantage of the benefits of having a smartphone or tablet. There are numerous controls including the ability to set limits on apps including how long kids can use them. SafeSearch is on by default and parents can set screen time limits, including the amount of time or the time(s) when a child can and can’t use the device. (The device will always be able to send or receive calls).
Google has also built screen time controls into the latest version of Android. The Digital Well-Being controls — which are available for users of all ages — enables the user to view a dashboard with a daily or hourly view of the time spent on the phone, how frequently they use different apps, and how many notifications they get. They can also use it to reduce interruptions, set a bedtime schedule, which, among other things, turns the display to gray. Apple has launched a similar set of features on its latest version of iOS.
Google’s YouTube Kids app provides access to a subset of YouTube deemed suitable for children. There was some initial criticism of the app for allowing blatantly commercial content to slip in so the company added curation so that human moderators now review content before it’s accessible on the app.
Facebook’s Messenger Kids gives kids ages 6 to 12 the ability to engage in conversations as well as exchange messages, videos and images with parent-approved friends and family. The app also allows kids to send photos, videos and text messages to approved adults, who receive the messages on their Facebook Messenger app. Kids also have access to a pre-approved library of stickers, GIFs, masks, frames and drawing tools to decorate their content. There is no advertising on the app and Facebook says that it won’t migrate kids to other products as they get older. There are numerous parental controls, including requiring parents to approve any new contacts.
Amazon sells special kid-friendly versions of the company’s Amazon Fire tablets and its Echo Dot speakers that offer parental controls, lots of extra kid content and even a special “no questions asked” two-year warranty to protect them from child-induced damage. Amazon also offers FreeTime Unlimited, a monthly subscription that offers thousands of content titles for children ages 3 to 12 years old. Here’s my interviewwith Kurt Beidler who heads up Amazon FreeTime and the kids’ product division.
Microsoft has parental control tools for many of its products including Windows and Xbox. Its Microsoft Family service offers features like screen limits, activity reporting, locational sharing and content restriction across Microsoft products.
Family education programs
As I mentioned, all of these companies offer consumer education programs, including programs aimed at parents and children. My nonprofit, ConnectSafely, has helped develop programs offered by Google, Facebook, Snap and other companies.
Google’s Be Internet Awesome “teaches kids the fundamentals of digital citizenship and safety so they can explore the online world with confidence,” according to Google. The program consists of a curriculum, an educators’ guide and a game that kids can play to learn basic lessons on privacy, safety and security. A Family Link Guide for Parents provides tips on screen time, content consumption and managing settings. There is also an interactive game called Interland, which teaches young children lessons on such things as real vs. fake content, sharing with care, ways to “secure your secrets,” and kindness. Google also offers programs to educators, including hands-on workshops with participation from ConnectSafely.
As I mentioned, these companies also provide support for non-profit education programs including ConnectSafely, Family Online Safety Institute, Childnet International and other groups based throughout the world. They also have their own safety portals including these from Facebook Google, Snapchat, Twitter and Instagram.
It’s all about family communication
While I applaud any sincere safety effort by tech companies to provide policies and services to make their products safer and more age appropriate, the most powerful force when it comes to protecting children remain parents as well as the internal controls that children develop to protect themselves. Companies can create software to try to protect children, but as I’ve said for more than 30 years, the best protection software runs on the computer between their ears. Even if a Facebook or Google could prevent kids from inappropriately using their services, they can’t protect kids from other apps and other dangers that will confront them in life. But parents can and so can the kids themselves. It takes education and it takes communication and it’s not a one-shot deal. You have to bring up these issues from time to time, not as a lecture but as a conversation.