Larry Magid: Internet Governance Forum focuses on protecting kids

I traveled to Lillestrøm, Norway, in late June for the annual United Nations Internet Governance Forum (IGF). Unlike many UN gatherings that primarily involve government officials, the IGF is a multistakeholder event, bringing together representatives from governments, universities, advocacy groups and the tech industry from around the world. I attended on behalf of ConnectSafely, the nonprofit internet safety organization where I serve as CEO.

Related Articles


Magid: The disappearing off switch


Magid: Designing digital safety with youth in mind


Magid: Using ChatGPT for everyday tasks


Magid: Make sure you prep all your tech for travel


Larry Magid: A laptop built for AI

Although it doesn’t pass resolutions or make binding decisions, the IGF serves as a space for dialogue and collaboration. As someone who has long worked at the intersection of technology, policy and online safety, I find it an invaluable opportunity to share insights, contribute to the conversation and learn from policymakers, researchers and fellow advocates.

Algorithm impact

I was excited about a panel titled “Securing Child Safety in the Age of the Algorithms,” an important topic that deserves serious consideration.  But the first speaker, Leanda Barrington-Leach of the FiveRights Foundation, painted an overly pessimistic picture, arguing that algorithm-driven platforms are not just risky for children but downright engineered to harm them. She alleged a slippery slope where a child can go from a “simple search for slime to porn in just a single click, or from trampolining to pro-anorexia in just three clicks, and nudge to self-harm in 15 clicks.” I suppose that might be possible, but I actually couldn’t replicate either when I tried using Google.

She wasn’t alone in sounding the alarm. The panel featured representatives from UNICEF, the European Commission and senior government officials from Norway and Sierra Leone, some of whom laid out a sobering portrait of what they view as a public health crisis: Children drawn into dangerous digital spaces by algorithms designed to maximize engagement, often at the expense of well-being.

Concerns lack nuance

To be fair, some of the concerns raised are legitimate. There’s no denying that children are exposed to inappropriate content and that features like infinite scroll and autoplay can lead to overuse. Algorithms, designed to serve up content the system believes you want, can reinforce bad habits and encourage repetitive consumption. But they can also enhance discovery, making platforms more engaging and useful. As I listened, I couldn’t help but feel that the narrative lacked nuance and gave parents and children too little credit. I’m not nearly as pessimistic as some of the panelists.

The reality is that most kids, especially teens, are savvy enough to avoid these pitfalls, and many parents use parental control tools or enforce family rules to help their children avoid these dangers. Although users may not have full control, there are often ways to tweak the algorithms as we point out in ConnectSafely’s new guide on Taking Control of Your Instagram Feed.

Yes, bad things can happen, but the vast majority of young people don’t have horrific experiences online. Unfortunately, many people encounter annoying scams, and teens can be inundated with images of seemingly perfect lives and beautiful people, which, if internalized, can lead to the trap of “compare and despair.”

Dwelling only on the possible but relatively unlikely horrific outcomes would be like a pediatrician focusing on rare life-threatening diseases rather than common childhood illnesses.

It’s not a perfect metaphor, but using online services, including social media, is a bit like participating in sports. They offer significant benefits but come with inherent risks. Millions of children play sports with overwhelmingly positive outcomes, despite the occasional scraped knee, or in rare and tragic cases, serious injuries and even death. A panel on bicycle safety could dwell on horrific accidents or highlight the physical and mental benefits of cycling along with common sense precautions like wearing helmets and watching out for cars.

Youth sports organizations work hard to make games as safe as possible, and tech companies should be held to that same standard. Although I agree there is more to do, including changing the way some of the algorithms work to keep people online longer, I can say from direct experience that the safety teams at Meta, TikTok, Snap, Discord, Roblox and other companies that work with ConnectSafely are constantly seeking ways to make their platforms safer for young users.

Industry weighs-in

The panel did include representatives from TikTok and Roblox who, as you might expect, took a different tone.

Christine Grahn, head of Public Policy, TikTok Europe, described TikTok’s “safety-by-design” approach, agreeing that many features for minors should be off by default. She pointed out that teen accounts are private by default and that teens under 16 can’t access direct messaging or group chats, and their videos won’t appear in the For You feed. ConnectSafely’s Parent’s Guide to TikTok describes the safeguards for teens.

Roblox director of innovation, Emily Yu, emphasized that “safety is at the heart of everything we pretty much do at Roblox,” noting that every new product feature is evaluated through a safety-by-design lens. She highlighted the company’s recently announced, “robust parental controls,” including screen time limits and content labeling to help parents better understand and manage the experiences available to their children. “Parents have awareness as to what an experience holds,” she explained, “and they can obviously permit or not permit their child from entering that experience.” Yu also addressed the role of algorithms on the platform, saying Roblox focuses more on “discoverability rather than limiting the content that is seen by the child based on personalization.” You can learn more at ConnectSafely’s recently updated Parent’s Guide to Roblox.

UNICEF’s Thomas Davin compared the impact of algorithmic harm with tobacco and alcohol, invoking neuroplasticity, screen addiction and even the erosion of truth. It’s an argument I’ve heard before, but I’d argue that it’s not the full picture. Yes, there are teens who overuse TikTok and other platforms, but there are also many who use them to learn new skills, express creativity and engage in activism.

Davin also correctly pointed out that “We have a risk of children feeling less and less able to have voice and agency on how those technologies affect them, impact them, and maybe direct some of what they have access to or what they can say.”

Cultural erasure

One of the takeaways from this and other panels was the concern, voiced especially by participants from outside North America and Europe, about how social media platforms are dominated by U.S. and European interests. Sierra Leone’s Minister of Science, Technology and Innovation, Salima Bah, reminded attendees that “a significant portion of internet traffic in Sierra Leone flows through TikTok,” and expressed concern that algorithmic systems too often fail to reflect African identities. She warned of cultural erasure when platform design decisions are made without grounding in local context.

Youth participation

One of the panel’s most encouraging themes was the call for meaningful youth participation in digital governance. Both TikTok and Roblox highlighted their global youth councils, which provide input on product and policy decisions. Yet, in a telling irony, not a single child or teen was present in the room.

Related Articles


California homeowners flatten asking prices as listings jump 50%


More small businesses starting to go cashless


Cal Fire rolled out an AI chatbot. Don’t ask it about evacuation orders


Why Gov. Newsom backtracked on ordering state workers back to the office


Caltech settles class-action lawsuit accusing it of misleading students about cybersecurity bootcamp

Larry Magid is a tech journalist and internet safety activist. Contact him at [email protected].

Leave a Reply

Your email address will not be published. Required fields are marked *