When she was in fifth grade, Scarlett Goddard Strahan started to worry about getting wrinkles.

By the time she turned 10, Scarlett and her friends were spending hours on TikTok and YouTube watching influencers tout products for achieving today’s beauty aesthetic: a dewy, “glowy,” flawless complexion. Scarlett developed an elaborate skin care routine with facial cleansers, mists, hydrating masks and moisturizers.

One night, Scarlett’s skin began to burn intensely and erupted in blisters. Heavy use of adult-strength products had wreaked havoc on her skin. Months later, patches of tiny bumps remain on Scarlett’s face, and her cheeks turn red in the sun.

“I didn’t want to get wrinkles and look old,” says Scarlett, who recently turned 11. “If I had known my life would be so affected by this, I never would have put these things on my face.”

The skin care obsession offers a window into the role social media plays in the lives of today’s youth and how it shapes the ideals and insecurities of girls in particular. Girls are experiencing high levels of sadness and hopelessness. Whether social media exposure causes or simply correlates with mental health problems is up for debate. But to older teens and young adults, it’s clear: Extended time on social media has been bad for them, period.

  • greenskye@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    18 days ago

    Which is also a problem because we can’t have adult spaces either. Every time someone tries, they get shut down or all attempts to keep kids out are fruitless. At this point I think everyone would benefit from robust ways of enforcing age limits online.

    Personally I think this needs to be at the device level. You can register a device as: child, teen, adult. Every website can query the device age group. The device age is set by a process that verifies ID through a trusted party. Only that party knows your identity, everyone else simply knows your age group. Child and teen devices would be tied to an adult account and only they could override or update the classification (or a valid adult ID works too).

    Then it would put liability on the parent for allowing their kids access to adult content. Websites not checking for this info that abuse it can be shut down.

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      18 days ago

      No, the people targeting children in the adverts and entertainment should face criminal prosecution.

      They know they’re targeting children, they want to target children and they already use methods to attempt to get over what protections are in place.

      Google have expressly told advertiser, that they can target children is they go after unknown users.

      The only people watching most of the content are children and the mentally handicapped. Most adults would find it too annoying. The people creating it know this. Prime drinks are an example of this, the groups associated with it regularly discuss topic and use humour that inappropriate for children and often plays with sexist, racist and intolerant themes. They wanted to sell alcoholic drinks with their branding, but realised there was no market for it because most of their viewers are under 12.