Facebook says it is not dead. Facebook also wants you to know that it is not just for “old people,” as young people have been saying for years.
Now, with the biggest thorn in its side — TikTok — facing heightened government scrutiny amid growing tensions between the U.S. and China, Facebook could, perhaps, position itself as a viable, domestic-bred alternative.
There’s just one problem: young adults like Devin Walsh have moved on.

Michael Dwyer, Associated Press
The Facebook logo is seen on a cell phone Oct. 14, 2022, in Boston.
“I don’t even remember the last time I logged in. It must have been years ago,” said Walsh, 24, who lives in Manhattan and works in public relations.
Instead, she checks Instagram, which is also owned by Facebook parent company Meta, about five or six times a day. Then there’s TikTok, of course, where she spends about an hour each day scrolling, letting the algorithm find things “I didn’t even know I was interested in.”
Walsh can’t imagine a world in which Facebook, which she joined when she was in 6th grade, becomes a regular part of her life again.
“It’s the branding, right? When I think of Facebook, I think ugh, like cheugy, older people, like parents posting pictures of their kids, random status updates and also people fighting about political issues,” Walsh said, using the Gen Z term for things that are definitely not cool.
The once-cool social media platform born before the iPhone is approaching two decades in existence. For those who came of age around the time Mark Zuckerberg launched thefacebook.com from his Harvard dorm room in 2004, it’s been inextricably baked into daily life — even if it’s somewhat faded into the background over the years.

Jenny Kane, Associated Press
An iPhone displays the Facebook app in 2019 New Orleans.
Facebook faces a particularly odd challenge. Today, 3 billion people check it each month. That’s more than a third of the world’s population. And 2 billion log in every day. Yet it still finds itself in a battle for relevancy, and its future, after two decades of existence.
For younger generations — those who signed up in middle school, or those who are now in middle school, it’s decidedly not the place to be. Without this trend-setting demographic, Facebook, still the main source of revenue for parent company Meta, risks fading into the background — utilitarian but boring, like email.
It wasn’t always like this. For nearly a decade, Facebook was the place to be, the cultural touchstone, the thing constantly referenced in daily conversations and late-night TV, its founding even the subject of a Hollywood movie. Rival MySpace, which launched only a year earlier, quickly became outdated as the cool kids flocked to Facebook. It didn’t help MySpace’s fate that it was sold to stodgy old News Corp. in 2005.
“It was this weird combination…no one knew how technology worked, but in order to have a MySpace, we all needed to become mini coders. It was so stressful.” said Moira Gaynor, 28. “Maybe that’s even why Facebook took off. Because compared to MySpace it was this beautiful, integrated, wonderful engagement area that we didn’t have before and we really craved after struggling with MySpace for so long.”
Positioning himself a visionary, Zuckerberg refused to sell Facebook and pushed his company through the mobile revolution. While some rivals emerged — remember Orkut? — they generally petered out as Facebook soared, seemingly unstoppable despite scandals over user privacy and a failure to address hate speech and misinformation adequately. It reached a billion daily users in 2015.
Debra Aho Williamson, an analyst with Insider Intelligence who’s followed Facebook since its early days, notes that the site’s younger users have been dwindling but doesn’t see Facebook going anywhere, at least not any time soon.
“The fact that we are talking about Facebook being 20 years old, I think that is a testament of what Mark developed when he was in college. It’s pretty incredible,” she said. “It is still a very powerful platform around the world.”
AOL was once powerful too, but its user base has aged and now an aol.com email address is little more than a punchline in a joke about technologically illiterate people of a certain age.
Tom Alison, who serves as the head of Facebook (Zuckerberg’s title is now Meta CEO), sounded optimistic when he outlined the platform’s plans to lure in young adults in an interview with The Associated Press.
“We used to have a team at Facebook that was focused on younger cohorts, or maybe there was a project or two that was dedicated to coming up with new ideas,” Alison said. “And about two years ago we said no — our entire product line needs to change and evolve and adapt to the needs of the young adults.”
He calls it the era of “social discovery.”
“It’s very much motivated by what we see the next generation wanting from social media. The simple way that I like to describe it is we want Facebook to be the place where you can connect with the people you know, the people you want to know and the people that you should know,” Alison said.
Artificial intelligence is central to this plan. Just as TikTok uses its AI and algorithm to show people videos they didn’t know they wanted to see Facebook is hoping to harness its powerful technology to win back the hearts and eyeballs of young adults. Reels, the TikTok-like videos Facebook and Instagram users are bombarded with when they log into both apps, are also key. And, of course, private messaging.
“What we are seeing is more people wanting to share reels, discuss reels, and we’re starting to integrate messaging features back into the app to again allow Facebook to be a place where not only do you discover great things that are relevant to you, but you share and you discuss those with people,” Alison said.
-
Holocaust survivors, descendants join forces on social media
Anna Moneymaker // Getty Images
In February 2023, two cases were argued in front of the Supreme Court that could change how social media is used forever. The first case was Twitter v. Taamneh, in which SCOTUS is being asked to decide whether a social media company's general knowledge that terrorist content is on its site is enough to determine whether the company aided and abetted terrorist actions. This case challenges how Section 2333 of the Anti-Terrorism Act is currently being interpreted.
The second case, Gonzalez v. Google, challenges protections provided by Section 230 of the Communications Decency Act, one of the few laws that directly address free speech online. The case asks whether social media companies can be held liable for the information hosted on their sites and, more specifically, the information their algorithms recommend to users.
Decisions for both cases are expected to be handed down sometime during the summer of 2023. In advance of those announcements, Stacker spoke with expert Aaron Mackey of the Electronic Frontier Foundation and, consulting government sources, investigated how Supreme Court decisions and changes to Section 230 might impact social media companies, their users, and people's ability to disseminate sensitive information.
You may also like: History of the US justice system

Anna Moneymaker // Getty Images
In February 2023, two cases were argued in front of the Supreme Court that could change how social media is used forever. The first case was Twitter v. Taamneh, in which SCOTUS is being asked to decide whether a social media company's general knowledge that terrorist content is on its site is enough to determine whether the company aided and abetted terrorist actions. This case challenges how Section 2333 of the Anti-Terrorism Act is currently being interpreted.
The second case, Gonzalez v. Google, challenges protections provided by Section 230 of the Communications Decency Act, one of the few laws that directly address free speech online. The case asks whether social media companies can be held liable for the information hosted on their sites and, more specifically, the information their algorithms recommend to users.
Decisions for both cases are expected to be handed down sometime during the summer of 2023. In advance of those announcements, Stacker spoke with expert Aaron Mackey of the Electronic Frontier Foundation and, consulting government sources, investigated how Supreme Court decisions and changes to Section 230 might impact social media companies, their users, and people's ability to disseminate sensitive information.
You may also like: History of the US justice system

-
Holocaust survivors, descendants join forces on social media
Chinnapong // Shutterstock
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
That sentence from Section 230 sums up the gist of the law, which was passed in 1996. Essentially, the host of a website online—be it a small blog or tech giant—cannot be held liable for the content published on that site by other people or entities. For example, if an individual posts a libelous comment on Facebook, that individual can be sued for libel, but Facebook's parent company, Meta, cannot.
Currently, this protection of information hosts allows for increased free speech online. Website hosts, such as social media companies, don't have to worry about being legally liable for their users' actions, giving users more leeway with the type of content they can post. This protects not only the ability for people to use social media as they please but, more consequentially, encourages the widespread sharing of important information that may be sensitive. This could include first-person accounts of protests, reporting about acts of terrorism, discussions about police brutality, and much more.
Chinnapong // Shutterstock
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
That sentence from Section 230 sums up the gist of the law, which was passed in 1996. Essentially, the host of a website online—be it a small blog or tech giant—cannot be held liable for the content published on that site by other people or entities. For example, if an individual posts a libelous comment on Facebook, that individual can be sued for libel, but Facebook's parent company, Meta, cannot.
Currently, this protection of information hosts allows for increased free speech online. Website hosts, such as social media companies, don't have to worry about being legally liable for their users' actions, giving users more leeway with the type of content they can post. This protects not only the ability for people to use social media as they please but, more consequentially, encourages the widespread sharing of important information that may be sensitive. This could include first-person accounts of protests, reporting about acts of terrorism, discussions about police brutality, and much more.
-
-
Holocaust survivors, descendants join forces on social media
JIM WATSON/AFP // Getty Images
EFF's Aaron Mackey did not mince words on the swiftness and surety of the immediate impact, telling Stacker, "The rules of the internet will be written by lawyers concerned about liability, rather than by the platforms trying to provide diverse and new ways to allow users to create content."
This shift in perspective, as well as legal bearing, concerning content curation could have widespread impacts on the end-user experience. According to Mackey, this might include more posts being taken down, increased content warnings or flags, delays in posts becoming public so they can be reviewed in-depth, and a large share of the type of content that currently lives on social media not being allowed at all.
JIM WATSON/AFP // Getty Images
EFF's Aaron Mackey did not mince words on the swiftness and surety of the immediate impact, telling Stacker, "The rules of the internet will be written by lawyers concerned about liability, rather than by the platforms trying to provide diverse and new ways to allow users to create content."
This shift in perspective, as well as legal bearing, concerning content curation could have widespread impacts on the end-user experience. According to Mackey, this might include more posts being taken down, increased content warnings or flags, delays in posts becoming public so they can be reviewed in-depth, and a large share of the type of content that currently lives on social media not being allowed at all.
-
Holocaust survivors, descendants join forces on social media
JIM WATSON/AFP // Getty Images
If protections for social media companies wane, they will likely change their moderation tactics to filter out any information that may cause a lawsuit. As a result, it may become harder to share information about sensitive topics.
As Mackey explained: "[We're] talking about specifically news events, journalism, basic sharing of real-world, horrible things that have happened. We're talking about the ability of human rights organizations to document atrocities because all of those could be construed as … distributing the speech of a terrorist organization or otherwise aiding and abetting [instead of] having their message come across that someone is reflecting and documenting the reality of the fact that there are terrorist organizations in our world and that they commit horrible acts of terror."
JIM WATSON/AFP // Getty Images
If protections for social media companies wane, they will likely change their moderation tactics to filter out any information that may cause a lawsuit. As a result, it may become harder to share information about sensitive topics.
As Mackey explained: "[We're] talking about specifically news events, journalism, basic sharing of real-world, horrible things that have happened. We're talking about the ability of human rights organizations to document atrocities because all of those could be construed as … distributing the speech of a terrorist organization or otherwise aiding and abetting [instead of] having their message come across that someone is reflecting and documenting the reality of the fact that there are terrorist organizations in our world and that they commit horrible acts of terror."
-
-
Holocaust survivors, descendants join forces on social media
chrisdorney // Shutterstock
Though it's impossible to know the exact outcome of a decision before it is made, Mackey explained that the liability for content may fall on the moderators of specific groups on sites like Reddit and Mastodon. In response to Elon Musk's purchase of Twitter, many users have migrated to such platforms to build their communities outside of Twitter's volatility, causing an unexpected burst of growth. However, because these platforms are not centralized, moderators receive no training or compensation for their work.
Moreover, in the wake of decreased protections, these moderators could be sued if inappropriate or potentially dangerous content was found on their servers. "What are the incentives to even continue being a moderator for Reddit or to spin up a server on Mastodon?" Mackey posited. "I think there are legitimate questions about whether someone will feel like it's worth their risk to do that without taking steps and having resources that make [the platforms] look more like a centralized, large-scale platform."
chrisdorney // Shutterstock
Though it's impossible to know the exact outcome of a decision before it is made, Mackey explained that the liability for content may fall on the moderators of specific groups on sites like Reddit and Mastodon. In response to Elon Musk's purchase of Twitter, many users have migrated to such platforms to build their communities outside of Twitter's volatility, causing an unexpected burst of growth. However, because these platforms are not centralized, moderators receive no training or compensation for their work.
Moreover, in the wake of decreased protections, these moderators could be sued if inappropriate or potentially dangerous content was found on their servers. "What are the incentives to even continue being a moderator for Reddit or to spin up a server on Mastodon?" Mackey posited. "I think there are legitimate questions about whether someone will feel like it's worth their risk to do that without taking steps and having resources that make [the platforms] look more like a centralized, large-scale platform."