BERLIN — Assia Gorban was 7 years old when the Germans occupied her hometown of Mogilev-Podolsky in Ukraine. The Jewish girl and her family were first imprisoned in a ghetto on the outskirts of town and later forced onto a cattle car that took them to the Pechora concentration camp in 1941.

Michele Tantussi, Associated Press
Holocaust survivor Assia Gorban, left, and her granddaughter Ruth Gorban pose during an interview with The Associated Press on April 3 in Berlin, Germany
After a few failed attempts, Gorban, her mother, and younger brother managed to escape in 1942, and spent the rest of World War II living under false identities until they were liberated in 1944.
Sitting in her apartment in Berlin, where she still lives on her own at age 89, Gorban vividly remembers the horrendous details of her time in the camp and during hiding from the Nazis who wanted to kill her because she was Jewish.
She likes to share her memories with her granddaughter, 19-year-old Ruth Gorban, a university student, who also lives in Berlin and visits her frequently at home.
“My grandmother is amazing,” said Ruth, sitting next to Gorban on the couch. “I even invited her to my school, so that everyone in my class could hear from her personally about the Holocaust.”
Both Assia and Ruth also participated in the new digital campaign called “Our Holocaust Story: A Pledge to Remember,” which was launched by the New York-based Conference on Jewish Material Claims Against Germany, also referred to as the Claims Conference.

Michele Tantussi, Associated Press
Holocaust survivor Assia Gorban shows a picture of her family during an interview with The Associated Press on April 3 in Berlin, Germany. Over 100 Holocaust survivors and their descendants are participating in a new social media campaign that illustrates the importance of passing on the Holocaust survivors’ testimonies as their numbers dwindle.Â
Six million Jews and people from other groups were murdered by the Nazis and their henchmen during the Holocaust.
Today, about 240,000 survivors are still alive, living in Europe, Israel, the U.S. and elsewhere.
The campaign by the Claims Conference features survivors and their descendants from around the globe and illustrates the importance of passing on the Holocaust survivors’ testimonies to younger family members as the number of survivors dwindles.
“We are doing this new social media campaign because survivors are dying,” said Greg Schneider, the executive vice president of the Claims Conference.
“The stories that they hold, the wisdom and knowledge that they can share is too important, too vital for society, particularly in these challenging times, to let it die with them,” Schneider said in a phone interview from New York with The Associated Press.
More than 100 Holocaust survivors and their families are participating in the campaign, all of whom will be featured in posts across the Claims Conference’s social media platforms every week throughout the year. Survivor stories will be shared on Facebook, Instagram, Twitter, and TikTok, using the hashtag #OurHolocaustStory.
“When we see a Holocaust survivor with their family members, it sends a powerful message — they didn’t just survive the Holocaust, they went on to live, to build a family, a family that would not exist if they had not survived,” Schneider added.
Assia Gorban was liberated by the Soviet Union’s Red Army in 1944. She later moved to Moscow, where she became a school teacher. While she loved the Russian capital, especially for its vivid cultural scene, she and her husband decided to emigrate to Germany in 1992, looking for more financial stability and following her son, who had moved there earlier.
Even at her old age, Gorban is an active member of Berlin’s Jewish community, volunteering weekly at the Jewish nursing home and talking to high school students about her life.
“I enjoy speaking in school and helping old people at the nursing home — it keeps me fit,” Gorban said with a cheeky smile and in blissful ignorance of the fact that she’s turning 90 in August.
One reason Ruth Gorban decided to participate in the campaign with her grandmother was her concern about the reemergence of antisemitism in Germany and elsewhere.

Michele Tantussi, Associated Press
Holocaust survivor Assia Gorban's granddaughter Ruth Gorban shows her necklace with a Star of David during an interview with The Associated Press on April 3 in Berlin, Germany.
Pulling her necklace with a Star of David pendant from underneath her sweater, the young woman with the long dark hair explained that she prefers to hide it when she’s in public.
“Berlin has a reputation for tolerance and diversity — but when it comes to the acceptance of Jews, that’s unfortunately not true,” she said.
Still, hearing from her grandmother about the Holocaust made Ruth Gorban very much aware of her own Jewishness.
“I’m proud to be Jewish,” she said. “It’s a beautiful religion and I will definitely pass it on to my children when I’m a mother one day.”
-
Holocaust survivors, descendants join forces on social media
Anna Moneymaker // Getty Images
In February 2023, two cases were argued in front of the Supreme Court that could change how social media is used forever. The first case was Twitter v. Taamneh, in which SCOTUS is being asked to decide whether a social media company's general knowledge that terrorist content is on its site is enough to determine whether the company aided and abetted terrorist actions. This case challenges how Section 2333 of the Anti-Terrorism Act is currently being interpreted.
The second case, Gonzalez v. Google, challenges protections provided by Section 230 of the Communications Decency Act, one of the few laws that directly address free speech online. The case asks whether social media companies can be held liable for the information hosted on their sites and, more specifically, the information their algorithms recommend to users.
Decisions for both cases are expected to be handed down sometime during the summer of 2023. In advance of those announcements, Stacker spoke with expert Aaron Mackey of the Electronic Frontier Foundation and, consulting government sources, investigated how Supreme Court decisions and changes to Section 230 might impact social media companies, their users, and people's ability to disseminate sensitive information.
You may also like: History of the US justice system

Anna Moneymaker // Getty Images
In February 2023, two cases were argued in front of the Supreme Court that could change how social media is used forever. The first case was Twitter v. Taamneh, in which SCOTUS is being asked to decide whether a social media company's general knowledge that terrorist content is on its site is enough to determine whether the company aided and abetted terrorist actions. This case challenges how Section 2333 of the Anti-Terrorism Act is currently being interpreted.
The second case, Gonzalez v. Google, challenges protections provided by Section 230 of the Communications Decency Act, one of the few laws that directly address free speech online. The case asks whether social media companies can be held liable for the information hosted on their sites and, more specifically, the information their algorithms recommend to users.
Decisions for both cases are expected to be handed down sometime during the summer of 2023. In advance of those announcements, Stacker spoke with expert Aaron Mackey of the Electronic Frontier Foundation and, consulting government sources, investigated how Supreme Court decisions and changes to Section 230 might impact social media companies, their users, and people's ability to disseminate sensitive information.
You may also like: History of the US justice system

-
Holocaust survivors, descendants join forces on social media
Chinnapong // Shutterstock
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
That sentence from Section 230 sums up the gist of the law, which was passed in 1996. Essentially, the host of a website online—be it a small blog or tech giant—cannot be held liable for the content published on that site by other people or entities. For example, if an individual posts a libelous comment on Facebook, that individual can be sued for libel, but Facebook's parent company, Meta, cannot.
Currently, this protection of information hosts allows for increased free speech online. Website hosts, such as social media companies, don't have to worry about being legally liable for their users' actions, giving users more leeway with the type of content they can post. This protects not only the ability for people to use social media as they please but, more consequentially, encourages the widespread sharing of important information that may be sensitive. This could include first-person accounts of protests, reporting about acts of terrorism, discussions about police brutality, and much more.
Chinnapong // Shutterstock
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
That sentence from Section 230 sums up the gist of the law, which was passed in 1996. Essentially, the host of a website online—be it a small blog or tech giant—cannot be held liable for the content published on that site by other people or entities. For example, if an individual posts a libelous comment on Facebook, that individual can be sued for libel, but Facebook's parent company, Meta, cannot.
Currently, this protection of information hosts allows for increased free speech online. Website hosts, such as social media companies, don't have to worry about being legally liable for their users' actions, giving users more leeway with the type of content they can post. This protects not only the ability for people to use social media as they please but, more consequentially, encourages the widespread sharing of important information that may be sensitive. This could include first-person accounts of protests, reporting about acts of terrorism, discussions about police brutality, and much more.
-
-
Holocaust survivors, descendants join forces on social media
JIM WATSON/AFP // Getty Images
EFF's Aaron Mackey did not mince words on the swiftness and surety of the immediate impact, telling Stacker, "The rules of the internet will be written by lawyers concerned about liability, rather than by the platforms trying to provide diverse and new ways to allow users to create content."
This shift in perspective, as well as legal bearing, concerning content curation could have widespread impacts on the end-user experience. According to Mackey, this might include more posts being taken down, increased content warnings or flags, delays in posts becoming public so they can be reviewed in-depth, and a large share of the type of content that currently lives on social media not being allowed at all.
JIM WATSON/AFP // Getty Images
EFF's Aaron Mackey did not mince words on the swiftness and surety of the immediate impact, telling Stacker, "The rules of the internet will be written by lawyers concerned about liability, rather than by the platforms trying to provide diverse and new ways to allow users to create content."
This shift in perspective, as well as legal bearing, concerning content curation could have widespread impacts on the end-user experience. According to Mackey, this might include more posts being taken down, increased content warnings or flags, delays in posts becoming public so they can be reviewed in-depth, and a large share of the type of content that currently lives on social media not being allowed at all.
-
Holocaust survivors, descendants join forces on social media
JIM WATSON/AFP // Getty Images
If protections for social media companies wane, they will likely change their moderation tactics to filter out any information that may cause a lawsuit. As a result, it may become harder to share information about sensitive topics.
As Mackey explained: "[We're] talking about specifically news events, journalism, basic sharing of real-world, horrible things that have happened. We're talking about the ability of human rights organizations to document atrocities because all of those could be construed as … distributing the speech of a terrorist organization or otherwise aiding and abetting [instead of] having their message come across that someone is reflecting and documenting the reality of the fact that there are terrorist organizations in our world and that they commit horrible acts of terror."
JIM WATSON/AFP // Getty Images
If protections for social media companies wane, they will likely change their moderation tactics to filter out any information that may cause a lawsuit. As a result, it may become harder to share information about sensitive topics.
As Mackey explained: "[We're] talking about specifically news events, journalism, basic sharing of real-world, horrible things that have happened. We're talking about the ability of human rights organizations to document atrocities because all of those could be construed as … distributing the speech of a terrorist organization or otherwise aiding and abetting [instead of] having their message come across that someone is reflecting and documenting the reality of the fact that there are terrorist organizations in our world and that they commit horrible acts of terror."
-
-
Holocaust survivors, descendants join forces on social media
chrisdorney // Shutterstock
Though it's impossible to know the exact outcome of a decision before it is made, Mackey explained that the liability for content may fall on the moderators of specific groups on sites like Reddit and Mastodon. In response to Elon Musk's purchase of Twitter, many users have migrated to such platforms to build their communities outside of Twitter's volatility, causing an unexpected burst of growth. However, because these platforms are not centralized, moderators receive no training or compensation for their work.
Moreover, in the wake of decreased protections, these moderators could be sued if inappropriate or potentially dangerous content was found on their servers. "What are the incentives to even continue being a moderator for Reddit or to spin up a server on Mastodon?" Mackey posited. "I think there are legitimate questions about whether someone will feel like it's worth their risk to do that without taking steps and having resources that make [the platforms] look more like a centralized, large-scale platform."
chrisdorney // Shutterstock
Though it's impossible to know the exact outcome of a decision before it is made, Mackey explained that the liability for content may fall on the moderators of specific groups on sites like Reddit and Mastodon. In response to Elon Musk's purchase of Twitter, many users have migrated to such platforms to build their communities outside of Twitter's volatility, causing an unexpected burst of growth. However, because these platforms are not centralized, moderators receive no training or compensation for their work.
Moreover, in the wake of decreased protections, these moderators could be sued if inappropriate or potentially dangerous content was found on their servers. "What are the incentives to even continue being a moderator for Reddit or to spin up a server on Mastodon?" Mackey posited. "I think there are legitimate questions about whether someone will feel like it's worth their risk to do that without taking steps and having resources that make [the platforms] look more like a centralized, large-scale platform."