Jacob Cohen get removed from Syracuse University for multiple rape allegations

"Remember, we were all born with wings. In times of doubt spread them. "
-Anita Kanitz

“Every woman is a sister of another woman born within or outside her race. Women have the power to unite the world for they have one link to the universe--their caring and loving hearts.”
― Princess Maleiha Bajunaid Candao

Never any doubt that a group of thoughtful, brave and strong women and girls can change the world; indeed, it's the only thing that ever has. We women and girls are strong and can change the world to a better one because without us it will not work! It must be our ultimate goal that female babies, children, girls and women can live fearless, free and self-determined anywhere in the world. It can not be that in the 21st century, men continue to have the power about their lives, their bodies, their freedom, their education, their occupation,their birth control. It can not be that girls and women are still being talked worldwide by brainwashing, they are only happy with men and daily intercourses in their lives. The media report daily where the whole thing leads to femicide, infanticides, domestic and sexual violence, rape culture, mass, gang and war rapes, misogyny, FGM, forced and child marriages, forced prostitution, trafficking in women and girls, sexual slavery, acid attacks, honor killings, dowry murders, widow murders, witch hunts, sex murders, domestic murders, sexual and sadistic stalking, slavery, forced veiling, catcalling, sexual harassment, hate speech and persecution in the name of all men made religions, verbal violence, hate speech and sexism in the society and media, , victim blaming, lynching, lashing, executions (stoning and burning alive) and imprisonment of female victims, even childs, to the compulsion to marry, to have sexual intercourse, to give birth to children, to give up because of men on education and occupation.

Unfortunately, for 16 years, I have been affected by cyberstalking, cyberbullying, identity theft on the internet, hate speech and defamation on the social medial, constant hacker attacks. The culprit is a former psychopathic and women-hating tenant and all his friends, relatives and acquaintances. These people make a sport of chasing and harassing women and girls on the Internet. My female friends, acquaintances and colleagues were also affected. I was obviously also sent real rape pornos, sadistic BDSM porn and rape threats and death threats. These people must finally get long prison sentences in jail. It can not be that such heinous criminals are allowed to continue.

The mirror of the situation worldwide: The misogyny on the Social Media:

The Unsafety Net: How Social Media Turned Against Women

Under the banner of free speech, companies like Facebook, Twitter, and YouTube have been host to rape videos and revenge porn—which makes female users feel anything but free!!!

In December 2012, an Icelandic woman named Thorlaug Agustsdottir discovered a Facebook group called “Men are better than women.” One image she found there, Thorlaug wrote to us this summer in an email, “was of a young woman naked chained to pipes or an oven in what looked like a concrete basement, all bruised and bloody. She looked with a horrible broken look at whoever was taking the pic of her curled up naked.” Thorlaug wrote an outraged post about it on her own Facebook page.
Before long, a user at “Men are better than women” posted an image of Thorlaug’s face, altered to appear bloody and bruised. Under the image, someone commented, “Women are like grass, they need to be beaten/cut regularly.” Another wrote: “You just need to be raped.” Thorlaug reported the image and comments to Facebook and requested that the site remove them.

“We reviewed the photo you reported,” came Facebook’s auto reply, “but found it does not violate Facebook’s Community Standards on hate speech, which includes posts or photos that attack a person based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability, or medical condition.”

Instead, the Facebook screeners labeled the content “Controversial Humor.” Thorlaug saw nothing funny about it. She worried the threats were real.
Some 50 other users sent their own requests on her behalf. All received the same reply. Eventually, on New Year’s Eve, Thorlaug called the local press, and the story spread from there. Only then was the image removed.

In January 2013, Wired published a critical account of Facebook’s response to these complaints. A company spokesman contacted the publication immediately to explain that Facebook screeners had mishandled the case, conceding that Thorlaug’s photo “should have been taken down when it was reported to us.” According to the spokesman, the company tries to address complaints about images on a case-by-case basis within 72 hours, but with millions of reports to review every day, “it’s not easy to keep up with requests.” The spokesman, anonymous to Wired readers, added, “We apologize for the mistake.”
If, as the communications philosopher Marshall McLuhan famously said, television brought the brutality of war into people’s living rooms, the Internet today is bringing violence against women out of it. Once largely hidden from view, this brutality is now being exposed in unprecedented ways. In the words of Anne Collier, co-director of ConnectSafely.org and co-chair of the Obama administration’s Online Safety and Technology Working Group, “We are in the middle of a global free speech experiment.” On the one hand, these online images and words are bringing awareness to a longstanding problem. On the other hand, the amplification of these ideas over social media networks is validating and spreading pathology.

In other words, social media is more symptom than disease: A 2013 report from the World Health Organization called violence against women “a global health problem of epidemic proportion,” from domestic abuse, stalking, and street harassment to sex trafficking, rape, and murder. This epidemic is thriving in the petri dish of social media.

While some of the aggression against women online occurs between people who know one another, and is unquestionably illegal, most of it happens between strangers. Earlier this year, Pacific Standard published a long story by Amanda Hess about an online stalker who set up a Twitter account specifically to send her death threats.

Across websites and social media platforms, everyday sexist comments exist along a spectrum that also includes illicit sexual surveillance, “creepshots,” extortion, doxxing, stalking, malicious impersonation, threats, and rape videos and photographs. The explosive use of the Internet to conduct human trafficking also has a place on this spectrum, given that three-quarters of trafficked people are girls and women.

A report, “Misogyny on Twitter,” released by the research and policy organization Demos this June, found more than 6 million instances of the word “slut” or “whore” used in English on Twitter between December 26, 2013, and February 9, 2014. (The words “bitch” and “cunt” were not measured.) An estimated 20 percent of the misogyny study Tweets appeared, to researchers, to be threatening. An example: "@XXX @XXX You stupid ugly fucking slut I’ll go to your flat and cut your fucking head off you inbred whore."

In summer 2016, VidCon, an annual nationwide convention held in Southern California, women vloggers shared an astonishing number of examples. The violent threats posted beneath YouTube videos, they observed, are pushing women off of this and other platforms in disproportionate numbers. When Anita Sarkeesian launched a Kickstarter to help fund a feminist video series called Tropes vs. Women, she became the focus of a massive and violently misogynistic cybermob. Among the many forms of harassment she endured was a game where thousands of players “won” by virtually bludgeoning her face. In late August, she contacted the police and had to leave her home after she received a series of serious violent online threats.

Danielle Keats Citron, law professor at the University of Maryland and author of the recently released book Hate Crimes in Cyberspace, explained, “Time and time again, these women have no idea often who it is attacking them. A cybermob jumps on board, and one can imagine that the only thing the attackers know about the victim is that she’s female.” Looking at 1,606 cases of “revenge porn,” where explicit photographs are distributed without consent, Citron found that 90 percent of targets were women. Another study she cited found that 70 percent of female gamers chose to play as male characters rather than contend with sexual harassment.

This type of harassment also fills the comment sections of popular websites. In August 2016, employees of the largely female-staffed website Jezebel published an open letter to the site’s parent company, Gawker, detailing the professional, physical, and emotional costs of having to look at the pornographic GIFs maliciously populating the site’s comments sections everyday. “It’s like playing whack-a-mole with a sociopathic Hydra,” they wrote, insisting that Gawker develop tools for blocking and tracking IP addresses. They added, “It’s impacting our ability to do our jobs.”

For some, the costs are higher. In 2010, 12-year-old Amanda Todd bared her chest while chatting online with a person who’d assured her that he was a boy, but was in fact a grown man with a history of pedophilia. For the next two years, Amanda and her mother, Carol Todd, were unable to stop anonymous users from posting that image on sexually explicit pages. A Facebook page, labeled “Controversial Humor,” used Amanda’s name and image—and the names and images of other girls—without consent. In October 2012, Amanda committed suicide, posting a YouTube video that explained her harassment and her decision. In April 2014, Dutch officials announced that they had arrested a 35-year-old man suspected to have used the Internet to extort dozens of girls, including Amanda, in Canada, the United Kingdom, and the United States. The suspect now faces charges of child pornography, extortion, criminal harassment, and Internet luring.

Almost immediately after Amanda shared her original image, altered versions appeared on pages, and videos proliferated. One of the pages was filled with pictures of naked pre-pubescent girls, encouraging them to drink bleach and die. While she appreciates the many online tributes honoring her daughter, Carol Todd is haunted by “suicide humor” and pornographic content now forever linked to her daughter’s image. There are web pages dedicated to what is now called “Todding.” One of them features a photograph of a young woman hanging.

Meanwhile, extortion of other victims continues. In an increasing number of countries, rapists are now filming their rapes on cell phones so they can blackmail victims out of reporting the crimes. In August, after a 16-year-old Indian girl was gang-raped, she explained, “I was afraid. While I was being raped, another man pointed a gun and recorded me with his cellphone camera. He said he will upload the film on the Net if I tell my family or the police.”

In Pakistan, the group Bytes for All—an organization that previously sued the government for censoring YouTube videos—released a study showing that social media and mobile tech are causing real harm to women in the country. Gul Bukhari, the report’s author, told Reuters, “These technologies are helping to increase violence against women, not just mirroring it.”

In June 2014, a 16-year-old girl named Jada was drugged and raped at a party in Texas. Partygoers posted a photo of her lying unconscious, one leg bent back. Soon, other Internet users had turned it into a meme, mocking her pose and using the hashtag #jadapose. Kasari Govender, executive director of the Vancouver-based West Coast Legal Education and Action Fund (LEAF), calls this kind of behavior “cybermisogyny.” “Cyberbullying,” she says, “has become this term that’s often thrown around with little understanding. We think it’s important to name the forces that are motivating this in order to figure out how to address it.”

In an unusually bold act, Jada responded by speaking publicly about her rape and the online abuse that followed. Supporters soon took to the Internet in her defense. “There’s no point in hiding,” she told a television reporter. “Everybody has already seen my face and my body, but that’s not what I am and who I am. I’m just angry.”

After Facebook removed Thorlaug’s altered image and the rape threats, she felt relieved, but she was angry too. “These errors are going to manifest again,” she told Wired, “if there isn’t clear enough policy.”

Yet, at the time of Thorlaug’s report, Facebook did have a clear policy. Its detailed Community Standards for speech, often considered the industry’s gold standard, were bolstered by reporting tools that allowed users to report offensive content, and Thorlaug had used these tools as instructed. But serious errors were still manifesting regularly.

Not long after Thorlaug’s struggle to remove her image, a Facebook user posted a video documenting the gang rape of a woman by the side of a road in Malaysia. The six minutes of graphic footage were live for more than three weeks, during which Facebook moderators declined repeated requests for removal.

Around the same time, another Icelandic woman, Hildur Lilliendahl Viggósdóttir, decided to draw attention to similar problems by creating a page called “Men who hate women,” where she reposted examples of misogyny she found elsewhere on Facebook. Her page was suspended four times—not because of its offensive content, but because she was reposting images without written permission. Meanwhile, the original postings—graphically depicting rape and glorifying the physical abuse of women—remained on Facebook. As activists had been noting for years, pages like these were allowed by Facebook to remain under the category of “humor.” Other humorous pages live at the time had names like “I kill bitches like you,” “Domestic Violence: Don’t Make Me Tell You Twice,” “I Love the Rape Van,” and “Raping Babies Because You’re Fucking Fearless.”

For online harassers, this is often an overt goal: to silence female community members, whether through sexual slurs or outright threats. It’s little surprise that the Internet has become a powerful tool in intimate partner violence: A 2012 survey conducted by the National Network to End Domestic Violence (NNEDV) found that 89 percent of local domestic violence programs reported victims who were experiencing technology-enabled abuse, often across multiple platforms.

For their part, social media companies often express commitment to user safety, but downplay their influence on the broader culture. Administrators repeatedly explain that their companies, while very concerned with protecting users, are not in the business of policing free speech. As Twitter co-founder Biz Stone phrased it in a post titled “Tweets Must Flow,” “We strive not to remove Tweets on the basis of their content.” The company’s guidelines encourage readers to unfollow the offensive party and “express your feelings [to a trusted friend] so you can move on.”

None of this was of much help to Caroline Criado-Perez, a British journalist and feminist who helped get a picture of Jane Austen on the £10 banknote. The day Bank of England made the announcement, Criado-Perez began receiving more than 50 violent threats per hour on Twitter. “The immediate impact was that I couldn’t eat or sleep,” she told The Guardian in 2013. She asked Twitter to find some way to stop the threats, but at the time the company offered no mechanism for reporting abuse. Since then, the company has released a reporting button, but its usefulness is extremely limited: It requires that every tweet be reported separately, a cumbersome process that gives the user no way of explaining that she is a target of ongoing harassment. (The system currently provides no field for comments.)

For that reasons, when social media companies fail to respond to complaints and requests, victims of online harassment frequently turn to individuals who can publicize their cases. Trista Hendren, an Oregon-based blogger, became an advocate for other women after readers from Iceland, Egypt, Australia, India, Lebanon, and the UK began asking her to write about their experiences. “I was overwhelmed,” she told us. In December 2012, Hendren and several collaborators created a Facebook page called RapeBook where users could flag and report offensive content that the company had refused to take down.

By April 2013, people were using RapeBook to post pictures of women and pre-pubescent girls being raped or beaten. Some days, Hendren received more than 500 anonymous, explicitly violent comments—“I will skull-fuck your children,” for instance. Facebook users tracked down and posted her address, her children’s names, and her phone number and started to call her.

By that time, Hendren had abandoned any hope that using Facebook’s reporting mechanisms could help her. She was able, however, to work directly with a Facebook moderator to address the threats and criminal content. She found that the company sincerely wanted to help. Their representatives discussed the posts with her on a case-by-case basis, but more violent and threatening posts kept coming, and much of the content she considered graphic and abusive was allowed to remain.

Facebook’s people, she said, told her they didn’t consider the threats to her and her family credible or legitimate. Hendren, however, was concerned enough to contact the police and the FBI. The FBI started an investigation; meanwhile Hendren, physically and emotionally spent, suspended her Facebook account. “I was the sickest I have ever been,” she said. “It was really disgusting work. We just began to think, ‘Why are we devoting all our efforts on a volunteer basis to do work that Facebook—with billions of dollars—should be taking care of?’”

Hendren contacted Soraya, who continued to press Facebook directly. At the same time, Soraya and Laura Bates, founder of the Everyday Sexism Project, also began comparing notes on what readers were sending them. Bates was struck by surprising ad placements. At the time, a photo captioned “The bitch didn’t know when to shut up” appeared alongside ads for Dove and iTunes. “Domestic Violence: Don’t Make Me Tell You Twice”—a page filled with photos of women beaten, bruised, and bleeding—was populated by ads for Facebook’s COO Sheryl Sandberg’s new bestselling book, Lean In: Women, Work, and the Will to Lead.

In early May, Bates decided to tweet at one of these companies. “Hi @Finnair here’s your ad on another domestic violence page—will you stop advertising with Facebook?” FinnAir responded immediately: “It is totally against our values and policies. Thanks @r2ph! @everydaysexism Could you send us the URL please so that we can take action?”

Soraya, Bates, and Jaclyn Friedman, the executive director of Women, Action, and Media, a media justice advocacy group, joined forces and launched a social media campaign designed to attract advertisers’ attention. The ultimate goal was to press Facebook to recognize explicit violence against women as a violation of its own prohibitions against hate speech, graphic violence, and harassment. Within a day of beginning the campaign, 160 organizations and corporations had co-signed a public letter, and in less than a week, more than 60,000 tweets were shared using the campaign’s #FBrape hashtag. Nissan was the first company to pull its advertising dollars from Facebook altogether. More than 15 others soon followed. The letter emphasized that Facebook’s refusal to take down content that glorified and trivialized graphic rape and domestic violence was actually hampering free expression—it was “marginaliz[ing] girls and women, sidelin[ing] our experiences and concerns, and contribut[ing] to violence against us.”

When Emily Bazelon, author of a book and a March 2013 Atlantic story about Internet bullies, visited Facebook’s headquarters, the young men she saw working as moderators were spending roughly 30 seconds assessing each reported post, millions of reports a week. Outsourced speech moderation has become a booming industry. Like Facebook’s own moderation process, the operations of these companies are opaque by design.

In late June 2016, the U.S. Supreme Court announced it would hear the case of Anthony Elonis, a man with five charges of sexual harassment, who was imprisoned after threatening to kill his wife on Facebook. Elonis insists that his Facebook posts were not real threats but protected speech. Tara Elonis, his estranged wife—possibly aware that most female murder victims are killed by intimate partners—said that there was nothing unthreatening about her husband’s Facebook posts and that they forced her to take necessary, costly precautions. “If I only knew then what I know now,” read one, “I would have smothered your ass with a pillow, dumped your body in the back seat, dropped you off in Toad Creek, and made it look like a rape and murder."

In late August 2016, Drew Curtis, founder of the content aggregator FARK, announced that the company had added “misogyny” to its moderation guidelines. FARK no longer allows rape jokes or threats. It also prohibited posts that call groups of women "whores" or "sluts," or suggest that a woman who suffered a crime is somehow asking for it. In a note to readers, Curtis wrote, “This represents enough of a departure from pretty much how every other large Internet community operates that I figure an announcement is necessary.” Responding in Slate, Amanda Hess praised FARK’s new policy but also pointed out its limitations: Just underneath his announcement, users posted dozens of comments about rape, whores, and “boobies.”

What kind of world ist that, what kind of world for all females?

Anita Kanitz, Stuttgart, Germany
1 year ago
Shared on Facebook
Tweet