
Decision Maker

Decision Maker

Dear Mrs. Heath, I am deeply sorry for the tragedy you and your family are experiencing. From the video you shared, it's clear that your husband was a caring and noble man and that you loved each other very much. I'm a product manager at Facebook on the team that works on memorialization, and I wanted to reach out to let you know that I'm sorry that your husband's account being memorialized was surprising and upsetting. Our intention is to offer support in times of loss, not to add to the grief that people experience. As you now know, we offer a legacy option to give people a choice about what they'd like to have happen to their accounts after they pass away. If someone choses a legacy contact, then after the account is memorialized, the legacy contact will be able to care for the account in a few specific ways, like changing the profile picture and pinning a post at the top of the Timeline. The legacy contact is not able to log into the account, see private messages, or post as the person. When a person's Timeline is memorialized (regardless of whether or not there is a legacy contact for it), it stays on Facebook as a place to come together to honor and celebrate the person's life. Friends can continue to see all the photos and posts they could see previously and they can continue to post to the Timeline to share thoughts and memories, if they were able to do so before. No one can log in to the account. We do not add a legacy contact posthumously in cases when someone did not select one themselves. Even when there are next of kin, Facebook is not in a position to know what the person would have wanted, and we have a responsibility to respect the choices and the privacy of each account holder, even after they have died. Similarly, we do not un-memorialize accounts as part of protecting the privacy and the security of the accounts of people who have passed away. Please know that these decisions are not made lightly; this is an area that is incredibly sensitive, and the team that develops our policies thinks carefully about how to meet the needs of our global community. Kind regards, Vanessa
Dear Mrs. Heath, I am deeply sorry for the tragedy you and your family are experiencing. From the video you shared, it's clear that your husband was a caring and noble man and that you loved each other very much. I'm a product manager at Facebook on the team that works on memorialization, and I wanted to reach out to let you know that I'm sorry that your husband's account being memorialized was surprising and upsetting. Our intention is to offer support in times of loss, not to add to the grief that people experience. As you now know, we offer a legacy option to give people a choice about what they'd like to have happen to their accounts after they pass away. If someone choses a legacy contact, then after the account is memorialized, the legacy contact will be able to care for the account in a few specific ways, like changing the profile picture and pinning a post at the top of the Timeline. The legacy contact is not able to log into the account, see private messages, or post as the person. When a person's Timeline is memorialized (regardless of whether or not there is a legacy contact for it), it stays on Facebook as a place to come together to honor and celebrate the person's life. Friends can continue to see all the photos and posts they could see previously and they can continue to post to the Timeline to share thoughts and memories, if they were able to do so before. No one can log in to the account. We do not add a legacy contact posthumously in cases when someone did not select one themselves. Even when there are next of kin, Facebook is not in a position to know what the person would have wanted, and we have a responsibility to respect the choices and the privacy of each account holder, even after they have died. Similarly, we do not un-memorialize accounts as part of protecting the privacy and the security of the accounts of people who have passed away. Please know that these decisions are not made lightly; this is an area that is incredibly sensitive, and the team that develops our policies thinks carefully about how to meet the needs of our global community. Kind regards, Vanessa

A l’attention de Julie et de tous ceux qui ont signé la pétition, Nous accordons une grande importance aux retours de notre communauté et voulons tout d’abord tous vous remercier d'avoir pris le temps de partager vos réflexions avec nous. Les terroristes, leur propagande et ceux qui en font l’apologie n’ont pas leur place sur Facebook. Nos standards de la communauté (https://www.facebook.com/communitystandards/) sont très clairs à ce sujet, et nous faisons tout ce qui est en notre pouvoir pour nous assurer que des terroristes ou des groupes terroristes ne puissent pas utiliser notre site. Notre objectif est de permettre à chacun de nos utilisateurs de s’exprimer, de partager avec ses proches autour de sujets qui lui tiennent à cœur. Pour que cela soit possible, nos utilisateurs ont besoin de savoir que nous travaillons d’arrache-pied pour assurer leur sécurité sur Facebook. C’est pourquoi nous avons des règles claires et strictes – les standards de la communauté - qui définissent ce qui est inacceptable sur notre plateforme . Ces règles bannissent explicitement tout contenu pro-terroriste. L'outil le plus puissant demeure la mobilisation de notre communauté d’1,5 milliard de personnes, particulièrement active et vigilante : alors que des milliards de nouvelles publications sont partagées chaque jour sur Facebook, nous mettons à la disposition de nos utilisateurs des outils de signalement simples d’utilisation auprès de chaque contenu visible sur Facebook afin de leur donner les moyens de nous signaler tout élément qui leur parait inapproprié. Et c’est ce qu’ils font. Lorsqu’un contenu nous est signalé, il est analysé par nos équipes d’experts spécifiquement formés pour cette tâche, et joignables 24h/24, 7j/7. Pour la France, nous disposons d’une équipe francophone. Bien entendu, ces équipes traitent les contenus en lien avec le terrorisme en haute priorité. Nous supprimons dans les plus brefs délais tout contenu, toute personne ou tout groupe véhiculant des propos haineux, violents, ou faisant l’apologie de groupes ou d’actes terroristes. Par ailleurs, lorsque nous supprimons des contenus ou des comptes pro-terroristes, nos équipes utilisent des outils dédiés pour détecter d’autres comptes associés véhiculant des messages similaires. Quand des événements tragiques ont lieu dans le monde, comme lors des attentats de Paris, nous mobilisons des ressources supplémentaires afin d’être en mesure d’agir rapidement contre tout contenu violant les règles de notre service. Par exemple, lors des attentats du 13 novembre, nous sommes rapidement entrés en contacts avec les représentants pertinents du gouvernement, les ONGs, les médias afin d’obtenir des informations en temps réel et ainsi être en mesure d’agir rapidement. Beaucoup de nos employés, en particulier nos employés francophones, se sont relayés jour et nuit afin d’être en mesure de répondre à la forte hausse des signalements que notre communauté nous transmettaient dans ces moments difficiles. Il s’agit d’un travail complexe, et nous sommes conscients qu’il nous arrive de faire des erreurs. Nous sommes donc constamment à la recherche de moyens pour améliorer notre réactivité et précision dans ce type de situation. Nous avons renforcé notre équipe, notamment en ce qui concerne leurs capacités linguistiques, afin d’être en mesure de répondre aux crises dans le monde entier plus rapidement et plus efficacement. Dans le cadre de cet effort, nous avons renforcé nos échanges avec des experts à même de nous conseiller sur ces sujets et suivons de très près les événements qui ont lieu dans le monde. Nous maintenons un dialogue étroit avec les ONG, les partenaires de l'industrie, les universitaires et les représentants du gouvernement sur les meilleures façons de lutter contre le terrorisme et leur propagande sur le Web. Comme l’ont souligné de nombreux gouvernements et universitaires, il est souvent difficile d'identifier de nouveaux groupes ou individus terroristes parce qu’ils évoluent dans un paysage en constante évolution. Nous faisons de notre mieux pour contrôler l’émergence de ces groupes en entretenant des relations étroites avec des experts les plus reconnus de ce domaine et en écoutant attentivement les retours de notre communauté. A chaque fois que des événements dramatiques se produisent, les gens utilisent Facebook pour partager leurs réactions. Les commentaires postés en provenance du monde entier expriment souvent la frustration et le désespoir, mais aussi l'empathie et le désir d'apporter de l'aide. Notre communauté utilise Facebook pour partager des nouvelles difficiles, mais aussi pour se soutenir mutuellement, exprimer sa solidarité et organiser des actions de soutien pour les victimes ou d'autres personnes vulnérables. Par exemple, après les attaques terroristes de Charlie Hebdo à Paris, nous avons vu une multitude de personnes utiliser Facebook pour organiser des événements "off line" pour témoigner leur solidarité contre le terrorisme. Plus de 7 millions de personnes avaient répondu "oui" pour participer à ces manifestations. Naturellement, quand les personnes parlent de ce type d'événements pour les dénoncer, les condamner, exprimer leur solidarité ou leur horreur, il peut leur arriver de partager des contenus choquants. Il est révoltant de voir la photo d'un enfant réfugié allongé sans vie sur une plage. Mais cette même image peut mobiliser et inciter les personnes à entreprendre des actions pour protéger d'autres réfugiés. Beaucoup de personnes dans des zones instables sont soumises à des actes horribles qui souvent ne sont pas repris par des caméras. Facebook donne une voix à ces personnes, et nous voulons protéger cette voix. Si Facebook bloquait l'ensemble des contenus choquants, nous empêcherions les médias, les institutions caritatives et d'autres acteurs encore de témoigner et raconter ce qui se passe dans le monde pour inciter au changement et soutenir les victimes. Pour cette raison, dans certains cas, nous donnons la possibilité à nos utilisateurs de commenter ce type d’événement et de partager certaines photos violentes ou choquantes, à l’unique condition qu’ils expriment clairement une volonté de sensibiliser les gens et de condamner la violence. En tout cas, nous supprimons tout type d'images de violence qui sont postées pour promouvoir ou glorifier la violence elle-même ou ceux qui l'infligent. Notre objectif est de protéger les contenus partagés qui représentent une partie si importante dans la lutte contre le terrorisme, tout en protégeant notre service de tout abus par ceux qui veulent glorifier violence et terreur. Nous savons que nous faisons aujourd’hui face à un sujet grave, important, qui fait débat et qui nous concerne tous. Notre objectif à Facebook est de garantir la sécurité de tout ceux qui utilisent nos services et Internet légitimement, que ce soit pour partager les contenus qui leur tiennent à cœur ou lutter contre la haine et la violence en partageant des contenus condamnant ces actes barbares, contre l’infime minorité qui glorifient la haine et la violence. Soyez assurés que, face à cet enjeu majeur, nos équipes sont pleinement mobilisées et font tout ce qui est en leur pouvoir pour protéger ce lieu de conversation privilégié contre les propos haineux, violents ou abusifs. Bien à vous, Monika Bickert Directrice mondiale de la politique de gestion des contenus de Facebook
A l’attention de Julie et de tous ceux qui ont signé la pétition, Nous accordons une grande importance aux retours de notre communauté et voulons tout d’abord tous vous remercier d'avoir pris le temps de partager vos réflexions avec nous. Les terroristes, leur propagande et ceux qui en font l’apologie n’ont pas leur place sur Facebook. Nos standards de la communauté (https://www.facebook.com/communitystandards/) sont très clairs à ce sujet, et nous faisons tout ce qui est en notre pouvoir pour nous assurer que des terroristes ou des groupes terroristes ne puissent pas utiliser notre site. Notre objectif est de permettre à chacun de nos utilisateurs de s’exprimer, de partager avec ses proches autour de sujets qui lui tiennent à cœur. Pour que cela soit possible, nos utilisateurs ont besoin de savoir que nous travaillons d’arrache-pied pour assurer leur sécurité sur Facebook. C’est pourquoi nous avons des règles claires et strictes – les standards de la communauté - qui définissent ce qui est inacceptable sur notre plateforme . Ces règles bannissent explicitement tout contenu pro-terroriste. L'outil le plus puissant demeure la mobilisation de notre communauté d’1,5 milliard de personnes, particulièrement active et vigilante : alors que des milliards de nouvelles publications sont partagées chaque jour sur Facebook, nous mettons à la disposition de nos utilisateurs des outils de signalement simples d’utilisation auprès de chaque contenu visible sur Facebook afin de leur donner les moyens de nous signaler tout élément qui leur parait inapproprié. Et c’est ce qu’ils font. Lorsqu’un contenu nous est signalé, il est analysé par nos équipes d’experts spécifiquement formés pour cette tâche, et joignables 24h/24, 7j/7. Pour la France, nous disposons d’une équipe francophone. Bien entendu, ces équipes traitent les contenus en lien avec le terrorisme en haute priorité. Nous supprimons dans les plus brefs délais tout contenu, toute personne ou tout groupe véhiculant des propos haineux, violents, ou faisant l’apologie de groupes ou d’actes terroristes. Par ailleurs, lorsque nous supprimons des contenus ou des comptes pro-terroristes, nos équipes utilisent des outils dédiés pour détecter d’autres comptes associés véhiculant des messages similaires. Quand des événements tragiques ont lieu dans le monde, comme lors des attentats de Paris, nous mobilisons des ressources supplémentaires afin d’être en mesure d’agir rapidement contre tout contenu violant les règles de notre service. Par exemple, lors des attentats du 13 novembre, nous sommes rapidement entrés en contacts avec les représentants pertinents du gouvernement, les ONGs, les médias afin d’obtenir des informations en temps réel et ainsi être en mesure d’agir rapidement. Beaucoup de nos employés, en particulier nos employés francophones, se sont relayés jour et nuit afin d’être en mesure de répondre à la forte hausse des signalements que notre communauté nous transmettaient dans ces moments difficiles. Il s’agit d’un travail complexe, et nous sommes conscients qu’il nous arrive de faire des erreurs. Nous sommes donc constamment à la recherche de moyens pour améliorer notre réactivité et précision dans ce type de situation. Nous avons renforcé notre équipe, notamment en ce qui concerne leurs capacités linguistiques, afin d’être en mesure de répondre aux crises dans le monde entier plus rapidement et plus efficacement. Dans le cadre de cet effort, nous avons renforcé nos échanges avec des experts à même de nous conseiller sur ces sujets et suivons de très près les événements qui ont lieu dans le monde. Nous maintenons un dialogue étroit avec les ONG, les partenaires de l'industrie, les universitaires et les représentants du gouvernement sur les meilleures façons de lutter contre le terrorisme et leur propagande sur le Web. Comme l’ont souligné de nombreux gouvernements et universitaires, il est souvent difficile d'identifier de nouveaux groupes ou individus terroristes parce qu’ils évoluent dans un paysage en constante évolution. Nous faisons de notre mieux pour contrôler l’émergence de ces groupes en entretenant des relations étroites avec des experts les plus reconnus de ce domaine et en écoutant attentivement les retours de notre communauté. A chaque fois que des événements dramatiques se produisent, les gens utilisent Facebook pour partager leurs réactions. Les commentaires postés en provenance du monde entier expriment souvent la frustration et le désespoir, mais aussi l'empathie et le désir d'apporter de l'aide. Notre communauté utilise Facebook pour partager des nouvelles difficiles, mais aussi pour se soutenir mutuellement, exprimer sa solidarité et organiser des actions de soutien pour les victimes ou d'autres personnes vulnérables. Par exemple, après les attaques terroristes de Charlie Hebdo à Paris, nous avons vu une multitude de personnes utiliser Facebook pour organiser des événements "off line" pour témoigner leur solidarité contre le terrorisme. Plus de 7 millions de personnes avaient répondu "oui" pour participer à ces manifestations. Naturellement, quand les personnes parlent de ce type d'événements pour les dénoncer, les condamner, exprimer leur solidarité ou leur horreur, il peut leur arriver de partager des contenus choquants. Il est révoltant de voir la photo d'un enfant réfugié allongé sans vie sur une plage. Mais cette même image peut mobiliser et inciter les personnes à entreprendre des actions pour protéger d'autres réfugiés. Beaucoup de personnes dans des zones instables sont soumises à des actes horribles qui souvent ne sont pas repris par des caméras. Facebook donne une voix à ces personnes, et nous voulons protéger cette voix. Si Facebook bloquait l'ensemble des contenus choquants, nous empêcherions les médias, les institutions caritatives et d'autres acteurs encore de témoigner et raconter ce qui se passe dans le monde pour inciter au changement et soutenir les victimes. Pour cette raison, dans certains cas, nous donnons la possibilité à nos utilisateurs de commenter ce type d’événement et de partager certaines photos violentes ou choquantes, à l’unique condition qu’ils expriment clairement une volonté de sensibiliser les gens et de condamner la violence. En tout cas, nous supprimons tout type d'images de violence qui sont postées pour promouvoir ou glorifier la violence elle-même ou ceux qui l'infligent. Notre objectif est de protéger les contenus partagés qui représentent une partie si importante dans la lutte contre le terrorisme, tout en protégeant notre service de tout abus par ceux qui veulent glorifier violence et terreur. Nous savons que nous faisons aujourd’hui face à un sujet grave, important, qui fait débat et qui nous concerne tous. Notre objectif à Facebook est de garantir la sécurité de tout ceux qui utilisent nos services et Internet légitimement, que ce soit pour partager les contenus qui leur tiennent à cœur ou lutter contre la haine et la violence en partageant des contenus condamnant ces actes barbares, contre l’infime minorité qui glorifient la haine et la violence. Soyez assurés que, face à cet enjeu majeur, nos équipes sont pleinement mobilisées et font tout ce qui est en leur pouvoir pour protéger ce lieu de conversation privilégié contre les propos haineux, violents ou abusifs. Bien à vous, Monika Bickert Directrice mondiale de la politique de gestion des contenus de Facebook

To Julie and everyone who signed the petition, We value feedback from our community and want to thank you for taking the time to share your thoughts with us. There is no place on Facebook for terrorists, terrorist propaganda or the praising of terror. Our Community Standards (https://www.facebook.com/communitystandards) make this clear and we work aggressively to ensure that we do not have terrorists or terror groups using the site. Our goal is to give people a place to share and connect with one another. For that to happen, they need to know we are working to keep them safe on Facebook. That’s why we have strict rules that outline what is not acceptable. This includes our clear ban on anything that promotes terrorism. The best tool we have to keep terrorist content off Facebook is our vigilant community of more than 1.5 billion people who are very good at letting us know when something is not right. There are billions of new posts on Facebook every day, so we make it easy for people to flag content for us, and they do. Every piece of content on Facebook can be reported to our teams directly through the site. When content is reported to us, it is reviewed by a highly trained global team with expertise in dozens of languages. The team reviews reports around the clock, and prioritizes any terrorism-related reports for immediate review. We remove anyone or any group who has a violent mission or who has engaged in acts of terrorism. We also remove any content that expresses support for these groups or their actions. And we don't stop there. When we find terrorist related material, we look for and remove associated violating content as well. When a crisis happens anywhere in the world, we organize our employees and, if necessary, shift resources to ensure that we are able to respond quickly to any violating content on the site. For instance, in the wake of the recent attacks in Paris, we also reached out immediately to NGOs, media, and government officials, to get the latest information so that we were prepared to act quickly. Many of our employees, especially our French speakers, worked around the clock to respond to the spike in reports from our community. This is not an easy job and we know we can make mistakes and are always working to improve our responsiveness and accuracy. We have expanded our team and increased our language capabilities so that we can respond to crises around the world faster and more effectively. As part of this effort, we have expanded our engagement with experts and follow world events closely. We remain in close contact with NGO’s, industry partners, academics, and government officials about the best ways to keep Facebook free of terrorists and terror-promoting content. As governments and academics have pointed out, it is often hard to identify new terror groups and individuals because the landscape is constantly changing. We do our best to monitor emerging groups or trends by maintaining relationships with experts in the field and listening closely to our community. Every time there is a terror attack, people come to Facebook to share their reactions. These posts from people around the world often express frustration and despair, but also empathy and a desire to help. Our community uses Facebook to share devastating news, but also to console one another, express solidarity, and mobilize support for victims and other vulnerable people. For instance, after the Charlie Hebdo attacks in Paris, we saw many people use Facebook to plan offline events to stand in solidarity against terrorism. Of course, when people talk about these events for good reasons, they sometimes share upsetting content. It is horrifying to see a photograph of a refugee child lying lifeless on a beach. At the same time, that image may mobilize people to take action to help other refugees. Many people in volatile regions are suffering unspeakable horrors that fall outside the reach of media cameras. Facebook provides these people a voice, and we want to protect that voice. If Facebook blocked all upsetting content, we would inevitably block the media, charities and others from reporting on what is happening in the world. We would also inevitably interfere with calls for positive change and support for victims. For this reason, we allow people to discuss these events and share some types of violent images but only if they are clearly doing so to raise awareness or condemn violence. However, we remove any graphic images shared to promote or glorify violence or the perpetrators of violence. We understand that this is an important subject that concerns all of us. Our goal at Facebook is to protect the sharing and connecting that is such an important part of responding to terrorism, while preventing our service from being abused by those who would glorify violence or terror. Our teams are committed and doing everything possible to keep Facebook safe. Yours Sincerely, Monika Bickert Head of Global Product Policy
To Julie and everyone who signed the petition, We value feedback from our community and want to thank you for taking the time to share your thoughts with us. There is no place on Facebook for terrorists, terrorist propaganda or the praising of terror. Our Community Standards (https://www.facebook.com/communitystandards) make this clear and we work aggressively to ensure that we do not have terrorists or terror groups using the site. Our goal is to give people a place to share and connect with one another. For that to happen, they need to know we are working to keep them safe on Facebook. That’s why we have strict rules that outline what is not acceptable. This includes our clear ban on anything that promotes terrorism. The best tool we have to keep terrorist content off Facebook is our vigilant community of more than 1.5 billion people who are very good at letting us know when something is not right. There are billions of new posts on Facebook every day, so we make it easy for people to flag content for us, and they do. Every piece of content on Facebook can be reported to our teams directly through the site. When content is reported to us, it is reviewed by a highly trained global team with expertise in dozens of languages. The team reviews reports around the clock, and prioritizes any terrorism-related reports for immediate review. We remove anyone or any group who has a violent mission or who has engaged in acts of terrorism. We also remove any content that expresses support for these groups or their actions. And we don't stop there. When we find terrorist related material, we look for and remove associated violating content as well. When a crisis happens anywhere in the world, we organize our employees and, if necessary, shift resources to ensure that we are able to respond quickly to any violating content on the site. For instance, in the wake of the recent attacks in Paris, we also reached out immediately to NGOs, media, and government officials, to get the latest information so that we were prepared to act quickly. Many of our employees, especially our French speakers, worked around the clock to respond to the spike in reports from our community. This is not an easy job and we know we can make mistakes and are always working to improve our responsiveness and accuracy. We have expanded our team and increased our language capabilities so that we can respond to crises around the world faster and more effectively. As part of this effort, we have expanded our engagement with experts and follow world events closely. We remain in close contact with NGO’s, industry partners, academics, and government officials about the best ways to keep Facebook free of terrorists and terror-promoting content. As governments and academics have pointed out, it is often hard to identify new terror groups and individuals because the landscape is constantly changing. We do our best to monitor emerging groups or trends by maintaining relationships with experts in the field and listening closely to our community. Every time there is a terror attack, people come to Facebook to share their reactions. These posts from people around the world often express frustration and despair, but also empathy and a desire to help. Our community uses Facebook to share devastating news, but also to console one another, express solidarity, and mobilize support for victims and other vulnerable people. For instance, after the Charlie Hebdo attacks in Paris, we saw many people use Facebook to plan offline events to stand in solidarity against terrorism. Of course, when people talk about these events for good reasons, they sometimes share upsetting content. It is horrifying to see a photograph of a refugee child lying lifeless on a beach. At the same time, that image may mobilize people to take action to help other refugees. Many people in volatile regions are suffering unspeakable horrors that fall outside the reach of media cameras. Facebook provides these people a voice, and we want to protect that voice. If Facebook blocked all upsetting content, we would inevitably block the media, charities and others from reporting on what is happening in the world. We would also inevitably interfere with calls for positive change and support for victims. For this reason, we allow people to discuss these events and share some types of violent images but only if they are clearly doing so to raise awareness or condemn violence. However, we remove any graphic images shared to promote or glorify violence or the perpetrators of violence. We understand that this is an important subject that concerns all of us. Our goal at Facebook is to protect the sharing and connecting that is such an important part of responding to terrorism, while preventing our service from being abused by those who would glorify violence or terror. Our teams are committed and doing everything possible to keep Facebook safe. Yours Sincerely, Monika Bickert Head of Global Product Policy

Hello, We saw that you had signed the petition asking Facebook to ‘stop political blocking’ and wanted to reach out to you directly to explain how we operate. We hope this will bring more clarity to this issue. We do first want to reiterate that we strongly support people being able to engage in political debate on Facebook, and we recognize the importance of these debates to people, especially in times of conflict. We have a single set of Community Standards around the world that help keep Facebook safe for everyone. These standards prohibit content that contains nudity, hate speech, specific threats of violence or bullying content. (For more details visit www.facebook.com/communitystandards). When content is reported to us we investigate to see if it breaks our standards and if it does, we take it down. It doesn’t matter if something is reported once or 100 times, we only remove content that goes against these standards. We have learnt over the years that people can run campaigns to create many reports about content they don’t like and have designed our systems to prevent this leading to the removal of acceptable content. All of our reports are handled in global centres where multilingual, highly trained teams deal with requests from all over the world in an impartial way. Within these teams we have people who speak both Russian and Ukrainian who can help give specific language context. And we have quality control systems in place to ensure that reports are decided on correctly according to the common standards. Of course, with more than 1.4 billion people around the world using Facebook there will be the rare occasion where our teams don’t get it right. When this happens and we are alerted to it then we will quickly work to restore or remove content as appropriate. When we remove a piece of content, we notify the person who posted it with an explanation of why it broke our rules, so they know not to post such content again. If people repeatedly post content that breaks our standards then we may block them from posting content for a number of days. As a last resort, if people seem to be ignoring the standards altogether, we sometimes have to suspend their accounts. When people raise questions about our decisions we will look at them again. We have reviewed the decisions made on the accounts in the petition and others that were brought to our attention. We have found that the majority of actions were correctly taken because of content that breached our standards. Some accounts repeatedly posted violating content, leading to them being temporarily blocked from posting new content. We also found a small number of accounts where we had incorrectly removed content. In each case, this was due to language that appeared to be hate speech but was being used in an ironic way. In these cases, we have restored the content and would like to apologise for any inconvenience caused. We hope people will recognize that it can be hard sometimes to make decisions when we see negative terms being used about national groups – we usually get this right but we will make mistakes from time to time. We will continue to review the decisions we have made to ensure that they are as accurate as possible. We would also encourage people to read our Community Standards to minimize the risk of there being problems with content they have posted. This is a big challenge, and one that we are working on everyday to keep improving the experience people have using our service. We’d like to thank you for taking the time to share your feedback with us. For more information please visit: https://www.facebook.com/communitystandards Yours sincerely Thomas Myrup Kristensen Director of Policy, Facebook, Nordics, Eastern Europe and Russia.
Hello, We saw that you had signed the petition asking Facebook to ‘stop political blocking’ and wanted to reach out to you directly to explain how we operate. We hope this will bring more clarity to this issue. We do first want to reiterate that we strongly support people being able to engage in political debate on Facebook, and we recognize the importance of these debates to people, especially in times of conflict. We have a single set of Community Standards around the world that help keep Facebook safe for everyone. These standards prohibit content that contains nudity, hate speech, specific threats of violence or bullying content. (For more details visit www.facebook.com/communitystandards). When content is reported to us we investigate to see if it breaks our standards and if it does, we take it down. It doesn’t matter if something is reported once or 100 times, we only remove content that goes against these standards. We have learnt over the years that people can run campaigns to create many reports about content they don’t like and have designed our systems to prevent this leading to the removal of acceptable content. All of our reports are handled in global centres where multilingual, highly trained teams deal with requests from all over the world in an impartial way. Within these teams we have people who speak both Russian and Ukrainian who can help give specific language context. And we have quality control systems in place to ensure that reports are decided on correctly according to the common standards. Of course, with more than 1.4 billion people around the world using Facebook there will be the rare occasion where our teams don’t get it right. When this happens and we are alerted to it then we will quickly work to restore or remove content as appropriate. When we remove a piece of content, we notify the person who posted it with an explanation of why it broke our rules, so they know not to post such content again. If people repeatedly post content that breaks our standards then we may block them from posting content for a number of days. As a last resort, if people seem to be ignoring the standards altogether, we sometimes have to suspend their accounts. When people raise questions about our decisions we will look at them again. We have reviewed the decisions made on the accounts in the petition and others that were brought to our attention. We have found that the majority of actions were correctly taken because of content that breached our standards. Some accounts repeatedly posted violating content, leading to them being temporarily blocked from posting new content. We also found a small number of accounts where we had incorrectly removed content. In each case, this was due to language that appeared to be hate speech but was being used in an ironic way. In these cases, we have restored the content and would like to apologise for any inconvenience caused. We hope people will recognize that it can be hard sometimes to make decisions when we see negative terms being used about national groups – we usually get this right but we will make mistakes from time to time. We will continue to review the decisions we have made to ensure that they are as accurate as possible. We would also encourage people to read our Community Standards to minimize the risk of there being problems with content they have posted. This is a big challenge, and one that we are working on everyday to keep improving the experience people have using our service. We’d like to thank you for taking the time to share your feedback with us. For more information please visit: https://www.facebook.com/communitystandards Yours sincerely Thomas Myrup Kristensen Director of Policy, Facebook, Nordics, Eastern Europe and Russia.

We’ve heard from our community that listing “feeling fat” as an option for status updates could reinforce negative body image, particularly for people struggling with eating disorders. So we’re going to remove “feeling fat” from the list of options. We’ll continue to listen to feedback as we think about ways to help people express themselves on Facebook.
We’ve heard from our community that listing “feeling fat” as an option for status updates could reinforce negative body image, particularly for people struggling with eating disorders. So we’re going to remove “feeling fat” from the list of options. We’ll continue to listen to feedback as we think about ways to help people express themselves on Facebook.

On Facebook, auto-play settings are easily customizable for both mobile and desktop — we want to make sure that people have control over when videos auto-play in News Feed. If you don’t want videos to auto-play, you can easily turn this feature off in Settings for both mobile and desktop. For mobile, there’s also an option to have videos only auto-play when you are on WiFi, which does not use data. See links below for instructions on how to adjust your auto-play settings. On mobile: https://www.facebook.com/help/mobile-touch/633446180035470 On desktop: https://www.facebook.com/help/1406493312950827 We also want to make sure that videos consume as little data as possible. If you scroll past an auto-play video in News Feed, the data used is minimal. If you stop to watch a video for a longer period of time, you will use more data.
On Facebook, auto-play settings are easily customizable for both mobile and desktop — we want to make sure that people have control over when videos auto-play in News Feed. If you don’t want videos to auto-play, you can easily turn this feature off in Settings for both mobile and desktop. For mobile, there’s also an option to have videos only auto-play when you are on WiFi, which does not use data. See links below for instructions on how to adjust your auto-play settings. On mobile: https://www.facebook.com/help/mobile-touch/633446180035470 On desktop: https://www.facebook.com/help/1406493312950827 We also want to make sure that videos consume as little data as possible. If you scroll past an auto-play video in News Feed, the data used is minimal. If you stop to watch a video for a longer period of time, you will use more data.