"Facebook tries to block tool aimed at promoting transparency around political ads
The standoff comes as campaigns ramp up spending ahead of Election Day.
Facebook has told researchers at New York University to stop using a digital tool that tracks how people are targeted with political ads ahead of the Nov. 3 election.
The demand, sent last week and confirmed by NYU on Friday, centers on the academicsā use of a web browser plug-in that gives Facebook users a way to share specific political ads they are seeing on the site.
Political advertisers primarily target their ads to specific demographic groups, so the NYU tool ā which collects roughly 16,000 ads each week ā allows researchers to see how campaigns and other groups are crafting messages to voters based on race, age, location or other criteria.
In its notice to NYU, Facebook said that the use of the plug-in broke the companyās terms of service, and ordered the academics to stop using the tool by Nov. 30 or face āadditional enforcement action." NYU said it will not take it down.
āWeāre not going to comply with it,ā Laura Edelson, an NYU researcher who is part of the project, told POLITICO. āWhat we are doing is perfectly legal and is in the public interest.ā
The standoff comes as social media giants are facing increased scrutiny over how political operatives are using their platforms to reach voters online, often in messages that use peopleās personal data to pinpoint them with individualized ads.
The plug-in is part of a broader project from the university aimed at archiving all political ads from official campaigns and partisan groups that have spent hundreds of millions of dollars, collectively, to promote Republican and Democratic candidates in the 2020 election.
In that separate relationship with Facebook, NYU researchers can collect some granular information on political ads. The system went down for several hours on Oct. 22 because of a technical glitch, though Facebook and researchers said the blackout was not related to the academicsā plug-in.
Edelson, the NYU researcher, said her tool is particularly crucial to understanding how different political groups are targeting voters in the 2020 election, which is being fought online more than previous presidential elections because of the Covid-19 pandemic.
āWhen things get through the cracks and vulnerable communities are targeted with harmful content, they have a right to know,ā she said. āPeople are concerned about how they are being targeted, and Facebook has not done a good job in providing that level of transparency.ā
More here:
www.politico.com
Good for them not to take it down. At least they can keep it up until after the election before possibly facing penalties. The way Facebook has been targeting communities is important. In addition to them admittedly throttling left leaning news, they have helped the right target their political ads.
With Zuck's Blessing, Facebook Quietly Stymied Traffic to Left-Leaning News Outlets: Report
"When Facebook tweaked its newsfeed algorithm in 2017 to reduce the visibility of political news, the companyās engineers intentionally designed the system to disproportionately impact left-leaning outlets, effectively choking off their traffic in the process.
According to a Wall Street Journal report this week, Facebook bigwigs at the time were concerned about how these changes would affect right-leaning news outlets and wanted to avoid adding fuel to criticsā argument that the platform has an anti-conservative bias. However, in its attempt to appear unbiased, the company evidently overcorrected (which it has a history of doing). Facebookās engineers overhauled the update to affect left-leaning sites more than previously planned, and CEO Mark Zuckerberg himself OKād the redesign, sources told the Journal. The changes werenāt aimed at any particular outlet, the company later said."
gizmodo.com
It is one thing for them to do it, it is another to lie about it. It is their private business and can do what they want with their site, they just should be transparent about it rather than lie to people about being unbiased.
The standoff comes as campaigns ramp up spending ahead of Election Day.
Facebook has told researchers at New York University to stop using a digital tool that tracks how people are targeted with political ads ahead of the Nov. 3 election.
The demand, sent last week and confirmed by NYU on Friday, centers on the academicsā use of a web browser plug-in that gives Facebook users a way to share specific political ads they are seeing on the site.
Political advertisers primarily target their ads to specific demographic groups, so the NYU tool ā which collects roughly 16,000 ads each week ā allows researchers to see how campaigns and other groups are crafting messages to voters based on race, age, location or other criteria.
In its notice to NYU, Facebook said that the use of the plug-in broke the companyās terms of service, and ordered the academics to stop using the tool by Nov. 30 or face āadditional enforcement action." NYU said it will not take it down.
āWeāre not going to comply with it,ā Laura Edelson, an NYU researcher who is part of the project, told POLITICO. āWhat we are doing is perfectly legal and is in the public interest.ā
The standoff comes as social media giants are facing increased scrutiny over how political operatives are using their platforms to reach voters online, often in messages that use peopleās personal data to pinpoint them with individualized ads.
The plug-in is part of a broader project from the university aimed at archiving all political ads from official campaigns and partisan groups that have spent hundreds of millions of dollars, collectively, to promote Republican and Democratic candidates in the 2020 election.
In that separate relationship with Facebook, NYU researchers can collect some granular information on political ads. The system went down for several hours on Oct. 22 because of a technical glitch, though Facebook and researchers said the blackout was not related to the academicsā plug-in.
Edelson, the NYU researcher, said her tool is particularly crucial to understanding how different political groups are targeting voters in the 2020 election, which is being fought online more than previous presidential elections because of the Covid-19 pandemic.
āWhen things get through the cracks and vulnerable communities are targeted with harmful content, they have a right to know,ā she said. āPeople are concerned about how they are being targeted, and Facebook has not done a good job in providing that level of transparency.ā
More here:

Facebook tries to block tool aimed at promoting transparency around political ads
The standoff comes as campaigns ramp up spending ahead of Election Day.

Good for them not to take it down. At least they can keep it up until after the election before possibly facing penalties. The way Facebook has been targeting communities is important. In addition to them admittedly throttling left leaning news, they have helped the right target their political ads.
With Zuck's Blessing, Facebook Quietly Stymied Traffic to Left-Leaning News Outlets: Report
"When Facebook tweaked its newsfeed algorithm in 2017 to reduce the visibility of political news, the companyās engineers intentionally designed the system to disproportionately impact left-leaning outlets, effectively choking off their traffic in the process.
According to a Wall Street Journal report this week, Facebook bigwigs at the time were concerned about how these changes would affect right-leaning news outlets and wanted to avoid adding fuel to criticsā argument that the platform has an anti-conservative bias. However, in its attempt to appear unbiased, the company evidently overcorrected (which it has a history of doing). Facebookās engineers overhauled the update to affect left-leaning sites more than previously planned, and CEO Mark Zuckerberg himself OKād the redesign, sources told the Journal. The changes werenāt aimed at any particular outlet, the company later said."

With Zuck's Blessing, Facebook Quietly Stymied Traffic to Left-Leaning News Outlets: Report
When Facebook tweaked its newsfeed algorithm in 2017 to reduce the visibility of political news, the companyās engineers intentionally designed the system
It is one thing for them to do it, it is another to lie about it. It is their private business and can do what they want with their site, they just should be transparent about it rather than lie to people about being unbiased.