AP: 267 million Facebook users names, phone numbers, and user id's found on the open internet.

hanimmal

Well-Known Member
https://apnews.com/article/business-media-social-media-fce118b1adfef8f6c51518f71465dd4b
Screen Shot 2021-04-04 at 7.43.59 PM.png
NEW YORK (AP) — Details from more than 500 million Facebook users have been found available on a website for hackers.

The information appears to be several years old, but it is another example of the vast amount of information collected by Facebook and other social media sites, and the limits to how secure that information is.

The availability of the data set was first reported by Business Insider. According to that publication, it has information from 106 countries including phone numbers, Facebook IDs, full names, locations, birthdates, and email addresses.

Facebook has been grappling with data security issues for years. In 2018, the social media giant disabled a feature that allowed users to search for one another via phone number following revelations that the political firm Cambridge Analytica had accessed information on up to 87 million Facebook users without their knowledge or consent.

In December 2019, a Ukrainian security researcher reported finding a database with the names, phone numbers and unique user IDs of more than 267 million Facebook users — nearly all U.S.-based — on the open internet. It is unclear if the current data dump is related to this database.

“This is old data that was previously reported on in 2019,” the Menlo Park, California-based company said in a statement. “We found and fixed this issue in August 2019.”
 

hanimmal

Well-Known Member
https://www.washingtonpost.com/politics/2021/08/05/technology-202-facebook-shuttered-crucial-tool-oversight-lawmakers-say-it-just-made-their-jobs-harder/
Screen Shot 2021-08-05 at 11.11.26 AM.png
At a high-stakes hearing just after the 2020 election, Sen. Amy Klobuchar (D-Minn.) grilled Mark Zuckerberg about data that showed Facebook mislabeled tens of thousands of political ads. The information came from a tool developed by researchers at New York University, which for over a year has been one of the preeminent tools for shining light on the notoriously opaque tech giant.

Facebook had just threatened to shut off the project, now called the Ad Observatory, and Klobuchar was bewildered. “Why would you not support this project?” she asked Zuckerberg.

Her frustration grew this week after Facebook cut off the researchers’ access to ad data, essentially killing the tool, which journalists and academics have widely used to track Facebook’s lucrative and powerful digital advertising business.

The project’s researchers have regularly briefed staffers and lawmakers in the House and Senate and officials at agencies such as the Federal Trade Commission, said one of the project members, Laura Edelson.

“The NYU team provided valuable data about advertising on Facebook — data that Facebook has consistently refused to provide,” said Jonathan Mayer, an assistant professor at Princeton who previously served as a tech and policy adviser to then-Sen. Kamala D. Harris.

The social networking giant announced in a blog post Tuesday that it had disabled the researchers’ accounts, saying it took the step to “protect people’s privacy” and to comply with a settlement it struck with the FTC over allegations that it violated users’ privacy.

The decision marks a setback for policymakers who have relied on the data to scrutinize the tech giant’s practices, including during the 2020 campaign, when the project helped shed light on shadowy ad campaigns and spending by key figures.
Screen Shot 2021-08-05 at 11.12.46 AM.png


Sen. Mark R. Warner (D-Va.) told The Technology 202 that “time and again” independent researchers have led the way in “illuminating the ways in which online advertising has become a key vector for online scams, political misinformation and voter suppression campaigns by a wide range of bad actors.”

He added: “In all of these cases, independent researchers were identifying misuse of these platforms well ahead of the platforms themselves —driving the policy discussion and public debate.”

Lawmakers have fiercely defended the value of the project’s data, which has been referenced in congressional letters, hearings and lawmakers’ public remarks.

During the run-up to the 2020 elections, leaders on the House Energy and Commerce Committee wrote to Zuckerberg objecting to the company’s earlier attempt to shutter the program, calling the data “crucial for holding both advertisers and Facebook accountable.” In his own written questioning of Zuckerberg after the election, Sen. Richard Blumenthal (D-Conn.) said the project “enables accountability.”

Rep. Frank Pallone Jr. (D-N.J.), who chairs the Energy Committee and is leading an investigation into misinformation across social media platforms, took strong exception to Facebook’s decision on Wednesday.

“These are the actions of a company that clearly has something to hide about how dangerous misinformation and disinformation is spreading on its platform,” he told The Technology 202.

And the censure has been bipartisan.

“America was born on research and innovation, and Silicon Valley in particular benefitted from such efforts," said Sen. Marsha Blackburn (R-Tenn.). "Its extremely concerning that Facebook is seeking to censor and block those who try to learn more about their practices.”

Edelson, one of the project’s lead researchers, said Facebook’s decision has not only kneecapped her team’s ability to hold the company accountable, but also its ability to share findings with officials in Washington.

“It’s just gotten a lot harder for me to make data available to the public and to lawmakers and regulators, and that was something I regularly did,” said Edelson, a PhD candidate at NYU, calling it “a blow to oversight.”

Her frequent consultations included working with Klobuchar on her legislation to set standards around political ad disclosures online, which Facebook has backed.

“As we face threats to our democracy, we need more transparency from online platforms, not less,” Klobuchar said in a statement, adding that she was “deeply troubled by the news."

Edelson said Facebook’s decision to shut the researchers down came swiftly after they notified the company that they were looking into the disinformation that precipitated the attack on the Capitol on Jan. 6.

She said they had planned a “case study” that, among other things, would look into how calls for violence by partisan media may have stoked outrage on Facebook ahead of the riot.

But when they went to pull up the data, they found something odd: “We noticed that there was a lot of content that was missing.” Edelson said she then contacted Facebook.

“They said, “Yeah, it’s a bug. Do you have any more examples?’ And I sent him some more examples. And that was a lot of contact I had with them,” she said.

Facebook spokesman Andy Stone said in a statement that the company “gave NYU ample time to come into compliance with our terms, and outlined that we would take an enforcement action if they failed to come into compliance.”

He added, “Any insinuation that this was an abrupt removal of access or retaliation does not comport with reality.”
Lawmakers on Capitol Hill have themselves been probing the role of Facebook and other social media sites in the Jan. 6 riot. Now they’ll have one less key tool in their bag.
 

hanimmal

Well-Known Member
https://www.washingtonpost.com/technology/2021/08/29/facebook-privacy-monopoly/
Screen Shot 2021-08-29 at 8.46.40 AM.png
Megan Borovicka joined Facebook in 2013 and then forgot she even had an account. But Facebook never forgot about her.

The 42-year-old Oakland, Calif., lawyer never picked any “friends,” posted any status updates, liked any photos or even opened the Facebook app on her phone. Yet over the last decade, Facebook has used an invisible data vacuum to suction up very specific details about her life — from her brand of underwear to where she received her paycheck.

“It’s a strange feeling,” Borovicka told me, after I showed her what Facebook knew about her. She paused looking at a string of shopping data from one Christmas when she was stuck with a sick kid while her husband went to Macy’s. “Why do they need to know that?” she said. “I thought if I’m not using Facebook, I wouldn’t be in its orbit.”

Facebook has become too big to escape. We’re rightly becoming more skeptical of Big Tech monopolies, and that should include the sheer volume of data they collect.

Help Desk: Ask our tech columnist a question

Earlier this month, the Federal Trade Commission filed an updated antitrust lawsuit against Facebook, arguing the company needs to be broken up.
Some 69 percent of American adults now have Facebook accounts, according to Pew Research. The next most popular social network, Instagram, is also owned by Facebook. So are messaging services WhatsApp and Messenger.

How does Facebook’s bigness hurt you and me? As Borovicka and I learned, Facebook takes a toll on your privacy — but perhaps not in the way you expect. It isn’t just the Facebook app that’s gobbling up your information. Facebook is so big, it has convinced millions of other businesses, apps and websites to also snoop on its behalf. Even when you’re not actively using Facebook. Even when you’re not online. Even, perhaps, if you’ve never had a Facebook account.

Here’s how it works: Facebook provides its business partners tracking software they embed in apps, websites and loyalty programs. Any business or group that needs to do digital advertising has little choice but to feed your activities into Facebook’s vacuum: your grocer, politicians and, yes, even the paywall page for this newspaper’s website. Behind the scenes, Facebook takes in this data and tries to match it up to your account. It sits under your name in a part of your profile your friends can’t see, but Facebook uses to shape your experience online.

Among the 100 most popular smartphone apps, you can find Facebook software in 61 of them, app research firm Sensor Tower told me. Facebook also has trackers in about 25 percent of websites, according to privacy software maker Ghostery.

I tried my own version of Borovicka’s experience by cutting Facebook and Instagram out of my life for two weeks and then tallying who sent it my data. But unlike her, I left its apps on my phone — untouched for a while, but present. (Below I’ll show you how I unearthed what Facebook knows, and how you can see it for yourself.)

While I was gone, Facebook got a notice when I opened Hulu to watch TV. Facebook knew when I went shopping for paint, a rocking chair and fancy beans. Facebook learned I read the websites What To Expect, Lullaby Trust and Happiest Baby. There’s no surprising Facebook when you’re expecting a baby.

Over two weeks, Facebook tracked me on at least 95 different apps, websites and businesses, and those are just the ones I know about. It was as if Facebook had hired a private eye to prepare a dossier about my life.

Screen Shot 2021-08-29 at 8.49.17 AM.png

The making of a monopoly
How did Facebook get too big to escape?

Facebook has had so many different privacy screw-ups, you can be forgiven for not remembering which ones were happening a decade ago.

Starting around 2010, researchers noticed Facebook was placing software on websites that weren’t Facebook.com. Back then, sites were including widgets that let you “like” or share content to Facebook without ever leaving. Privacy advocates were worried Facebook was using data it collected to track everything we do online.

In response, Facebook said in very clear terms that it wasn’t using the Web data to track people. It said it only used the data for advertising purposes when you actively clicked on a widget to share some content with friends. That was enough of a promise to keep many, including me, from jumping ship to some other up-and-coming social network, like Google+ or Orkut.

But in 2014, Facebook reversed course. It announced it would begin allowing its advertisers to target us based on the websites we visit. Facebook was giving itself permission to track everything we do beyond its happy blue walls. (“Facebook is following you” I warned in a technology column I wrote at the time.)

Our shift to using mobile phone apps increased Facebook’s reach. Its form of tracking also broke new ground: While many companies were using browser cookies, which could be easily cleared or blocked, Facebook tied what it learned to real identities — the names on our Facebook profiles.

Today, Facebook says it uses the data it gets from other companies to target its members with ads and make recommendations for things like groups and events. It forbids businesses from passing along what it considers “sensitive” information about our lives. But that hardly stops its giant data vacuum.

Legal scholar Dina Srinivasan, whose ideas about privacy shaped the government lawsuit against Facebook, tells me the 2014 switcheroo was the moment it became clear the social network had monopoly power over consumers.

“It’s a farce that consumers are happy with surveillance in return for a free product,” says Srinivasan, a former advertising executive. Everyone has different expectations about privacy, but in democracies people tend to agree broad surveillance is bad.

So then how does Facebook get away with doing it? Because we don’t have a choice. As the FTC wrote in its lawsuit: “Without meaningful competition, Facebook has been able to provide lower levels of service quality on privacy and data protection than it would have to provide in a competitive market.”

By the time Facebook switched on external tracking in 2014, Facebook owned onetime rival Instagram, and arch-nemesis Google had killed off its alternative social network Orkut. Facebook knew it could raise its price — making a grab for a lot more data.

Lawmakers have typically been concerned about the ways monopolies harm consumers, such as unilaterally raising prices. But just because Facebook is free to us doesn’t mean it can’t act like a monopoly. Srinivasan says we should think of Facebook’s cost as our data, and scrutinize the power it has to set its own price.

One way to measure it: In 2013, the average American’s data was worth about $19 per year in advertising sales to Facebook, according to its financial statements. In 2020, your data was worth $164 per year.

And what about Facebook’s assertion this isn’t a monopoly problem? “Monopoly has always been concerned with price and quality. Facebook knows that,” says Srinivasan.

Screen Shot 2021-08-29 at 8.50.37 AM.png


Article continues below.
 

hanimmal

Well-Known Member
Rest of the above story about Facebook selling us out.
How high is your privacy price?
If you’ve got an active Facebook account, you can see some of what it knows in a special section Facebook added to its website and app in 2020 called “off-Facebook activity.” (You can click here to access yours if you’re logged in.) It introduced this after that awkward visit CEO Mark Zuckerberg paid to Congress where he told lawmakers you’re “in control” of your data 45 times and, rightly so, nobody believed him.

Your off-Facebook activity screen will only cover the last two years worth of surveillance. But it showed Borovicka and me that Facebook is an equal-opportunity spy — it collects data on hibernating users and Instagram addicts alike. It also doesn’t matter if, like Borovicka, you had joined Facebook at a time when it had a different privacy policy.

This invisible surveillance system doesn’t require you to click “like” or use a “login with Facebook” button. You don’t necessarily have to be logged in to the Facebook app or website on your phone — companies can report other identifying information to Facebook like your email to help it figure out who you are. (That said, the more devices where you’re using Facebook, the more opportunities you give the social network to match up your activity with your identity.)

For example, during my experiment, Facebook got multiple notices from the meal delivery app DoorDash, indicating I opened the app at 11:09, 11:11 and 11:15 on different mornings. I guess I’m a creature of habit — and that’s the exactly the point: Now Facebook knows exactly the best time to advertise lunch to me.

If you’ve only got an Instagram account, Facebook also tracks you — but the photo-sharing app offers no mechanism to download this data.

Borovicka even tried using the force of law to access the data about her 13-year-old daughter, who only has an account with Instagram. The California Consumer Privacy Act, which went into effect in 2020, gives any resident the right to access and delete their personal data. Borovicka sent Instagram a legal request on behalf of her daughter. The company never sent her the data.

And what if you’ve never had a Facebook account at all? It may still be watching.

Borovicka tried to use that same California privacy law, known as CCPA, to view the data about her 11-year-old son, who has never had an account on Facebook or on Instagram.

Facebook replied it wouldn’t comply with the access request because her son doesn’t have an account it could use to verify his identity. But if he had an account, he’d be giving Facebook the right to collect his data. “It feels like you’re trapped in some kind of logic circle,” said Borovicka.

In its emailed reply, Facebook acknowledged it could still be collecting the boy’s personal information. It said: “When a person visits a site or app that uses one or more of these Facebook services, these sites and apps may send us information regardless of whether the person has a Facebook profile.”

Facebook told me it does not use this nonmember data to “create profiles” of people or target ads at them. But it doesn’t claim to delete the data — or preclude other uses for it.

How we survive the surveillance apocalypse

“What are the harms I’m not realizing?” Borovicka asked me. Facebook makes it sound so low stakes — like all this data is just about making ads for pants more interesting.

My answer: Ever get the feeling your phone is listening in on your conversations? This is what’s really going on.

Think about data as power. I, for one, don’t like giving businesses a leg up on my dreams and fears to hyper-target their sales pitches. Facebook can also use your life to nudge you into groups that shape your thinking on politics or even the coronavirus vaccine. It can use your life to train its artificial intelligence, analyze web and mobile app use, or whatever new purpose it might dream up in a decade.

At stake in all that personal information is the ability to manipulate our economy, our society and even our democracy.

Just try to hide

So what can you do to stop Facebook from following you? The whole problem with monopolies is that they leave consumers with few good choices.

The off-Facebook-activity disclosure does give you the option to “disconnect” some of this data from your account.
That term, which only a lawyer’s mother could love, does not mean that Facebook deletes your data — it means
Facebook just no longer uses it to target you with ads all over the Internet. (And, of course, this only works if you have an account.)

There’s little you can do to stop Facebook from collecting your information in the first place.

You can try to make changes to your computer and phone to block some of Facebook’s tracking altogether, (see this list of technical steps I take on my own devices) but that takes commitment and expertise because the tactics keep changing.

At the moment, the most effective pushback against the privacy price of Facebook is coming from technology giants Apple and Google, which have their own monopoly powers over smartphone operating systems and web browsers.

Earlier this year, Apple began giving iPhone owners the ability to tell apps not to track them, cutting back on some of Facebook’s ability to get data from apps that don’t also know your email address or phone number. Google has promised a similar setting for Android phones, but both of these are a partial solution, at best.

In response to these sorts of moves, Facebook recently announced it is exploring what it calls “privacy-enhancing technologies” to target ads using less personally identifiable information. But so far, those remain experiments.

So what about Borovicka?

“My instinct is to get completely off Facebook,” she told me when I asked what she was going to do next. “But I feel torn because it’s one thing to make these choices for myself, but another for my teenagers.”

The truth remains, no other online services come close to matching Facebook and Instagram’s ability to connect you with people you know, or might want to know.

For the vast majority of Americans, quitting Facebook is no more likely to happen in 2021 than it was a decade ago. We’re just stuck with its ever-higher price.
 

schuylaar

Well-Known Member
Im not sure about them being the devil, but their abrupt changes in the year after Snowden landed in Russia and their military started their attack on our democracies, makes me really wonder if Zuck is not a Russian stooge.
that would qualify him as Satan. he also accepts Rubles..you just need an app on your phone. Is The Zuch taking Bitcoin? talk about a sellout to his country. i can't wait until he ends up with a bullet in his head or gets pancreatic cancer and refuses to eat anything but carrots.
 

potroastV2

Well-Known Member
Rest of the above story about Facebook selling us out.
Why? You gave us a link to this article, and posted most of it. If we want to read the rest, we can click on the link.

In response to your alarmist article, I have a facebook page that I rarely use. However, I use Firefox on both my devices, and I configure it to NOT allow tracking.


:mrgreen:
 

hanimmal

Well-Known Member
that would qualify him as Satan. he also accepts Rubles..you just need an app on your phone. Is The Zuch taking Bitcoin? talk about a sellout to his country. i can't wait until he ends up with a bullet in his head or gets pancreatic cancer and refuses to eat anything but carrots.
lol Zuck came up with a whole new crypto currency that he controlled. What could possibly go wrong?


https://apnews.com/article/bitcoin-social-platforms-business-financial-markets-technology-366328036c7246f2871f4ab76bf1379aScreen Shot 2021-08-29 at 10.56.17 AM.png
NEW YORK (AP) — Facebook is unveiling a digital currency called Libra as the company seeks to make its ads more valuable by enabling smoother transactions and payments online, particularly among those without credit cards or bank accounts.

Libra will use the same security and record-keeping principles as Bitcoin, the most popular digital currency system today. But unlike Bitcoin, Libra is backed by several traditional financial companies, including PayPal, Visa and Mastercard, and will base its value on multiple real-world currencies such as the U.S. dollar and the euro. Libra also faces additional scrutiny over privacy, given Facebook’s poor record on the matter.

Here’s a look at Libra and other cryptocurrencies.

WHAT’S A CRYPTOCURRENCY ANYWAY?

It’s a form of digital cash that uses encryption technology to make it secure. Cryptocurrencies exist not as physical bills or coins but rather as lines of digitally signed computer code. Records are typically kept on ledgers known as blockchain.

People can store their cryptocurrency stashes in virtual wallets that resemble online bank accounts. Facebook is developing a wallet app for Libra; others will be able to as well.

As with other cryptocurrencies, people will be able to buy and sell libras on exchanges for traditional currencies. It’s not clear what fees, if any, consumers will have to pay for such transfers, although Facebook says they should be low.

WHY NOT USE BITCOIN?

Although Bitcoin has gotten a lot of attention, it isn’t widely used. For one thing, its value fluctuates wildly, meaning that $100 in bitcoins today might be worth $300 a month from now — or $2.50. Only a handful of merchants accept bitcoins as payments.

Facebook is hoping to keep the libra’s value stable by tying it closely to established currencies. Unlike most other cryptocurrencies, the Libra will be backed by real-world bank deposits and government securities in a number of leading currencies.

Facebook is also recruiting partners ahead of time. Lyft, Uber and Spotify already have joined the Libra group. They will likely accept libras when the system launches. They’ll also help fund, build and govern the system. That’ll make Libra less of a free-for-all than Bitcoin. Facebook says Libra will embrace regulation, but it isn’t providing many details on how.

With most cryptocurrencies, including Bitcoin, anyone can lend computing power to verify transactions and to prevent anyone spending the same digital coin twice. With Libra, the verifications will initially be managed by its founding companies, such as Facebook and PayPal. Facebook believes the closed approach will mean better security.

ARE CRYPTOCURRENCIES ANONYMOUS?

Although it’s possible to trace bitcoins and some other cryptocurrencies as they are spent, owners of accounts behind the transactions aren’t necessarily known. That makes such currencies a favorite among certain cybercriminals. But it is sometimes possible to tie cryptocurrency transactions to a real person who has cashed out digital coinage into a traditional currency.

And if someone spends libras while logged onto Facebook, it’s theoretically possible Facebook could tie it back to a real person.

Facebook says it won’t use Libra data to target ads, but may share data “to keep people safe, comply with the law, and provide basic functionality.” Facebook is creating a subsidiary, Calibra, to try to keep the operations separate.

GETTING STARTED

Libra is scheduled to launch publicly in the first half of next year. Whether consumers will embrace it is another matter. Discounts potentially offered by Uber and other partners might be enough to get people to at least try the system. But many people find it easy enough to pay for goods and services online with credit and debit cards.

There could be greater appeal among people who don’t have bank accounts. Libra could open up e-commerce to them.

Though Libra could be a way for Facebook to drive spending when people interact with Facebook ads, the company says the currency will be independent and won’t require a Facebook account to use.
Why? You gave us a link to this article, and posted most of it. If we want to read the rest, we can click on the link.

In response to your alarmist article, I have a facebook page that I rarely use. However, I use Firefox on both my devices, and I configure it to NOT allow tracking.


:mrgreen:
Why post the rest of it? In case people wanted to read it here that are not as savvy as you are with your privacy.
 

schuylaar

Well-Known Member
lol Zuck came up with a whole new crypto currency that he controlled. What could possibly go wrong?


https://apnews.com/article/bitcoin-social-platforms-business-financial-markets-technology-366328036c7246f2871f4ab76bf1379aView attachment 4975154



Why post the rest of it? In case people wanted to read it here that are not as savvy as you are with your privacy.
some people can't afford the price to subscribe and it is appreciated by them- nothing worse than reading part of the story then hitting a paywall.

there's a term for it: self-less or unselfish when others wish to share not expecting something in return.
 
Last edited:

schuylaar

Well-Known Member
Why? You gave us a link to this article, and posted most of it. If we want to read the rest, we can click on the link.

In response to your alarmist article, I have a facebook page that I rarely use. However, I use Firefox on both my devices, and I configure it to NOT allow tracking.


:mrgreen:
Democracy Dies in Darkness -Washington Post

especially when there's a paywall.
 

hanimmal

Well-Known Member
https://www.rawstory.com/disinformation-on-facebook/Screen Shot 2021-09-01 at 10.37.50 AM.png
A number of Facebook posts from the period around the Jan. 6 insurrection went missing from the social media service.

The posts -- which covered anything from personal updates to violent incitement -- disappeared from the Crowdtangle transparency tool that Facebook uses to allow researchers to track what users are saying on the platform, reported Politico.

"If Facebook knew about this, and just didn't tell anyone, I think researchers should be pretty concerned about that fact," said Laura Edelson, a New York University academic who was part of the team, which included researchers from Université Grenoble Alpes, that discovered the missing data.

The lost posts have been unavailable since at least May, and the company told Politico they were accidentally removed due to a data limit on its technical transparency tools but the error had been fixed.

Lawmakers asked the social media giant Friday to turn over internal documents and data linked to the U.S. Capitol riots, including information about how election misinformation spread on the site.

The academics who found the problem believe that tens of thousands of posts from the days before and after the riots were still missing, but Facebook said all the original posts were available directly through the site, and a spokesperson said about 80 percent of the posts flagged by researchers should not have been available on Crowdtangle because they had been deleted or made private.

"Researchers do assume that they are getting all the public content from Facebook pages that are indexed by Crowdtangle," said Edelson, whose account was suspended last month for her separate work around political ads. "Those assumptions have been violated in this case."

Facebook is currently dismantling its Crowdtangle team after the tool was repeatedly used to trace how extremist content and misinformation spread across that site and the company-owned Instagram platform, and the tech company was forced to release statistics that showed COVID-19 misinformation was still some of the most popular content on the platform.
 

hanimmal

Well-Known Member
https://www.washingtonpost.com/technology/2021/09/16/facebook-files-internal-research-harms/Screen Shot 2021-09-16 at 11.01.28 AM.png
Facebook knew that teen girls on Instagram reported in large numbers that the app was hurting their body image and mental health. It knew that its content moderation systems suffered from an indefensible double standard in which celebrities were treated far differently than the average user. It knew that a 2018 change to its news feed software, intended to promote “meaningful interactions,” ended up promoting outrageous and divisive political content.

Screen Shot 2021-09-16 at 11.08.13 AM.png

Only Facebook knows the extent of its misinformation problem. And it’s not sharing, even with the White House.

Facebook did not respond to a request for comment Thursday. In the past, it has responded to criticism over the role of its organizational structure by downplaying the role of any given executive and explaining that big decisions at the company receive input from multiple teams.

From its early days, Facebook has employed data scientists across various teams to study the effects of its products, and taken their findings seriously at the highest levels. In 2008, for instance, CEO Mark Zuckerberg signed off on the introduction of a “like” button only after its data scientists found in a test that it made users more likely to interact with one another’s posts, a story recounted by longtime Facebook executive Andrew Bosworth in a 2010 Quora post. In 2015, members of the company’s news feed ranking team explained to me how they rely on a dizzying array of surveys, focus groups and A/B tests to measure the impacts of any proposed change to the algorithm along multiple dimensions. Most of those findings were never publicized, but they factored heavily in the company’s decisions about which changes to implement.

More recently, Facebook has tasked its data scientists and multiple integrity and safety teams across the company with investigating questions about its products’ influence on things like global affairs, the flow of political information and users’ well-being. In at least a few cases, their findings have informed key product decisions. The 2018 Facebook news feed change around “meaningful interactions,” for one, was justified partly by appeal to research that found interacting with friends on social media was better for people’s mental health than passively watching videos.

Yet a pattern has emerged in which findings that implicate core Facebook features or systems, or which would require costly or politically dicey interventions, are reportedly brushed aside by top executives, and come out only when leaked to the media by frustrated employees or former employees.

For instance, the New York Times reported in 2018 that Facebook’s security team had uncovered evidence of Russian interference ahead of the 2016 U.S. election, but that Chief Operating Officer Sheryl Sandberg and Vice President of Global Public Policy Joel Kaplan had opted to keep it secret for fear of the political fallout. In February 2020, The Washington Post reported that an internal investigation following the 2016 election, called “Project P,” had identified a slew of accounts that had peddled viral fake news stories in the run-up to Donald Trump’s victory, but only a few were disabled after Kaplan warned of conservative backlash.

In September 2020, BuzzFeed obtained a memo written by former Facebook data scientist Sophie Zhang, making the case that the company habitually ignored or delayed action on fake accounts interfering in elections around the world. In July 2021, MIT Technology Review detailed how the company pulled the plug on efforts by its artificial intelligence team to address misinformation, out of concern that they would hurt user engagement and growth. Just last month, the company admitted that it had shelved a planned transparency report showing that its most shared link over a three-month period was an article casting doubt on the safety of coronavirus vaccines.

Facebook says post that cast doubt on covid-19 vaccine was most popular on the platform from January through March

Kaplan, a former Republican operative, is a recurring figure in many of these accounts. His current and former bosses, Nick Clegg and Elliot Schrage, respectively, also surface at times, albeit less often. They, in turn, report to Sandberg, who is Zuckerberg’s right hand.

Part of the issue, insiders say, may be the scope of these executives’ roles. As policy chief, Kaplan has input into decisions about how to apply Facebook’s rules, while also overseeing its relations with political leaders in D.C. — a mandate that all but ensures political considerations shape the platform’s policy choices. Clegg, meanwhile, oversees both policy and communications, weighing not only politics but PR concerns in evaluating which policies to pursue.

In contrast, Twitter’s then-vice president of global communications, Brandon Borrman, told me in 2020 that his company sends decisions about content enforcement, trust and safety, such as the call to fact-check one of Trump’s tweets for the first time, up a chain of command that is separate from its political and public relations divisions. Borrman said that he and the company’s top government relations executive were briefed on the decision only after CEO Jack Dorsey had accepted the trust and safety team’s recommendation.

Alex Stamos, Facebook’s former chief security officer who struggled to publicize his team’s findings on Russian election interference, has argued Facebook’s organizational structure helps to explain why all kinds of well-intentioned internal studies and projects at the company never see daylight. (Stamos now researches cybersecurity at the Stanford Internet Observatory.)

“I keep talking about how organizational design is a huge problem at Facebook,” Stamos tweeted Wednesday, after the third report in the Journal’s Facebook Files series. “In these cases, the unified product policy/government affairs structure and the isolation of people who care in dedicated Integrity teams are the problem. And Zuck.”

The last line of that tweet is a reference, of course, to Zuckerberg, who emerges in Sheera Frenkel and Cecilia Kang’s recent book “An Ugly Truth: Inside Facebook’s Battle for Domination” as the driving force behind a company culture that has long prioritized growth and dominance over concerns of societal harms.

Facebook’s strategy: Avert disaster, apologize and keep growing

Sandberg, for her part, is portrayed in the same book as averse to confrontation and unable or unwilling to stand up to Zuckerberg and Kaplan on pivotal decisions. Her private conference room at Facebook’s headquarters long bore a sign that said “Only good news,” according to numerous reports — a credo that may go a long way toward explaining why uncomfortable internal research findings struggle to find an audience.

Screen Shot 2021-09-16 at 11.07.08 AM.png
 

hanimmal

Well-Known Member
https://www.rawstory.com/trump-facebook-2655063882/Screen Shot 2021-09-20 at 10.30.31 AM.png
Facebook founder Mark Zuckerberg agreed to push conservatism on his platform as part of an agreement with the Trump administration, according to a new book.

Venture capitalist Peter Thiel told a confidant that he and Zuckerberg met with the former president, Jared Kushner and their spouses at the White House in 2019, where the Facebook founder promised not to fact check political speech if the administration agreed not to impose heavy-handed regulations, according to excerpts from The Contrarian: Peter Thiel and Silicon Valley's Pursuit of Power published by New York Magazine.

"Facebook had long seen itself as a government unto itself," wrote author Max Chafkin, "now, thanks to the understanding brokered by Thiel, the site would push what the Thiel confidant called 'state-sanctioned conservatism.'"

Zuckerberg denied the deal, calling the idea "pretty ridiculous," but the social media platform allowed Trump posts seemingly calling for violence against Black Lives Matter protesters that Twitter removed, and the company mostly ignored calls to limit "Stop the Steal" groups after the former president's election loss.

Trump remains suspended from Facebook for two years following the Jan. 6 insurrection.
 

hanimmal

Well-Known Member
https://www.rawstory.com/facebook-boost-post/Screen Shot 2021-09-22 at 10.57.55 AM.png
Facebook is aggressively trying to reshape its image after years of criticism for violating users' privacy, and allowing disinformation and hate speech.

In the past, Facebook and CEO Mark Zuckerberg have publicly apologized for the social media platform's major missteps, such as allowing Russian interference on the site during the 2016 presidential election, according to a report from the New York Times.

Recently, however, Zuckerberg and other executives decided to shift their strategy and go on the offensive, denying responsibility for things like COVID-19 vaccine disinformation, and blocking access to data that allowed academics and journalists to study how the platform worked.

Zuckerberg also recently signed off on Project Amplify, an initiative in which Facebook has been using its News Feed — its most important digital real estate — to artificially boost positive stories about, well, itself.

"The idea was that pushing pro-Facebook news items — some of them written by the company — would improve its image in the eyes of its users, three people with knowledge of the effort said," according to the NYT report. "But the move was sensitive because Facebook had not previously positioned the News Feed as a place where it burnished its own reputation. Several executives at the meeting were shocked by the proposal, one attendee said."

According to the NYT, Zuckerberg wanted to recast himself as an innovator and distance himself from scandals, so executives came up with a strategy of focusing his Facebook and Instagram posts on new products.

"Rather than addressing corporate controversies, Mr. Zuckerberg's posts have recently featured a video of himself riding across a lakecarrying an American flag, with messages about new virtual reality and hardware devices," the NYT reports. "Once the tests began, Facebook used a system known as Quick Promotes to place stories about people and organizations that used the social network into users' News Feeds, they said. People essentially see posts with a Facebook logo that link to stories and websites published by the company and from third-party local news sites. One story pushed 'Facebook's Latest Innovations for 2021' and discussed how it was achieving '100 percent renewable energy for our global operations.'"

Read the full story here.
 

hanimmal

Well-Known Member
https://apnews.com/article/coronavirus-pandemic-lifestyle-technology-sports-business-43cc5e3bc9fbdd6d2d2f425c117e4f0aScreen Shot 2021-09-25 at 8.41.48 AM.png
BRUSSELS (AP) — It’s the premier martial arts group in Europe for right-wing extremists. German authorities have twice banned their signature tournament. But Kampf der Nibelungen, or Battle of the Nibelungs, still thrives on Facebook, where organizers maintain multiple pages, as well as on Instagram and YouTube, which they use to spread their ideology, draw in recruits and make money through ticket sales and branded merchandise.

The Battle of the Nibelungs — a reference to a classic heroic epic much loved by the Nazis — is one of dozens of far-right groups that continue to leverage mainstream social media for profit, despite Facebook’s and other platforms’ repeated pledges to purge themselves of extremism.

All told, there are at least 54 Facebook profiles belonging to 39 entities that the German government and civil society groups have flagged as extremist, according to research shared with The Associated Press by the Counter Extremism Project, a non-profit policy and advocacy group formed to combat extremism. The groups have nearly 268,000 subscribers and friends on Facebook alone.

CEP also found 39 related Instagram profiles, 16 Twitter profiles and 34 YouTube channels, which have gotten over 9.5 million views. Nearly 60% of the profiles were explicitly aimed at making money, displaying prominent links to online shops or photos promoting merchandise.

Click on the big blue “view shop” button on the Erik & Sons Facebook page and you can buy a T-shirt that says, “My favorite color is white,” for 20 euros ($23). Deutsches Warenhaus offers “Refugees not welcome” stickers for just 2.50 euros ($3) and Aryan Brotherhood tube scarves with skull faces for 5.88 euros ($7). The Facebook feed of OPOS Records promotes new music and merchandise, including “True Aggression,” “Pride & Dignity,” and “One Family” T-shirts. The brand, which stands for “One People One Struggle,” also links to its online shop from Twitter and Instagram.

——


EDITOR’S NOTE: This story is part of a collaboration between The Associated Press and the PBS series FRONTLINE that examines challenges to the ideas and institutions of traditional U.S. and European democracy.

—-

The people and organizations in CEP’s dataset are a who’s who of Germany’s far-right music and combat sports scenes. “They are the ones who build the infrastructure where people meet, make money, enjoy music and recruit,” said Alexander Ritzmann, the lead researcher on the project. “It’s most likely not the guys I’ve highlighted who will commit violent crimes. They’re too smart. They build the narratives and foster the activities of this milieu where violence then appears.”

CEP said it focused on groups that want to overthrow liberal democratic institutions and norms such as freedom of the press, protection of minorities and universal human dignity, and believe that the white race is under siege and needs to be preserved, with violence if necessary. None has been banned, but almost all have been described in German intelligence reports as extremist, CEP said.

On Facebook the groups seem harmless. They avoid blatant violations of platform rules, such as using hate speech or posting swastikas, which is generally illegal in Germany.

By carefully toeing the line of propriety, these key architects of Germany’s far-right use the power of mainstream social media to promote festivals, fashion brands, music labels and mixed martial arts tournaments that can generate millions in sales and connect like-minded thinkers from around the world.

But simply cutting off such groups could have unintended, damaging consequences.

“We don’t want to head down a path where we are telling sites they should remove people based on who they are but not what they do on the site,” said David Greene, civil liberties director at the Electronic Frontier Foundation in San Francisco.

Giving platforms wide latitude to sanction organizations deemed undesirable could give repressive governments leverage to eliminate their critics. “That can have really serious human rights concerns,” he said. “The history of content moderation has shown us that it’s almost always to the disadvantage of marginalized and powerless people.”

German authorities banned the Battle of the Nibelungs event in 2019, on the grounds that it was not actually about sports, but instead was grooming fighters with combat skills for political struggle.

In 2020, as the coronavirus raged, organizers planned to stream the event online — using Instagram, among other places, to promote the webcast. A few weeks before the planned event, however, over a hundred black-clad police in balaclavas broke up a gathering at a motorcycle club in Magdeburg, where fights were being filmed for the broadcast, and hauled off the boxing ring, according to local media reports.

The Battle of the Nibelungs is a “central point of contact” for right-wing extremists, according to German government intelligence reports. The organization has been explicit about its political goals — namely to fight against the “rotting” liberal democratic order — and has drawn adherents from across Europe as well as the United States.

Members of a California white supremacist street fighting club called the Rise Above Movement, and its founder, Robert Rundo, have attended the Nibelungs tournament. In 2018 at least four Rise Above members were arrested on rioting charges for taking their combat training to the streets at the Unite the Right rally in Charlottesville, Virginia. A number of Battle of Nibelungs alums have landed in prison, including for manslaughter, assault and attacks on migrants.

Screen Shot 2021-09-25 at 8.49.42 AM.png

Screen Shot 2021-09-25 at 8.47.24 AM.png

Thorsten Hindrichs, an expert in Germany’s far-right music scene who teaches at the Johannes Gutenberg University of Mainz, said there’s a danger that the apparently harmless appearance of Germany’s right-wing music heavyweights on Facebook and Twitter, which they mostly use to promote their brands, could help normalize the image of extremists.

Extreme right concerts in Germany were drawing around 2 million euros ($2.3 million) a year in revenue before the coronavirus pandemic, he estimated, not counting sales of CDs and branded merchandise. He said kicking extremist music groups off Facebook is unlikely to hit sales too hard, as there are other platforms they can turn to, like Telegram and Gab, to reach their followers. “Right-wing extremists aren’t stupid. They will always find ways to promote their stuff,” he said.

None of these groups’ activity on mainstream platforms is obviously illegal, though it may violate Facebook guidelines that bar “dangerous individuals and organizations” that advocate or engage in violence online or offline. Facebook says it doesn’t allow praise or support of Nazism, white supremacy, white nationalism or white separatism and bars people and groups that adhere to such “hate ideologies.”

Last week, Facebook  removed almost 150 accounts and pages linked to the German anti-lockdown Querdenken movement, under a new “social harm” policy, which targets groups that spread misinformation or incite violence but didn’t fit into the platform’s existing categories of bad actors.

But how these evolving rules will be applied remains murky and contested.

“If you do something wrong on the platform, it’s easier for a platform to justify an account suspension than to just throw someone out because of their ideology. That would be more difficult with respect to human rights,” said Daniel Holznagel, a Berlin judge who used to work for the German federal government on hate speech issues and also  contributed to CEP’s report. “It’s a foundation of our Western society and human rights that our legal regimes do not sanction an idea, an ideology, a thought.”

In the meantime, there’s news from the folks at the Battle of the Nibelungs. “Starting today you can also dress your smallest ones with us,” reads a June post on their Facebook feed. The new line of kids wear includes a shell-pink T-shirt for girls, priced at 13.90 euros ($16). A child pictured wearing the boy version, in black, already has boxing gloves on.
 

CCGNZ

Well-Known Member
I'm not on Facebook,creepy and waaay to much drama, maybe my Polish,German,and Hungarian heritage makes me extremely averse to putting myself out there so to speak. Not only that but Mark Zuckerberg?, does he look 100% human to you? Looks kind of alienish to me. Anyway I'm pretty convinced that the ulterior motives of this Co. creep me out. I have only one toe in the bathtub of the digital world and I'm more than OK w/that.ccguns
 

Three Berries

Well-Known Member
The Facebook app is a data miner whether you are using it or not. Facebook is just the old CI A LifeLog project. The day FB started was the day Lifelog ended.

 
Top