35 Internal Code Words Facebook Uses to Talk About Its Users and Tools

Image for article titled 35 Internal Code Words Facebook Uses to Talk About Its Users and Tools

Photo: Kirill Kudryavtsev (Getty Images)

One of probably the most surreal components of going by way of the mountain of paperwork captured from inside Facebook’s partitions by whistleblower Frances Haugen is seeing the phrases staff use when discussing a number of the firm’s most delicate merchandise and methods. Many of those names (CORGI! Yoda!) sound deceptively cute, whereas others sound extra… sinister.

While the phrases themselves are attention-grabbing just because they’re used internally, in addition they present key insights into Facebook’s inside machinations and the way the corporate thinks concerning the points we’ve all come to know and loath. For probably the most half, these definitions are pulled from an internal glossary utilized by the corporate’s (now disbanded) Civic Integrity staff, which had been a part of Haugen’s disclosures made to Congress and the Securities and Exchange Commission. Gizmodo, together with dozens of different information organizations, obtained redacted variations of those paperwork.

There are different phrases in right here, too, that don’t seem within the glossary however do seem incessantly in a number of the different paperwork offered for us. Gizmodo was in a position to outline with the assistance of a former Facebook worker who spoke to us on the situation that they not be named.

With all of that out of the best way, let’s get to these phrases!

1. CORGI

This refers to a fancy mathematical mannequin that Facebook’s researchers got here up with internally so as to discover “clusters of users” that is perhaps working in an inauthentic means—like customers that is perhaps commenting a bit too incessantly on one another’s posts. Based on this mannequin (that, sure, is spelled just like the canine breed), these researchers may determine doubtless bots and unhealthy actors.

2. Bonjovi

Employees used this inside investigation instrument to trace—amongst different issues—accounts on Instagram and Facebook that is perhaps engaged in human trafficking. According to the interior paperwork we had been offered, Bonjovi may very well be used to trace a possible trafficker’s on-platform search exercise, and a historical past of the profiles that mentioned trafficker was viewing as a approach to suss out who their potential victims is perhaps.

3. H1/H2/H3

Because Facebook is such a large operation with numerous traces of code operating at any given time, the corporate must push out any updates in a collection of levels. The first stage is H1, which deploys code to a set of inside, Facebook-specific servers solely accessible to the corporate’s engineers. If that deployment goes off with out a hitch, that code will get pushed out to H2, a “few thousand machines serve a small fraction of real-world users,” in line with a Facebook research paper. If that code works as anticipated, it will get pushed to H3—full deployment throughout all of Facebook’s servers.

4. DCI

“Destructive conflict,” or DCI, is a label generated inside the firm that’s meant to flag “uncivil” or abusive conversations between customers in an automatic means.

5. Eat Your Veggies

The protocol that Facebook staff are anticipated to comply with round “major [or] sensitive” updates to individuals’s News Feeds.

6. Blame Tool

This is an inside instrument Facebook’s researchers can use on a given “bad post” to see what sort of on-platform triggers brought on it to bubble up in an individual’s feed.

7. Blackhole

Another inside instrument utilized by researchers to blacklist any URLs, domains, or IP addresses which can be related to spam or in any other case icky content material. Blackhole attaches completely different labels (like “OK,” “BAD,” or “IFFY”) to every of those parts, and every label has a corresponding impact to how that URL/area/IP may be seen and shared throughout Facebook as a complete.

8. FUSS

This stands for “Facebook Unified Signal Sharing/Feed Unified Scoring System. Internally, the Integrity team would classify posts on people’s newsfeeds under different FUSS categories depending on the “quality” of that given entity. Low-quality posts had been labeled “FUSS Red,” “borderline” content material was labeled “FUSS Yellow,” and common, high-quality posts had been “FUSS Green.” The analysis staff additionally ran an experiment generally known as “FUSS Black,” which was their try to filter out as a lot Red and Yellow content material from a given feed as attainable.

9. Hex

The staff’s inside time period to check with “human exploitation,” or human trafficking.

10. Banhammer

A instrument used internally to take away the entire likes or follows from a given Facebook consumer, or group of Facebook customers. One use-case introduced up internally for the Banhammer was chopping out all of the likes/follows from a consumer after they’d been banned from the platform.

11. Yoda

An in-house textual content processing instrument used to sift by way of individuals’s posts, at scale. Supposedly named after the funny-talking Green alien man of the identical identify.

12. VPV

This stands for “Viewport Views.” This is a fairly foundational metric that Facebook staff use to calculate how usually a chunk of content material—a submit, a video, somebody’s Story—was really considered by a given variety of Facebook customers. “Viewport,” on this case, refers to your laptop computer or cellphone display. One viewport view = one entity, totally loaded, on that display and in entrance of your eyeballs.

13. USI

“Unwanted social interactions,” or USI, can embody harassing messages, undesirable Friend Requests, or actually any kind of reach-out from one other Facebook consumer that one other consumer doesn’t like.

14. TRIPS

Stands for “Tracking Reach of Integrity Problems.” TRIPS was a foundational inside survey meant to measure what customers consider the content material they’re seeing on the platform. TRIPS tracks the prevalence of hate speech and harassment that customers got here throughout, but additionally the content material that the Integrity staff decided is perhaps of “civic value.” At the top of the day, this kind of monitoring is “meant to improve the quality of civic conversations” on the platform.

15. SUMA

“Single User Multiple Accounts,” or SUMA, check with sockpuppet accounts used to govern conversations on Facebook. ‘Nuff said.

16. Shield

This is the internal program that either adds speedbumps to any efforts to crack down on a particular piece of content, or completely prevents any attempts to crack down on that content. Shield was specifically implemented for Facebook pages belonging to celebrities or public figures, for example, in order to prevent one of their algorithms from automatically pulling one of their posts—a move that would undoubtedly spell a PR nightmare for Facebook, if that public figure happened to notice.

17. SEV

Short for “Site Event,” SEV is what the company calls a platform-wide issue that affects overall Facebook service. Think the recent 6-hour outage of Facebook, Instagram, and WhatsApp.

17. ROPS/RODS

These acronyms refer to “Repeat Offender Pages,” and “Repeat Offender Domains,” which means said page or domain committed at least three offenses (or platform violations) during a 90 day period.

18. (P)rating

Another complicated mathematical model! This one’s used to foretell how the corporate’s in-house staff {of professional} News Feed Raters would rank the content material on a given feed. The instance given in Facebook’s inside glossary is “how good” a selected story is perhaps, on a 1-5 ranking scale.

19. Orb

An in-house search instrument particularly geared in direction of sniffing out spam assaults on the Facebook platform.

20. Bouncer

An inside instrument that the Integrity staff used so as to crack down on “relatively small” lists of pages or individuals. Because we’re speaking about an organization with Facebook’s scale, “small” on this case means “on the order of thousands,” in line with an inside doc.

21. Blue

How researchers check with the principle Facebook app, which is… blue. “Blue time,” which is an instance they offer, refers back to the complete period of time somebody spends on mentioned blue app. Makes sense!

22. Magnet User

This is the time period the Integrity staff used when speaking a few given Facebook consumer who’s “hyper-engaged” with unhealthy content material.

23. ACDC

The algorithm that classifies the clusters of teams that get produced by the corporate’s different algorithms. Confusing, proper? In this case, it simply signifies that if one algorithm catches a bunch of (doubtlessly sockpuppet-y) accounts sharing a single URL, ACDC is the algorithm that classifies this cluster of (spammy) accounts as sharing that single URL.

24. Faux-tire

Literally faux satire. The glossary defines the time period as “material meant to misinform/push propaganda,” whereas actively portraying itself as satire, so as to weasel out of the corporate’s fact-checking methods. For an thought of what this sort of content material appears to be like like, look no additional than Alex Jones’s attorneys, who famously described Infowars as an outlet specializing in “humor, bombasity, sarcasm [and] wit.”

25. NFX

The acronym the corporate internally makes use of when referring to the steps Facebook customers take so as to report unhealthy stuff cropping up on their feed. Stands for “Negative Feedback eXperience” (sure, actually).

26. NCII

Stands for “Non-Consensual Intimate Imagery.” This is colloquially generally known as revenge porn, however that time period is taken into account inaccurate and dangerous to victims.

27. HERO

HERO refers to an inside “High-Risk Early Review Operations” program that’s meant to foretell which posts may go viral throughout the platform on any given day. Used to catch doubtlessly dangerous viral-posts-in-the-making earlier than they really go viral.

28. NSFA

Depending on the context, this may check with one among two acronyms that debate one among two sorts of content material: “Not Safe For All” (that means that content material isn’t family-friendly), or “Not Safe For Ads,” that means that the content material violates one thing in Facebook’s insurance policies for advertisers.

29. MSI

Short for “Meaningful Social Interactions.” Internally, that is what staff known as “the goal metric” for individuals’s News Feeds. The firm’s definition of “meaningful” is a little bit of a transferring goal—different inside paperwork be aware that the completely different items that make up MSI change incessantly, the identical means an individual’s understanding of what “meaningful” means may change over time.

As of 2020, the Integrity staff was utilizing metrics just like the variety of Likes and reshares a submit acquired, together with the variety of sticker feedback (sure, sticker feedback) individuals left below a given submit to gauge its meaningfulness.

29. MAD

Short for “Mark As Disturbing,” MAD refers to content material that is perhaps reported by customers (and flagged by Facebook’s content material moderators) for, you guessed it, being “disturbing.” Frequent offenders, in line with different Facebook paperwork that we’ve reviewed, embody “borderline nudity,” “gross medical videos or wounds,” and content material that tries “minimizing or denying [the] Holocaust.” (It’s unclear whether or not Mark Zuckerberg signed off on that final one.)

30. WAP

I do know what you’re thinking. But on this context, it stands for “Weekly Average People,” or the variety of Facebook customers that do some kind of motion in a given week. Which brings us to…

31. NAWI

AKA “Non-Abusive WAP Impact.” An inside rating that Facebook’s researchers use when monitoring so-called “non-abusive” accounts, that means that they had been flagged for no matter cause however had been confirmed to be “benign” after evaluate.

32. VICN

This stands for “Violence Inducing Conspiracy Network.” The definition for what these “networks” appear like is one more transferring goal. Here’s how one of many leaked docs talks about VICN, within the context of two main Facebook teams (since eliminated) that had been tied to the January 6 Capitol riots:

Over time, now we have additionally more and more discovered that not all dangerous networks are so clear reduce. With VICNs, we realized that not all dangerous coordination is pushed by a corporation with command and management. With Stop the Steal and Patriot Party, we had been in a position to designate actions for coordinating hurt, although our interventions stopped wanting treating both as a full community.

In this occasion, it took some time for the corporate’s researchers to catch onto the truth that they had been really coping with a “full network” of pages and people who had been devoted to an “adversarial harmful movement.” At least the paperwork present that these groups realized one thing from the occasion.

33. CAU

Short for “Care About Us,” CAU is used as a measure for the affinity Facebook’s customers really feel concerning the platform (i.e., how a lot they really feel Facebook “cares about us”). Facebook even has an inside process power devoted to boosting individuals’s CAU—the Protect and Care staff. Products that this staff labored on embody Facebook’s suicide and self-injury prevention tools, which inside paperwork be aware had been the results of a detailed collaboration “with a team of experts in suicide prevention and policy.” The similar doc notes that these consultants additionally helped introduce people engaged on CAU “to people who’ve experienced suicidal thoughts or attempted suicide,” so they might take these experiences into consideration when designing these instruments.

Regardless of your opinion on Facebook as a complete (and whether or not the corporate really cares about any of us), simply know that this staff does vital work that we should always all be glad about.

(The National Suicide Prevention Hotline within the U.S. is on the market 24 hours a day at 1-800-273-8255. A listing of worldwide suicide hotlines may be discovered here.)

34. BTG

Short for “Break The Glass.” This is a time period used when speaking a few hellish occasion (just like the aforementioned Capitol riot) that necessitates harsher moderation practices than the corporate’s normal ways. Banning Stop The Steal teams, for instance, was internally known as a part of Facebook’s “BTG Response.”

35. Cannibalism 

A grisly sounding time period for what’s fairly frequent within the enterprise world: rising one product on the expense of one other. You can see a fairly clear instance with Instagram, which has been cannibalizing customers from Facebook’s large blue app for the last couple of years


The above checklist is simply a short collection of the phrases Facebook makes use of internally. For a much bigger image, see the total glossary offered by Haugen under. (This doc has been redacted to exclude the names of staff and exterior researchers who should not a part of Facebook management.)

This story is predicated on Frances Haugen’s disclosures to the Securities and Exchange Commission, which had been additionally offered to Congress in redacted kind by her authorized staff. The redacted variations acquired by Congress had been obtained by a consortium of stories organizations, together with Gizmodo, the New York Times, Politico, the Atlantic, Wired, the Verge, CNN, and dozens of different shops.

#Internal #Code #Words #Facebook #Talk #Users #Tools
https://gizmodo.com/35-secret-code-words-facebook-uses-to-talk-about-its-us-1847988690