As pictured above, the “About This Page” popup really only displays broad location data about the page in question. Users are then meant to take this context into account when choosing whether or not to believe information from the source.
How will this help? — Facebook hopes providing location data about high-reach Pages and Instagram accounts will assist users in deciding for themselves about their validity.
The obvious example here is political information: click on the page and see it’s based in Russia and you’ll probably decide not to read it. This might be helpful in limited circumstances.
Nowhere near enough — This increased transparency is severely limited in its abilities to stop misinformation from reaching users.
For one thing, the feature is ripe for misuse. If an organization is attempting to spread misinformation and knows its location is being used to verify that information, the organization would likely find a way to base its accounts in the United States. This could easily persuade readers that the information is more credible — the opposite of the feature’s intended effect.
Even if the feature is somehow implemented correctly, it doesn’t do much to address the misinformation problem at Facebook. In fact, the platform’s misinformation problem seems to be getting worse, with reports cropping up that coronavirus misinformation is slipping through its content filters. Concerted efforts to spread misinformation — like the many anti-quarantine groups linked to one gun-rights group — are somehow allowed to continue operating, even after their roots have been brought to Facebook’s attention.
If Facebook wants to stop misinformation from spreading through its enormous user base, it’s going to need changes much more far-reaching than just some simple location transparency. Probably best to not to hold our breath while we wait for that to happen.