Given Facebook’s domination of social media in Papua New Guinea, it was concerning that researchers found strong indications of organised, politically motivated activity using inauthentic accounts to impersonate incumbent politicians
WASHINGTON DC - How many Facebook accounts and pages claim to belong to Papua New Guinea’s prime minister James Marape?
Between 20 and 35, depending on the point in time and your definition, none verified by the platform.
Plenty of other suspicious and unverified channels, encompassing both Facebook Pages and Accounts, also purport to have official associations.
Such sources of potential confusion, misinformation or disinformation are worrying at any given time, particularly so as PNG’s election campaign gathers pace in a country where social media has wide and growing reach.
A survey by Datareportal last year found that a remarkable 97% of the approximately 930,000 mobile phone users in PNG – approximately one seventh of the total population – are active on Facebook.
So it is crucial to understand the potential for malign attempts to influence the electoral outcome as has occurred in other countries.
Meta, as the conglomerate is now known that runs Facebook and other popular platforms including WhatsApp, states on its Approach to Elections transparency information page that, “we stop millions of fake accounts every day before they are even created”.
For PNG, it appears the net is not as finely woven. The company’s recent adversarial threat report into ‘inauthentic behaviour’ made no mention of PNG elections.
DT Institute and DT Global studied Facebook activity linked to nine incumbent MPs in PNG in the run-up to the elections as part of a focus on technical areas given that PNG is a priority country in the implementation of the United States’ 2020 Global Fragility Act.
Our study, ‘Characterising Pre-Election Facebook Activity of Papua New Guinean Politicians’, set out to assess how the events and technological change of the last five years might have affected the information landscape in PNG.
We changed our methodology to collect and track data on inauthentic accounts mid-project.
We collected the creation date, the date we reported the channel to Facebook, its number of friends or followers and the indicators of inauthenticity in a larger database that will be made public at a later date.
One category of channels linked to financial scams with suspicious links to ‘get a grant’ (translated from French) or to start a WhatsApp chat for financial services or weight loss solutions.
Some of these used phishing tactics such as slight misspellings of politicians’ names (e.g., Pawes instead of Powes Parkop for the Port Moresby governor, Peter Oneil instead of O’Neill for the former prime minister), offering cash and giveaway bonuses in exchange for contact details.
Others connected to larger networks of fake accounts, largely francophone, tagging each other in posts such as, ‘Who wants to be my next millionaire’.
Beyond the phishing, though, we found strong indications of organised, politically motivated activity using inauthentic accounts to impersonate Papua New Guinean incumbent MPs.
The key indicators that the efforts were organised centred around the frequency and similarity of the same account creation behaviours, including:
Packs of similarly configured pages which were created in quick succession, appearing faster than we could add them to our database. Newly set up channels used the names and likenesses of PNG MPs, and carried Facebook’s ‘New Facebook User’ warning, although it is unclear how long such a warning is maintained.
We could see roughly four new accounts were created daily in March and April, though we suspect that the actual number and frequency was higher.
They shared the use of the same set of open-source images as profile and cover pictures, sometimes in distinctively distorted proportions.
We also concluded that the actor(s) were actively seeking to interact with voters by actively friend-requesting PNG nationals, testing different combinations of the images in profile and cover photos to gauge success in gaining foothold in the information landscape – and some were successful.
One now-removed account purporting to represent Powes Parkop amassed 3,600 friends.
Another, claiming to represent the prime minister, using images of Marape with Australia’s former foreign minister Julie Bishop, built a network of more than 200 friends in less than two weeks before being taken down.
While we could also see these accounts being taken down at a robust speed, we managed to record more than 45 suspicious accounts in total over the study.
At last check, about 20 of these were still visible.
It is concerning that Facebook did not appear to have significant focus on inauthentic activity purporting to be electoral candidates in PNG in the six months leading up to a pivotal election where the influence of the platform is more significant than ever before, and where the platform has previously been rebuked for ineffective moderation in PNG.
Among the accounts with PM Marape’s likeness on them, we also found accounts which had been hacked and their screen name and profile pictures changed (thereby avoiding the ‘newly created account’ warning), and one which had been originally established with the name and likeness of the PM of Kuwait and later changed to represent PM Marape.
The hacking capability and use of the same account to impersonate a political actor on the other side of the world point strongly to oversight by an actor with particular interest in global political affairs. These accounts have been taken down following our reports.
Australia’s former defence minister Peter Dutton has publicly shared intelligence advice that foreign interference in PNG ‘is at a record level’.
Given Facebook’s domination of the social media landscape in PNG, the concerns raised by this small sample of data gather in our study are clear: Facebook’s resourcing of interference prevention on the platform ahead of PNG’s 2022 election appears unfocussed and underwhelming.
This has left the election of a nation with lower cyber safety and digital literacy vulnerable at a critical time for its democracy.
Caitlyn McKenzie joined DT Global as a strategic engagement adviser after a series of development aid and other roles within the Australian government. Dr Ben Connable is the senior research advisor to DT Institute and adjunct professor of security studies at Georgetown University
Link here to download the DT Institute study of incumbent MPs Facebook activity in the run-up to the election