Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh

Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh

As Bangladesh gears up for its upcoming parliamentary elections, a study by Digitally Right has laid significant shortcomings in Meta’s political advertising enforcement. The study, titled “Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh,” published today (Monday), finds various active political ads, lacking required disclaimers, managed to slip through Meta’s detection systems.

The study also finds issues of over-enforcement, with non-political ads facing inaccurate categorization, disproportionately affecting commercial entities. The inadvertent mislabeling, ranging from product promotions to employment services, prompts questions about the efficacy of Meta’s ad classification algorithms.

It reveals instances of incomplete or vague information in disclaimers provided by the advertisers that falls short of Meta’s transparency standards and hints at potential gaps in the platform’s verification processes leaving users in the dark about the sources funding political advertisements.

Political advertisements significantly impact democratic processes by shaping public perceptions of political systems and leadership and function as the most dominant form of communication between voters and candidates before elections. And social media has brought about a significant transformation in electoral campaigns, enabling political actors to engage with a vast audience through advertisements.

In January 2023, Bangladesh had 44.7 million social media users, representing 34.5% of the population aged 18 and above, as reported by Data Reportal. Among these users, Facebook stood out with the highest number, accounting for 43.25 million. Among the major platforms only Meta, in its Ad Library, offers disclosures on ads related to election, politics and social issues — that his study terms as political ads. It allows an opportunity to scrutinize the identity of advertisers, amount spent and content of such ads.

The study analyzed detected political ads and related disclaimer information available in the Meta Ad Library in one year period and ran keyword search to identify active but undetected political ads.

Political but Undetected

The under-enforcement undermines the entire purpose of political ad transparency as it represents a failure of the transparency system, which then leads to missing significant portions of political advertising on the platform.

The study finds 50 active ads with clear political messages and almost half of those ads came straight from political figures and parties highlighting underenforcement. Over 90% of the undetected ads prominently displayed political party names or political figures in the text and 72% included photos of political leaders or party symbols. These ads were found by keyword search for just four days within the research period.

Screenshots from the report

According to the findings, identical ads were identified as political when shared by certain pages but not when posted by others. Political demands featured on ostensibly unrelated pages bypassed detection at least in five cases. It also finds a delayed response in identifying and flagging political ads, with instances of ads running for prolonged periods without disclaimers. For instance, an ad conveying a political message remained undetected for a staggering 372 days despite accumulating a million impressions.

Advertisers Evading Transparency

The study analyzed 314 “paid for by” disclaimers available in the Ad Library and found that nine advertisers, including a mayoral election candidate and two politicians, didn’t submit any required transparency information yet were allowed to display political ads.

Meta advertising guidelines require the advertisers to provide phone, email, website, and location addresses that are functional and correct at all times. However, the study finds, 80% of the 314 disclaimers had ambiguous or insufficient address information, with 47% using only the district name. Only 17% had complete and operational addresses and 58% used a Facebook page URL in lieu of a website.

Screenshots from the report

“Paid For By” disclaimers are at the core of ad transparency and crucial for users because they provide essential information about who is funding and supporting a particular advertisement allowing voters to understand the interests and affiliations behind the messages they encounter. Findings of the study imply that once disclaimer information is provided, there is inadequate effort from meta in verifying “functionality” and “correctness” of the information in disclaimers.

Non-Political but Detected

This study analyzed 1,420 advertisement samples from the Ad Library that were posted by pages in the non-political category and found that about 25% of the advertisements from non-political pages (i.e., commercial, news and media, and other categories) were incorrectly detected as political. The highest rate of false positives (i.e., ads erroneously identified), at 43%, occurred in the ‘commercial’ pages category, indicating that it was the most adversely impacted by over-enforcement.

Screenshots from the report

The study identified mis-detections where ads from commercial pages owned by political figures were inaccurately labeled as political. Even simple product promotions from companies owned by political figures faced incorrect categorization.

Ads promoting the sale of guidebooks, textbooks, novels, and services related to employment opportunities, studying abroad, and visa applications were detected as political, seemingly for being considered as a social issue, highlighting a need for more precise classification of social issues for Bangladesh.

Commercial pages encountered challenges stemming from keyword-related issues, where seemingly innocuous terms like ‘Minister’ triggered the misclassification of electronic appliance advertisements and marriage matchmaking services as political ads.

Keywords such as ‘winner’ and references to specific events further contributed to the misclassification of ads, underscoring the complexities in accurately categorizing content on the platform.

Recommendations

This study’s findings underline crucial areas requiring attention and improvement in ensuring transparency within online political advertising for Bangladesh.

Recommendations from the study, informed by expert interviews, highlight the necessity for Election Commission mandated regulations, regular audits of keywords by digital platforms, and stronger collaboration between various stakeholders for effective oversight.

Here is are some of the key recommendations:

  • Social media platforms, such as Meta, must conduct regular audits on keywords tailored to the Bangladeshi context to ensure accurate classifications of political ads and collaborate with stakeholders, including the Election Commission (EC), election watchdogs, researchers, and civil society for context-specific insights and evaluations for the effectiveness of these audits.
  • To foster transparency, the Election Commission or government must mandate all social media platforms to disclose political ads data specific to Bangladesh and policies on what information should be made available and how.
  • To clarify regulations for online advertising and campaigning in Bangladesh, the Election Commission (EC) guidelines must explicitly include provisions for overseeing online political campaigns and allowing parties to self-disclose all their official pages.
  • Meta should collaborate with local stakeholders to establish and publicly disclose a comprehensive list of relevant social issues specific to Bangladesh.
  • Meta must display and archive not only political but all ads uniformly, across all jurisdictions, as it does for the EU to comply with the Digital Services Act.
  • Social media platforms, including Meta, must invest in more human reviewers with specific knowledge of the local political landscape and language to ensure the effective review of political ads.

This study is an integral component of Digitally Right’s ongoing commitment to exploring the intersection of technology and its impact on society. Digitally Right, a research-focused company at the convergence of media, technology, and rights, provides essential insights and solutions to media, civil society, and industry to help them adapt to a rapidly changing information ecosystem.

The full report is available here.

Call for Applications: Digital Verification Fellowship 2023

Call for Applications: Digital Verification Fellowship 2023

Digitally Right under its Dismislab project is excited to launch the call for applications for The Digital Verification Fellowship 2023. This fellowship is open to graduate students and early-career journalists based in Bangladesh.

This program is a 3-month paid fellowship opportunity with certificates for individuals who are dedicated to advancing the understanding of the online disinformation research and fact-checking within and outside newsrooms. It provides a unique platform to gain hands-on experience, collaborate with experts, and contribute to tackling mis/disinformation. High performers will also have the prospect to work full-time for Dismislab, a bilingual  information lab working for online verification and media research.

Key Goals of the Fellowship:

  1. Apprehend the Global and Local Digital Information Ecosystem: Dive deep into the world of digital information to understand its global and local dynamics.
  2. Understand Information Disorder: Explore the various facets of information disorder, including disinformation, misinformation, and malinformation.
  3. Grasp Verification of Online Information: Master the art of verifying digital information using the latest tools and techniques.
  4. Use the Latest Tools and Techniques to Verify Mis/Disinformation:Develop practical skills to identify and counteract mis and disinformation in digital content.
  5. Institute a Culture of Fact-Checking in organizations: Work on projects and initiatives to promote fact-checking and accuracy in media organizations.

Important Dates:

  •  Application Deadline: Oct 26, 2023
  •  Fellowship Start Date: Nov 15, 2023
  •  Fellowship End Date: Feb 15, 2024

 

To apply for the Digital Verification Fellowship 2023, please click here

For inquiries or more information, please contact: editor@dismislab.com

Call for Applications: Bangladesh Tech Policy Fellowship 2023

Call for Applications: Bangladesh Tech Policy Fellowship 2023

Digitally Right is excited to launch the call for applications for The Bangladesh Tech Policy Fellowship 2023. This program is open to young and mid-career professionals in academia, law, journalism, tech and civic advocacy organizations.

In an era where digital and physical realities are intertwined, the influence of technology policies on our daily lives is profound. From government legislations to platform policies to international principles, these policies shape how we express ourselves, access information, conduct business, and innovate. The fellowship offers an opportunity to explore the impact of technology on our lives and the resulting implications for policies and policymakers.

The Fellowship is designed to empower participants with the knowledge and skills needed to understand global and local trends in technology policies. It includes in-depth training, expert mentoring, and the opportunity to conduct research on a policy issue. Participants will contribute to the discourse by producing research papers, policy briefs, or articles and a stipend to cover research expenses.

Be part of a transformative journey!  

Fellowship Topics

Applicants interested in this fellowship will be required to submit research proposals as a crucial part of the application process. The excellence of these proposals will play a vital role in the selection of the most qualified candidates. While there is a wide array of tech policy topics to consider, this fellowship concentrates on four primary areas of focus.

  • Online Safety and Security
  • Content Governance and Free Expression
  • Equal and Open Internet Access 
  • Privacy and Data Protection

What’s in it for you?

  • A 4-day in-person workshop focusing on tech policies.
  • Participate in 3 virtual in-depth sessions with global experts.
  • Produce and publish research papers/articles under expert mentoring.
  • Showcase your work to an esteemed audience.
  • Become part of a regional network of tech policy enthusiasts.
  • Receive a modest stipend throughout the 5-month program.

Eligibility

The Tech Policy Fellowship is designed for individuals who envision professional development opportunities in the fields of tech policy, digital rights advocacy, and civic technology. We encourage applications from any Bangladeshi who has an interest in these areas, and the selection process will primarily consider the quality of the research proposal and the applicant’s personal commitment. However, preference will be given to candidates who meet the following criteria:

  • Early and mid-career professionals.
  • Backgrounds in law, journalism, academia, or civic advocacy are highly desirable.
  • Age within the range of 26 to 35 years.
  • Demonstrated relevance of the program to the candidate’s professional work.

Eight exceptional candidates will be chosen based on the strength of their applications. Notifications regarding the selection will be sent out by November 7th.

Apply now! 

If you find yourself intrigued, possess a compelling research idea, and believe that participating in this fellowship can have a meaningful impact on both your career and the broader domain of tech policy, we strongly encourage you to submit an application. 

📆 Application Deadline: November 1, 2023

🔗 Registration Link:  https://shorturl.at/deWY3

Disinformation Risk Assessment: The Online News Market in Bangladesh

Disinformation Risk Assessment: The Online News Market in Bangladesh

Digitally Right has partnered with the Global Disinformation Index (GDI) to launch a pioneering report titled “Disinformation Risk Assessment: The Online News Market in Bangladesh”. The report provides insights on the dangers of disinformation in Bangladesh’s media industry, based on a study of 33 news domains.

The report was unveiled during an online event on March 28th, 2023, which included presentations from the Research Director at the Global Disinformation Index, and Digitally Right’s Founder, Miraj Ahmed Chowdhury. The event featured Shafiqul Alam, Bureau Chief at AFP, Dhaka, Ayesha Kabir, Head of English, Prothom Alo, Talat Mamun, Executive Director, Channel 24, and Saiful Alam Chowdhury, Associate Professor, Dhaka University, who shared their insights on the report.

The report presents GDI’s findings on disinformation risks in the media market of Bangladesh, based on a study of 33 news domains, and aims to provide an overview of the media market as a whole and its strengths and vulnerabilities.

The assessment found that all 33 domains had a medium to high risk of disinforming their users, including respected sites known for their independent news coverage. Sixteen sites had a high disinformation risk rating, and half of the sample had a medium risk rating. However, no site performed so poorly as to earn a maximum risk rating.

According to the findings, the main source of disinformation risk in Bangladesh media sites is the lack of transparent operational checks and balances. While all sites scored strongly in presenting unbiased, neutral, and accurate articles, 28 sites had no form of accuracy policy on their websites. Most sites lacked policies for editorial checks and balances including for post publication corrections, comment moderation, byline information, fact checking, and sourcing as well as clarity on funding and ownership structures.

The report highlights operational shortcomings that are hampering trust and transparency in the media industry. Recommendations made in the report urge news sites to adopt and make transparent universal standards of good journalistic practices, such as publishing beneficial ownership and funding information, maintaining a corrections policy, publishing bylines policies, and sourcing guidelines.

The findings show many of the operational issues afflicting the Bangladeshi websites can easily be fixed by adopting and making transparent universal standards of good journalistic practices as agreed upon by the Journalism Trust Initiative.

This risk-rating framework for Bangladesh provides crucial information to enable decision-makers to stem the tide of money that incentivizes and sustains disinformation and the report is aimed to encourage greater transparency and accountability in Bangladesh’s media landscape.

Download the report here.

Report Launch: Disinformation Risks and Bangladesh Online News Market

Report Launch: Disinformation Risks and Bangladesh Online News Market

Digitally Right and the Global Disinformation Index (GDI) are launching a study into the disinformation risks on digital news platforms in Bangladesh. The study is based on an assessment of editorial and operational transparency in 33 Bangladeshi news domains creating an index that intends to serve the media and advertisers that they can utilize to defund disinformation.

The virtual launch event for the report is scheduled for 10:30 am on March 28th, 2023, and it will be held on Zoom. The event will feature an expert panel of media editors, researchers, and representatives from GDI and Digitally Right. To register for the event, please visit the following link: https://zoom.us/webinar/register/WN_0hTiIud-RHelA54ep1C0pw

The expansion of news into the online world has exposed the industry to new risks of disinformation, which can be financially incentivized for news websites. Disinformation can have harmful consequences, such as disrupting society’s shared sense of accepted facts and undermining public health and safety. 

To combat ad-funded disinformation, GDI provides independent, trusted, and neutral ratings of news domains’ risk of disinforming their readers. These ratings can be used by advertisers, ad tech companies, and platforms to redirect their online ad spending in line with their brand safety and disinformation risk-mitigation strategies.

In 2022, GDI rated the disinformation risks of news sites in 20 countries around the world. Digitally Right, a Dhaka based media and technology research company collaborated with GDI to conduct the online news market assessment in Bangladesh. 

We invite you to the launch event of the report titled “Disinformation Risk Assessment: The Online News Market in Bangladesh,” and a panel discussion that focuses on the risk of disinformation, trust in media, and their impact on media sustainability. 

EngageMedia and Digitally Right Co-Hosts Asia-Pacific Digital Rights Forum Solidarity Event in Dhaka

EngageMedia and Digitally Right Co-Hosts Asia-Pacific Digital Rights Forum Solidarity Event in Dhaka

Digitally Right, in partnership with EngageMedia, co-hosted a solidarity event of the Asia-Pacific Digital Rights Forum on January 14, 2023, in Dhaka. The event was attended by around 30 academics, journalists, lawyers, feminists, researchers, and activists from different communities.

It coincided with four other simultaneous solidarity events in Bangkok, Jakarta, Kuala Lumpur, and Manila that aimed to provide spaces for changemakers to meet each other and build stronger regional solidarity.

The Dhaka event featured sessions on online media freedom, hate speech, online gender-based violence (OGBV), and a focus group discussion (FGD) on the use of open and secure technology. Toward the end of the day, the group shared their learnings with their peers in the four other solidarity events through a virtual video call.

The event started with a session on online media freedom and censorship, which tackled issues ranging from the key actors involved in restricting free speech to discussions on reasonable restrictions to speech without hindering fundamental rights.

The second session on online hate speech and OGBV opened discussions on feminist politics, social justice movements, and backlash against gender equality in the digital space. Participants also talked about the reality and impact of online hate speech and gender-based violence in the Global South, especially in Bangladesh. According to a November 2021 survey, cases of OGBV increased during the pandemic, with 63.51 percent of 359 respondents saying they faced online violence.

The solidarity event concluded with an FGD to assess the awareness of civil society organizations, human rights defenders, and digital rights advocates in Bangladesh regarding online threats and surveillance. Participants discussed their understanding of ways to mitigate these threats using open and secure technology – such as by using virtual private networks, encrypted password managers, and secure messaging apps. The discussions also scrutinized the reasons behind the use or non-use of such technologies and how changemakers can work in more secure, ethical, and sustainable ways.