AI and Human Rights: Concerns Raised at Draft National AI Policy Workshop

AI and Human Rights: Concerns Raised at Draft National AI Policy Workshop

On April 29, Digitally Right, in collaboration with ICNL, held a workshop focused on legislative issues titled “Understanding the Draft National AI Policy.” The session was attended by 17 participants, including representatives from media outlets, environmental organizations, human rights defenders, and gender activists. The workshop aimed to raise awareness, strengthen advocates’ capacity, encourage in-depth discussions, and develop strategies to influence changes to the Draft Policy. Shabnam Mojtahedi, Legal Advisor for Digital Rights at ICNL, delivered a presentation on the Draft AI Policy and its implications, followed by a discussion on strategies for engaging with the government on the issue. Participants also shared their individual concerns regarding the Policy.

During the workshop, ICNL’s presentation highlighted common concerns about AI and its potential impact on human rights. Shabnam then discussed best practices for consulting and engaging with the government on AI-related issues. She also pointed out that the lack of a globally accepted definition of AI has resulted in the Draft Policy being vague and broad. Since no jurisdiction has established standalone AI laws, there are no clear precedents for treating AI as a separate legal issue.

Participants at the workshop agreed that the government should base the AI Policy on international best practices, with civil society organizations (CSOs) playing a key role in bridging the gap between the government and global partners. They emphasized the need for effective and diverse engagement in the policy process. It was noted that the Draft Policy lacked impact assessment, monitoring, and evaluation mechanisms, which need to be addressed. While national security carve-outs are common in AI policies worldwide, participants stressed the importance of strong oversight in this area to prevent data abuse.

Following the presentation and Q&A session, participants held an open discussion to identify strategies for influencing the government to amend the Draft AI Policy. Three strategies emerged. The first is to file a Right to Information (RTI) request, seeking details on which CSO representatives were involved in the consultation and drafting process, and to challenge the legality of the Policy, as it is required by law to be drafted in Bangla rather than English. The second strategy involves rallying public support to delay the Policy’s implementation, eventually pushing for its repeal on the grounds that it is not functional. Some participants suggested pursuing both strategies simultaneously.

The third strategy proposed is the creation of a national alliance consisting of CSOs and AI experts. This alliance would engage with grassroots communities, gather their feedback, and present it to international forums.

Digitally Right Hosts First Digital Safety Training of 2024 Under GIF Project

Digitally Right Hosts First Digital Safety Training of 2024 Under GIF Project

Digitally Right hosted its first Digital Safety Training under the fourth year of the Greater Internet Freedom (GIF) project, a global initiative supported by USAID, on February 10-11, 2024. As part of this effort, Digitally Right collaborated with Internews and EngageMedia to implement the program in Bangladesh. The training was attended by 12 journalists, including 8 male and 4 female, who spent two days learning critical skills for navigating the digital space securely.

The training began with an ice-breaking and introductory session, followed by an overview of the internet, its functioning, and the associated risks. Participants voiced their security concerns, including fears of being surveilled or having their locations tracked through digital devices. These concerns set the stage for the next session, which focused on information security, both online and offline.

After each session, participants practiced using the relevant tools, allowing them to apply what they had learned. The session on tools for safely capturing photos and videos was of particular interest to the group, marking the conclusion of the first day of training.

The second session started with a recap of the previous day’s lessons. Participants then learned about securing communication on both emails and mobile phones. However, they found the mobile phone security training more relevant, as they generally do not use email for confidential information.

Participants then took part in a group activity where they were presented with a scenario and tasked with creating an action plan using the tools they had learned. Following this, they were introduced to the risks associated with digital footprints and instructed on how to protect their privacy while browsing the internet. The session ended with a review of the entire training.

Tech Policy Fellowship 2023: Meet the Fellows

Tech Policy Fellowship 2023: Meet the Fellows

Eight young professionals from law, academia, media, and civil society organizations in Bangladesh have joined Digitally Right in its flagship annual Tech Policy Fellowship program of 2023. The diverse set of fellows aims to explore various issues arising from technology and related policies, affecting different communities.

The six-month program provides a unique opportunity to gain a deeper understanding of technology-related policies in Bangladesh and the region, along with their implications for everyday life. The goal is to develop the next generation of advocates to shape the legal and policy domain in Bangladesh for a more open and free internet space.

The fellowship includes residential training, deep-dive sessions, mentoring, and research production on a contemporary issue related to technology and society. It also facilitates engagement and networking with the global and regional digital rights community. Access Now, an international non-profit promoting digital rights worldwide, is a knowledge partner in this fellowship program.

The fellowship program commenced in January of this year with a five-day residential training in Kathmandu, Nepal, in collaboration with Access Now. Representatives and experts from six organizations in civil society, technology, and academia joined the fellows to discuss the legal and policy environment shaping internet governance in Bangladesh and beyond.

Each fellow will pursue research under this fellowship program in the next couple of months and will have the opportunity to share their work with the digital rights community in the region and globally.

Meet the Fellows

Dilshad Hossain Dodul, a Senior Lecturer in the Media Studies and Journalism Department of the University of Liberal Arts, has a broad research focus on digital media, migration, and political communication. With a background in Mass Communication and Journalism, including international exposure through the Erasmus Plus Scholarship, Dodul has worked on impactful projects promoting digital awareness and safety, particularly among teenagers and female journalists.

Nazia Sharmin, a Course Contract Faculty at BRAC University, focuses on Digital Sociology, emphasizing socio-cultural aspects of misinformation. With a master’s degree from Stockholm University, Nazia believes that effective tech policies are crucial for defending society against the threats posed by misinformation.

Nowzin Khan, a graduate from BRAC University’s School of Law, is currently pursuing a master’s degree in Peace and Conflict at Dhaka University. An independent researcher with a keen interest in the convergence of technology, governance, and human rights, Nowzin aspires to contribute to comprehensive and unbiased tech policies through active engagement in the Tech Policy Fellowship.

Nusrat Jahan Nishat, currently serving as an Advocacy Manager at ANTAR, Dhaka. With a background in law, she has actively participated in Moot Court Competitions and contributed to human development training programs, workshops, and human rights-related research. Nusrat aims to explore various facets of digital technology and policy issues during her fellowship.

Rashad Ahamad, a journalist with over 13 years of reporting experience in reporting labor and human rights. Currently working at ‘New Age,’ Rashad has covered a wide range of topics, including migration, disinformation, health, climate, and labor rights. His commitment to public-interest reporting has earned him awards and fellowships, and he remains interested in the intersection of data, labor rights and technology.

Shoeb Abdullah is a fact-checker, researcher, and digital rights advocate. Currently engaged with Internews, Shoeb’s career spans nearly five years, contributing to combating disinformation, and advancing digital rights. His expertise includes internet Freedom, digital safety, and anti-internet shutdown activism.  

Suhadha Akter Afrin, a staff reporter of Prothom Alo, is a technology journalism enthusiast covering ICT and digital security issues. Actively following upcoming digital laws and policies, she is keen to understand the impact of these policies on society.  Suhadha aims to keep the people informed about the developments in the digital sector.

Suparna Roy, a legal professional dedicated to justice and equality, serves as a Legal Expert to ensure basic human rights for SOGIESC and marginalized communities in Bangladesh. Beyond her professional roles, Suparna is a passionate human rights activist advocating for gender equality and societal transformation.

Research Papers from the Tech Policy Fellowship 2022

Research Papers from the Tech Policy Fellowship 2022

With the rapid rise of the internet and the adoption of new technologies in our everyday life, we see different technology-related laws and policies emerge at home and across the world. To understand the trends in these policies and their role and impact on society, Digitally Right launched Bangladesh Tech Policy Fellowship 2022. The programme aimed to empower the citizen’s voices with the essential skills and knowledge to become an advo­cate for a free and open digital space.

The fellowship spanned for a period of six months where fellows from diverse backgrounds in academia, law, media, and civic advocacy were engaged in thorough training, and virtual deep-dives and produced research/papers under expert supervision.

In their research, fellows have focused on different areas of concern. For instance, while one of the papers emphasised on today’s youth’s interaction with the internet and their awareness of the rights and liabilities that come with it, another paper focused on how the Rohingya refugees are utilising the internet and the difficulties they face while doing so. It has also been found that the existing laws in terms of election are not sufficient to deal with the risks stemming from the digital space.

Two of our fellows attempted to assess the impact of the draft Data Protection Bill on in­ternet-driven businesses, which is set to make data localization mandatory. Another time appropriate research flags concerns around Facebook’s Bangla content moderation due to the language’s ethno-linguistic diversity.

We hope this publication and the reports by our fellows contribute to the existing knowl­edge and inform arguments, discussions, and debates for future actions in the tech policy space.

Read the papers here.

Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh

Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh

As Bangladesh gears up for its upcoming parliamentary elections, a study by Digitally Right has laid significant shortcomings in Meta’s political advertising enforcement. The study, titled “Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh,” published today (Monday), finds various active political ads, lacking required disclaimers, managed to slip through Meta’s detection systems.

The study also finds issues of over-enforcement, with non-political ads facing inaccurate categorization, disproportionately affecting commercial entities. The inadvertent mislabeling, ranging from product promotions to employment services, prompts questions about the efficacy of Meta’s ad classification algorithms.

It reveals instances of incomplete or vague information in disclaimers provided by the advertisers that falls short of Meta’s transparency standards and hints at potential gaps in the platform’s verification processes leaving users in the dark about the sources funding political advertisements.

Political advertisements significantly impact democratic processes by shaping public perceptions of political systems and leadership and function as the most dominant form of communication between voters and candidates before elections. And social media has brought about a significant transformation in electoral campaigns, enabling political actors to engage with a vast audience through advertisements.

In January 2023, Bangladesh had 44.7 million social media users, representing 34.5% of the population aged 18 and above, as reported by Data Reportal. Among these users, Facebook stood out with the highest number, accounting for 43.25 million. Among the major platforms only Meta, in its Ad Library, offers disclosures on ads related to election, politics and social issues — that his study terms as political ads. It allows an opportunity to scrutinize the identity of advertisers, amount spent and content of such ads.

The study analyzed detected political ads and related disclaimer information available in the Meta Ad Library in one year period and ran keyword search to identify active but undetected political ads.

Political but Undetected

The under-enforcement undermines the entire purpose of political ad transparency as it represents a failure of the transparency system, which then leads to missing significant portions of political advertising on the platform.

The study finds 50 active ads with clear political messages and almost half of those ads came straight from political figures and parties highlighting underenforcement. Over 90% of the undetected ads prominently displayed political party names or political figures in the text and 72% included photos of political leaders or party symbols. These ads were found by keyword search for just four days within the research period.

Screenshots from the report

According to the findings, identical ads were identified as political when shared by certain pages but not when posted by others. Political demands featured on ostensibly unrelated pages bypassed detection at least in five cases. It also finds a delayed response in identifying and flagging political ads, with instances of ads running for prolonged periods without disclaimers. For instance, an ad conveying a political message remained undetected for a staggering 372 days despite accumulating a million impressions.

Advertisers Evading Transparency

The study analyzed 314 “paid for by” disclaimers available in the Ad Library and found that nine advertisers, including a mayoral election candidate and two politicians, didn’t submit any required transparency information yet were allowed to display political ads.

Meta advertising guidelines require the advertisers to provide phone, email, website, and location addresses that are functional and correct at all times. However, the study finds, 80% of the 314 disclaimers had ambiguous or insufficient address information, with 47% using only the district name. Only 17% had complete and operational addresses and 58% used a Facebook page URL in lieu of a website.

Screenshots from the report

“Paid For By” disclaimers are at the core of ad transparency and crucial for users because they provide essential information about who is funding and supporting a particular advertisement allowing voters to understand the interests and affiliations behind the messages they encounter. Findings of the study imply that once disclaimer information is provided, there is inadequate effort from meta in verifying “functionality” and “correctness” of the information in disclaimers.

Non-Political but Detected

This study analyzed 1,420 advertisement samples from the Ad Library that were posted by pages in the non-political category and found that about 25% of the advertisements from non-political pages (i.e., commercial, news and media, and other categories) were incorrectly detected as political. The highest rate of false positives (i.e., ads erroneously identified), at 43%, occurred in the ‘commercial’ pages category, indicating that it was the most adversely impacted by over-enforcement.

Screenshots from the report

The study identified mis-detections where ads from commercial pages owned by political figures were inaccurately labeled as political. Even simple product promotions from companies owned by political figures faced incorrect categorization.

Ads promoting the sale of guidebooks, textbooks, novels, and services related to employment opportunities, studying abroad, and visa applications were detected as political, seemingly for being considered as a social issue, highlighting a need for more precise classification of social issues for Bangladesh.

Commercial pages encountered challenges stemming from keyword-related issues, where seemingly innocuous terms like ‘Minister’ triggered the misclassification of electronic appliance advertisements and marriage matchmaking services as political ads.

Keywords such as ‘winner’ and references to specific events further contributed to the misclassification of ads, underscoring the complexities in accurately categorizing content on the platform.

Recommendations

This study’s findings underline crucial areas requiring attention and improvement in ensuring transparency within online political advertising for Bangladesh.

Recommendations from the study, informed by expert interviews, highlight the necessity for Election Commission mandated regulations, regular audits of keywords by digital platforms, and stronger collaboration between various stakeholders for effective oversight.

Here is are some of the key recommendations:

  • Social media platforms, such as Meta, must conduct regular audits on keywords tailored to the Bangladeshi context to ensure accurate classifications of political ads and collaborate with stakeholders, including the Election Commission (EC), election watchdogs, researchers, and civil society for context-specific insights and evaluations for the effectiveness of these audits.
  • To foster transparency, the Election Commission or government must mandate all social media platforms to disclose political ads data specific to Bangladesh and policies on what information should be made available and how.
  • To clarify regulations for online advertising and campaigning in Bangladesh, the Election Commission (EC) guidelines must explicitly include provisions for overseeing online political campaigns and allowing parties to self-disclose all their official pages.
  • Meta should collaborate with local stakeholders to establish and publicly disclose a comprehensive list of relevant social issues specific to Bangladesh.
  • Meta must display and archive not only political but all ads uniformly, across all jurisdictions, as it does for the EU to comply with the Digital Services Act.
  • Social media platforms, including Meta, must invest in more human reviewers with specific knowledge of the local political landscape and language to ensure the effective review of political ads.

This study is an integral component of Digitally Right’s ongoing commitment to exploring the intersection of technology and its impact on society. Digitally Right, a research-focused company at the convergence of media, technology, and rights, provides essential insights and solutions to media, civil society, and industry to help them adapt to a rapidly changing information ecosystem.

The full report is available here.