by digiright | Jul 3, 2024 | Digital Security, Our work, Updates
On the 8th of June 2024, Digitally Right organized a Workshop titled “Digital Security Workshop: Mitigating Online Gender Based Violence.” This event is part of the Greater Internet Freedom (GIF) project, a USAID-sponsored global program that stands as the largest global effort dedicated to advancing Internet freedom. Digitally Right partnered with Internews and EngageMedia in GIF in Bangladesh.
The Workshop was attended by 16 participants from various CSOs and gender rights activists. Among the 16 participants, 10 belong to the Cyber Support for Women and Children platform. This platform is an alliance of 13 organizations in total that safeguard women and children from violence online. As such, this Workshop was highly relevant and useful for the group. The other 6 participants belong to the OGNIE Foundation, a gender-focused non-profit organization based in Bangladesh.
The Workshop started with an ice breaking session. This workshop took a unique approach where the participants themselves were in charge of the discussion points. With the help of an interactive activity, the participants teamed up to identify the common issues women, in their experience, face online and pinpoint the topics that they want to learn about during the workshop. This activity revealed a concerning number of threats women face online which include revenge porn be it by means of having intimate pictures and videos being leaked without consent or having their pictures photoshopped to appear obscene, hacked social media accounts, identity theft, bullying and doxing.
Based on the threats identified, the participants dove into a session to learn about the risks associated with some common online practices and how to circumvent them. This was followed by a session on password management to avoid social media account hacking, the importance of activating 2 factor authentication, encrypted communication options and how to identify and bypass phishing attempts.
The workshop also included sessions on reporting and documenting harassing and bullying comments online as well as a session on how various apps track our activities on the phone and how to stop them. The last session on the Workshop consisted of a group activity where the participants were divided into groups of 4 and given 4 different scenarios on revenge porn, cyber bullying, identity theft and account hacking. Each group came up with a solution as to how they will handle each of the situations and presented in front of everyone else. This group activity helped everyone to connect and exchange ideas and strategies. The Workshop concluded with a summary of the entire day’s event.
by digiright | May 27, 2024 | Our work, Updates
Following its launch on May 5, 2024, Digital Safety School successfully held its very first 2-day Digital Hygiene Training from 18-19 May. The training was attended by 13 journalists stationed in different parts of Bangladesh. The two-day training started with learning about the basic do’s and don’ts of maintaining digital safety.
On the first day, the participants learned about account and browser security, data encryption tools and how to transfer and delete data securely and the best practices for capturing photos and videos on the ground. The second day of the training consisted of the participants learning about device and communication security, identifying malware and phishing attempts and the use of fact checking tools to identify fake images and videos. After each of these sessions, the participants were given the opportunity to practice the tool they learned about under the supervision of our security expert. This gave the participants more confidence about their capability to use the tools by themselves later on.
When the training concluded, one journalist expressed their gratitude for the opportunity and shared that, “During Covid-19, I would constantly watch videos on social media to learn about safe digital practices and although I was able to learn about the risks of the digital space, I was none the wiser about safe practices. By attending this training, I am now aware of the risks as well as how to protect myself from the threats.”
Another participant shared, “During the session on using fact-checking tools to identify fake images and videos, I thought that it would not be useful for me. But once the session ended I realized that the session was relevant for me and I did not even realize how much I needed it.”
by digiright | May 5, 2024 | Our work, Updates
Digitally Right is delighted to launch Digital Safety School. The objective of this school is to empower individuals and organizations with the skills to navigate current and emerging digital threats. Through this initiative, Digitally Right aims to support civil society, media and the private sector as well.
This school offers one free access and two on-demand services. The school will organize a monthly two-day free access training which interested candidates can apply for. The training will be administered by Digitally Right’s expert team. Apart from that, the school will also offer on-demand services such as training packages based on the needs of different organizations and digital safety audits for small to mid-sized organizations. In the near future, the school aims to launch a resource center which will provide tailored tips and advice on emerging digital trends and threats.
If you are a journalist, or editor or fall within the broader civil society, please apply for our free access training using the form. The training is expected to be held on the third week of each month and candidates will be selected on a rotational basis.
The Digital Safety School is a collaborative initiative and we welcome partnerships and collaborations from organizations who share our vision of a safer digital future. For inquiries regarding collaboration opportunities, please do not hesitate to reach out to us via email at dss@digitallyright.org.
For further information and to stay updated on the latest developments, explore our website at https://digitalsafetyschool.com.
Join us on this journey towards a safer and more secure digital future.
by digiright | Feb 12, 2024 | Our work, Updates
Eight young professionals from law, academia, media, and civil society organizations in Bangladesh have joined Digitally Right in its flagship annual Tech Policy Fellowship program of 2023. The diverse set of fellows aims to explore various issues arising from technology and related policies, affecting different communities.
The six-month program provides a unique opportunity to gain a deeper understanding of technology-related policies in Bangladesh and the region, along with their implications for everyday life. The goal is to develop the next generation of advocates to shape the legal and policy domain in Bangladesh for a more open and free internet space.
The fellowship includes residential training, deep-dive sessions, mentoring, and research production on a contemporary issue related to technology and society. It also facilitates engagement and networking with the global and regional digital rights community. Access Now, an international non-profit promoting digital rights worldwide, is a knowledge partner in this fellowship program.
The fellowship program commenced in January of this year with a five-day residential training in Kathmandu, Nepal, in collaboration with Access Now. Representatives and experts from six organizations in civil society, technology, and academia joined the fellows to discuss the legal and policy environment shaping internet governance in Bangladesh and beyond.
Each fellow will pursue research under this fellowship program in the next couple of months and will have the opportunity to share their work with the digital rights community in the region and globally.
Meet the Fellows
Dilshad Hossain Dodul, a Senior Lecturer in the Media Studies and Journalism Department of the University of Liberal Arts, has a broad research focus on digital media, migration, and political communication. With a background in Mass Communication and Journalism, including international exposure through the Erasmus Plus Scholarship, Dodul has worked on impactful projects promoting digital awareness and safety, particularly among teenagers and female journalists.
Nazia Sharmin, a Course Contract Faculty at BRAC University, focuses on Digital Sociology, emphasizing socio-cultural aspects of misinformation. With a master’s degree from Stockholm University, Nazia believes that effective tech policies are crucial for defending society against the threats posed by misinformation.
Nowzin Khan, a graduate from BRAC University’s School of Law, is currently pursuing a master’s degree in Peace and Conflict at Dhaka University. An independent researcher with a keen interest in the convergence of technology, governance, and human rights, Nowzin aspires to contribute to comprehensive and unbiased tech policies through active engagement in the Tech Policy Fellowship.
Nusrat Jahan Nishat, currently serving as an Advocacy Manager at ANTAR, Dhaka. With a background in law, she has actively participated in Moot Court Competitions and contributed to human development training programs, workshops, and human rights-related research. Nusrat aims to explore various facets of digital technology and policy issues during her fellowship.
Rashad Ahamad, a journalist with over 13 years of reporting experience in reporting labor and human rights. Currently working at ‘New Age,’ Rashad has covered a wide range of topics, including migration, disinformation, health, climate, and labor rights. His commitment to public-interest reporting has earned him awards and fellowships, and he remains interested in the intersection of data, labor rights and technology.
Shoeb Abdullah is a fact-checker, researcher, and digital rights advocate. Currently engaged with Internews, Shoeb’s career spans nearly five years, contributing to combating disinformation, and advancing digital rights. His expertise includes internet Freedom, digital safety, and anti-internet shutdown activism.
Suhadha Akter Afrin, a staff reporter of Prothom Alo, is a technology journalism enthusiast covering ICT and digital security issues. Actively following upcoming digital laws and policies, she is keen to understand the impact of these policies on society. Suhadha aims to keep the people informed about the developments in the digital sector.
Suparna Roy, a legal professional dedicated to justice and equality, serves as a Legal Expert to ensure basic human rights for SOGIESC and marginalized communities in Bangladesh. Beyond her professional roles, Suparna is a passionate human rights activist advocating for gender equality and societal transformation.
by digiright | Dec 28, 2023 | Our work, Research
With the rapid rise of the internet and the adoption of new technologies in our everyday life, we see different technology-related laws and policies emerge at home and across the world. To understand the trends in these policies and their role and impact on society, Digitally Right launched Bangladesh Tech Policy Fellowship 2022. The programme aimed to empower the citizen’s voices with the essential skills and knowledge to become an advocate for a free and open digital space.
The fellowship spanned for a period of six months where fellows from diverse backgrounds in academia, law, media, and civic advocacy were engaged in thorough training, and virtual deep-dives and produced research/papers under expert supervision.
In their research, fellows have focused on different areas of concern. For instance, while one of the papers emphasised on today’s youth’s interaction with the internet and their awareness of the rights and liabilities that come with it, another paper focused on how the Rohingya refugees are utilising the internet and the difficulties they face while doing so. It has also been found that the existing laws in terms of election are not sufficient to deal with the risks stemming from the digital space.
Two of our fellows attempted to assess the impact of the draft Data Protection Bill on internet-driven businesses, which is set to make data localization mandatory. Another time appropriate research flags concerns around Facebook’s Bangla content moderation due to the language’s ethno-linguistic diversity.
We hope this publication and the reports by our fellows contribute to the existing knowledge and inform arguments, discussions, and debates for future actions in the tech policy space.
Read the papers here.
by digiright | Dec 25, 2023 | Our work, Research
As Bangladesh gears up for its upcoming parliamentary elections, a study by Digitally Right has laid significant shortcomings in Meta’s political advertising enforcement. The study, titled “Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh,” published today (Monday), finds various active political ads, lacking required disclaimers, managed to slip through Meta’s detection systems.
The study also finds issues of over-enforcement, with non-political ads facing inaccurate categorization, disproportionately affecting commercial entities. The inadvertent mislabeling, ranging from product promotions to employment services, prompts questions about the efficacy of Meta’s ad classification algorithms.
It reveals instances of incomplete or vague information in disclaimers provided by the advertisers that falls short of Meta’s transparency standards and hints at potential gaps in the platform’s verification processes leaving users in the dark about the sources funding political advertisements.
Political advertisements significantly impact democratic processes by shaping public perceptions of political systems and leadership and function as the most dominant form of communication between voters and candidates before elections. And social media has brought about a significant transformation in electoral campaigns, enabling political actors to engage with a vast audience through advertisements.
In January 2023, Bangladesh had 44.7 million social media users, representing 34.5% of the population aged 18 and above, as reported by Data Reportal. Among these users, Facebook stood out with the highest number, accounting for 43.25 million. Among the major platforms only Meta, in its Ad Library, offers disclosures on ads related to election, politics and social issues — that his study terms as political ads. It allows an opportunity to scrutinize the identity of advertisers, amount spent and content of such ads.
The study analyzed detected political ads and related disclaimer information available in the Meta Ad Library in one year period and ran keyword search to identify active but undetected political ads.
Political but Undetected
The under-enforcement undermines the entire purpose of political ad transparency as it represents a failure of the transparency system, which then leads to missing significant portions of political advertising on the platform.
The study finds 50 active ads with clear political messages and almost half of those ads came straight from political figures and parties highlighting underenforcement. Over 90% of the undetected ads prominently displayed political party names or political figures in the text and 72% included photos of political leaders or party symbols. These ads were found by keyword search for just four days within the research period.
Screenshots from the report
According to the findings, identical ads were identified as political when shared by certain pages but not when posted by others. Political demands featured on ostensibly unrelated pages bypassed detection at least in five cases. It also finds a delayed response in identifying and flagging political ads, with instances of ads running for prolonged periods without disclaimers. For instance, an ad conveying a political message remained undetected for a staggering 372 days despite accumulating a million impressions.
Advertisers Evading Transparency
The study analyzed 314 “paid for by” disclaimers available in the Ad Library and found that nine advertisers, including a mayoral election candidate and two politicians, didn’t submit any required transparency information yet were allowed to display political ads.
Meta advertising guidelines require the advertisers to provide phone, email, website, and location addresses that are functional and correct at all times. However, the study finds, 80% of the 314 disclaimers had ambiguous or insufficient address information, with 47% using only the district name. Only 17% had complete and operational addresses and 58% used a Facebook page URL in lieu of a website.
Screenshots from the report
“Paid For By” disclaimers are at the core of ad transparency and crucial for users because they provide essential information about who is funding and supporting a particular advertisement allowing voters to understand the interests and affiliations behind the messages they encounter. Findings of the study imply that once disclaimer information is provided, there is inadequate effort from meta in verifying “functionality” and “correctness” of the information in disclaimers.
Non-Political but Detected
This study analyzed 1,420 advertisement samples from the Ad Library that were posted by pages in the non-political category and found that about 25% of the advertisements from non-political pages (i.e., commercial, news and media, and other categories) were incorrectly detected as political. The highest rate of false positives (i.e., ads erroneously identified), at 43%, occurred in the ‘commercial’ pages category, indicating that it was the most adversely impacted by over-enforcement.
Screenshots from the report
The study identified mis-detections where ads from commercial pages owned by political figures were inaccurately labeled as political. Even simple product promotions from companies owned by political figures faced incorrect categorization.
Ads promoting the sale of guidebooks, textbooks, novels, and services related to employment opportunities, studying abroad, and visa applications were detected as political, seemingly for being considered as a social issue, highlighting a need for more precise classification of social issues for Bangladesh.
Commercial pages encountered challenges stemming from keyword-related issues, where seemingly innocuous terms like ‘Minister’ triggered the misclassification of electronic appliance advertisements and marriage matchmaking services as political ads.
Keywords such as ‘winner’ and references to specific events further contributed to the misclassification of ads, underscoring the complexities in accurately categorizing content on the platform.
Recommendations
This study’s findings underline crucial areas requiring attention and improvement in ensuring transparency within online political advertising for Bangladesh.
Recommendations from the study, informed by expert interviews, highlight the necessity for Election Commission mandated regulations, regular audits of keywords by digital platforms, and stronger collaboration between various stakeholders for effective oversight.
Here is are some of the key recommendations:
- Social media platforms, such as Meta, must conduct regular audits on keywords tailored to the Bangladeshi context to ensure accurate classifications of political ads and collaborate with stakeholders, including the Election Commission (EC), election watchdogs, researchers, and civil society for context-specific insights and evaluations for the effectiveness of these audits.
- To foster transparency, the Election Commission or government must mandate all social media platforms to disclose political ads data specific to Bangladesh and policies on what information should be made available and how.
- To clarify regulations for online advertising and campaigning in Bangladesh, the Election Commission (EC) guidelines must explicitly include provisions for overseeing online political campaigns and allowing parties to self-disclose all their official pages.
- Meta should collaborate with local stakeholders to establish and publicly disclose a comprehensive list of relevant social issues specific to Bangladesh.
- Meta must display and archive not only political but all ads uniformly, across all jurisdictions, as it does for the EU to comply with the Digital Services Act.
- Social media platforms, including Meta, must invest in more human reviewers with specific knowledge of the local political landscape and language to ensure the effective review of political ads.
This study is an integral component of Digitally Right’s ongoing commitment to exploring the intersection of technology and its impact on society. Digitally Right, a research-focused company at the convergence of media, technology, and rights, provides essential insights and solutions to media, civil society, and industry to help them adapt to a rapidly changing information ecosystem.
The full report is available here.