Call for Applications: Bangladesh Tech Policy Fellowship 2024

Call for Applications: Bangladesh Tech Policy Fellowship 2024

Digitally Right is excited to launch the Bangladesh Tech Policy Fellowship 2024. This program is open to young and mid-career professionals in academia, law, journalism, technology, and civic advocacy organizations.

With our reliance on the internet being on a steep rise, the need for responsible and ethical tech policies cannot be denied. This fellowship allows the participants to explore the impact of these evolving technologies and how the policies that govern these technologies influence their use. 

The Fellowship is designed to empower participants with the knowledge and skills needed to understand global and local trends in technology policies. It includes in-depth training, expert mentoring, and the opportunity to conduct research on a policy issue. Participants will contribute to the discourse by producing research papers, policy briefs, or articles and a stipend to cover research expenses.

Be part of a transformative journey!  

Fellowship Themes

Applicants interested in this fellowship will be required to submit research proposals which will play a crucial part in the selection of the most qualified candidates. While there is a wide array of tech policy topics to consider, this year the fellowship will focus on accountable digital governance in the following areas: 

  • Privacy and Personal Data Protection
  • Online Content Governance and Free Expression 
  • Technology Facilitated Gender Based Violence
  • Open and Equitable Access to the Internet
  • Fair E-commerce and Consumer Protection

What’s in it for you?

  • A 4-day in-person workshop.
  • 3 virtual in-depth sessions.
  • Publish research papers under expert mentoring.
  • Showcase your work to an esteemed audience.
  • Become part of a regional network of tech policy enthusiasts.
  • Receive a modest stipend for your work.

Eligibility

The Tech Policy Fellowship is designed for individuals who envision professional development opportunities in the fields of tech policy, digital rights advocacy, and civic technology. We encourage applications from any Bangladeshi institute or organization who has an interest in these areas, and the selection process will primarily consider the quality of the research proposal and the applicant’s personal commitment. However, preference will be given to candidates who meet the following criteria:

  • Early and mid-career professionals with a minimum of three to five years of working experience.
  • Backgrounds in law, journalism, academia, or civic advocacy are highly desirable.
  • Age within the range of 26 to 35 years.
  • Demonstrated relevance of the program to the candidate’s professional work.

Eight exceptional candidates will be chosen based on the strength of their applications. Notifications regarding the selection will be sent out by November 15th.

Timeline

November 2024: Application and Selection

Week 1-2: Application submission period

Week 3-4: Candidate selection and notification of acceptance

December 2024: Program Kick-off and Workshop

Week 1: Program introduction and orientation (virtual session)

Week 2: Mentor meeting and setting research goals

Week 3: 4-day in-person workshop on tech policies (lectures, group discussions, practical sessions)

Week 4: Begin research development (literature review, data collection)

January 2025: Virtual Sessions and Research Progress

Week 5: First virtual session with global experts (research design and methodology)

Week 6-7: Research development, refine research questions

Week 8: Second virtual session (focus on ethical considerations and regional policy issues)

February 2025: Research Writing and Feedback

Week 9-10: Draft research paper, mentor feedback on first draft

Week 11: Peer review session with fellow participants

Week 12: Revise and refine research paper based on feedback

March 2025: Finalization and Showcase

Week 13: Third virtual session (analysis and final recommendations)

Week 14: Final revisions of the research paper

Week 15-16: Showcase your work to an esteemed audience and submit research for publication

Apply now! 

If you find yourself intrigued, possess a compelling research idea, and believe that participating in this fellowship can have a meaningful impact on both your career and the broader domain of tech policy, we strongly encourage you to submit an application. 

📆 Application Deadline: November 10, 2024

🔗 Registration Link:  https://forms.gle/rVg2gX2HRxDCabaK8 

DRAPAC 2024 Highlights: Digitally Right Navigates AI, Disinformation, and Data Protection Challenges

DRAPAC 2024 Highlights: Digitally Right Navigates AI, Disinformation, and Data Protection Challenges

EngageMedia hosted the 2024 edition of the Digital Rights in the Asia-Pacific (DRAPAC) Assembly in Taipei, Taiwan, from August 18 to 19, 2024. This gathering has brought together civil society representatives, journalists, activists, tech experts and more from all over the world with an aim to forge alliances in response to the evolving digital threats in the Asia-Pacific. The two-day event featured a diverse mix of plenary sessions, capacity-building workshops, digital security labs, and solidarity events. 

Out of 43 sessions, Digitally Right presented in 4 sessions and facilitated in 1 which included two plenary sessions, two workshops and a roundtable discussion. Three representatives from Digitally Right were part of these five sessions, Miraj Chowdhury, Managing Director of Digitally Right, Tohidul Islam and Aditi Zaman, the Research Officer and  Programme officer of Digitally Right respectively. 

Miraj Chowdhury is speaking at the opening plenary on harnessing digital rights resilience in the Asia-Pacific

On the first day of the event, Digitally Right was part of 4 sessions including the opening plenary on harnessing digital rights resilience in Asia-Pacific. Miraj Chowdhury presented on this panel discussion along with representatives from Nepal, Sri Lanka and Cambodia. This was followed by facilitating a roundtable collaborative discussion among civil society organizations (CSOs) to enhance their involvement in the governance of artificial intelligence (AI) technologies. The aim of this discussion was to create a roadmap for a more inclusive and transparent AI governance framework. Some participants agreed that the policies for AI governance should be framed in an ethical manner with accountability mechanisms in place while others believed that instead of making new laws and policies, we should consider our existing laws for AI governance. But the consensus was that no matter what policies and laws are used, the governance of AI should be multistakeholder.  

Digitally Right also ran two separate workshops on the first day. The first one was a workshop on how social media fuels disinformation and what can be done to prevent or minimize it. During this session, Tohidul Islam presented his own research titled “Misinformation On YouTube: High Profits, Low Moderation”. The research was presented as a regional case study, illustrating how both content creators and YouTube capitalize on the proliferation of misinformation on the platform for financial gain. The study identified that about 30% of 700 unique Bangla fact-checked misinformation videos, excluding Shorts, displayed advertisements, thereby generating profit for the platform. The presentation helped the participants gain a clear perspective on exactly how monetisation on social media platforms encourages misinformation. After the presentation concluded, the participants were divided into four groups for a breakout session where they came up with strategies to overcome this issue. An interesting observation from this breakout session was that the participants, who were all from different parts of Asia, were not able to agree on one single strategy. For instance, while some thought government control over social media governance would be helpful, others were against it. This workshop had garnered immense positive responses from the attendees. 

Tohidul Islam is presenting his research on how social media fuels disinformation

In the meantime, another workshop was being run by Miraj Chowdhury which aimed to engage women and youth-led digital rights communities across South Asia in sharing their ideas for inclusive digital technology advocacy. Mr. Chowdhury was a panelist during this session where he explored the tech policy awareness among women and youth in South Asia and discussed the unique challenges identified in a research Digitally Right is a part of conducted in India, Bangladesh, Sri Lanka, and Nepal. ImpactNet – a platform designed to bridge collaboration gaps, was introduced during this session and the participants were given a walkthrough on its use. They were encouraged to contribute and provide inputs on how to maximize the use of ImpactNet.

Aditi Zaman is speaking at a panel discussion on personal data protection governance in the age of AI

On the second day of DRAPAC, Digitally Right had one panel discussion on personal data protection governance in the age of artificial intelligence. Aditi Zaman was a panelist on this session along with representatives from Philippines, Cambodia and Maldives. The existing data protection legal frameworks and the recent developments in national AI strategies within South Asia were discussed in this session. Maldives and Cambodia shed some light on how their countries ensure data protection with no specific data protection regulation in their Country. In contrast, both the Philippines and Bangladesh have regulation for both personal data protection and AI. Miss Zaman discussed the recent changes and challenges of the Draft Personal Data Protection Act of Bangladesh and its potential impact on the upcoming Draft National AI Policy, particularly within the new political context of Bangladesh. 

Dismislab’s New Study Reveals YouTube Running Ads on Misinformation Videos

Dismislab’s New Study Reveals YouTube Running Ads on Misinformation Videos

YouTube, a dominant platform in Bangladesh, significantly influences news consumption and entertainment, but concerns about its role in spreading and monetizing misinformation persist. A study by Dismislab, Digitally Right’s disinformation research unit, identified 700 unique Bangla misinformation videos fact-checked by independent organizations and still present on YouTube as of March 2024.

The study, titled “Misinformation on YouTube: High Profits, Low Moderation” shows about 30% of these misinformation videos, excluding Shorts, displayed advertisements, thereby generating profit for the platform and posing reputational risks for the advertisers. These ads were seen on 165 videos, which accumulated 37 million views and featured ads from 83 different brands, one-third of which were foreign companies targeting the Bangladeshi audience. 16.5% of the channels posting these videos were YouTube-verified, including known media outlets, but mostly content creators across various genres like entertainment, education, and sports, often pretending to be news providers.

Misinformation primarily centered around political (25%), religious, sports, and disaster-related topics, with some channels repeatedly spreading false information. Researchers reported all 700 videos to YouTube, with only a fraction (25 out of 700) of reported videos receiving action, such as removal or age restrictions, highlighting gaps in YouTube’s enforcement of its own policies.

The following are the key issues with moderation and policies that are identified in this research:

  • YouTube’s policies have limitations considering they are often proven vague and inadequate.
  • The policies provide some examples, but often say that violations are not limited to these instances without specifying what is not permitted, rendering the moderation process unclear.
  • It is not always necessary to remove all misinformation; however, users should be made aware of potential false or misleading claims. Other platforms, such as Facebook and Twitter, identify misinformation based on user reports or third-party fact-checking organizations. YouTube does not do this extensively.
  • YouTube’s automated systems often fail to detect a variety of misinformation that violates its policies. Furthermore, these techniques are unable to reliably detect the same false content on other channels, even after it has been banned or removed in response to community reports.

It is observed in this research that what actions were taken against the reported content by YouTube. While many videos explicitly violate policies and others are unclear, the reporting was carried out to mainly understand how YouTube reviews user-reported content. Advertisers and experts, interviewed for this research, expressed disappointment over ad placements on misinformation content, emphasizing the urgent need for YouTube to enhance its moderation capabilities and provide better transparency and control options.

Policy Brief: AI Policy and Governance in Bangladesh

Policy Brief: AI Policy and Governance in Bangladesh

Artificial Intelligence (AI) technologies hold immense potential to transform societies, economies, and governance systems. However, the rapid advancement and deployment of AI also raises complex ethical, legal, and social challenges particularly in safeguarding human rights.

Bangladesh, recognizing the importance of AI, has drafted a National AI Policy aimed at harnessing its potential while addressing associated challenges. Keeping its potential in mind, the Policy aims to implement it in various sectors to boost the economy and to ensure national security, while making sure there are some pre-emptive measures for some of the inherent risks of AI. This policy brief critically examines the draft policy and digs into the key concerns, highlights ambiguities, and outlines strategies for advocating for inclusive and transparent AI governance.

Policy Brief Series: Promoting Digital Rights in Bangladesh

Policy Brief Series: Promoting Digital Rights in Bangladesh

Promoting Digital Rights in Bangladesh is the first part of a four-part policy brief series initiated by Digitally Right. The primary objective of this initiative is to identify key digital rights priorities in the country, support civil society, industry and the government with relevant knowledge and insights, and promote collaborative, multi-stakeholder discussions.

As the initial installment of the series, this brief provides a broad overview of digital rights priorities, the contemporary policy landscape, needs, gaps, and potential areas for engagement through consultation with stakeholders, including local civil society and international organizations, along with comprehensive desk research. Subsequent briefs in the series will cover specific priority issues, examining content governance, internet access and shutdowns, and privacy and data protection.

Safeguarding Journalists: Insights from Digitally Right’s Digital Hygiene Trainings

Safeguarding Journalists: Insights from Digitally Right’s Digital Hygiene Trainings

As part of the Digital Safety School initiative, Digitally Right hosted two dynamic Digital Hygiene Trainings on June 22-23 and July 13-14, 2024. Co-sponsored by the SWISS Embassy of Dhaka, these sessions empowered participants with essential tools and strategies to enhance their online safety and digital well-being.

The target audience for both trainings was journalists from regional correspondents for news organizations or local news outlets outside of Dhaka. 12 participants from various news and media organizations attended the first training. Of the participants, 3 were female and 9 were male. 10 participants from various news and media organizations attended the second training. There were 3 female and 7 male participants. These training sessions were very pertinent and helpful for the group because they frequently deal with the greatest digital threats, issues, and security risks due to their line of work.

Both trainings followed a similar agenda and discussion topics. The first day began with an ice-breaking session where participants introduced themselves and shared the digital risks or threats they had encountered in their work. They were also asked about which digital safety methods or tools they employ to get a better understanding of their skill level.

After assessing each participant’s risk level and digital security knowledge, the training sessions began with an overview of common digital safety practices. This was followed by a session on online safety, which covered secure browsing techniques and ways to protect social media and internet service accounts. Participants also learned how to safeguard offline information and were introduced to data encryption tools for managing specific risks. In the afternoon, they explored tools for securely storing, transferring, and permanently deleting data, and the final session focused on best practices for capturing photos and videos—crucial skills for field reporters.

On the final day of the first training, two representatives from the SWISS Embassy attended, and participants shared feedback on what they had learned and how they planned to apply it in their work. The second training’s final day began with a recap of the previous day’s topics. Sessions then covered best practices for device and communication security, followed by lessons on identifying and protecting against phishing and malware. A fact-checking expert led the last session, teaching participants how to verify fake images and videos using advanced search and fact-checking tools. The training ended with a summary of key topics and a Q&A session.

Impact of the Training

Journalists face significant digital challenges and security risks, worsened by restrictive laws in the country. Since the inception of Digitally Right’s Digital Safety School it has made an impact in supporting journalists. The initiative aims to hold monthly training sessions for at-risk journalists across Bangladesh, depending on available funding. Candidates apply for each training session, and participants are selected based on these applications. Each training is tailored to the specific needs of the group—for instance, journalists outside Dhaka face greater risks and have fewer resources, so their sessions differ from those for journalists in the capital. In the current context, such training has become essential for journalists nationwide.

When asked about the training, one participant expressed their gratitude and shared, “Journalists who are working outside the Capital inherently are at more risk. Therefore, this training will be highly beneficial. I am very pleased with the training.” 

One participant added, “Another colleague of mine had previously attended a digital safety training with Digitally Right and learned the use of the Tella app. While investigating a voting scam he had used the Tella app to take pictures at the polling station and when law enforcement coerced him to show the pictures in his gallery, they could not find anything. This not only saved him from facing severe consequences but he was also able to use the pictures taken using Tella on his story. I have learned a lot from this training myself and hope these learnings help me out in the future the same way it helped my colleague.”