Internet Rules Workshop 2025: Understanding digital rights & policies in South Asia

🦋 Workshop overview

🦋 Workshop overview

Agenda

UTC Converter: Please click this link to check your local time.

Internet Rules - Agenda SA.png

🦋 Workshop overview

Zoom link to workshop and instructions to use zoom

Our Zoom space

Our workshop will take place at: https://apc-org.zoom.us/j/89259454824


Tips for participation in Zoom sessions

How to use zoom

Join the workshop session by opening the link in your browser, a client or an app. You will be redirected to the correct zoom room with the waiting room. When the host will adds you to the meeting you will be able to see the main screen.

zoom-main-window-.png

Configure Audio

If your audio is not working, please check your settings.

For Audio click on the up arrow next to Unmute/Mute button and check if your Microphone and Speaker have been correctly configured.

zoom-audio-settings-button.png

To test Speaker and Microphone click on Audio Settings in the same menu and click on the link Test Speaker and Microphone.

bbb-audio-test.png

In the window that opens you can Test the Speaker and Microphone and select the volume.

zoom-audio-settings.png

Configure Video

If your video is not working when you click Start Video, please check your settings.

Click on the up arrow next to Start Video and see if Camera is selected.

zoom-video-sharing-button.png

If you would like to examine settings in more detail, click on the last option Video Settings... and the following window will open.

zoom-video-settings.png

Here you can define how your video should be presented and we recommend you select the option Hide Non-video Participants so that you will be able to see only the videos of participants who have their cameras enabled.

Chat

Zoom enables you to participate in public chat and also send private messages to specific participants. To send a private message select the participant in the dropdown field which is by default set to Everyone (for public chat).

zoom-chat.png

Share screen...

If you need to share screen, portion of the screen, computer audio or 2nd camera you can do that by clicking on a green icon Share screen and select the window, program, computer sound or 2nd camera in Basic or Advanced tabs.

zoom-share-screen.png

zoom-select-a-window.png

If you select the wrong window you can always stop sharing and then start sharing the intended one.

🦋 Workshop overview

Organisers' contact details

The team can be reached at asia-sej@apc.org

 

🐬 Workshop Sessions

🐬 Workshop Sessions

Day 1 Session 1: Introduction to ICT landscape and frameworks

Time: 5:00 UTC - 6:30 UTC

Objective: This session invites participants to explore the regional ICT and digital rights landscape, develop an understanding of key frameworks and structures, and examine the powers and processes that create them. It will also help participants to identify the structures of governance and regulation, as well as recognise the opportunities and challenges that arise across different sectors and countries in South Asia as a result of these regulations. 

Session plan:

The session will aim to cover questions such as:

*Reading materials are hyperlinked, please click the text to access

Reading Materials:

📌 Presentation Slide

🐬 Workshop Sessions

Day 1 Session 2: Access and inclusion

Time: 7:00 UTC - 8:30 UTC

Objective: The focus of this session is to develop participants’ understanding of meaningful access and the regulatory frameworks that enable Internet connectivity. It will also invite participants to examine the digital divide, its effects on marginalised groups, and policies and initiatives designed to promote inclusion. This session will further incorporate case studies in the region that illustrate how these issues play out in practice, enabling them to connect theoretical perspectives with lived realities.

Session plan:

The session will aim to cover questions such as:

*Reading materials are hyperlinked, please click the text to access

Reading Materials:

📌 Presentation Slide

🐬 Workshop Sessions

Day 1 Session 3: Group exercise briefing

Time: 9:00 UTC - 10:00 UTC

Objective: Participants will come together in this session to engage with these themes of digital rights: access, freedom of expression, and privacy. Using relevant case studies, they will be divided into breakout groups and tasked with 1) analysing the issues and 2) designing advocacy strategies targeting actors such as governments, the public, or platforms. Conducted as a group activity, the session allows participants to work on a case study collaboratively with a small group of peers

Exercise sheets

Group 1 - Case Study on Access and Inclusion

Group 2 - Case Study on Freedom of Expression

Group 3 - Case Study on Privacy and Surveillance

🐬 Workshop Sessions

Day 2 Session 1: Freedom of expression

Time: 5:00 UTC - 6:30 UTC

Objective: This session is aimed at exploring the laws protecting freedom of expression across the South Asian region, while also examining the restrictions placed on this right within different legal systems. Participants will look closely at laws addressing hate speech, sedition, blasphemy, and defamation in their regional contexts. The session will further provide a brief historical overview of these laws and restrictions, tracing how they continue to surface in both offline and online spaces today.

Session plan:

The session will aim to cover questions such as:

*Reading materials are hyperlinked, please click the text to access

Reading Materials:

📌 Presentation Slide

🐬 Workshop Sessions

Day 2 Session 2: Privacy, surveillance and data protection

Time: 7:00 UTC - 8:30 UTC

Objective: In this session, participants will learn to examine the different facets of privacy in the digital age and the regulations designed to protect these rights. They will investigate the impact of emerging technologies on privacy by considering the various forms of surveillance currently deployed in society alongside the enabling legal frameworks. This session will further focus on the rise of surveillance-related policies and their implications for the protection and promotion of digital rights.

Session plan:

The session will aim to cover questions such as:

*Reading materials are hyperlinked, please click the text to access

Reading Materials:

📌 Presentation Slide

 

🐬 Workshop Sessions

Day 2 Session 3: Group work check-in

Time: 9:00 UTC - 10:00 UTC

Objective: This session provides an opportunity for participants to continue their group work and reflect on the case studies from their own experiences, through challenges they have encountered in their work and approaches they have employed. They will also engage in a feedback process with facilitators, allowing them to refine their approaches through this exchange.

Exercise sheets

Group 1 - Case Study on Access and Inclusion

Group 2 - Case Study on Freedom of Expression

Group 3 - Case Study on Privacy and Surveillance

🐬 Workshop Sessions

Day 3 Session 1: Gender and vulnerable groups

Time: 5:00 UTC - 6:30 UTC

Objective: This unit invites participants to examine the differentiated impacts of ICT policies on vulnerable groups, with particular attention to the special provisions that seek to protect gender and other marginalised communities in digital spaces. It also encouraged reflection on how vulnerable groups participate in policymaking processes and how rights-based, inclusive strategies can strengthen equity in the ICT environment.

Session plan:

The session will aim to cover questions such as:

*Reading materials are hyperlinked, please click the text to access

Reading Materials:

📌 Presentation Slide

🐬 Workshop Sessions

Day 3 Session 2: Group work session

Time: 7:00 UTC - 7:45 UTC

Objective: During this collaborative block, participants will work in their breakout groups to refine their advocacy strategies by integrating insights from earlier sessions. Facilitators will offer guidance and feedback to help participants consolidate their approaches and strengthen their overall outcomes.

🐬 Workshop Sessions

Day 3 Session 3: Final presentation and closing

Time: 8:00 UTC - 9:30 UTC

Objective: In this concluding session, participants present the outcomes of their group work and reflect on the lessons learned throughout the program. Each group will deliver a presentation, followed by feedback and questions from the facilitation team and peers. The session will also include a collective reflection on commonalities and differences across regions and countries. 

🐳 Resource persons' profiles

🐳 Resource persons' profiles

Resource persons' profiles

Apar - Final.png

Nandini - Final.png

Prasanth - Final.png

Ashwini - Final.png

Seerat - Final.png

🌊 Documentation

🌊 Documentation

Day 1 Report

ICT landscape and frameworks | 5:00 - 6:30 UTC

👤 Apar Gupta

Session details:

This first session invites participants to explore the regional ICT and digital rights landscape, develop an understanding of key frameworks and structures, and examine the powers and processes that create them. It will also help participants to identify the structures of governance and regulation, as well as recognise the opportunities and challenges that arise across different sectors and countries in South Asia as a result of these regulations. 

Key Themes from the presentation: 

Platform Governance 

In the past decade, there has been an increased number of internet connections in India, with the  current record of 970M+ internet connections, following an increase of 300-400M users on Instagram, Youtube and WhatsApp. Access to data per individual is unequal due to socioeconomic status, with some owning multiple connections, and some without any access. 

As these platforms are now core public infrastructure for speech, news and association, they also govern the visibility on who is heard, silenced, and how information travels. State power further increasingly exercised their governance through the platforms, via specific tactics like notice-and-takedown, blocking, and shadow banning. However, there is a lack of transparency on these rules and implementations on censorship. Such governance choices affect civil society and social movements including elections, protects, minority speeches and journalistic investigations. 

Core Concepts: 

The Indian Framework: From Safe Harbor to Delegated Censorship

Censorship

Towards Rights-Respecting Platform Governance

Notes from the discussion:  
Government initiatives and regulations, safe harbor framework

What has changed in how the government looks at technology and the shift in the last 7 years or so? 

The large growth of telecommunication infrastructure prompts a high interest in regulation on the content and type of technologies. As such, the Indian government has been using the Safe Harbor framework as a principle regulation, whereas there’s a global movement for deregulation to increase more innovation and more private industries to thrive. 

The Safe Harbor framework offers the ability of the state to not to bring a lot of regulations to operate your business, but still allows the government to fasten a liability on a private company that is required to obey any content takedown requests. While it may not be the best, it is still important for this framework to be protected. 

The Digital India Act plays a large role in pushing for digitisation, and yet there are no clear statutory provisions. There are ‘guardrails’ acting through executive notification but there are no fully fleshed out regulations unlike in the EU.  pushing for the digitisation of every information, with ‘guardrails’ through executive notification. 

How do we balance states' use of broad-based language to allow regulation of emerging technology and the chilling effect that it produces, if we maintain the ‘Safe Harbor’ framework? (especially to curb Big Tech)

It is important to see and frame the platforms as businesses that serve their primary interest in gathering data for private corporations, and are no longer acting as passive intermediaries. Many problems arise when the platforms have full control on the operations and content distribution. 

The issue of broad-based language is due to the larger Rule of Law problem. It has been affected because the constitutional frameworks and the institutions were largely captured by the elites and legalities protests. Even that has been shifted by global populism.

I would double down on the transparency commitment, because at the very least it should allow researchers, academics, CSOs to know more on the basis of the knowledge of public pressure’s outcomes. 

Q: What other infrastructure regulations beyond online platforms that we should look out for? And how do we see that impacting rights in terms of the current thinking? What are the trends happening in the region, and what should we prepare in order to push for rights? 

It is important to acknowledge that many issues are dominated and steered by technology. 

Platform regulations and content moderation

Content moderation that was once supported by thousands of workers across the Global South has been dismantled as major platforms cut costs and left a gap for context-sensitive moderation especially when it comes to non-English content. Also in extension it also has effects on information integrity. How do the regulations structure navigate this in a way that is rights-based regulatory models that balance accountability with freedom of expression?

As the platforms have the power to control the visibility of content based on their own interests,  it is more difficult to see where the line is drawn, what kind of information they would deliver to the users, with the intent to prolong the usage time. Content moderation is important but it alone does not address or fix the issues as fascism and populism that could contribute to the platforms’ initiatives.  

Content moderation systems in the Global South which are used to mitigate harm may cause extreme polarisation by amplifying polarising opinions to get more engagement. 

Certain issues could also be degraded or desensitised over time, re: Palestine.

Some platforms like Signal would follow human rights principles to a certain degree. Ultimately, there needs to be a shift of thinking and mindset to distribute on diverse platforms, rather than being tied to one certain platform. 

Do you think it is useful to compare the models of state control over these platforms and to information flow particularly from across border (eg. banning accounts of users from other countries from view within this country) to Digital Right Management (DRM) 20 years ago. Media companies pushed tech companies to create and impose DRM tech on everything and recreate the control they had over physical world media in digital spaces, then a decade later became even more controlling than the physical world was. Do you see parallels? We also see strange invasions of digital privacy hidden within tax law. Do you think this kind of sideways attack on digital privacy and other digital rights are harder to fight?

Certain tech industries may have also broken the DRM. Eg Google Books who scanned every publication and went against intellectual property. We need to acknowledge the distinction between business concerns and tech companies, in which DRM is being bundled with cheap data and subscription services. 

However, piracy is also on the rise as users are looking for content not offered.

The control which tech companies have on people today is immense. It’s not only a strategy of locking people in their platforms, but it’s also the ability of larger techs investing in smaller start ups and venture capitals, monopolising any new services. There’s a high amount of control among 15-20 companies on our technology, with the current AI companies becoming more concentrated with the perceived value. 

Alternative Strategies For a Safe, Accessible and Rights-Affirming Access

Between deregulation by neoliberal privatisation and regulation by the government, what should be the solution to balance the rights-based regulatory model?

The government is expected to serve the public and constitutional interests. Unfortunately, the global trend of playing by the Rule of Law creates a tension in the relationship between the larger government’s interests and individual officials who tend to play towards their own interests, which is conducted without any transparency. 

There needs to be a full institutionalised regulatory framework, with regulators, that is insulated and autonomous from the political space so that the expertise can grow and develop their own mandate. That regulatory body of large transnational companies has to be in the public domain for public scrutiny and access, as most large private companies are not necessarily champions of free expression. Similar to the government, it's good to have that tension. 

The internet is historically different as there are global efforts to ensure multi-stakeholder governance can be conducted. How is this model adopted for the internet and why does it need to find a place at the national level and what are the challenges in the countries in our regions?

Certain multilateral processes around internet governance are available (e.g. UN IGF) to acknowledge recommendations from civil society and other stakeholders, giving them the opportunity to shape the agenda or provide any feedback on themes and decisions to be made. However, most states would opt for state-centric multilateral approaches due to distrust of civil society. 

What do alternative models of ownership, development, and innovation look like - models that build community-owned or decentralised pockets of the internet that challenge extractivist and capitalist business models?

And in contexts where accountability mechanisms are failing because states themselves are perpetrators of digital repression and violence, what strategies or governance frameworks could support these alternatives to remain safe, accessible, and rights-affirming?


Access and inclusion | 7:00 - 8:30 UTC

👤 Nandini Chami

Session details:

The focus of this session is to develop participants’ understanding of meaningful access and the regulatory frameworks that enable Internet connectivity. It will also invite participants to examine the digital divide, its effects on marginalised groups, and policies and initiatives designed to promote inclusion. This session will further incorporate regional case studies in the region that illustrate how these issues play out in practice, enabling them to connect theoretical perspectives with lived realities.

Key points from the presentation:

Mentimeter:

  1. Addressing the divides in meaningful access will ensure digital inclusion
    1. Not sure: We are battling other issues such as poverty, fascism, etc 
  2. South Asian economies have other priorities to address before the quest for AI transformation
    1. I disagree: Does not think we should look at other priorities as distinct, it can be fought in parallel with each other  when these issues are structurally interconnected. 
  3. In South Asia, what is the biggest challenge to effecting digital inclusion?
    1. Political will, poverty and digital literacy, knowledge

SA is home to 2 billion people, and has emerged as one of the world’s fastest-growing regions during the first two decades of the 21st century demonstrated by GDP growth rates. The second date experienced higher economic growth in Bangladesh and India, with jobs affected by AI development. 

Coverage gap: Afghanistan and Pakistan have a coverage gap of over 10%. Usage gap, percentage of the population who live within the footprint of a broadband network but not using internet: 42%

Mobile internet subscribers: the number of unique users that have used internet services on a mobile device that they own or have primary use of at the end of the year

To take note that the connection is available because of the subscription, but not necessarily on actual usage. 

South Asia’s mobile use landscape

Gaps in affordability - SA has a huge gap on gender employability. 

Gaps in usage - while there is no significant rural-urban gap, but men 

Gender divide: Typically, headlines tend to lean towards women’s usage on the internet. Often hear about mobile bans by the local government agencies. Even more than overt gatekeeping, there’s systemic and patriarchal barriers by patriarchal gender norms. 

Women use it to speak to family; using cheaper phone thus access is limited; digital skills and time constraints

Young women, aged 18-25, how they navigate household patriarchies to allow their access and use of the internet. The participants of the FGD described the approach as ‘Cinderella’ and playing an angel image - uses social media but rarely posts on it, rarely accepts requests from male users, and tries to balance perceptions and performs how the family should be portrayed.  

Online GBV on women and minorities - especially on the rise of synthetic media. Study by Equality Now, 2025. Indian law enforcement agencies describe the process of getting social media companies to remove abusive content (eg ‘revenge porn’) as “opaque, resource-intensive, inconsistent and often ineffective” 

Often ends up policing and surveilling women, and suppresses women’s self-expression and sexual expression

Access is NOT equal to digital inclusion on empowering terms

Labour exploitation: the gig economy (drivers, delivery workers) and platformisation of informal employment

The AI sweatshop (in India region) - Data labelers to train AI models. The ghost workers are well-educated workforce, many with STEM university education to an advanced level, but failed to find appropriate jobs in their actual field. 

Commodification of data

India - biometric identification programme (Aadhar numbers). The approach is being exported to other regions including Sri Lanka. 

The citizen became congruent with the customer, the national population is reborn as a “total Addressable Market”. January 2025 amendment to the Aadhar Act (2016) allows private sector services to be used to promote ease of living. Private innovation delivered through state-funded digital infrastructures empowers the poor and develops the nation. This allows predatory data profiling by the market which has never been enforced before. 

Representational Injustice

Concluding thoughts: Institutional governance deficits that perpetuate digital exclusion

Notes from the discussion:
On Artificial Intelligence and AI transformation

How do you define AI transformation?

Referring to the UN Trade and Development’s definition of AI, as the purpose of the technology, there is a transfer of structure in the economy. In AI, you see an introduction of technology service in the tertiary and private sector, in which there is an increased use of knowledge in its structure. 

However, AI transformation doesn’t mean it’s magic as it does not necessarily leap through all the bottlenecks you have. If you are in the situation where you do not have any value added to the AI transformation, or giving up your data, then you would be lowered in the value chain. 

With the proliferation of AI is not only the hype (and the illusion of ‘inclusion” - everyone can now ‘write’, everyone now can ‘draw’), but also the possible lack of solidarity that came with it - in AI labour displacement, first instinct is people might not protect one another, but to use the tools to replace our collaborators. We wish away designers, writers, we wish away project coordinators etc. Sure, AI lowers the barrier to working outside your lane, sure, that could mean more overlap between disciplines but individual work is often connected from the whole and when accelerated by automation, only makes the turbulence worse and the course corrections more violent

It is tantalising to see AI, such as the CoPilot revolution to make the work seen as dispensable.  However, if we are increasingly seeing the internet as full of AI, chatbots and the transmission of information by way of AI becomes worse, there will be a point for collective response to stop  using it. There is an increase in unionisation where people’s jobs are affected by AI as well. 

Q: Regarding copyright and content, is it the case of closing the barn door in case the horses have already left, especially when the data has already been stored and used by Big Tech, but the laws become better after?

I have never run into a situation where cognitive work can be extracted from the person, as the mind is not extractable. Generative AI is putting this to the test, but that’s where there is hope to recognise this as work and not as similar to those fictions created by enterprises. 

Another strategy is to reclaim copyright laws as content creators, and training standards. 

In the case of data, one useful thing is that if we build data models, most businesses want access to continuously learn social relations. This requires continuous data collection, which requires a lot of work. This can be a way of ‘rescuing the horse’. 

There is also the debate on what kind of data could be commodified, and what shouldn’t.We should recognise that data is social commerce, but model rights of traditional and indigenous knowledge should be acknowledged and involved. We should work on recognising the boundaries. Collective liaising on our data is a move we can prioritise. 

Access to Connection and Data Control

In the previous session, we discussed the slow unionisation especially among gig workers. What pathways do you see to challenge through this divide? 

Even within this paradigm of the digital divide on who has access, there are varying degrees on what access and usage means. In SA, not many have the basic level of that connectivity. One may be connected but do not understand the complexities on how our data can be exploited. On the other hand, one may not be able to remain connected. We know that corporations' concern is on profitability. Could you please tell us more on the options of those alternatives? Such as responding with community work. You cannot prioritise one over another. 

Even if people are not connected, the government has moved many systems online, especially in delivering their services, which could lead to life and death situations. E.g  the hospital computer may not accurately recognise the patient’s beneficiary or actual needs. We need to have the right to have a non-tech approach to many foundational services. 

Secondly, we have abandoned the agenda of public access. In India, we do have a broadband fibre connection program. But it is not like a water pipe program - how do you make the connection meaningful? How do you make and deliver programs that can benefit many? 15 years ago, we were working on efforts to get connected. However, once mobile connection comes up, we tend to think of the mobile connection as a complete replacement for broadband internet, when these are two different infrastructures. 

Because of the epidemic of misinformation, there is a rise in the debate for access, in which we can seize the opportunity to educate and talk about digital citizenship literacy and public access models. Many members of APCs are running this model, such as digital empowerment. We need to have a concerted response to lobby for a policy that can scale up these experiments. 

Data and Labour Rights

Some 6-8 years ago IT for change had done a report that had looked at how datafication in ports in India such as Mumbai had tried port workers into amazon warehouse sequence stress bots. Their speed of movement was tracked with GPS, they had to complete tasks in some gamified but deeply punitive and unreasonable way. Amazon warehouse 101, really but in PPP port. To me this kind of example is a great way to show the horrors of data collection. However, I also see that the horror of this work model reflects equally perverse and inhumane work conditions in the informal economy. 

This is a way to recognise the importance of Labour rights in countering this. Do you know of campaigns/organisation/initiatives dealing specifically with the labour conditions of datafication and gamified work?

In June 2025, ILO commits to International Standards on Gig Work which promises decent work in the platform economy. In India, because of unions like Indian Federation of App-based Transport Workers (IFAT) and  All India Gig Workers Union (AIGWU), a range of parliaments have passed the legislations including one of them which looked at how algorithmic work management work systems are auditable by labour commissioners. These conversations on transparency and accountability on the trackers that we use are happening. 

Even in the EU and in the UK, the unions have taken Uber under scrutiny. That’s probably the way forward we can contest. 

Certain apps and softwares also track how much time you spend on your devices, including keyboards and mouse. Sometimes software developers do not see themselves being affected by these issues. I wonder if we can create any solidarity through this?

Collective licensing of our data and workers data is something we need more investment in. 

Bossware - lots of civil society organising against this. I know that Business and Human Rights Resource Centre and Investor Alliance for Human Rights (not India specific though) has been doing a fair bit of work on this

Q: Some SA countries have passed or in the process of passing data protection laws, do any of them address the challenges on data protection and data rights? If not, what are the ways data protection legislation can address these problems?

If we look at the data protection frameworks emerging in the nation, the problem is it is trying to create a baseline by eliminating personal data to a publicly commodified market. This is not just true of our legislation, but also as the gold standard of EU-Data Privacy Framework (EU-DPF) which does not recognise anonymous data. The fact that if you have alienated data on the basis of free and informed concern, and aggregated on the anonymisation, the basis is that it can work as a Human Rights Free Zone. 

The question of harm does not stop with the question of harm erosion. Downstream profiling and downstream capture of public value, the private sector has taken property. Unless you have means of compensatory purposes, how can you reclaim the public value of the data? This does not answer the question of benefit sharing outside of the legislation. What complicates this now is that, especially going by the experience of the African region, the pressure where India and Bangladesh have been negotiating with the US, digital trade becomes a bargain in which the US allows free reign on your market. That is one of the issues where we cannot imagine where we can see resources as data. We are only able to talk about personality data rights. 

We are given the illusion to opt out from giving our data for the public and personal good. But there is also the question of having that option and choice to do so. In the concept of inclusion and access, as much as we want everyone to be included, does that include the option for people to meaningfully opt out?

I think if you look at European legislation, especially the Digital Services Act, there is an idea that users cannot lose the right to select services. 

This is a fascinating way of thinking about GDPR. Do you have links to specific material on this for diving deeper into this perspective?

Final Thoughts

Suggests to read the Internet Feminist Report that APC created

🌊 Documentation

Day 2 Report

Freedom of expression and privacy | 5:00 - 6:30 UTC

👤 Prasanth Sugathan

 

Session details:

This session is aimed to explore the laws protecting freedom of expression across the South Asian region, while also examining the restrictions placed on this right within different legal systems. Participants will look closely at laws addressing hate speech, sedition, blasphemy, and defamation in their regional contexts. The session will further provide a brief historical overview of these laws and restrictions, tracing how they continue to surface in both offline and online spaces today.

Notes: 

Most state governments were initially not adept on regulating online media and platforms as opposed to physical media, due to the former being created and owned by private and independent sectors. With time, governments were able to exercise their control through legislative frameworks, partnerships, and monitoring. The impact is the online space, initially promised for independent ownership and free speech, has been filtered and with privacy compromised. 

 

By looking and comparing the perspectives of various international and national human rights laws, there is an opportunity to challenge the legislation and the disparity on the private actors/platforms’ treatment towards the Global South and the Global North. Participants can also learn from success stories of other countries. 

 

International Human Rights Legislations

The session acknowledges that the right to Freedom of Expression (FoE) is a fundamental human right recognised internationally through these channels: 

National FoE Legislations and Restrictions in South Asia: 

Constitutional/legal provisions generally ensure that an individual should have freedom of speech and expression. Different states have different levels of provisions on privacy, such as: 

Fundamentally, FoE restrictions are meant to serve for the public good and reduce  harmful behaviours, and must be proportionate and necessary, while serving legitimate aims (as per Article 19(3) of ICCPR). Unfortunately the definitions and implementations of the restrictions are disproportionate, with most often targeting voices of dissent such as journalists, individuals and whistleblowers reporting issues against any politically affiliated actors. 

 

Such restriction categories are: 

 

 

Many social media platforms do not act accordingly in restricting harmful content, which often led to the state taking such actions online and even offline. Such issues and methods are: 

Such actions could cause a chilling effect of reducing free speech, instilling fear in expressing their opinions. 

 

Three case studies in India: Safe harbour & intermediary protections

Safe harbour is mandated by Section 230 of the Communications Decency Act in the US. 

 

Case 1, 2008, India: Avnish Bajaj vs State (Delhi High Court, 2008; SC, 2012).

Arrested as he landed in India for an obscene MMS sold via his own platform, baazee.com

Case 2, 2011, India: Shreya Singhal v Union of India and connected cases. 

In 2011, FB and social media pages were not widely used during this time as compared to now, thus the law was imposed on various individuals for vague reasons and activities such as for liking a post that is deemed unlawful, or tracking online ecommerce activities as alibi. Struck down Section 66A IT Act for criminalising “offensive” speech, ruled unconstitutional due to ambiguous wording and could lead to chilling effect on free speech. 

Case 3, 2021: Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules

These rules were tabled to regulate the proliferation of misinformation on WhatsApp, especially on tracking the source of forwarded messages. Other actions include proactive filters and a 24 hour takedown for illegal content. 

 

This coincides with the Blocking Rules (2009), in which the government has the power to request the platform (WhatsApp or Facebook, and other platforms) to take down the unlawful content. Although there is a provision to send notice to the affected party, this emergency provision is often used without any notice. The confidentiality provision also creates ambiguity in which the affected party will not receive the reasoning for the action. 

 

In 2021, a FOSS engineer who maintains various FOSS domains and platforms, including Diaspora pod, Matrix instance Grounds filed a writ petition to challenge IT Rules 2021. 

 

The terms used in Rule 3(1) are vague, making it uncertain on what is prohibited or permitted. Force the intermediaries to censor and restrict free speech, or lose “safe harbor” protection under the IT Act, 2000. It also impacts the right to privacy (Article 21), including the right to encryption, as it aims to introduce traceability and break end-to-end encryption.

 

Such rules can cause a disproportionate impact on small intermediaries or platforms, especially alternative or open source platforms run by small entities who may not have the capacity to fulfil such compliance and conduct thorough content tracing.

 

The writ petition is currently pending in Delhi High Court.

 

Blocking Instances

Mass blockings mostly occur during any protests or large dissent against politically affiliated actors or the state.  

Grievance Appellate Committee (GAC)

Handles appeals from users dissatisfied with decisions made by Grievance Officers (GO) of social media and other online intermediaries. Users can reach out to GAC if they do not receive any response from the GOs, and can be challenged in court.

However, the GAC is not an independent body and is still largely governed and regulated by the state. There are no independent mechanisms and there is a huge lack of transparency in the structural and electoral processes. Some complaints are also filed once the platform responds with their defense. 

 

Private actors and semi-private platforms’ accountability 

Users can use copyright infringement as a loophole and excuse for indiscriminate take downs to navigate any unlawful content. However, platforms most often do not follow the law or the jurisdiction. For Copyright, the DMCA provision requires notification and counter notification in which the rights holder needs to produce proof of cases. Cases that are filed in a defective manner with the affected user may never get a notice. Independent journalists and smaller content creators do not have the resources to fight such takedowns.

 

Platforms also do not assume nor take accountability on taking down the harmful content even after filing the report on hate speech or OGBV. The platform’s own interest may go against the public good. 

 

Organising and Mobilising on Platforms

Emerging digital spaces such as Discord, with gated servers, have become more central to grassroots mobilisation, like in Nepal, where online communities managed to drive political change through this platform. Youths were mostly moderating and mobilising, and is used as a communications channel to organise publicly.

 

However, such platforms have a double-edged sword on its lack of regulation and high number of users, as users are free to post any content in which some could be unlawful or incite violence against another user. The anonymity also doesn’t promise absolute security, as there could be lurking surveillance from some actors or officers. The complexity of accessing Discord or platforms that are banned, which requires certain digital and technological literacy, means it's inaccessible to all communities and creates limitations in organising and mobilising. For example, only urban youths and digital literate people knew how to access Discord via VPN. 

 

In Bangladesh, Facebook was being used for organising. 

 

What is one provision in the law in your country that affects free speech in your country, that you would like to be amended or deleted? 

 

India: 

Bangladesh: 



Privacy, surveillance and data protection | 7:00 - 8:30 UTC

👤 Ashwini Natesan

Session details:

In this session, participants will learn to examine the different facets of individual privacy in the digital age and the regulations designed to protect these rights. They will investigate the impact of emerging technologies on decisional privacy by considering the various forms of surveillance currently deployed in society alongside the enabling legal frameworks. This session will further focus on the rise of surveillance-related policies and their implications for the protection and promotion of digital rights.

Definitions of Privacy

Participants’ answered to what does privacy mean to them: Freedom from prying eyes; Control over my information; Unnoticed;  Freedom; Being myself; Human rights; Civilised society; Non interference; Personal space; Protection

 

Historically, the right to privacy was first defined as the ‘the right to be let alone’ (Warren and Brandeis, 1890 Harvard Law article). For over a century, privacy law scholars labored to define the illusive concept of privacy. 

 

The notion of ‘control’ acts as a common denominator, in which the definition of privacy is being reduced to: the control we have over information about and relating to ourselves.

A Taxonomy of Privacy, Solove (2006) - identified there are lots of rights that can be classified under the umbrella of privacy, with 16 harmful activities recognised under the rubric of privacy, and further classifying them into 4 groups. The field of privacy law has expanded to encompass a broad range of Information-based harms, including from consumer manipulation to algorithmic bias.

 

Privacy by essence goes beyond data, as it affects the physical life. Every person should have autonomy on the information on their body, identity and physical space. For this discussion and based on the statutes and legislations available, privacy is narrowly defined under the banner of data protection. 

 

Four Stages of Information and Data Management

 

‘Data Subject’ is an individual whose data is being subjected/collected. There are four stages of processes that could happen to the data: 

Privacy Rights in the Digital Age

Privacy rights in the digital age are commonly understood as the right and expectation of individuals to control the collection, use, and sharing of their personal information (data, communications, conduct) in the digital realm. Not just secrecy, but autonomy and control over one’s digital self.

Privacy is often thought of as an individual interest, and does not breach into the public good. In this sense, privacy is often pitted against other rights and freedoms more broadly “social values” such as free speech, security, innovation, efficiency and transparency. 

 

This  view is narrow and does not capture privacy as a social value - in at least two ways:

The right to privacy aims to preserve human dignity and autonomy, as the latter is and should be non-negotiable. The right to make decisions is currently being influenced by parties who do not have our best interests in mind.

Contextual Understanding of Privacy

In the South Asia context, there is a lack of actual Data Protection Laws unlike in the Global North. Legislatively and culturally, privacy is not seen as a priority in most legislations, and sometimes ranked lower than other general rights. It has assumed a secondary role compared to other issues such as national security.

As noted by Anuvind, it is often framed antithetical to certain positive actions, especially crime prevention, as if "nothing to hide means nothing to fear" - but rather than centering this discourse around the "misuse" of the right to privacy, it is often discussed as a reason to not have such a 'right to privacy' in the first place. 

However, the debate itself is wrong because it is mostly framed from the POV of someone else needing access to that information, rather than on the need and right for an individual to protect their own data. There is also no clarity on why other issues should be prioritised, when there is intersectionality in all cases. 

 

‘Consent’ is rarely informed. Most laws have a provision that personal data can be collected, provided the individual consent to the processing of the data. But there is no real choice afforded in our terms and conditions (T&C),  Thus the flaw is that consent is assumed to be a catch all process. 

 

Behavioral experiences noted that many users do not have time to go through the T&C, and in some cases, do not have a choice in accepting the T&C if there is a strong need to use the services.  

 

The recent example of most platforms’ terms of service that automatically entails all user content will be used to train for LLM and it is difficult to opt out, and scraped data made without consent. This further pools into the extent to how much entities, corps, individuals, groups are allowed to have access to users. Some suggestions on user-empowering consent mechanisms should look like:

 

Surveillance by private actors and companies 

Big Tech companies thrive on users’ data, evidently through targeted ads that are based on data collection and profiling. Companies collect vast amounts of user data via their digital footprint, which could be but not limited to: 

These data are further integrated into the surveillance mechanism, where it is used to create detailed profiles for targeted advertising and market manipulation, often without the user’s full comprehension or meaningful consent (“consent fatigue”).

 

Surveillance capitalism by Harvard economist Shoshana Zuboff explains it as: 

The relationship between State and Private Actors 

State Surveillance and Control 

Case: India - Fundamental right - Justice K.S Puttaswamy (Retd) & Anr vs Union of India

Impact on Freedoms (Expression, Assembly, Association)

Chilling effect on FoE: 

Can there be lawful and justified surveillance?

In the opinion of yes, there needs to be surveillance that has to go through the principles of: 

Private actors can be held accountable by having a strong data protection law in place, followed by strong law enforcement and implementations. Since 2018, the instances of EU GDPR enforcement particularly against BigTech has been very encouraging in showing that an individual's access to their rights can be achieved through judiciary and other mechanisms. 

 

Concluding Thoughts

Privacy is essential for autonomy and democratic rights. With the spread of biometric ID systems, SA faces growing concerns over surveillance. This can be mitigated by strong independent oversight, governed by a strong Data Protection Act. 

Civil society should also play a bigger role in safeguarding the rights. 

As an individual, one need to constantly ask: 

Reflection questions: 



Group Discussion - privacy and surveillance: Group 3 - Case Study on Privacy and Surveillance_South Asia




 

🌊 Documentation

Day 3 Report

(To be updated)

🐋 Workshop policies & governing rules

🐋 Workshop policies & governing rules

Code of conduct and principles of participation

The Association for Progressive Communications is committed to providing a safe and welcoming environment for discussing issues related to its community. The APC Community comprises members of the network, all APC staff and team and its larger network of partners, friends and allies. 

The code of conduct and ground rules apply to this meeting, all APC hosted events, conference-related social events, such as parties or gatherings at restaurants or bars and spaces, and includes our mailing lists, wikis, platforms, websites and any other spaces that APC hosts, both online and offline. Participants are responsible for knowing and abiding by these guidelines. In this event, the code applies to anyone who is part of the event, which includes organisers, resource persons and participants. 

All APC meetings, virtual and physical meetings, are intended to be SAFE spaces and we ask participants to be guided by the following:

|Be respectful

|Listen actively

|Be respectful of others’ views even when you disagree

|Be collaborative

|Recognise diversity

|Respect privacy of participants

|Be aware of language diversity

|Handle disagreement constructively

|Act fairly, honestly, and in good faith with other participants

It is vital that discussions include and acknowledge a diversity of opinions and experiences, and that the community does not tolerate harassment of any kind. 

We expect the members of the APC community to treat one another with respect and to acknowledge that everyone can make a valuable contribution. We may not always agree, but the space and conversation must always have openness to positions that may not be aligned or in agreement. Frustration cannot turn into a personal attack. It's important to remember that a community where people feel uncomfortable or threatened is not a productive one, and that the meeting conduct and ground rules are anchored in the APC values we have all committed to uphold. It is our collective responsibility to ensure that we create a safe, creative, productive and welcoming space that can hold us in all of our diversity.

We will take action in response to harassment related to gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, nationality, caste, ethnicity or religion. APC does not tolerate harassment of participants in any form.

Definitions

Harassment

includes, but is not limited to:

  • Offensive comments related to gender, gender identity and expression, sexual orientation, disability, mental illness, neuro(a)typicality, physical appearance, body size, race, caste, ethnicity or religion.
  • Unwelcome comments regarding a person’s lifestyle choices and practices, including those related to food, health, parenting, drugs, and employment.
  • Physical contact and simulated physical contact without consent or after a request to stop.
  • Deliberate intimidation.
  • Sustained disruption of discussion.
  • Continued one-on-one communication after requests to cease.
  • Sexual harassment 
Sexual harassment

is a broad term. For the purposes of this event it is defined as:

Any unwelcome sexual advance in the form of words, images, gestures or physical contact in physical, digital or communication spaces which may reasonably be expected, or be perceived,  to cause distress, intimidation, fear, humiliation, or harm to another. The term also covers any request for a sexual favour, or a threat of a sexual  nature.  Sexual  harassment  may  occur in any space, including the workplace. This includes activities of face-to-face meetings, virtual meetings and digital communication of all kinds. It can be a one-time incident or a series of incidents. Sexual harassment may be unintended, deliberate, or coercive. Sexual harassment may occur both within formal working hours and spaces, and outside these. Men, women, non-binary, transitioning and transgender individuals may be victims or offenders.

Sexual harassment may result in discrimination, and it may create a hostile working environment. Other forms of behaviour which cause discrimination, fear, and/or a hostile working environment may be implicated in sexual harassment, such as harassment based on race, gender, sexuality, national origin, physical appearance, age, ancestry, disability, economic disparity, nationality, or religious or spiritual beliefs. APC recognises that APC's staff members, partners and event participants are from diverse contexts, and that sexual harassment experiences are embedded within the cultural, social, historical and personal contexts.

Sexual harassment should not be confused with unintentional careless communication in a diverse working environment, or with our efforts to create  a working culture  which  is open to conversations on sexuality and human rights.

We understand that the impact of sexual harassment on APC's working culture can be highly destructive,  and we understand  the harmful impact of sexual harassment on any person’s work, mind and body.

Examples of sexual harassment include (but are not limited to):
  • Gratuitous or off-topic sexual images or behaviour in spaces where they are not appropriate.
  • Unwelcome sharing of sexualised content in visual, audio or text form
  • Deliberate stalking, following or intimidation, online and/or offline
  • Harassing photography, video or audio recording
  • Inappropriate and/or unwanted physical contact
  • Unwelcome sexual attention, in any form of communication
  • Requests for sexual favours, verbal or physical contact of a sexual nature in exchange for an opportunity
  • Threats, either explicit or implicit, to withdraw an opportunity or resources unless sexual contact and/or communication is permitted
  • Advocating for, or encouraging, any of the above behaviour.

If you believe you have been harassed, or notice that someone else is being harassed, or have any other concerns, you are encouraged to raise your concerns in confidence to the Event Incidents Team.

APC commits that each case will be considered, and concrete actions will be taken as appropriate.

Please refer to APC’s sexual harassment policy for how APC responds to sexual harassment.

Code of Conduct and Anti-Harassment Policy Response Process

If you are being harassed by a member of the community or a participant or organiser at the workshop, or have any other concerns, please contact a member of the Event Incidents Team.

When a complaint is made or an incident occurs that breaches this code, the Event Incidents Team will confidentially review and respond to any participant who has experienced harassment or inappropriate behaviour.

If the person who is harassing you is on the Event Incidents Team, they will recuse themselves from handling your incident. If the person who is harassing you is a member of the organising team, they will not receive differential treatment than any other participant in the handling of the complaint.

We will try to respond as promptly to complaints as we can. These steps will be taken once you make a complaint:

  1. One or more members of the Event Incidents team will discuss the issue with you.
  2. They may take notes, with your consent, of what you say.
  3. One or more members of the Events Incidents team will separately speak with the person(s) against whom the complaint is lodged.
  4. The process will involve attaining resolution while ensuring safety, dignity and respect for everyone involved.

If a participant engages in harassing behaviour, the Response Team may take any action they deem appropriate, up to and including expulsion from all APC spaces during the event, and identification of the participant as a harasser to other APC members or the general public. The Event Incidents team will prioritise marginalised people’s safety over privileged people’s comfort.

Any member of the Event Incidents Team can be contacted with any questions or concerns participants may have throughout the duration of an APC event. Anonymous complaints can be reported to the team via email.

Names and contact information of the Event Incidents Team are as follows:

🧊 Others

🧊 Others

Feedback

Your feedback matters! Please take a few minutes to complete the post-workshop survey.

 

🧊 Others

Zoom Backgrounds

Please download the Zoom background from this link.
🧊 Others

South Asia Spotify Splaylist

Let’s exchange our favourite music! Feel free to drop your favourite songs into this playlist - we’ll play them throughout the break.

🎶 Click here for the playlist