36.2 C
City of Banjul
Sunday, April 28, 2024

Muddy Waters of Female Circumcision in Islam 

By Momodou Buharry Gassama, Stockholm, Sweden The current...

Fighting for Democracy and Diversity are Worthwhile

 By Tumbul Trawally, Seattle, U.S.A  Narcissists do not...

Public Safety: A Case for the National Assembly to Regulate Content Moderation of Social Media Platforms

OpinionGuest EssaysPublic Safety: A Case for the National Assembly to Regulate Content Moderation of Social Media Platforms

By Sarjo Barrow, Esq.

To create a safe environment, the government must act to protect society at large and not
delegate such responsibilities to profit-making companies with the hope that they would do the
right thing. That is why parliamentarians are citizens first before public servants. As faith would
have it, each represents a specific section of the society in the Assembly. Yet, with all the recent
happenings in their respective communities, I have not heard any member announce or declare
that they would utilize the newly created technocrat office in the Assembly to bring legislation to
help regulate in areas that are sowing the seed of discord in the society—content moderation
practices of social media.

Since 2016, The Gambia experienced a resurgence in the use of online media platforms.
Initially, the government panicked and threatened to pass legislation to criminalize insults
against specific public figures. Ordinarily, most Gambians would identify with the public policy
behind such a rationale. Our shared culture and history taught us to respect grey hair, and our
society frowns upon insult. The quickest way to incite violence in The Gambia is by using
explicit against a respected figure. I was against the proposal because of our collective
experience during the dictatorship, the lack of security sector reforms, and the government’s
abuse of power to stifle dissent.

Notwithstanding, citizen journalists are on the rise in The Gambia. Significantly, de facto
journalism became the new hustle, where anyone with a smartphone or a Personal Computer (PC) could create an online media platform to solicit, promote, or spread information, including intentional
defamatory statements, cyberbullying or stalking, child abuse, or even child recruitment into
extremist behaviors such as terrorism. Thus raising the question of why the government or Members of the National Assembly have failed to legislate online media platforms with all deliberate speed.

Recently, I have argued that the passage of the National Assembly Service (NAS) Act did not
bring any meaningful change to the citizens but to the members and the benefactor of the Act. I
have not heard a National Assembly Member introduce a bill through the newly created in-house
technocrat office. Indeed, first and foremost, the job of an assembly member is promulgating

laws. Going by this standard, I believe all members’ scorecard is big fat zero. Debating on bills
that the Attorney General’s Chamber drafts do not count here. Of course, parliament has relied
on the AG’s Office for support because of the ostensible inadequacies of the members. To
support their contention of a co-equal branch in our constitutional democracy, they passed the
NAS Act to help them build capacity but failed to utilize the office.

In the United States, Section 230 of the Communications and Decency Act has been the
foundation for governing expression in digital platforms. Congress passed this provision in 1996
when online presence was surging. While Congress failed to re-address the existing and
emerging policy issues regarding technological changes online, the goal of Section 230 was to
protect online platforms from liability for third-party content that they distribute. Equally, in
2000, the European Union adopted the Electronic Commerce Directive. Like Section 230, the
eDirective protects the online platform from liability for the passive retransmission of third-party
content. However, unlike the United States, the EU has revisited the issue and passed the EU’s
Digital Services Act (DSA) in 2022. Although DSA left the eDirective undisturbed, it
established a “duty of care” for an online platform. Key among them is creating an expansive
duty of care for the most prominent platforms, requiring disclosure and transparency, including
algorithmic and human content moderation.

Although I am not aware of any law in The Gambia that mirrors Section 230, eDirective,
or DSA, I think National Assembly must act now to protect the citizen from the harmful effect of
unregulated social media content. I do not have all the answers or even the best ideas for this
complex area of law where free speech interacts with the state police power to protect its citizen.
Still, the tie is now to start a conversation to avoid further damaging the fabric of our society.

Like the United States & the EU, I believe The Gambia too should provide immunity for
third-party content that online platforms host with qualifications. Interestingly, the concept of
online hosting has changed since AOL. The qualifications I am recommending are:

  • Incentivize platform. Just because an individual can access a smartphone or a PC to
    create an online platform should not automatically insulate the platform from liability for
    third-party content. Under this approach, online platforms that purposefully promote, solicit, or facilitate criminal activity (cyberbullying/stalking/child abuse/terrorism/unlawful criminal conduct) or are willfully blind to illegal criminal behavior on their platform by third parties should not receive the benefit of immunity for hosting the third party content. Like the “Good Samaritan” immunity of Section 230, The Gambia should limit this immunity to individuals who do not endanger the citizens in the first place.
  • Promote competition in a free market. Immunity should not extend to antitrust claims
    or competition laws. Foreign companies largely dominate The Gambian economy, but the
    same may be true for online media. The monetization of the internet requires that large
    companies not hide behind immunity in antitrust cases, where liability is based on harm
    to competition and not the third-party content (speech).
  • Promote transparency. Like the EU DSA, the law should create a “notice-and-action”
    rule. If a platform receives notice asserting unlawful content, it must immediately assess
    the claim and take appropriate action. Moreover, for large platforms, the law should
    require an ex-ante effort to evaluate the risks “stemming from the design, functioning,
    and use of their services” and deploy the necessary means to mitigate the systemic risks
    identified silently. However, to avoid a heckler veto or chilling effect on free speech, this
    notice requirement should be limited to unlawful criminal conduct (like stalking, child
    sex abuse, terrorism, etc.).
  • Notice liability. Here, platforms with actual knowledge or notice of criminal or unlawful
    material on their services without taking any action should not be entitled to immunity for
    hosting third-party content. Indeed, Internet Service Providers are not treated as
    “publishers or speakers” of content provided by third parties. As traditional tort law has
    recognized, intermediary liability for publicizing the speech of third parties varies based
    on the publisher’s status. For example, newspapers or book publishers are generally held
    strictly liable for defamatory material they publish as if they were the speaker.
    Distributors, such as libraries and newsstands, are held responsible only if they knew or
    should have known the content was unlawful. And accessories, such as printing presses,
    are generally not held liable for defamation. This balance is required to protect the
    citizen, especially the vulnerable and suspecting, from the dangers of unregulated media
    platforms.

NOTE ABOUT THE AUTHOR: The author practice focuses on constitutional law, national security, human and civil rights litigation. 

Check out our other content

Check out other tags:

Most Popular Articles