THE CIRCULAR
Search
Close this search box.

Opinion:If Hitler Had Twitter: The Public Sphere and the Curse of Social Media

Photo by Tracy Le Blanc: for pexels.com

Photo by Tracy Le Blanc for.pexels.com (edited by Joy Asemota on Canva)

In 2005, Mark Zuckerberg wanted Facebook to “move fast and break things.” Facebook did move fast, toppling Myspace, Friendster and the likes, and gulping WhatsApp and Instagram. History says the then Harvard sophomore was a resolute and dedicated visionary. But history leaves little to no clue whether or not Zuckerberg envisioned Facebook would break things so bad it would be dubbed the “greatest propaganda machine in history.”

It is 2024. Facebook (and Twitter, YouTube, Google, and Alphabet) have broken things so much that Chris Hughes, Facebook’s co-founder and Zuckerberg’s friend of humble beginnings, ruled that, to escape the conundrum of fake news, cyber hate, conspiracy theories, and the resulting, brewing collective anger, global power brokers to break up Facebook. Like Hughes, renowned comedian (common sense activist, really) Sacha Baron Cohen thinks that social media behemoths need to take a regulatory hit in order to rid the public sphere off the immense powers of trolls and bots, and damning, limiting social media algorithms.

But how did we get here? How did we become slaves to likes and retweets so much that we let objective truth die? How did we de-legitimize knowledge and erode scientific consensus? How did we contaminate our consciousness and let trolls and bots desecrate our sacred public sphere? Well, our greatest goof was letting the GAFA (Google, Alphabet, Facebook, and Amazon) be our ultimate gaffer. We handed the reins of our democracy to the Silicon Six (Google, Alphabet, Facebook, YouTube, and Twitter).

Photo by Harrison Haines for pexels

Of course, social media companies have become an integral part of our Democracy. With social media, the fourth estate of the realm – the mass media – received a redefinition and somewhat, rejuvenation. Social media also brought to the fore a taste of absolute freedom which we have bedecked in axioms like freedom of speech, freedom of expression, diversity of ideas, and information presumption. The fifth estate of the realm – the public sphere – also took a fine hit with the advent of the social media as society broke free from the reins of a consolidated and colonized media ownership system that dictated what and how the public sphere thought.

Unlike the social media, regulation by power brokers is central to the traditional mass media, rightly so too. Democratic settings blossom when the media plays its role in informing the public, shaping and curating public opinion, and holding power brokers accountable, especially in the face of potential media manipulation, and misinformation. Governmental regulations help to ensure a diverse and pluralistic media landscape, as well as protect against media monopolies that can undermine democracy. Regulations promote media diversity by preventing media conglomerates from dominating the market given that media concentration poses a risk to democracy, as it limits diverse viewpoints and allows powerful interests to control the flow of information.

In a democratic society, informed citizens are essential for active participation in public affairs and making well-informed decisions. The media acts as a “crucial intermediary” between the public and the government, providing vital information to citizens. Unchecked media can succumb to biases, misinformation, and sensationalism, thereby distorting facts and manipulating public opinion. By implementing regulations, governments can enforce standards of accuracy, fairness, and impartiality, thus safeguarding the flow of reliable information to the public.

Governmental regulations also exist to protect the media from itself.  Media organizations driven solely by profit motives may overlook important issues affecting the public or prioritize sensational stories over matters with significant social impact. By establishing regulatory frameworks, governments can prevent media organizations from neglecting public interest in pursuit of profit. One notable example is the regulation of broadcasting frequencies and licenses.

Governments allocate limited public resources, such as frequencies, to media organizations, ensuring fair competition and diversity of voices. This ensures that media outlets represent a range of perspectives, providing citizens with a holistic view of public affairs. Without sufficient regulation, powerful media conglomerates may dominate the market, stifling competition, limiting diverse viewpoints, and compromising public interest in the process.

It is understandably expected that the social media – the biggest element of 21st century public sphere – should be laced with some regulations given its proven propensity for unstructured participation that has led to loose and leaderless networks in the face of extreme information abundance, unreliable information, censorship, privacy concerns, and the absence of critical discussion. Social media should be regulated to contribute to the public health it largely benefits from. Social media companies should be made to develop an obligation to ensure the discourse on their platforms is conducive to healthy democracy.

Social Media Regulation: How far, How Well?

Photo by Pixabay

To be fair, social media platforms have either succumbed, tried to succumb, or even pretend to succumb to the demands for regulation. The Silicon Five (Google, Alphabet, Facebook, YouTube, and Twitter (now X) have created a semblance of self-regulation. Proposals to regulate social media have continued to gain traction though these proposals suffer stark deficiencies. 

One of the main criticisms of proposals to regulate social media is the lack of clarity and specificity in defining key terms and determining the scope of regulations. For instance, while the regulation of hate speech is often highlighted, the term itself lacks a universally accepted definition. This ambiguity creates challenges in implementing and enforcing regulations consistently across different platforms and jurisdictions, potentially hindering their effectiveness.

Also, regulating social media poses the risk of encroaching upon the fundamental right to freedom of speech. While it is essential to combat hate speech and misinformation, there is a concern that excessively strict regulations could unintentionally suppress legitimate speech and limit open dialogue. Striking a balance between protecting individuals from harm and preserving freedom of expression remains a significant challenge in designing effective regulatory frameworks.

Proposals to regulate social media often involve increased monitoring and content moderation, potentially leading to overreach and bias. As governments or regulatory bodies gain more control over platforms, there is a risk of censorship and content suppression that aligns with their own interests. Moreover, automated content moderation algorithms can inadvertently favor certain perspectives or misidentify innocent content as violating community guidelines, leading to unintended censorship and stifling of voices.

Regulatory proposals need to address the challenges of enforcing regulations across borders and ensuring accountability for both individuals and platforms. Social media platforms operate globally, making it difficult to enforce regulations consistently across jurisdictions with different legal systems. Additionally, holding platforms accountable for their content moderation decisions and addressing appeals or grievances effectively is a complex task that requires careful consideration and oversight.

Proposals to regulate social media must consider the potential unintended consequences that may arise as introducing strict regulations may push problematic content and conversations to less regulated platforms, making it harder to monitor and address issues effectively. Additionally, the increased burden of compliance and moderation may disproportionately affect smaller platforms and new entrants, stifling competition and innovation in the industry. It is crucial to carefully assess the potential unintended consequences and ensure that regulations do not inadvertently exacerbate the issues they seek to address.

Regulating social media platforms also presents significant technological challenges. Machine learning algorithms used for content moderation may not always be accurate in identifying problematic content, leading to both false positives and false negatives. The dynamic nature of social media, with millions of user-generated content being posted every second, further complicates the effectiveness of automated moderation techniques. Regulatory proposals should account for the limitations of existing technologies and foster collaboration between policymakers, researchers, and industry to develop more advanced and nuanced approaches.

Social media platforms operate globally, and regulations designed at a national level may struggle to have a significant impact without international coordination. Cross-border cooperation is necessary to address challenges such as cross-platform hate speech, coordinated misinformation campaigns, and privacy breaches. Proposals must account for the need to establish international norms, standards, and frameworks for regulating social media to ensure a comprehensive and coordinated approach.

A notable social media regulatory proposal is content moderation. Content regulation on social media aims to prevent the dissemination of illegal and harmful material, such as hate speech, incitement to violence, and misinformation. Content moderation is a complex task because it requires platforms to balance safeguarding users and preserving freedom of expression. The primary benefit of content regulation lies in its potential to reduce the spread of harmful content, thus protecting users and often the most vulnerable groups in society.

However, content regulation is not without its pitfalls. Critics argue that it may lead to over-censorship and the suppression of free speech. The delegation of moderation responsibilities to private companies creates “free speech chokepoints,” where these entities have the power to shape public discourse. Another challenge arises from the global nature of social media platforms, where a one-size-fits-all approach to regulation may not be sensitive to cultural differences and contextual nuance.

In regard to Algorithm transparency, social media companies are called to disclose how their systems curate and recommend content to users. One argument in favor of such transparency is that it could lead to greater accountability, enabling scrutiny from regulators and the public. Such scrutiny could then ensure that algorithms do not unwittingly promote harmful or biased content. Nonetheless, the push for algorithm transparency is often countered by concerns over intellectual property and trade secrets. Social media companies argue that revealing the details of their algorithms could undermine their competitive advantage. There is also the practical concern that transparency alone does not equate to understandability; the algorithms governing content on social media are highly complex, and simply disclosing them might not lead to increased public understanding due to their technical nature.

Moreover, even with full transparency, the dynamic and self-learning nature of algorithms makes their outcomes unpredictable and often inscrutable even to their creators, adding another layer to the challenge. This results in the term “interpretative flexibility”, highlighting the fact that the same data can lead to different interpretations, making the demand for transparency a somewhat uncertain solution.

Independent oversight characterized by external bodies responsible for reviewing and monitoring social media platform practices, is often considered a means to ensure accountability and transparency. However, it is important to note its potential limitations. Independent oversight bodies may lack the specialized knowledge and technical expertise required to comprehensively assess the complex operational aspects of social media platforms. This could limit their effectiveness in identifying and addressing emerging issues such as algorithmic bias or the spread of harmful content.

There is also a risk that independent oversight bodies may be influenced or co-opted by the very platforms they are meant to regulate. Lobbying efforts, industry capture, or employment cycles may compromise their independence and integrity, thus undermining their effectiveness in holding social media platforms accountable.

Anti-trust measures which are intended to promote competition and prevent the excessive concentration of power have been proposed as a means to regulate social media platforms. While these measures can be appropriate in certain contexts, they also present challenges and drawbacks. Determining the appropriate market boundaries within the rapidly evolving digital landscape is a complex task. Social media platforms often offer a wide range of services, making it challenging to establish clear market areas for anti-trust regulation.

This ambiguity can hinder the effective application of anti-trust measures and potentially lead to unintended consequences.  Also, overly restrictive anti-trust measures may inadvertently stifle innovation and impede competition. By focusing solely on breaking up dominant players, there is a risk of inhibiting the arrival of new and innovative platforms that could offer better alternatives to address concerns related to privacy and moderation.

Social media regulation seems to be double-edged. Yet, it is necessary else more “Jews could be thrown down the well” with Hitler just tweeting. We let social media become our collective error and we fell to the whims of fake engagements, and manipulative algorithms. We can help the public sphere regain sanity. We just have to choke and bridle the monster we have made – social media.

Share your love
Facebook
Twitter

Related News

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.