Share This Story, Choose Your Platform!
Published On: February 9th, 2024|Tags: |10.6 min read|

Introduction

Humour can be found all over social media, playing a mix of positive and negative roles. It is highly effective in enhancing engagement, growing popularity, and building connections, allowing users to enjoy the moment and relax. Funny accents often also become catalysts for fostering a positive brand perception, especially when amusement seamlessly aligns with its profile and strategy. Nevertheless, jokes can sometimes go wrong, offend people, and damage reputation when inappropriate or misinterpreted. A delicate balance is crucial to navigate these challenges. How can one use jokes online, benefiting their potential without offending the audience or minorities? Let’s delve into practical approaches for leveraging humour in the digital landscape.

What is the Role of Humour in Social Media?

Laugh-inducing materials on digital platforms are typically user- or brand-generated, reflecting a cultural shift and the adaptability of individuals and businesses in leveraging light-hearted moments to enhance online interactions and create an enjoyable virtual experience. Firstly, the presence of funny content shared by social media enthusiasts helps swiftly capture the attention and grow users’ interest, playing a pivotal role in the success and prosperity of digital services, making them broadly recognised as entertaining and attractive. A humorous style that stands out may become integral to the community’s language and identity, provided it maintains appropriateness and cultural sensitivity. These include jokes, funny comments, comical memes, or entertaining unprofessional movies. The demand for entertaining information is evident in various statistics:

Instagram data from 2021 reveals an astounding one million memes being shared daily. (Source: Instagram)
76% of Gen Z enjoys comedic content on TikTok, reflected in top hashtags like #funny, #comedy, and #meme. (Source: Screenshot Media)
Funny content results in the highest engagement rates on social media. (Source: Buffer)

From a business perspective, on the other hand, humouristic content serves as an effective and dynamic way to present ideas, products, or services through attractive multimedia, quizzes, memes, playful copywriting, interactive challenges, or brand campaigns. Funny brand content offers higher noticeability, greater likability, and more significant engagement, presenting immense potential for virality and organic brand awareness growth. Its strengths lie in building deeper connections and cultivating trust among consumers who increasingly turn to this modern channel during leisure time. However, it is crucial to approach business humour with care. It must align with the brand’s overall strategy, be timely, and, most importantly, be appropriate for the specific audience and platform. Striking the right balance between enjoyment and sensitivity is essential for businesses to maximise the positive impact of amusing content in the dynamic landscape of social media.

Brand Humour Use Case:

Xbox is known for its interactive and playful social media presence, often using humour to connect with its audience, especially on platforms like Twitter. These include sharing gaming culture memes, engaging in light-hearted banter with other gaming companies, or creating humorous content around products and announcements. This approach helps Xbox showcase its brand personality and resonate with the gaming community, making its social media channels more enjoyable and relatable for its audience. (Source: Xbox.com, X.com)

The pivotal role of humour and the need to incorporate it across diverse platforms and industries is well reflected in statistics. One of the notable sources is the global industry report* conducted post-pandemic by Oracle and Gretchen Rubin, stating that, for instance:

91% of interviewees prefer brands with a sense of humour.
75% of respondents would follow a brand that shares humour on social media.
78% of survey participants believe brands can do more to deliver happiness to customers.

How Complex is Moderating Humour?

The complexity of humour moderation lies in its reliance on context, intention, language transformation, evolving trends, and cultural subtleties, turning it into a sophisticated and challenging mission. Key areas are contributing to the intricacies of the overseeing process, making it difficult to formulate universally applicable moderation guidelines:

  • Individual Bias: Humour is highly subjective and varies widely among individuals and cultures. The same content can be amusing to one person while offensive to another, adding a layer of the interpretive lens that complicates moderation.
  • Context Dependency: Jokes often rely on context, and without the proper understanding of circumstances, a seemingly inoffensive statement could be misunderstood and unnecessarily deleted, limiting free speech or mistakenly allowed, potentially harming users.
  • Cultural Sensitivity: Humour is deeply rooted in cultural nuances, and what may be amusing in one culture might be deemed inappropriate or offensive in another.
  • Evolution of Language: Funny content often involves wordplay, sarcasm, or irony. However, as language transforms rapidly, what may be a harmless joke today might take on a different meaning or be perceived differently in the future.
  • Dynamic Nature: Humour trends can change rapidly, and some jokes can be considered funny today but have a different meaning in the future.
  • Intent vs. Impact: A well-intentioned joke may still have unintended consequences or offend certain groups, as the author’s intent is subject to interpretation, and the impact on the audience may differ.

How does Humour Impact the Marginalised Groups?

While humour holds the potential to foster inclusion, empower individuals, and challenge societal norms, it also carries the risk of perpetuating unfair stereotypes, contributing to a divisive or discriminatory environment. Marginalised groups illustrate the dual-edged nature of humour’s impact, which can work as a coping mechanism within these communities, allowing individuals to deal with challenges, navigate difficult situations, and foster a sense of unity and mutual support. Conversely, external jokes or memes directed at these groups may reinforce stereotypes, deepen divisions, and promote discrimination. Their consequences sometimes extend beyond the online realm, manifesting in real-world impacts and further widening societal gaps. Groups vulnerable across the globe include racial and ethnic minorities, LGBTQ+ communities, people with disabilities, religious minorities, women and gender minorities, socioeconomically disadvantaged individuals, and indigenous populations.

Pictures of women representing minorities, that are vulnerable at inappropriate humour.

How Can Platforms Better Moderate Humour?

Considering the complexity and specificity of humorous content subjected to moderation, the conclusion arises that it is necessary to properly construct a strategy aligned with social media needs, challenges and obligations. This entails a multifaceted approach that acknowledges jokes’ nuanced nature and leverages proactive and reactive measures, technology, human touch, and user engagement. Below are key considerations and strategies for effective user engagement in the context of humour content moderation:

Icons presenting the components of a compelling humour moderation strategy.

1. Methodology

Integrating diverse moderation methods is crucial for digital platforms to achieve optimal efficiency, accuracy, and adaptability. Among the most critical concepts are proactive content review and reactive monitoring. While the first method involves using human moderators for pre-publication assessments, intercepting potentially harmful content before it reaches the audience, the second one employs post-publication review mechanisms that blend human oversight with real-time tools, user reporting, and adherence to community guidelines.

2. Balancing Human and Technological Elements

Effectively navigating the dynamic social media space in terms of humour necessitates a fusion of advanced technological solutions and unique human qualities. This combined approach is instrumental in creating a secure and inclusive space for all users. Considering the tech stack, it is worth embedding cutting-edge tools and filters capable of nuanced content detection or real-time automated screening. Automation technologies, including machine learning algorithms, are employed to handle the sheer volume of user-generated content. However, training them to recognise jokes or memes accurately and distinguish them from harmful content can be challenging due to the subjective nature of humour. This is where humans come into play, bringing unique strengths that AI-driven tools currently struggle to replicate. These encompass empathy, intuition, emotional intelligence, broad experience, cultural awareness, and fluency in evolving trends, enabling one to discern sophisticated intent, anticipate potential impacts on the audience, and respect diverse perspectives efficiently.

According to a study from Loughborough University, automated sentiment analysis tools have immense potential to assist human moderators in identifying and categorising humorous expressions within user-generated content. This automation can aid in efficiently and consistently applying content moderation policies, especially on platforms with many user interactions. (Source: Tech Xplore)

3. User Engagement Strategies

Involving users in collective moderation ensures a dynamic and responsive content environment where amusing materials are frequently exposed. This can be achieved by encouraging individuals to report or flag inappropriate jokes and use rating and voting systems, contributing to content visibility and removal. Nevertheless, it may introduce challenges such as false positives, where benign content is mistakenly flagged as offensive. Moderation teams can confidently and efficiently mitigate the threat by continuously refining reporting mechanisms and implementing robust validation processes. Another key issue is communicating transparently with social media enthusiasts about the platform’s content moderation policies, clearly articulating the guidelines for humorous content, addressing user concerns, and providing insights into the rationale behind moderation decisions.

4. Training and Development

Fostering a continuous learning environment where moderators stay informed about evolving language trends, cultural shifts, and emerging humour nuances guarantees that they are well-equipped to handle the dynamic nature of the content on social media platforms. Additionally, providing cultural sensitivity training for employees is a valuable strategy to enhance their understanding of diverse contexts, reduce the risk of misinterpretation, and ensure content aligns with varied perspectives. Regularly assessing processes and making improvements based on feedback, emerging trends, and evolving user expectations is also essential for maintaining an effective and responsive moderation approach.

5. Collaboration with Humour Experts

Collaborating with external cultural consultants and humour experts enhances the platform’s ability to navigate diverse joke styles. By seeking insights from leaders in the field, digital services can gain valuable perspectives on specific cultural references, potential sensitivities, and the ever-changing landscape of humour.

Facebook Use Case

In Q3 2022, Facebook informed about reducing actions against hate speech-related content from 13.5 million to 10.6 million by enhancing the accuracy of its AI technology. This improvement involved learning from user appeals, enabling better recognition of humorous terms between friends and the nuanced assessment of words that might be offensive in one context but not in another. (Source: Meta.com)

What are the Regulatory Challenges with Humour?

Addressing regulatory challenges in humour moderation involves deftly navigating the complex terrain of evolving legal and ethical considerations. The ever-changing environment greatly emphasises digital privacy, transparency, and accountability on social media, aiming to create a secure space free from virtual abuse and infused with fun and entertainment. Balancing regulatory compliance with the playful nature of humour is crucial for cultivating a responsible and enjoyable online environment.

Recent global regulations are reshaping the digital landscape, emphasising the importance of online safety, which is highly correlated to humour management. For instance, in the UK, the Online Safety Bill holds social media platforms accountable, ensuring the prompt addressing of virtual harms and prioritising the safety of children. On a broader scale, initiatives like the Digital Services Act and the Code of Practice on Disinformation underscore the European Union’s commitment to a secure, fair, and transparent online space free from misleading content and fake news. These regulations extend globally, with the United States and various Asian countries implementing measures in line with this trend. They highlight the worldwide effort to enact legislation addressing online safety challenges and protecting users, underscoring the importance of finding a middle ground between compliance and encouraging a vibrant, humorous, and safe online community. Staying informed about new regulations and adjusting moderation practices is essential in navigating the legal landscape.

What else matters is an ongoing challenge in balancing freedom of expression with preventing the spread of harmful or offensive humouristic information and materials. Defining the boundaries of acceptable behaviour while respecting users’ rights to express themselves requires constant attention and consideration, and content moderations play a crucial role in achieving this delicate equilibrium.

Conclusions: What is Worth Remembering?

Introducing comic accents in social media is worthwhile and beneficial for all – platforms, brands, and users, but it should be approached thoughtfully, with necessary precautions to avoid unintended threats. This involves closely monitoring shared content to prevent or eliminate potentially offensive information and materials while smartly balancing freedom of speech and user well-being. As humour often relies on cultural references and context, the moderation teams must understand the nuances and accurately assess whether the content is appropriate. Modern technology is also pivotal in this endeavour, providing greater efficiency and accuracy. Combining the strengths of human moderation with cutting-edge tools ensures a harmonious online environment that fosters creativity, expression, and positive interactions.

A smartphone with diverse applications, including social media apps.

* Source: Global Report: 45% of People Have Not Felt True Happiness for More Than Two Years by Oracle Fusion Cloud CX and Gretchen Rubin, New York Times bestselling author and podcaster.

Moderator Mental Health: How We Make the Internet a Safer Place
What is AI content moderation?
Content Moderation: Problem of Humour in Social Media

Contact our sales to learn more or send us your RFP!

Recent  Articles

Trust and Safety in the Hyper-growth phase

April 12th, 2024|Tags: , |

Companies often experience rapid growth for various reasons, including favourable economic conditions, disruptive technological innovations, or insightful organisational management strategies. When expanding, they must ensure that their Trust and Safety measures are effective and more robust to [...]