Social media’s failings have made regulation almost inevitable
The GameStop saga has thrown light, once again, on the power of social media. Having dramatically increased the stock price of a loss-making video game retailer, the two-million strong subreddit r/WallStreetBets has shown that with enough will, a single group can take on even the powerful world of hedge funds–and make itself some money in the process.
But this, like Twitter’s blocking of former US president Donald Trump’s account, raises questions about control. Who controls the public discourse in our society? People, platform, or the government? Furthermore, who should have control? It’s now undeniable that the public voice can overwhelm even well-established institutions, so the arms race is on for who controls the biggest share of that voice.
Read more: Putin critic Alexei Navalny sentenced to three-and-half years in penal colony
It is not the first time that social media has posed difficult ethical questions. But in our current era, when the socio-political, cultural and economic dust has been kicked into the air, the need to understand this technology, from a personal and professional standpoint, has become more urgent. Locked down and isolated, social platforms have become a vital way to connect with friends and share information about the pandemic; use for news consumption surged globally last year. But it also allowed misinformation to spread, and emboldened fringe groups whose ideas have no sound basis.
Our dilemmas can’t be solved by reference to traditional ethical ideals. Absolute libertarians want beliefs to be aired without censorship. Government authoritarians propose strict regulation. Others look for common ground, attempting to smooth the raw edges of social by imposing a moral framework on their online behaviour. There is no one-size-fits-all solution.
The reality for marketers like me is that one of our best tools can cause real-world harm if used in the wrong way. Algorithms that promote a race towards extremes always have the potential to cause damage. But there’s no doubt that social media can be a force for good. Facebook is a powerful fundraising tool, raising millions through initiatives like its ‘Season for Giving’. Instagram has brought mindfulness classes, breathing workshops and at-home fitness sessions to tens of thousands, for free. And how many businesses were started or saved by social last year, either through monetised content or selling products direct to the consumer? Stepping away from the limelight of documentaries warning people of social media’s negative impact, I find it hard to ignore the positive everyday impact it also has.
To the problem of ensuring social has a positive impact, regulation is not the answer. Placed in the hands of the government, it will begin to lean one way or the other. It will only introduce more bias and diminish the public voice. People with a cause should be allowed to stand up and shout about what matters to them, without needing to twist the arm of other mainstream media to get onboard. But the urgent need to resolve the ethical problems social media has thrown up over the past year makes it more likely by the day that the government will step in.
At the individual level, many of the negative effects of social can be offset by bearing in mind the potential real-world outcome of every post, no matter how innocuous it may seem to the poster. As accidental gatekeepers on social, standing at the intersection of people and platform, marketing agencies have a responsibility to lead the way and lead by example. We run social media campaigns all over the world and this appreciation of real-world outcomes now forms the basis of our agency’s approach.
But in a wider, more systemic way, there are two main solutions. First, that the platforms do more to incentivise proper use. YouTube and TikTok have incentivised creators through funds, effectively rewarding those who use these channels in a prosocial, useful way.
Secondly, agencies, organisations and individuals can work together to find common ground, embedding a set of unspoken but agreed-upon rules for social use —a kind of ‘common law’—with which a large majority of users are broadly happy.
This will light the way for effective self-regulation by the platforms themselves and deter government regulators, whose interventions will only make platforms more dangerous.
Read more: City A.M. Newsletters