Search
Close this search box.

GUEST COMMENT Managing the rise of criminal and abusive user generated content

Some 92% of shoppers trust recommendations from individuals (even strangers) over a brand

Some 92% of shoppers trust recommendations from individuals (even strangers) over a brand

The amount of User Generated Content (UGC) on internet retailing sites has grown incredibly over the past five years.

Consumers use UGC, especially reviews, to make purchasing decisions with research showing that 92% of people trust recommendations from individuals (even strangers) over brands. A positive or negative review can therefore make a huge difference to a business with consumers acting almost as a marketing function.

With such an increase, perhaps inevitably, there has also been a substantial rise in the amount of criminal, fraudulent or offensive posts. This can cause companies real issues in terms of reputation and financial loss unless they are able to quickly identify and remove such posts.

However, with the sheer amount of UGC now appearing on sites, the job of moderating has become crucial, but resource heavy role.  We have seen the likes of Google commit to employing 10,000 people  specifically with the role of moderating YouTube videos. This came after the severe criticism from both the public and advertisers regarding the amount of highly offensive content that remained on the site.

Such a commitment, whilst admirable, is a necessity in this case. The continuing loss of advertisers simply could not continue and Google had to be seen to be solving this high-profile issue. Most organisations, of course, do not have the budget to deploy such a sizable workforce. Every retail website that allows UGC will have to deal with a percentage of negative content. Adding to this challenge is the increasing amount of regulation being introduced by governments in order to crack down on a negative, offensive and criminal UGC.

The German Government at the beginning of 2018 introduced regulation that means websites operating in the German market now have 24 hours to identify and remove UGC related to ‘hate speech’. The law initially at least is restricted to sites with over two million members, but those who fail to act will face fines of up to 50m Euros. This would seem to be only the beginning of a number of regulations that will sweep across Europe and the rest of the world, and is also likely to impact smaller sites as the efforts to combat offensive UGC continues.

So, both in terms of reputation and adherence to the latest regulations, the moderation of UGC is now a crucial element of any online retailer/marketplace. Unless you have an almost unlimited resource, companies have to find ways of making this process efficient.

This is where Artificial Intelligence (AI) can play such an important role. It is a hugely efficient way of quickly identifying possible abusive or inappropriate added content, allowing it to be taken down before it impacts end-users. AI is constantly learning and so the process will become increasingly efficient and effective. However, without the input of human expertise, AI, on its own, can only play a limited role. Using experts who have knowledge of the type of language used and the changing trends of criminal’s use of sites alongside AI technology strikes a good balance; allowing legitimate posts to be approved and added in a timely manner, whilst weeding out the abusive or criminal posts and removing them quickly.

Up until recently, AI has been fairly prohibitive for all but the biggest companies. The financial investment needed has meant smaller companies have been unable to take advantage of the technology. Furthermore, providing AI with the huge amount of data sets needed for it to act independently can be near impossible for smaller volume sites. There are some solutions that are now “off-shelf” ready for retailers to use, pre-loaded with AI modules constructed to deal with most general content moderation challenges for online marketplaces. These can be immediately put into action combatting criminal and abusive posts.

Undoubtedly, UGC can provide huge benefits for online retailers and of course is the crucial element of online marketplaces. The independent third-party endorsement it brings from customers recommending goods is unbeatable marketing for retailers, and the rise of UGC has meant online marketplaces are a viable business proposition. The inevitable rise of criminal and abusive UGC has meant that companies have to manage this carefully and effectively. With the regulatory landscape changing all the time and the perpetual threat to reputation as a result of inappropriate UGC appearing and remaining on site; action must be taken. Using technology alongside human expertise is increasingly being considered the most effective method of moderating content effectively and efficiently.

Author: Patrik Frisk is chief executive officer at Besedo

Image credit: Fotolia

Read More

Register for Newsletter

Group 4 Copy 3Created with Sketch.

Receive 3 newsletters per week

Group 3Created with Sketch.

Gain access to all Top500 research

Group 4Created with Sketch.

Personalise your experience on IR.net