Compliance
Give UK Regulator More Teeth Over Online Scams – PIMFA

The UK's proposed new Online Safety Bill is controversial, and is designed to remove fraudulent and harmful content from social media and the internet. A UK wealth industry group wants the UK's main regulator to direct Ofcom, a body overseeing the communications field, to have power to take certain content down.
A wealth management industry group wants the UK’s regulator – the
Financial Conduct Authority – to be given power to direct
authorities to erase fraudulent content from the internet under
powers of the new Online Safety Bill.
The Personal Investment Management & Financial Advice
Association, or PIMFA,
has called on the FCA to direct Ofcom to remove such material. It
issued a statement yesterday.
The Bill, which is designed to protect users of online services,
has already been slammed as opening the way to censorship. The UK
wealth manager Quilter has said it must be used to
remove financial scams from websites.
PIMFA made its recommendations about how the law should work when
giving evidence to MPs scrutinising the Online Safety
Bill.
Tim Fassam, director of government relations and policy at PIMFA,
urged an amendment to the Bill that would see partner regulators
such as the FCA provide strategic support to Ofcom to prevent
harm being introduced to financial services consumers.
While the Bill deals very specifically with fraud and breaches of
the Financial Services and Markets Act, PIMFA said it is unclear
how Ofcom will ensure that it has the expertise needed to
identify breaches.
Fassam pointed to the case of London Capital & Finance where the
regulated firm was able to introduce harm into the market through
the sale of unregulated, speculative mini-bonds aided
specifically by advertising, offering significant returns in a
low interest rate economy. If the FCA were able to swiftly
prevent adverts of this nature through Ofcom it could
significantly reduce the risk of potential harm to consumers,
PIMFA said.
He said that PIMFA was also supporting a Which amendment to
the Bill to ensure that search engines had the same duty of care
as social media websites to eliminate fraudulent adverts on their
platforms.
According to the government’s website: “The Bill introduces
new rules for firms which host user-generated content, i.e. those
which allow users to post their own content online or interact
with each other, and for search engines, which will have tailored
duties focused on minimising the presentation of harmful search
results to users. Those platforms which fail to protect people
will need to answer to the regulator and could face fines of up
to ten per cent of their revenues or, in the most serious cases,
being blocked.”
Civil liberties campaigners argue that the law, while designed to
protect the public, creates the risk of censorship because the
definition of "harm" in certain cases is ambiguous.
As previously reported, Matthew Lesh, head of public policy at
the Institute of Economic Affairs, a UK think tank, has said the
Bill is dangerous.
"Companies will be required to remove anything that could
potentially be illegal, from ‘hate speech’ to emotionally
distressing content – under the threat of
multi-billion-pound fines. This will empower the easily offended
and malicious actors to push for removal of [free] speech,” he
said.