Got a TV Licence?

You need one to watch live TV on any channel or device, and BBC programmes on iPlayer. It’s the law.

Find out more
I don’t have a TV Licence.

Live Reporting

Paul Seddon, Richard Morris and Johanna Howitt

All times stated are UK

  1. That's all from us

    BBC Politics

    Thanks for following along with us today. Goodbye.

  2. Ferdinand: I had to tell my kids about the monkey emoji

    Video content

    Video caption: Rio Ferdinand: 'I have to explain the monkey emoji to my kids'
  3. What was said at the committee

    • Former England captain Rio Ferdinand told MPs online hate is "normalising racism". When there are "no repercussions" for racists online, he said "people are going to think it's normal".
    • Ferdinand described needing to explain to his children why people had posted banana and monkey emojis on his social media posts. He said he has seen members of his own family "disintegrate" after seeing racist abuse online.
    • FA director Edleen John said mechanisms are needed to prevent people creating an account, sending abuse, and then deleting the accounts.
    • She said harmful content should be defined by Parliament, not in the boardrooms of social media companies.
    • Imran Ahmed from the Centre for Countering Digital Hate said social media companies are "failing to act on anti-vaccine lies", failed to act on racism directed at sportspeople and failed to act on "misinformation" about US elections.
    • He said he found it "remarkable" that social media companies are still able to escape culpability for the content on their platforms. "Self-regulation is over - it's got to be over," he said.
    • Nancy Kelley from Stonewall said attitudes towards LGBT people have improved over the last 50 years, but she says social media can "radicalise" people's views, and "enables" a minority of people to spread abuse.
    • Sanjay Bhandari, the chair of Kick It Out, said the government needs to force tech firms to set up a code of practice around the reach of anonymous accounts.
    • Danny Stone from the Antisemitism Policy Trust said there are "various reasons" why people might be anonymous online, but appropriate middle measures are needed to allow some people to keep anonymous accounts.
  4. Facebook: "We know we have more to do"

    Facebook logo

    In response to suggestions earlier from Imran Ahmed that Facebook "doesn't want to look for harm on their site to avoid then having to become responsible for it", a spokesperson for the company said:

    “Facebook has long called for new rules to set high standards across the internet.

    "We already have strict policies against harmful content on our platforms, but regulations are needed so that private companies aren’t making so many important decisions alone.

    "While we know we have more to do, our industry-leading transparency reports show we are removing more harmful content before anyone reports it to us.”

  5. Online companies are in for "a bit of a shock"

    committee

    The penultimate question comes from Suzanne Webb who asks the panel if they think online platforms understand the scale of the work they need to do once the bill comes in to law.

    Nancy Kelley replies: "I think they know the scale of the problem better than anyone sitting in this room. Whether they know the scale of the work required for them to address the problem, I'm not sure."

    Danny Stone says: "I think they are having a bit of an internal debate about it. I think some probably understand and know what's coming"

    But he adds: "I think that, in some cases they are going to get a bit of a shock"

    He ends by saying he thinks that the government's Online Safety Bill is a "great piece of legislation trying to set an international standard - that has to be welcomed."

  6. Final questions on offensive material

    Baroness Kidron asks what Danny Stone's views are on offensive material and rules from different social media providers.

    Danny Stone says sometimes the BBFC - which classifies films and video content - may ban a film in the UK, but it is most likely still available somewhere online and in dark corners of the internet, but it is about limiting the audience to a harmful film.

    This, he says, is how he views trying to regulate abuse on popular social media websites, that there should be a minimum standard which he says should be set by Parliament, "not Mark Zuckerberg".

  7. MP talks about personal experience of online abuse

    Nicolson
    Image caption: John

    The SNP's John Nicolson asks the panel how seriously social media platforms take the issue of online abuse.

    Nancy Kelley from Stonewall says she thinks platforms are genuinely concerned at a "generalised level" - but this isn't always reflected in how they regulate content on their sites.

    Mr Nicolson introduces a story of his own - saying that as a gay man he was once called a "greasy bender" on Twitter.

    "Most people would accept it crosses a line" he says.

    But he adds that when he reported the abuse, the firm told him the post did not breach its community guidelines. He says he went back and forward "two or three times" with the company "but absolutely nothing happened".

    "They know I am a Member of Parliament, they know I sit on this committee. If they are prepared to behave with such utter disdain to somebody with a wee bit of influence, can you imagine the disdain for the average LGBT person."

    Nancy Kelley says: "We don't have to imagine it - we can see it."

  8. Removing anonymity may not be a solution

    Conservative Dean Russell asks if organisations see removing anonymity as the easiest way to tackle "cross-pollination" of hatred from different groups.

    Danny Stone says "this isn't really addressed" in the bill at the moment. He says social media platforms should be legally liable where data to identify a social media user cannot be used.

    There are "various reasons" why people might want to keep anonymity, he says, but there should be an appropriate middle measure to allow some people to keep anonymous accounts where it is necessary.

    Nancy Kelley says these platforms can identify anonymous abusive accounts and delete them - they just don't.

    She says Stonewall has "deep concerns" for LGBT people around the world if social media companies insist on people using real names.

    "Even in very liberal, accepting environments, [publishing names] increases the danger to our community", she tells MPs.

    90% of LGBT social media users in Egypt, Lebanon and Iran are reluctant to share their names even on social media private messages for fear of legal repercussions.

    She says in Egypt in particular, authorities are known to use online sting operations where they try to find gay men using social media so they can arrest them.

  9. Do social media platforms reflect or amplify abuse?

    Conservative MP Dean Russell asks the panel whether social media firms are reflecting an increase in abuse within society, or contributing to it.

    Danny Stone, director at the Antisemitism Policy Trust, says platforms "facilitate the spread" of hatred.

    Nancy Kelley from Stonewall says it is important not to "confuse" the situation online with wider attitudes towards LGBT people, which she says have improved over the last 50 years.

    But she says social media can "radicalise" people's views, and "enables" a minority of people to spread abuse.

  10. Witness backs powers for regulator to demand abuse data

    The panel is asked about the issue of getting information about the scale of abuse from social media firms.

    Nancy Kelley from Stonewall says there is an "absence of high-quality studies" - although some studies do exist. She says the regulator should be able to require firms to hand over relevant data.

    Danny Stone from the Antisemitism Policy Trust says there is "some data out there" - and points to some data collected by the European Union.

    He adds though that he has been trying to persuade one social media firm to give him a PowerPoint slide they use in training sessions for ten years, but although it has been promised he is yet to receive it.

  11. What are the differences between online and offline abuse?

    Nancy Kelly, the CEO of Stonewall
    Image caption: Nancy Kelly, the CEO of Stonewall

    Nancy Kelly, chief executive of Stonewall, says that existing legislation does not really account for abuse online, she says much legislation against homophobic and transphobic abuse are made for workplaces and face-to-face interactions.

    She says introducing duties of care for social media companies is something Stonewall is fully behind.

    She adds that the LGBT community suffers direct insults, forced outing and identifying where people live (otherwise known as "doxing"). Some people may have to move house or change jobs as a result of these instances.

    Danny Stone MBE, director of the Antisemitism Policy Trust says offline and online abuse is "similarly connected".

    He says there have been anti-Semitic murders and anti-Semitic manifestoes shared by extremists shared online. He adds that he does not post any pictures of his children online as there is a chance someone might want to hurt his children because of the work he does.

    Smaller social platforms can disseminate false information about LGBT people in the same way they can share anti-Semitic misinformation, Ms Kelly adds.

    Danny Stone MBE, Director of the Antisemitism Policy Trust
    Image caption: Danny Stone MBE, Director of the Antisemitism Policy Trust
  12. Questions turn to anti-LGBT and anti-Semitism online hate

    final panel

    The committee is now taking evidence from the final panel.

    Nancy Kelley, is the chief executive of Stonewall, and Danny Stone MBE, is director of the Antisemitism Policy Trust.

  13. Analysis

    Dealing with hateful posters: can the law stop abusive comments?

    Jane Wakefield

    Technology reporter

    Sanjay Bhandari, the chair of Kick It Out, suggested that one of the key things the government can do to to force tech firms to deal with racist abuse on their platforms is to set up a code of practice around the reach of anonymous accounts.

    The bill currently deals primarily with content moderation - how the tech platforms deal with hate speech after it has been posted.

    But MPs also ask if they need to consider ways to make those behind abuse more accountable a key part of the legislation.

    There was consensus that there is a lack of data about who the hateful posters are, and that is something that the regulator needs to audit, all panellists agreed.

    But Mr Bhandari also raised on the biggest issues facing regulators when he told MPs if that "if I was being continuously punched in the face, then arresting someone is good but I also want them to stop them punching me in the first place."

    How to stop people posting hateful comments is sadly probably a job both beyond the reach of Ofcom or the tech firms.

  14. This legislation can be a "blueprint" for other countries

    JOhn

    Edleen John says the international football community is "looking at us, and the opportunity we have to put in place the first piece of legislation that can become a blueprint for other organisations and other countries as well.

    Chair of the committee Damian Collins ends this section of the hearing by asking Ms John what discussions are being held an executive level with football's governing bodies such as Uefa and Fifa.

    She says: "We absolutely have conversations right to the top of the organisations around discrimination, around online abuse, around sanctions and around what we can do as a collective to make it clear that this isn't something that we want in our game, and our game is for all."

  15. Regulator should define harmful content, says campaigner

    Conservative peer Lord Gilbert goes back to the issue of the responsibility in the bill on social media companies to remove content that is not legal but "harmful".

    He asks whether "harmful" should be defined in the bill, or whether the definition of what is illegal should simply be widened instead.

    Sanjay Bhandari, chair at anti-racism group Kick it Out, says he doesn't favour either approach.

    He says if what is "harmful" is defined in the bill, then it will have to be continually updated with new legislation to reflect new forms of abuse.

    He says instead, a regulator - in this case, Ofcom - should be given powers to make the decision, reflecting contemporary social views on morals.

  16. What about homophobia in football?

    Ferdinand and Nicolson

    John Nicolson asks about homophobia in football.

    Rio Ferdinand says he recently met a footballer who had come out as gay, and his lawyer had advised him not to because of the pressure and emotional toll it could take on him.

    Mr Nicolson asks what message it sends to young gay disadvantaged teenagers if a wealthy footballer cannot come out as gay.

    Rio Ferdinand says "the amount of eyeballs" and press attention "is so much bigger" than in other sports.

  17. Racists are empowered seeing others online - Ferdinand

    panel

    Rio Ferdinand continue saying that previously racism had "got better" but had never "gone away". But now racists are "empowered" by seeing others do the same online.

    Sanjay Bhandari says that social media companies need to be compelled to share all data that they have available on an account.

    Edleen John says that quite often, data shared by social media companies is not complete, and their own investigations will point to a completely different type of person sharing the abuse.

  18. Who would throw a banana or send a monkey emoji?

    SNP MP John Nicolson asks who the panel who would throw a banana or send money emojis to footballers.

    Rio Ferdinand replies:

    "Sometimes it's young school boys ranging ages 13, 14 years old, sometimes younger. Or maybe an estate agent or a banker.

    "Really it is from all different types. You can't just pinpoint one type of person from a certain particular background and say that is the actual stereotypical racist who's putting this abuse online or at stadiums or wherever. It's a very varied demographic demographic of people.

    John Nicolson asks what lessons can be learned how to tackle it.

    Ferdinand says the landscape is very different now.

    "I think with social media, we've never been in this in this area, this time is very different.

    He says anonymity online is "the big difference".

    "The fact that you can be anonymous online, gives you a certain amount of power - it enables you to puff your chest out, and be able to really say what you really feel, especially compounded with the fact that there's no repercussions, at the moment".

    Arsenal striker Pierre-Emerick Aubameyang celebrates scoring in 2018
    Image caption: In 2018, Arsenal striker Pierre-Emerick Aubameyang had a banana skin thrown in his direction as he celebrated scoring in front of Tottenham fans at the Emirates Stadium.
  19. MPs not boardrooms should define 'harmful' content

    Edleen John says she backs one of the key elements in the bill - that firms should not just be responsible for taking down posts that are illegal, but also "harmful".

    She says she recognises "how difficult it is" to define precisely what content this should cover.

    But she adds that Parliament is a "better place to define this" than in the boardrooms of social media companies.

  20. Priority to hold social media firms to account, says witness from FA

    Labour peer Lord Knight asks the panel whether the bill should "do more" on education.

    Edleen John from the Football Association says education is "absolutely one part of it" and will be important in trying to combat racism in the younger generations.

    But she says the "responsibility at the moment" should be on holding social media firms to account.

    She says this includes making sure the platforms do not "amplify" the racism present online - and that education is not used as an excuse to deviate from this aim.