Tech Tent: fake news, algorithms and listening gadgets

  • Published
Rory Cellan-Jones

In an era of fake news and alternative facts, how can scientists re-engage with the public and make sure they are respected and understood? That's a big theme at the Cheltenham Science Festival from which this week's edition of Tech Tent comes.

We also discuss the growing importance of algorithms in our lives - and ask whether we should be worried that our gadgets are listening to us.

Science under siege

The hottest ticket at the Cheltenham Science Festival was for a debate on the fake news phenomenon hosted by the comedian, and former theoretical physics student, Dara Ó Briain.

He told a hilarious story about a fake news report of his death after a car driven by his chauffeur plunged into a Dublin ravine - who knew that city's geography was so perilous?

But the debate swiftly turned serious when Nasa's former chief scientist Ellen Stofan pointed out that a decline in faith in science in areas like vaccine safety and climate change could have profound consequences for the planet.

Expanding on this in an interview for Tech Tent, Dr Stofan told me these were life and death issues, and public misconceptions and mistrust of scientists was a matter of real concern.

Image source, Getty Images
Image caption,

President Donald Trump controversially pulled out of the Paris climate Agreement.

She'd found President Trump's decision to withdraw from the Paris Agreement on climate "extremely disappointing", and was worried that the public was not aware that all the major scientific institutions were agreed that climate change was real and was caused by humans.

And she blamed the internet - and the major tech companies - for spreading misinformation: "There's actually a profit motive in spreading misinformation...people try to find information that reinforces their opinion - and on the internet you can find just about any information."

She said that, in trying to appear impartial, the tech firms had helped those who were trying to spread untruths, and they needed to think carefully about putting such information out there without making its veracity clear to people.

Do Algorithms Know Best?

"A computer making a decision on our behalf for us - or with us." I had asked Jeni Tennison for a simple definition of an algorithm - and the chief executive of the Open Data Institute came up with a pretty good one.

Computers absorbing a torrent of data from social media and the internet of things are being programmed to make ever more decisions - from what you pay to insure your car to where you should go for dinner. But now more questions are being asked about how these algorithms work and whether they are always good for us.

Image source, Getty Images
Image caption,

There's an algorithm for that.

Jeni Tennison conceded there was a danger that they were built to reflect a Silicon Valley mindset - and said programmers everywhere needed to sit back and think about the effect on our lives.

She was debating algorithms with Hetan Shah of the Royal Statistical Society. He's largely optimistic, seeing these data-driven recipes as improving areas like poverty reduction by mapping crop yields and predicting famine.

But he sees a real risk of biased algorithms feeding off biased data: "The rules that have worked so far have been stretched to breaking point in a world where your fridge and your car are passing round data very quickly."

He suggests that there could be a group of wise people who could start thinking about issues such as whether a driverless car's algorithm should always favour the safety of a driver in the event of an accident.

Are your gadgets listening?

This week Apple unveiled a wireless smart speaker the HomePod, a rather belated rival to the Amazon Echo and the Google Home, though one which claims superior audio quality.

But yet again it seems we are being asked to invite a listening device into our homes.

At a debate on listening gadgets in Cheltenham I found an audience pretty wary of this phenomenon - one man asked what Orwell would have thought of a world where we actually paid to be spied on.

Media caption,

Watch: A closer look at Apple's Homepod

Audio signal specialist Professor Mark Plumbley from the University of Surrey explained that the devices were generally only recording once they had been alerted by a wake word such as "Alexa" - but after that your voice data was heading off into the cloud to be processed.

Security expert Dr Jason Nurse from Oxford University said we needed to ask how long these various services were holding recordings of what we said when talking to our devices. But his real concern was what happened if they were taken over by criminals: "If hackers got control of these devices then they could record all the time - and that's a pretty significant concern."

Mark Plumbley is sceptical about the theory held by many that all kinds of online services are eavesdropping on us and then sending us adverts reflecting our overheard conversations.

But he cautions that all kinds of apps now ask for permission to use the smartphone's microphone and we should be cautious before agreeing.

Voice control has been the hot new trend in home gadgets for quite a while, and if the spate of TV adverts for the Amazon Echo and Google Home are anything to go by, the tech industry is convinced they are a hit with consumers.

But maybe it is time to think more carefully about just who could be listening in when we talk to Alexa or Siri.