The Disruptors: How this Norwich start-up gave a voice to Eastern Daily Press’s archive
- Credit: Getty Images/iStockphoto
Faced with the huge task of modernising a newspaper archive, chatbot builder ubisend had to teach robots how to read the news – and sound like humans. The latest in The Disruptors series, chief technology officer Joe Dixon explains how they did it.
Tasked with bringing the Eastern Daily Press's newspaper archive into the 21st century, local startup ubisend has created a programme that allows current and archival stories to be read aloud by virtual assistants like Alexa and Google Assistant.
The Local Recall project means local readers can listen to any article they desire from the archive - which is particularly useful for those living with sight loss.
You may also want to watch:
Speaking here in the latest video in our The Disruptors series, ubisend's Joe Dixon explained how they did it.
- 1 First look inside Norfolk garden centre after £1.25m refurbishment
- 2 No recent virus cases in more than a quarter of Norfolk and Waveney
- 3 'Vindictive' man torched couple's new home - after failing to buy it
- 4 'The nicest people shop in Lidl' - Women amazed by act of kindness
- 5 'He absolutely pummeled me' - Man, 70, describes vicious attack
- 6 Road closure warning for £940,000 traffic shake-up in Norwich
- 7 MP moves to reassure public as film crew hires out village homes
- 8 Work on new McDonalds and Starbucks site 'breaks rules' - councillors
- 9 Award-winning Norfolk school appoints new headteacher
- 10 Emergency services called after car overturns in road crash
What makes Local Recall a disruptive project?
There's always been a way to access old newspapers, albeit through online portals and PDFs, and in the last few years, a few publishers have even made their online editions available via Facebook Messenger.
Local Recall is unique because it joins both experiences through a new and innovative medium; voice.
As a chatbot builder, how do you ensure a 'robot' sounds as natural as possible?
We use character recognition and natural language algorithms to test the chatbox text, and then make any manual fixes on a case-by-case basis.
In the case of Google Assistant we encountered an issue when it begins reading articles aloud; the assistant would say "Now reading…" followed by the name of the article. Unfortunately, Google Assistant was saying the verb 'reading' using the pronunciation of the city Reading!
These are the sort of tweaks we continuously make to ensure the pronunciation is as natural as possible.
How important are local speech patterns for a project like this?
It's certainly something we keep in mind. We worked to build [the chatbot's] local geographical understanding, such as making sure it knows what 'Great Yarmouth' is or who 'Alan Partridge' is.
What does this project mean personally to you?
I grew up hearing the stories we now bring back to life through the Local Recall chatbots.
Getting to work with a local institution as well as with Google, the epitome of tech giants, is a dream come true.