For the avoidance of doubt, I would like to make it clear from the outset that this article has been written by a genuine, bona fide human being and not by a robot.

While this comment may appear to be no more than a slightly glib opening to a piece on artificial intelligence (AI), it is actually rooted in fact.

In October the UK's Press Association announced that it will be following in the footsteps of US news giant, Associated Press, by recruiting robot reporters to produce news content within the next few months.

The use of natural language generation engines to produce news content neatly encapsulates many of the current controversies linked to the increasing use of AI.

In particular, it highlights many of the issues that surround the potential impact of AI on traditional job roles.

Proponents of the new technology argue that news content can now be generated with much greater efficiency and accuracy.

A single bot named Xiaomingbot produced over 450 articles on the 2016 Olympics for Chinese news syndication service, Toutiao, whilst the Washington Post relied heavily on in-house automated storyteller, Heliograf, for its own coverage of the 15 day event.

So, what has happened to those journalists whose time would previously have been spent in producing content that is now automated?

Associated Press has stated that automation has not displaced any reporters but has instead allowed them to re-direct their focus, to think more critically about the bigger picture and produce content which examines the nuances behind the numbers.

Put bluntly, the humans can get on with producing quality, insightful journalism whilst the computers take care of the drudge work.

In the short term, this looks like a win-win situation but one has to question whether this is a sustainable model in the longer term.

As the use of automated content becomes standard industry practice and more professional reporters produce an increasing volume of quality content, will we reach a saturation point beyond which there is simply no further demand or outlet for high quality human-generated content? This saturation point would, for most professional journalists, no doubt represent the tipping point – the moment at which AI ceases to be a positive resource to be relied on and instead becomes, at best, a career redefining hurdle and, at worst, a career destroying hindrance.

As consumers of automatically generated news coverage, we are generally entirely unaware that the article we are reading has not been written by a human. Reg Chua, executive editor at Thompson Reuters (another proponent of the technology) reports that in blind testing automated content actually came out as more readable than human-generated content.

Whilst this is testament to the high standard of the existing technology there is no dispute that, at present, the scope of that technology is limited.

The current generation of robot reporters are, in fact, simply software programs that can process specific data sets to generate a fairly narrow range of essentially standardised reports on topics such as sports and finance.

Concerned journalists may take some comfort from the recent experience of Microsoft's chatbot, Tay. Tay, launched on Twitter earlier this year, was taken offline within 24 hours after posting a variety of tweets containing racist and sexist content, promoting drug taking and denying the Holocaust. It subsequently made a brief return only to meltdown and start tweeting out of control, spamming its 210,000 followers with messages reading, somewhat ironically, 'you are too fast, please take a rest...'

According to recent research by Oxford University and Deloitte, about 35% of current jobs in the UK are at risk of computerisation over the next twenty years.

Interestingly, journalism scored a relatively low 8% likelihood of automation, placing 285th on a list of 366 professions considered.

The clergy came in at position 341. The job deemed most likely to be automated is that of telephone sales person.

Those looking to guarantee long term job security are advised to start looking for management opportunities in the hospitality sector. My own job, solicitor, came with a reassuringly low 3.5% likelihood of automation.

For now, at least, I can feel some degree of job security as I return to considering who I might sue when my client is defamed by a robo-scribe whilst my lawbot gets busy dealing with my contract drafting.