Automated news writing programs have begun gaining popularity in newsroom settings mainly because such programs are able to write breaking news stories at an incredibly fast pace. The appeal: Faster breaking news production can mean getting ahead in online search results and readership, not to mention automated journalism costs less than paying a real person to write a news story.
However, despite the appeal of robot journalism at first glance, there are 10 good reasons (and probably many, many more) why robot journalism can’t replace real writing by real people.
1. It can’t make an emotional connection. The best news stories, whether feature length or a short paragraph in the Sunday paper, are those that make us feel something. The kind of news that has a lasting impact on you is that which can make you laugh, feel sorrow or even feel angry.
Robot journalism, no matter how complex an algorithm is used, will never be able to emulate the nature of human emotions and feelings. Real reporters will always win there.
2. It can't be creative. Another reason human writers will remain a constant is because we are able to go out, research and hunt down a good story. Automated news software can only scour the internet for information that already exists. If robot journalists were the norm for every news organization, no one would ever learn anything new.
3. It can’t infer meaning. Inferring meaning from related pieces of news is one of the most common parts of our jobs. Where an automated news source might report that event A led to event B, a real journalist might see that both events are telling signs of a larger cultural shift or trend.
4. It could be offensive. While I’m sure most automated journalist algorithms would include a list of words not to use, a robot journalist still has the potential to offend readers on the basis of context, idioms and seemingly implied meanings the algorithm might not be aware of.
Obviously, all organizations who choose to have automated systems do some of their writing should have that writing edited before publication but, as many of us know, sometimes things just get overlooked.
5. It can’t be subtle. In line with the lack of emotions robot journalists are able to feel, they also lack the ability to be subtle when dealing with an emotionally sensitive issue. News stories dealing with death, severe disease and other emotionally sensitive material require a certain amount of tact in reporting. It’s unlikely that an automated news program has the ability to convey such information as effectively as a human reporter.
6. It could be hacked. As with all other technological innovations, automated news software certainly has the potential to be hacked. This could mean someone getting a hold of your next breaking story before it goes live or someone being able to download your employees’ and sources’ personal information.
7. It could pull in compromised files. Equally as bad as getting hacked yourself is the unfortunate event when you download unsafe content, files or images. If any of these automated news writing systems have to download press releases, PDF files or spreadsheets to gather their information, they run the risk of allowing the server they are hosted on to be compromised.
8. It could use unreliable sources. While it’s likely that robot reporting software uses a list of criteria in their algorithms to determine what qualifies as a worthy news source, that’s not to say that some unreliable information could slip through. For example, if a robot journalist scans a satirical news story in a well-respected publication, would it know that the story is satire? Or would it cite the story as an accurate source? A human would likely be able to tell the difference very quickly.
9. It can’t clearly separate news from ads. As the SPJ’s Code of Ethics makes clear, journalists should be able to distinguish between what is news and what is an advertisement. However, the line between news and ads can often become blurred, especially with the advent of content marketing advertising. For example, would you say that these news releases are ads or news? What would a computer program say?
10. It can’t be held accountable. Another ethical responsibility of journalists is that we are expected to hold ourselves accountable for our actions, our coverage of news stories and any mistakes that occur therein. However, robot journalists can’t assume accountability for anything because, at the end of the day, they’re not accountable for anything; the people who create and program them are.
Some have suggested that allowing automated news programs to take over the more mundane news tasks will free up more time for human journalists to do their jobs better. Ideally, that would be true. However, the implementation of this new technology into the journalistic field seems worrisome at best.
Time can only tell how robots will impact news writing but one thing is for sure: Human journalists will certainly be difficult to best with an algorithm.