Automated journalism is entering newsrooms
While machine-oriented journalism is still young in development, it will undoubtedly become common in the near future.
The idea behind automated journalism is that an algorithm could effectively peruse a large amount of data and regurgitate it into a coherently written piece, all of which based solely on deductions from the information provided.
At the moment, it’s hard to predict the impact of machine-based news reporting on the public — aside from some qualitative speculation concerning its effect on news media conglomerates and their organizational hierarchy.
The one area that concerns me is the field’s impact on the First Amendment and how artificial intelligence would be punished in the event of an “error.”
What exactly are the parameters of a machine’s freedom to formulate ideas? Video games are protected under the First Amendment. The reason behind this protection is that “like protected books, plays, and movies, they communicate ideas through familiar literary devices and features distinctive to the medium.”
This legal definition could almost perfectly be attributed to a news-generating machine. However, the output from these programs may cause the creators to have to answer to a court of law someday.
If this legal gray area is left alone, a programmer or data-entry clerk (or multiple) may face an insurmountable amount of legal fees due to a lawsuit resulting from the algorithm’s reasoning pattern or data input.
One of the worst outcomes for this scenario could be a conviction in a defamation suit. The best would probably be dismal: paying legal fees to set a precedent in an unexplored legal realm.
Although it would be hard to prove the argument for actual malice, even in most hypothetical cases, those running the website of potentially hosted articles would have immunity. The financial burden, then, wouldn’t fall upon the machines in the first place.
I tend to lean toward the argument that an automated process would propel journalistic integrity and credibility, especially in the areas of financial analysis and Wall Street news. Recently, Norman Pearlstine, once the top editor for Time, The Wall Street Journal and Bloomberg, announced that he would be joining the startup Money.net, a low-cost alternative to financial news and data analysis that is heavily invested in machine-generated news.
It’s obvious that there is a serious monetary incentive to bring this technology to market-makers.
While I’m skeptical of tampering with First Amendment legal precedents, I strongly believe that legislation could prevent a lot of future problems concerning the fourth branch of the government.
Litigation would tie down this new technology and potentially harm the innovators of Silicon Valley. The early stages of this phenomenon are taking place. I think it would be best to be proactive about the inevitability of machine-generated news rather than wait for the issues to reach an appeals court.
Opinion columnist Nicholas Bell is an MBA graduate student and can be reached at [email protected]