The creators of artificial intelligence (AI) fuelled applications should pay for the news and content being used to improve their products, according to the CEO of News Corp Australia.
In an April 2 editorial in The Australian, Michael Miller called for “creators of original journalism and content” to avoid the past mistakes that “decimated their industries” by allowing tech companies to profit from using their stories and information without compensation.
Chatbots are software that ingests news, data and other information to produce responses to queries that mimic written or spoken human speech, the most notable of which is the ChatGPT-4 chatbot by AI firm OpenAI.
According to Miller, the rapid rise of generative AI represents another move by powerful digital companies to develop “a new pot of gold to maximize revenues and profit by taking the creative content of others without remunerating them for their original work.”
Using OpenAI as an example, Miller claimed the company “quickly established a business” worth $30 billion by “using the others’ original content and creativity without remuneration and attribution.”
The Australian federal government implemented the News Media Bargaining Code in 2021, which obliges tech platforms in Australia to pay news publishers for the news content made available or linked on their platforms.
Miller says similar laws are needed for AI, so that all content creators are appropriately compensated for their work.
“Creators deserve to be rewarded for their original work being used by AI engines which are raiding the style and tone of not only journalists but (to name a few) musicians, authors, poets, historians, painters, filmmakers and photographers.”
More than 2,600 tech leaders and researchers recently signed an open letter urging a temporary pause on further artificial intelligence (AI) development, fearing “profound risks to society and humanity.”
Meanwhile, Italy’s watchdog in charge of data protection announced a temporary block of ChatGPT and opened an investigation over suspected breaches of data privacy rules.
Miller believes content creators and AI companies can both benefit from an agreement, rather than outright blocks or bans on the tech.
I respect the concerns but am not gonna sign this. LLMs won’t become AGIs. They do pose societal risks, as do many things. They also have great potential for good. Social pressure for slowing R&D should be reserved for bioweapons and nukes etc. not complex cases like this.
— Ben Goertzel (@bengoertzel) March 29, 2023
He wrote that with “appropriate guardrails,” AI has the potential to become a valuable journalistic resource. It can assist in creating content, “gather facts faster,” help to publish on multiple platforms and could accelerate video production.
Related: ‘Biased, deceptive’: Center for AI accuses ChatGPT creator of violating trade laws
The crypto industry is also starting to see more projects using AI, though it is still in the early stages.
Miller believes AI engines face a risk to their future success if they can’t convince the public that their information is trustworthy and credible, adding that “to achieve this they will have to fairly compensate those who provide the substance for their success.”
Magazine: All rise for the robot judge: AI and blockchain could transform the courtroom
Read More: cointelegraph.com