MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Thursday, 18 April 2024

AI will pollute the Internet

Big Tech will doubtless baulk at it, but it is time to tax the bots

Andrew Orlowski Published 20.02.23, 03:54 AM
Representational image

Representational image Shutterstock

Recently, Google invited the world to marvel at how clever its latest artificial intelligence (AI) software is, but it all went horribly wrong. Overnight, more than $100bn was wiped from the share price of Google’s parent company, Alphabet, as a result.

Interestingly the software actually did OK. The haphazard answers spewed out by Google’s AI chatbot, Bard, were just as poor as those generated by the software it sought to eclipse: ChatGPT.

ADVERTISEMENT

In just a few weeks, OpenAI’s chatbot has captured the imagination of the media, and created a minor speculative bubble in the tech world. Microsoft began to introduce this automatic text generator into its search engine and its Office products such as Word.

“Microsoft and OpenAI have introduced pretty much the same product, with worse flaws, but were met with adulation as opposed to derision,” says former Nomura Securities analyst Richard Windsor, wryly.

One reason is journalists and analysts are not really dispassionate — they see what they want to see. Much like the audience at an illusionist’s show, they become part of the performance and, in a sense, complete the magic trick. Later participants can’t believe how easily they were bamboozled when the misdirection was so obvious. Critics who were thrilled by ChatGPT in December simply saw through the sleight of hand when the same trick was performed by Google.

ChatGPT faithfully churns out pastiche text based on statistical models of material it has ingested. Similar services churn out amazing pictures too. In reality, it has no idea of what it’s doing. Windsor accurately calls it “a clever innovation capable of creating the illusion of sentience, easily shattered with even a small dose of reality”.

These bots are not only very expensive to run, but they are also creating downstream social costs to which their makers seem oblivious.

In 2023, our digital world is awash with spam and fakes, and we must navigate vast amounts of derivative and unoriginal material. Why would we want to unleash a tool that creates even more rubbish? Ah, say the AI advocates — that’s missing the point.

ChatGPT will be useful because so many of our everyday exchanges already follow a predictable pattern or a template, and the bot recognises these patterns, and mimics them. An example often cited is generating the covering letter of a resume, which assuredly is not the place a job applicant should be getting funky and creative. But generating this filler may not be necessary at all. Does the resume need a cover letter? Probably not. Could that two-page analysis be condensed into three bullet points? Very probably. No wonder the waffling classes — such as futurists and consultants — are fascinated by it.

France’s elite university Sciences Po became the latest institution to ban students from using ChatGPT to generate their essays without transparent referencing. “The sanctions for use of the software may go as far as exclusion from the institution, or even from French higher education as a whole,” an administrator warned. If schools and universities need to, they’ll disable Word, or uninstall it altogether — a consequence that the market has not yet factored into Microsoft’s share price.

Institutions are already thinking about how to redesign courses to reduce the amount of home learning, where cheating is easy, and rely more on in-hall exams. That isn’t cheap.

So, this is pollution with a cost. Fortunately, economists know exactly what to do. The cost is what is referred to as an externality. In 1920, the Cambridge economist Arthur Pigou, a winner of the Adam Smith Prize, introduced the idea of taxing these externalities — so the polluter pays.

Paul Sanders, a digital music entrepreneur who, like Pigou, approaches the problem from a classical liberal perspective, explains it so: “The external costs from ChatGPT are the teachers’ time and effort, and pupils’ lack of learning caused by cheating at school, incorrect advice followed, and the extra efforts human creators need to compete. A tax is the simplest and most effective way to deal with gains and losses that fall to those outside of a transaction.”

Big Tech will doubtless baulk at such a proposition. The industry is unable to acknowledge the pollution it creates. But we have an elegant and liberal solution to such digital effluence: it’s time to tax the bots.

THE DAILY TELEGRAPH

Follow us on:
ADVERTISEMENT