After spawning anticipation for a new and improved GPT-4, OpenAI launched the new language model a week earlier than announced, on 14 March 2023. The software is available for ChatGPT Plus members, who pay a monthly subscription of $20, and developers may enter a waitlist to access the application programming interface.
The inner-workings of an efficient technology are invisible. Hence, users may have unknowingly interacted with GPT-4 via Bing Chat, an AI-powered chatbot. Yusuf Mehdi, Microsoft’s head of consumer marketing blogged, “[w]e are happy to confirm that the new Bing is running on GPT-4 [...] If you’ve used the new Bing in preview at any time in the last six weeks, you’ve already had an early look at the power of OpenAI’s latest model”.
Speaking of a great collaboration between artificial intelligence and education, the popular language learning platform, Duolingo, has integrated GPT-4 within its operations. “Duolingo Max” is a new subscription tier that personalizes a user’s learning experience through “Explain My Answer” and “Roleplay”. The former allows the learner to open a dialogue with AI about their answer, and the latter familiarizes the learner with real life conversations.
Described as “advanced” and “sophisticated” by OpenAI, GPT-4 can take in and generate up to 25,000 words, which is eight times more than its older model. Known for its ability to interpret images in addition to text, the software can express logical ideas about pictures. If GPT-4 is the human our corporations want us to be, with eyes to react to images; ears to interpret videos; a “brain” to retrieve information, where does it live?
OpenAI worked with Microsoft to build a supercomputer in the Azure cloud in 2020. Microsoft Azure is a platform that stores and shares computing resources on the internet, which was used to train and “keep” GPT-4. With a billion dollars funneled into the supercomputer by Microsoft, it is a 285,000-core machine. To put that in context, a normal computer has between two and four cores.
Simon Portegies Zwart, an astrophysicist at Leiden University, tells Physics World, “if I run a supercomputer that takes as much energy as 10,000 households then who am I to tell my children, or other people, they shouldn’t shower for 20 minutes?”. In the process of feeding data to a Natural Language Processing (NLP) model, about 626,155 pounds of carbon are produced, as delineated by Strubell et al in the MIT Technology Review. That is about as much carbon as 1,100 people might produce in a year.
Aside from the scope of artificial intelligence to maximize productivity, is a software that can distinguish red from blue, worth the environmental damage? OpenAI has been sneaky about integrating GPT-4 into platforms we interact with daily, but has our technology become so sneaky, we forget that it takes up space?