Nvidia CEO Jenson Huang spoke at the World Government Summit where he made a controversial statement regarding the future of computer science education.
For decades the received wisdom from those who work in the higher echelons of Tech companies has been that learning how to program is an essential skill that more children should be learning. Jenson Huang, CEO of Nvidia has turned all of that on its head by claiming that learning how to code is no longer necessary thanks to AI tools.
Huang argued that by using natural language prompts on artificial intelligence models like ChatGPT, humans will be able to easily generate working code for whatever use-case is needed, thus allowing humans to focus on other areas such as biology, education, or farming.
The wider tech community largely reacted with incredulity, with many tech professionals taking to Reddit to express their opinions. Some were cynical, believing the statement from Huang was self-serving in light of Nvidia’s heavy involvement with the AI market. One Reddit user called Veighnerg said: “Of course he is going to tell people to not learn coding. It will increase dependence on AI-powered stuff and his products will sell more.”
Subscribe to our newsletter for the latest updates on Esports, Gaming and more.
Huang goes against prevailing wisdom
Another user who worked in education explained that Huang’s words went against the prevailing sentiment in tech education, saying: “There has surprisingly been a shift in the importance of teaching Assembly. We participate in several security-focused computer science programs where reverse engineering malware and faulty code has really brought assembly back into the realm of relevant knowledge.”
Assembly language is low-level code that is often quite hard for humans to understand, as it is very ‘close to the metal’ but results in programs that are more efficient than those written in high-level languages such as Java, Python, or C#.
It was also pointed out that using AI tools like ChatGPT will ultimately have diminishing returns as the AI models will be forced to train on data output by other AI models. Reddit user Dizekat explained: “Once AI-generated code pollutes their training datasets, it’ll start declining. It can only damage the training dataset by diluting original data with an imperfect copy.”