Juan Andres Guerrero-Saade’s speciality is picking apart malicious software to see how it attacks computers.
It’s a relatively obscure cybersecurity field, which is why last month he hosted a weeklong seminar at Johns Hopkins University where he taught students the complicated practice of reverse engineering malware.
Several of the students had little to no coding background, but he was confident a new tool would make it less of a challenge: He told the students to sign up for ChatGPT.
“Programming languages are languages,” Guerrero-Saade, an adjunct lecturer at Johns Hopkins, said, referring to what the ChatGPT software does. “So it has become an amazing tool for prototyping things, for getting very quick, boilerplate code.”
ChatGPT opened up to the public in November and quickly gained millions of users who reveled in its uncanny ability to mimic nearly any style of writing, from Seinfeld scripts and limericks to religious texts and Shakespearean sonnets.
And while there’s been plenty of speculation about its ability to disrupt writing jobs, some computer scientists are now wondering if its most immediate impact will be on people whose jobs were once thought of as “futureproof.” YouTube and TikTok are already rife with videos of people showing how they’ve found ways to have ChatGPT perform tasks that once required a hefty dose of coding ability, from building entire websites to scraping information from the internet.
“The hottest new programming language is English,” tweeted Adrej Karpathy, a former senior director of artificial intelligence at Tesla and a founding member of OpenAI.
ChatGPT’s ability to mimic a particular author or style comes from the fact that developers trained it on the readily available and public information spread across the internet, which includes vast repositories of published computer code and discussions of how to troubleshoot it. That gives ChatGPT and GitHub Copilot, a similar program designed specifically for coding, a rich foundation on how to complete all sorts of programming tasks, said Grady Booch, the chief scientist for software engineering at IBM.
“They’ve got an open book — they’ve got the internet at their disposal,” Booch said. “They’ve probably found answers to questions that have already been answered. So it becomes easier, faster.”
That won’t put professional programmers out of a job in the immediate future, but it’s speeding them up, Booch said. Even before ChatGPT, coders who ran into a problem often used Google to look for a solution.
“It doesn’t change the way I do business. But it kind of speeds things up for me,” he said. “It’s not revolutionary. It’s evolutionary.”
David Yue and two other engineers beat out around 300 programmers last week in a San Francisco competition for who could build the most interesting AI software program. His team’s project, entitled “GPT is all you need for backend,” used the chatbot to automatically build some of the necessary but not particularly unique parts of how apps work.
Why Microsoft sees big payoff with ChatGPT creator investmentJan. 25, 202307:00
Yue said that while software engineers have been building those kinds of tools for years, the speed at which they have recently taken off has taken him by surprise.
“I think there was no doubt about the inevitability. But absolutely the speed at which it happened is quite surprising,” he said.
ChatGPT and related technologies are not perfect. They can introduce coding errors, and some have questioned whether the code they generate is secure. But as long as they have human minders with some programming expertise, that may not be a major problem. Siddharth Garg, a professor of computer engineering at New York University, said he and his colleagues recently completed a first-of-its kind study where he gave a coding assignment to groups of students, but only allowed some of them to use ChatGPT or Copilot to help.
“We didn’t see a substantial difference in the incidence of security bugs in human origin code versus code that is generated by Copilot or ChatGPT,” Garg said.
“Yes, there are security bugs, but humans also produce security bugs. At least we didn’t notice a significant difference.”
What does all this mean for the many people who learned to code in hopes that they would be in a lucrative profession? Not everyone is pessimistic about their future.
“Generative AI can automatically generate code, making it easier to create software, and amplifying the power of a software engineer,” wrote Hadi Partovi, CEO of the tech education nonprofit Code.org, as part of a lengthy Twitter thread about the topic. “This will accelerate the creation of (and demand for) software, and more people will become software engineers,” he concluded.