I am beginning to perceive a pattern, where people outside the STEM fields see AI as this revolutionary technology that is going to upend everything, while the people inside the STEM fields have an attitude more akin to "here's another tool for the toolbox; it will improve your productivity in some ways, and occasionally give you a terrible headache."
There was a similar dichotomy of attitudes when electronic microcomputers were becoming popular, but we didn't get a utopian paradise, nor did it take away everyone's jobs, or make us slaves to the machines. Some jobs went away, some were created, some changed; but overall, we're still in perdition, just with some aspects running orders of magnitude faster.
I don't write academic papers, so automating that process does nothing for me. However, I do occasionally
read papers from electronic engineering academia, and in that context, my interest is whether or not the paper contains information I need to solve a problem. The form of the paper affects its readability, but the underlying research is what determines its utility.
In its current incarnation, OpenAI is obviously and painfully incapable of producing academic papers of any value to me (see my travelling salesman example, above). An AI that could synthesize existing papers together to produce novel conclusions
might have use, but the results would have to have practical relevance. There are an infinite number of true yet meaningless papers that it could construct, which means that someone who knew what they were doing would have to be presenting a problem space to perform the synthesis over, or sifting through the mud to find the gold nuggets.
An AI that could, on its own, determine what kinds of papers someone in industry would find useful, and then synthesize those (effectively, an autonomous research engine) would be verging into singularity territory, and that would be pretty cool.
Get ready for rocket scientists that drank their way through grad school getting jobs at NASA and SpaceX.
I thought all rocket scientists drank their way through grad school?
But in practical terms, AI isn't going to provide a vector for students to cheat their way to a degree, because practically everyone in the domain will know what the tools are capable of.
There will be periods of transition, to be sure, but the assignments will just change form. Instead of "perform research X, and write a paper on it", it becomes "use RocketAI to model research X, and generate a paper from your results using PaperWriter-2.1".