@ct_bergstrom The cynic in me suspects that they are very well aware of this, but since basically the only thing LLMs can do is generate plausible text, they deliberately redefined the goal of science as something to which "generating plausible text" seems like the answer.
Wouldn't be the first time that the LLM crowd (or the cryptocurrency crowd with which it has a lot of overlap, for that matter) pulls something like this. Just redefine the problem so that you are the solution.