Taylorism is a management philosophy based on using scientific optimization to maximize labor productivity and economic efficiency.

Here's the result of making the false Taylorist assumption that the output of scientific research is scientific papers—the more, faster, and cheaper, the better.

Papers are not the output of scientific research in the way that cars are the output of automobile manufacturing.

Papers are merely a vehicle through which a portion of the output of research is shared.

We confuse the two at our peril.

The entire idea of outsourcing the scientific ecosystem to LLMs — as described below — is a concept error that I can scarcely begin to get my head around.

sakana.ai/ai-scientist/

Follow

@ct_bergstrom The cynic in me suspects that they are very well aware of this, but since basically the only thing LLMs can do is generate plausible text, they deliberately redefined the goal of science as something to which "generating plausible text" seems like the answer.

Wouldn't be the first time that the LLM crowd (or the cryptocurrency crowd with which it has a lot of overlap, for that matter) pulls something like this. Just redefine the problem so that you are the solution.

· · Web · 0 · 3 · 8
Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.