Why I may ‘hire’ AI instead of a graduate student (science.org)

by doener 110 comments 117 points
Read article View on HN

110 comments

[−] __bjoernd 61d ago
Over here in Germany, professors' job is "research and teaching". According to the internet, the author's university is a publicly funded university as well. I can see how AI can make you faster on the research side, but you give up 100% of the teaching/developing people part.

As a tax payer, I am very concerned if the people I fund with my taxes to do a job unilaterally declare they are no longer going to do the half of it.

[−] fasterik 61d ago
Teaching and research should be decoupled. Professors are hired and granted tenure primarily based on their ability to produce original research. The skillsets are different; often good researchers are bad teachers, and good teachers are bad researchers.
[−] Molitor5901 61d ago
My experience at a research university, albeit 12 years ago, was that many of my professors loathed teaching. Some openly expressed disdain for the time they wasted teaching when they could be researching. I think a better framework in the future would be to have researchers, and lecturers/teaching professors separate. One is to teach, the other is to research.
[−] MITSardine 60d ago
In the classic division, "teaching" consists in giving undergraduate classes, and "research" consists in the whole spectrum between working all on your own and managing a PhD factory (3+ students a year).

So this article is really not saying anything controversial in the strictly ontological side of things, in fact it's already a relatively common stance to prefer supervising few (or, more rarely, none at all) students.

This researcher is saying "when I consider hiring someone as a workhorse, I might prefer AI instead"; what's the harm in that? Too many PhD students are used as disposable cheap labor, seeing little personal growth in their PhD journey and being generally neglected and abused.

[−] cherryteastain 61d ago
Teaching an undergraduate class or even a graduate class is still teaching. The author does not say he won't do that anymore.

The problem is about the fresh talent pipeline for researchers (i.e. PhDs). In many ways, elementary school and a Master's degree are more alike than a Master's and a PhD in the sense that you're learning prior art with clearly defined exam/project assessments and no expectation of making something truly novel in both elementary school and the Master's, while a PhD is all about discovering something nobody uncovered before. So, calling this a problem of not wanting to teach isn't quite right.

IMO, the article is rather highlighting a different problem; the former problem in this area was that only a tiny sliver of the best engineering/CS undergrads wanted go into research given the far more lucrative industry careers, and now the supply part of that market is about to vanish too due to agentic AI. This will basically kill the concept of an academic career as we know it and the point of the article is that we need to find a different model of advancing and funding science.

[−] rob_c 60d ago
You've clearly never worked the academic sector. Calm down most researchers are hyper focused on their research productivity because at times 90%+ of their time is consumed by teaching for months at a time. This is an almost universal constant for all decent institutes globally. Taking on extra cheap labor in the form of grad students used to be the only way to do this but every single time this turns into onboarding someone for months to get weeks of work out of them. Great when you can hide it in your other side of your job, but most of the time you can't...
[−] dude250711 61d ago
[dead]
[−] SirHumphrey 61d ago
Students are not only workers, they are also disciples of your work and, once forced to read it, will likely use it in the future even when they leave your lab.

Even completely egoistically replacing students with AI is shooting yourself in the foot in the long term.

[−] CraftingLinks 61d ago
It says a lot about US academic culture that they think in terms of hiring. There is an important educational commitment requirememt to the role of professor, at least in Europe. Hiring is to the betterment of your own goals and almost orthogonal to the educational mission. A lot of unethicalities fond their root in this schizophrenic mission statement of doing professional competitive scientific research and at the same time education of graduates.
[−] lkm0 61d ago
This sounds misguided. In the little experience I had, I've seen that models get basic knowledge so absolutely wrong that giving them any sort of independence will not result in publications that positively impact a professor's reputation, or contribute to science. Or at least the reviews and papers I read that had AI content did not give me the impression that we should have more of this. And they require much more supervision, with the added issue that they cannot learn in the long term through your interactions, and without the enjoyment of teaching something to someone. They're really good at finding papers though. Perhaps because navigating search engines has become a pain. Perhaps this will be the case in the future, but saying you're tempted right now is like saying you're being tempted to replace your HPC with quantum computers. It's a bit early.
[−] BitsAndObjects 61d ago
10 years from now, the people that stopped hiring novices and juniors are going to be deeply regretting their past decisions. The people that kept hiring are going to be working with their newly-promoted-to-senior colleagues and be making significantly more progress than those that didn’t keep hiring.
[−] NoPicklez 61d ago

> The issue is not whether my students are valuable. In the long run, they are invaluable. The issue is that their value emerges slowly, whereas AI delivers immediate returns.

This is like the classic "I'll do it myself because its quicker".

In the current environment and likely more so into the future, those that hire and develop graduates skills are going to be looked at more favorably. Furthermore, the people you work with and coach become peers and typically help build a network of people who can bat for you.

It's tricky and its a balancing act and I appreciate that AI is becoming the easier quicker less fuss option

[−] 0xpgm 61d ago
To beat to death a well-known quote:

You may be able to go fast with AI, but you can only go far with humans.

[−] Peroni 61d ago

>

In the process, they may bypass the valuable experience of struggling through early tasks and learning from their mistakes. Students, I worry, could simply become an intermediary between the raw idea and the AI’s output.

Even if all AI progress grinds to a permanent halt today, there's already enough utility in its current capability to force these questions. As a result, how we train and educate graduates and young people needs to change.

I have no doubt you need to have actual experience to be able to ensure AI output is at a production standard but if we accept that reality, then a shift in how we educate and train young people could make an enormous difference in ensuring employers still see value in hiring people with no real commercial work experience.

[−] heavyset_go 61d ago
It's interesting in the ways AI mania and psychosis manifest
[−] servo_sausage 61d ago
This is something that will have to be solved through the way research is funded.

At least for publicly funded work, it was always an assumption that you would need students to hit some goal; so by funding it you would get both the outcome, and more people skilled in that field. If the scope of what one team/senior can handle has grown with ai, we will either need explicit staff numbers as a requirement or bigger scope to the point where the ai can't handle it.

Or we find that AI can do so much the whole system implodes...

[−] csvm 61d ago
Why not hire a graduate and empower them to use AI? Much better interfacing with an actual human who will then go and do the work using all AI tools at their disposal.
[−] noobermin 60d ago
Even the author eventually admits the problem is fundamentally academic funding and the stupid publishing culture we have. I wonder if the author had the courage to actually expound upon that as opposed to leaving it as a triffling mention at the end if it would have been published as a cute little op-ed in science in the first place.
[−] ttanveer 61d ago
It's interesting that this dilemma (of getting quick and easy wins) is occurring at multiple levels. Even as a junior researcher, its often tempting to hand off actuall thinking and reasoning about one's research to AI (e.g. blindly accepting AI code) to quickly make 'progress'.

Apparently the same question is being asked at different levels and abstractions...

[−] throwaw12 61d ago
Just like DEI, sustainability efforts, I predict we will see new initiatives for forced hiring of Juniors.

Implementation can differ (e.g. ratio of interns vs total headcount and so on), but it is the time for governments to intervene and force corporations to train people, humans are resource for the government, they need to polish that resource to thrive.

[−] peyton 61d ago

> I’m not sure where that will leave students who start with no research experience.

What is wrong with this guy? Of course he knows where that will leave those students. Why did he even choose to be in the business of developing people? Nobody forced him. Anyway, the ladders were pulled up in 2020–2021.

[−] ares623 61d ago
It's one thing to be forced to use the damn things, but this guy gives it very serious thought, much more than others I've seen and known, he even writes a science.org article about it, and ultimately chose wrong.
[−] oncallthrow 61d ago
This is morally wrong and it should be embarrassing to publish such an article
[−] teekert 61d ago
We measure scientific output as nr of publications. And that is the cause of bs like this.

These institutions have a duty to educate humanity. PhDs are also supposed to be able to help the public understand complicated science. To guide ethical decisions.

But no, we measure the number papers, and not even their quality (very well).

It's all a matter of incentive alignment, what gets measured gets done. The state of academic science is sad in most places. This contemplation by OP being case and point.

[−] akomtu 61d ago
In other words "why I prefer machines over humans and how I've replaced my moral compass with productivity metrics".
[−] eeixlk 61d ago
To drive your car, to build your software, to run the government. But not to change the oil in your car. Ya ok.
[−] kiwih 60d ago
I am an academic at a big Australian university. I can't disagree more with this article. Students are not just machines that convert ideas to papers (and nor should faculty), they are one of the true joys of academia in the first place.

Unfortunately, as other comments here have pointed out, the incentive structure for academia right now is misaligned, leading some faculty to focus just on publish or perish. This is a huge shame on so many counts.

My hope and also slight expectation from the AI era is that academic writing becomes so commodified that we see a total devaluing of papers as a metric, even in prestige venues. Perhaps this would help the community find better things to focus on.

[−] tigermafia 61d ago
Reading this article makes me happy for the graduates that are spared from working for an absolute imbecile.
[−] kang 61d ago
The title is not relevant to the article, not even for a single line. The author straightup assumes, does not answer the 'why', cause I was here to give Lady Lovelace argument to Turing, that you would NEVER (hire an ai instead of a student) unless you making directionless slop. You can share goals, but not the vision, and mission is different. Ai learns from experience, humans are needed to build that experience due to their extremely large 'context windows' going as deep as the constant evolution of the DNA(as long as it serves human-centric goals, which circles back to the mission part).

The article really is about "education seems directionless without economic goals", and again as comments have pointed out, it only seems so.