Another positive externality is that academia furnishes a source of labor for peer review. The vast majority of labor, both for reviewing papers and for organizing conferences, comes from researchers in academia. Industry researchers are not in a cultural milieu where reviewing (or serving on a program committee, if you're in ML) is expected. Of course, whether peer review provides societal value is now being questioned. I've previously argued that it does have value, in that it increases recall (sensitivity) to new ideas, particularly from people without prior fame or prestigious affiliations: https://calvinmccarter.substack.com/p/peer-review-worsens-precision-but-improves-recall
Well, this is certainly thought provoking and needed, speaking as one of the "burn it all to the ground" guys. Ironically, far better written and more accessible than an academic paper would be. In no particular order:
- It's not really the case that companies don't train people from scratch, is it? Historically that was the only way people got trained, as the vast majority never went to university. Many people still do. Apprenticeship schemes are found widely throughout Europe. And I should say that in my own field of computing, most people teach themselves. CS degrees are often worthless at actually teaching programming from scratch, so self-teaching carries the load. It's not at all clear that universities are required for training.
- International community?! Conferences, international meetups and international collaboration are hardly academic! Why do you think only academia does this? The average programmer has been to way more industry conferences than academic, assuming they go to such events. I've been to both types but industry conferences were far bigger, more useful and in fact more international. The software industry routinely organizes hundreds of thousands of international collaborations in (open source) projects. Other industries also work through international orgs, often standards or research related (e.g. MPEG). This argument is very weak.
- Generating open knowledge. I think this is one of the best arguments. Though patents do the same thing with the difference that the peer reviewers are actually incentivized properlyC. The main problem with patents is that because patent rights are overly strong and penalties (in the USA) triple for wilful infringement, most people in industry don't read them. They might be useful, but you're better off independently reinventing what's in them for financial/legal reasons. But then again most people don't read academic papers either.
- Cross discipline collaboration. No, this is silly. Academics are notorious for totally failing at cross-discipline collaboration, to the extent that corporate labs routinely smoke them just by fixing this one thing. Look at how easily DeepMind crushes entire academic research fields through the magic of pairing up AI researchers with researchers in other fields. Academic fraud involving stats and programming tasks is absolutely notorious, often rooted in cockups due to a refusal to hire people with the right skills. Why? Because those skills are more valuable than theirs are, and they hate the idea of spending their grant money on expensive programmer/data scientist types rather than more people inside their own field, so they wing it and end up publishing nonsense. This is not an argument to keep academia.
- Obscure talent pools / orthogonal incentives. Quantum computing isn't a good example here because IIUC it's mostly driven by corp labs at IBM, Google and some startups. I don't think academia has done much to develop talent there, and anyway the promise of the tech hasn't panned out. But the wider argument you make is that we need systems that aren't optimized by capitalism because it might optimize away something useful. The problem with this nice sounding idea is that it leaves academia without any way to prune the genuinely and offensively useless. This is one of the biggest weaknesses of the whole enterprise and why so many people are turning against it. The refusal to defund useless or outright harmful work has led to academia incubating dangerously extreme and idiotic ideologies. It also floods the literature with garbage that nobody has the time to wade through, which obviates quite a few of the other suggested advantages.
I am one of the relatively rare types who works in a corporate research lab, and so I read scientific papers as part of my job. I've done this for years. I mostly avoid academic papers these days because I've learned the hard way that the ROI just isn't there. I wasted years of my life studying a purely academic sub-field that corporate labs were ignoring, because I believed the claims made by the academics writing the papers. Then one day I went to an obscure conference with them and learned that the way they presented their work in their talks was very different to how they presented it in papers and press releases. Major problems were revealed that somehow hadn't made it into the actual published works, and after I got some of them drunk one of the most celebrated researchers told me they would never actually use their own research for anything! What a fool I was. There were good reasons nobody tried to apply their output after all. And yet this sub-field churns out hundreds of papers a year. Other sub-fields are hardly better. The best papers come from other big corporate labs, and that's where the time is best spent. Or just, y'know, doing research and pushing the boundaries forward a little bit at a time. Blue sky stuff is rarely the right way to go.
It's a fine essay by conventional standards, but it rubs me the wrong way.
How about an alternative way of thinking about it: do most of the things you recommend, but conceptualizing academia/etc as corporations with a monopoly position that cannot be dislodged without substantial power, with the goal of establishing entirely new, competitor institutions that do not have all the (arguably unavoidable, in their case) failings *and staying power*[1] of academia.
[1] Due to exploitation of well known human cognitive shortcomings, many of which were put there in the first place by academia.
> Taking ideas seriously. With its roots in philosophy, academia genuinely values ideas for their novelty and depth rather than just their production value.
A problem: this is only true to the degree that it is true. You'd be hard pressed to even find a philosopher who can take ideas *really* seriously these days, so enamoured are Westerners with their cultural "facts".
> There are few other places where people will grapple seriously with other people’s ideas.
If it's in the course curriculum maybe...but try getting people to think about a genuinely novel idea and see how well that goes.
> Academics will actually dig into the minutiae of ideas and the why behind them. (This could be because of genuine curiosity or the desire to be right but either way ideas get explored in ways they wouldn’t otherwise.)
It could also be a nice sounding story that isn't actually true in any sort of an absolute....which is what it is.
I'd like to add another consideration. Society has a need for a transition from childhood to adulthood. Sure, sending kids off to college is an expensive way of doing that but coming of age rituals are often quite expensive in a society.
Moreover, in our modern world where people are expected to live on their own and get a job outside of a family buisness you need a period of transition where young people can learn to live independently with lower stakes not to mention the incredible social benefits offered by colleges (meeting friends and often life partners). College has performed a neat trick in getting families to pay for what is an incredibly valuable social experience under the guise of useful (which it has some of) training.
And if we have some institution that serves a similarly formative role in people's lives it is going to be the recipent of substantial support from people who went there later in life.
Ultimately, we are incredibly lucky that we managed to create an institution with substantial positive externalities to fulfill these roles. In past ages it was often the military that did this for young men. In our current world, likely that role would be taken on by corporations (eg google would have their dorms and stuff for their young engineer program as would many other companies).
It's an amazingly lucky thing that this role is filled in our society by a relatively benevolent organization with positive externalities.
Another positive externality is that academia furnishes a source of labor for peer review. The vast majority of labor, both for reviewing papers and for organizing conferences, comes from researchers in academia. Industry researchers are not in a cultural milieu where reviewing (or serving on a program committee, if you're in ML) is expected. Of course, whether peer review provides societal value is now being questioned. I've previously argued that it does have value, in that it increases recall (sensitivity) to new ideas, particularly from people without prior fame or prestigious affiliations: https://calvinmccarter.substack.com/p/peer-review-worsens-precision-but-improves-recall
Well, this is certainly thought provoking and needed, speaking as one of the "burn it all to the ground" guys. Ironically, far better written and more accessible than an academic paper would be. In no particular order:
- It's not really the case that companies don't train people from scratch, is it? Historically that was the only way people got trained, as the vast majority never went to university. Many people still do. Apprenticeship schemes are found widely throughout Europe. And I should say that in my own field of computing, most people teach themselves. CS degrees are often worthless at actually teaching programming from scratch, so self-teaching carries the load. It's not at all clear that universities are required for training.
- International community?! Conferences, international meetups and international collaboration are hardly academic! Why do you think only academia does this? The average programmer has been to way more industry conferences than academic, assuming they go to such events. I've been to both types but industry conferences were far bigger, more useful and in fact more international. The software industry routinely organizes hundreds of thousands of international collaborations in (open source) projects. Other industries also work through international orgs, often standards or research related (e.g. MPEG). This argument is very weak.
- Generating open knowledge. I think this is one of the best arguments. Though patents do the same thing with the difference that the peer reviewers are actually incentivized properlyC. The main problem with patents is that because patent rights are overly strong and penalties (in the USA) triple for wilful infringement, most people in industry don't read them. They might be useful, but you're better off independently reinventing what's in them for financial/legal reasons. But then again most people don't read academic papers either.
- Cross discipline collaboration. No, this is silly. Academics are notorious for totally failing at cross-discipline collaboration, to the extent that corporate labs routinely smoke them just by fixing this one thing. Look at how easily DeepMind crushes entire academic research fields through the magic of pairing up AI researchers with researchers in other fields. Academic fraud involving stats and programming tasks is absolutely notorious, often rooted in cockups due to a refusal to hire people with the right skills. Why? Because those skills are more valuable than theirs are, and they hate the idea of spending their grant money on expensive programmer/data scientist types rather than more people inside their own field, so they wing it and end up publishing nonsense. This is not an argument to keep academia.
- Obscure talent pools / orthogonal incentives. Quantum computing isn't a good example here because IIUC it's mostly driven by corp labs at IBM, Google and some startups. I don't think academia has done much to develop talent there, and anyway the promise of the tech hasn't panned out. But the wider argument you make is that we need systems that aren't optimized by capitalism because it might optimize away something useful. The problem with this nice sounding idea is that it leaves academia without any way to prune the genuinely and offensively useless. This is one of the biggest weaknesses of the whole enterprise and why so many people are turning against it. The refusal to defund useless or outright harmful work has led to academia incubating dangerously extreme and idiotic ideologies. It also floods the literature with garbage that nobody has the time to wade through, which obviates quite a few of the other suggested advantages.
I am one of the relatively rare types who works in a corporate research lab, and so I read scientific papers as part of my job. I've done this for years. I mostly avoid academic papers these days because I've learned the hard way that the ROI just isn't there. I wasted years of my life studying a purely academic sub-field that corporate labs were ignoring, because I believed the claims made by the academics writing the papers. Then one day I went to an obscure conference with them and learned that the way they presented their work in their talks was very different to how they presented it in papers and press releases. Major problems were revealed that somehow hadn't made it into the actual published works, and after I got some of them drunk one of the most celebrated researchers told me they would never actually use their own research for anything! What a fool I was. There were good reasons nobody tried to apply their output after all. And yet this sub-field churns out hundreds of papers a year. Other sub-fields are hardly better. The best papers come from other big corporate labs, and that's where the time is best spent. Or just, y'know, doing research and pushing the boundaries forward a little bit at a time. Blue sky stuff is rarely the right way to go.
It's a fine essay by conventional standards, but it rubs me the wrong way.
How about an alternative way of thinking about it: do most of the things you recommend, but conceptualizing academia/etc as corporations with a monopoly position that cannot be dislodged without substantial power, with the goal of establishing entirely new, competitor institutions that do not have all the (arguably unavoidable, in their case) failings *and staying power*[1] of academia.
[1] Due to exploitation of well known human cognitive shortcomings, many of which were put there in the first place by academia.
> Taking ideas seriously. With its roots in philosophy, academia genuinely values ideas for their novelty and depth rather than just their production value.
A problem: this is only true to the degree that it is true. You'd be hard pressed to even find a philosopher who can take ideas *really* seriously these days, so enamoured are Westerners with their cultural "facts".
> There are few other places where people will grapple seriously with other people’s ideas.
If it's in the course curriculum maybe...but try getting people to think about a genuinely novel idea and see how well that goes.
> Academics will actually dig into the minutiae of ideas and the why behind them. (This could be because of genuine curiosity or the desire to be right but either way ideas get explored in ways they wouldn’t otherwise.)
It could also be a nice sounding story that isn't actually true in any sort of an absolute....which is what it is.
I'd like to add another consideration. Society has a need for a transition from childhood to adulthood. Sure, sending kids off to college is an expensive way of doing that but coming of age rituals are often quite expensive in a society.
Moreover, in our modern world where people are expected to live on their own and get a job outside of a family buisness you need a period of transition where young people can learn to live independently with lower stakes not to mention the incredible social benefits offered by colleges (meeting friends and often life partners). College has performed a neat trick in getting families to pay for what is an incredibly valuable social experience under the guise of useful (which it has some of) training.
And if we have some institution that serves a similarly formative role in people's lives it is going to be the recipent of substantial support from people who went there later in life.
Ultimately, we are incredibly lucky that we managed to create an institution with substantial positive externalities to fulfill these roles. In past ages it was often the military that did this for young men. In our current world, likely that role would be taken on by corporations (eg google would have their dorms and stuff for their young engineer program as would many other companies).
It's an amazingly lucky thing that this role is filled in our society by a relatively benevolent organization with positive externalities.