Some Thoughts on “Academic Training”


I’ve long said that university education starts to make a lot more sense if you look at it as a precursor to academic training. Historically, there have really been two major types of undergraduate university training, in my opinion: there was the liberal arts type of education, which was meant to turn rich people into cultured members of society (several of whom then went on to pursue academic training and scholarly activities, because they were rich and could afford to do so), and the more specialized type, which is meant to make the student literate enough in the major foundational ideas of the field to pursue additional training at the graduate level. If I remember correctly, universities functioning more like businesses is relatively new, the idea of university being a place for vocational training is relatively new, and the idea that most adults should get a university degree to be employable is also relatively new. (Also of interest: see “credential inflation.”)

Undergraduate courses at a typical research university are taught either by academic researchers, who on average have no training or interest in pedagogy and little to no non-academic work experience, or by academically trained lecturers, who on average, may be interested in pedagogy (but are almost never trained in it) and typically have dubious (if any) non-academic work experience. The fact that so few of the people teaching courses in undergrad have an industry background seems to hail back to this idea of undergrad being a training ground for grad school, and thus for “academic training”. (That being said, the core sequences of mandatory courses I’ve experienced in my program do not seem to be particularly good training for grad school, and I also happen to think they are pretty terrible as training for the real world, which sort of implies that I think my degree has failed as a whole… but that’s a subject for a whole other blog post.) In most cases education at the graduate level literally is academic training, with the exception of professional or other sorts of purely course-based degrees. Unfortunately, one of the things I am starting to realize about academic training is that almost everything about it seems designed to be terrifyingly unclear, and I have no idea whether this sort of thing is meant to be seen as a feature or a bug.

I am neither a graduate student nor a researcher, but I have participated in undergraduate research opportunities and completed graduate level courses, and a defining characteristic of all of those experiences has been having no idea what is going on. I feel like I’m always flailing around in confusion, like I’m supposed to just absorb what I need to know by osmosis, and I wonder if that’s normal, if the admissions process and the courses and the training and the research and everything around grad school are just intended to filter for intense autodidacts who can handle a large amount of ambiguity. To an uninitiated person like me, academic activities feel largely like swimming in a pit of the unknown with only vague guidance, if any at all, and this is true even in relatively low stakes activities such as courses.

To illustrate this, let’s talk about the final course project from the (graduate level) data science algorithms course I took in the fall. The project requirements were incredibly vague: pick some research paper to investigate (how?), write a six page research report about it (in what way?), and present a 15 minute seminar to the class (at what level of detail?). Sure, on the surface it sounds reasonable, if you totally ignore the fact that the average computer science undergraduate at my university finishes their degree without ever writing a formal report or essay and without ever doing a solo presentation. Most of us finish our degrees with no idea what the expectations are, but when we become grad students we’re all of a sudden supposed to intuit what a good project looks like and pull it off with a high level of proficiency.

What frustrates me the most about that course project is that even after attending the professor’s office hours and asking him several questions about it, I still had close to no idea what the expectations were, so I decided to do my best and move on. Clearly, most of the other students in the course also had no idea what the expected standard or the expectations were, since we all gave wildly different presentations. But it turns out that the professor did, in fact, have a good idea of what he wanted, since the rubric he used to grade us was quite detailed and enlightening, and since he also included a video on how to give good talks with the rubric and our feedback. I really would have liked to see those before starting the project, not after the completion of the course. This sort of thing makes me suspect that at least some of the vagueness was intentional, even if it was due to subconscious thoughts like “they’re grad students – they should be able to figure it out.” But I don’t think everyone in the course figured it out, and presentations are hard even under the best of circumstances. Knowing a bit more about the rubric and what was expected of us in the presentations would have set everyone up for success.

Of course, I totally get that leaving projects more open ended allows for more creativity on the part of the students; it allows for pleasant surprises and stops students from self-censoring their work to fit their narrow interpretation of what the professor has said. It shifts the focus away from maximizing grades (which, sadly, is what I feel the undergrad experience has devolved into) towards the discovery and learning process, which is what you probably want in a degree that is training people to eventually do independent research. And it’s more fun this way. There is something very freeing about being able to take your work in whatever direction you want, and I love that. While this comes with the caveat that there are boundaries you have to stay within (and in courses like this, you don’t quite know what those boundaries even are), as long as you produce good work, profs are usually happy to let you bend the rules. In fact, in my experience, the better the end product you come up with, the more you can get away with bending (or even ignoring, at times) the rules and requirements.

The problem I have with open-endedness executed in this way is that I feel like there’s some subtle elitism snuck in there too. You can leave a project open-ended while still giving students some guidance, but my professor didn’t do this. It feels like there’s this attitude of expecting people to prove their competence before any effort is ever spent on helping or properly advising them. We all did the project based on our personal experience and understanding of how to produce a report and presentation, and our grades reflected the level of ability we entered the class with. We only got advice, feedback, and resources when it was already too late to improve. This is what I mean when I say that I wonder if academic training is designed to filter for intense autodidacts who can attain the required level of competence largely on their own.

The research supervisor I had last summer was extremely hands off. He was very kind, and very available, and I learned a lot of new math from him, but when it came to the nuts and bolts of how to actually do my research, that was entirely up to me to figure out. I complained about this to a senior academic once, who responded with something along the lines of “if we knew what we were doing, it wouldn’t be research!” (Classic response from an academic, lol.) I find comments like this frustrating because while they’re technically true, they manage to completely miss the point. Of course research is about wading into the unknown! I’m not complaining about hanging out in the unknown: I’m complaining about drifting around in the unknown with no tools or equipment or direction. I spent the summer comparing myself to a prodigy who was, according to my professor, already performing higher than the level of most grad students (he said something about how he usually has to handhold grad students a little bit and walk them through things, but this guy figured it out on his own!), which sucked, because no matter what you do, you’re going to look stupid next to a prodigy. It felt like it was bad to have to ask how to do things, since clearly the idea was that you should just magically have good intuition. And sometimes, when I did ask how to do things, I got non-answers like “I’m sure you don’t need my help for this.” (Which was a frustrating answer, since I did need the help… but it’s fine, I found help in other ways. Sort of.)

I did end up figuring it out, I think. I’ve worked on so many independent projects that I’m pretty used to having to learn things on my own. I’m an artist; we’re insane autodidacts almost by definition. I think there are a lot of similarities between academic behaviours and artist behaviours: the need for curiosity, the constant drive to learn and innovate, the need for self-discipline and initiative. But one thing I think academic spaces do that I don’t think artist spaces do in quite the same way, or at the very least, with quite the same amount of zeal, is ruthlessly filtering out people they don’t see as having the innate skill and work ethic and personality type to succeed. What I guess I’m saying, in a sense, is that academia is extremely inaccessible to people who don’t have a ton of endurance, already highly developed skills, and the ability to navigate mountains of hidden curricula. I know I’m not the first person to point this out; there are all sorts of conversations happening about the inaccessibility of academia. But now I’ve had first hand experience, and it is an issue that is sort of on my mind.

As I write this, I’m extremely conflicted about what my actual opinion on this is. I realize that I’ve been extremely ungenerous to academics (and to the people I just talked about) in my characterization thus far. For what it’s worth, I think I’ve done fine at surviving my “academic training” thus far. I think I killed it in my course project, and it seems like my supervisor thought my research project went okay. It’s not like I’m saying these things because I don’t think I can succeed in the current system, but I do think people are being screwed over by this. Like in any competitive field, it’s normal for the system to be designed so that some people can’t make it. The question then becomes whether there are people being screwed over by this who shouldn’t be.

While I’ve just complained about academia being inaccessible and this sort of “sink or swim” mentality being bad for students, I’m not entirely convinced that academia being inaccessible is a bad thing. The goal of academic training is to produce highly competent experts, but not everyone can be an expert, and not everyone wants to be an expert. Cutting edge research literally means going as far as possible in your chosen subarea of your field. Why should a professor invest time, money, and resources into a weaker student, who is probably unlikely to make it, instead of investing in a skilled and motivated student who has already proven they can figure things out on their own? If you give two people the same resources, the more competent person will almost certainly be able to do more with them. If the goal is to produce the most competent experts, investing only in those who have already proven themselves is a reasonable strategy, especially since time, money, and energy are finite (and often quite limited) resources.

Another thing to consider is that research is not accessible by definition. You are going to be doing new things that no one understands yet, by definition. To do that, you are going to have to look through a whole bunch of documents very few people understand, possibly replicate their results, possibly invest new techniques, and so on. Then, you’re going to have to convince a bunch of other experts that whatever results you came up with aren’t completely insane. If you can’t even handle trying to intuit what a good report and presentation look like, will you really be able to handle the challenges of research?

There is also, I think, something to be said for building your own tools rather than using tools someone else handed you – sure, it takes longer, but they’re custom-made and they presumably fit you better, and it guarantees that you actually understand how the tools work. I think this, right here, is the best argument for a very hands off advising style. Assuming you manage to learn things, the lessons stick, and you feel complete ownership over your learning. You get to know yourself, and what works for you, and you don’t need to work on building independence later on, because you’re already working on it. I learned so much about time management, project management, problem solving, programming, math, and more from my research internship, all on my own, because I had no choice, and that experience has served me way more than I ever expected it to. As much as I complained about it, and while I might have made more concrete research progress if I had been handheld a bit more, I don’t know if I would have developed as many of those skills in such depth. I definitely wouldn’t have gotten the chance to explore my own ideas and make my own conclusions as much. As a tactic for forcing me to learn, it was brutally effective.

The flipside is that amateur toolsmiths often make very flawed tools, which is why being too hands off is probably bad. It’s better to know that the tools need to be fixed sooner rather than later – more importantly, it’s crucial to know why they need to be fixed, and that “why” knowledge is probably much more efficient to get from a supervisor than to spend months figuring out on your own, possibly resulting in wasted time and habits one needs to unlearn. (I can think of an example of that in my own experience – but I also wonder if that time was really wasted, since now I understand that problem much more in depth. See what I mean?? There are pros and cons all the way down.)

So while I’m not convinced that the system we currently have isn’t working as intended, the concern I have is that we’re excluding a whole group of people who could grow into amazing scholars if they were given just a little bit more guidance at the start. Self-sufficiency is a skill, which means it can be learned, but I don’t think scarring people by throwing them into the deep end and seeing who swims is always the best way to cultivate that skill in students. The current state of undergraduate education is that there is a ton of scaffolding built in all over the place – undergraduate courses give students lots of guidance and instructions on how to do things (or, at least, a lot of mine have), especially in required courses. Regardless of what we might think about whether or not that’s a good idea, the reality is that most people need some time to adjust to no longer having scaffolding. I feel like once you hit grad courses, the whole support system has been torn down and thrown away, and we expect students to adjust to that immediately.

Would it really kill us if that scaffolding were removed more slowly? If, instead of throwing it away all at once, we tore it down in stages? It seems to me like we could just allow students who don’t need the scaffolding to ignore it. And I feel like making this sort of change would lead to more diversity in the types of people who end up as academics, which has to be a good thing.

But what do I know, I’m just an undergrad.