Friday, May 4, 2018

What is the game university for?

I did not go to game design school -- not that I had a choice, since only a handful of game design programs existed when I attended college. Like many in the industry, I'm "self-taught" -- which is to say, I relied on informal learning from a network of creative communities and random online tutorials. Today, I teach in one of the better-rated and better-funded game design programs in the world. With my self-taught background, I'm often suspicious of the idea of formal technical education. So here's my experience with teaching game development for the past 5-6 years:

There's often very loud implications from students, parents, and industry developers that we, as universities, are never doing enough to prepare students for the "real world". This criticism exists alongside the skyrocketing cost of US higher education, the ivory tower elitism of academia, and a societal shift toward privileging "practical" "hard" skills like science and engineering, instead of "useless" "soft" skills like literature or ethics. This anxiety is understandable, but it also plays into a very politically conservative vision of universities: that we exist only to train a productive and compliant workforce.

Danette Beatty recently tweeted something that seems very reasonable and actionable, and her thoughtful thread started a long important conversation on generalism vs specialization and how we ought to teach game development... but a lot of people don't read beyond the first tweet in a Twitter thread so I'd like to get into why "set your students up with the skills to actually get jobs that are in demand in the industry" has been complicated for me as a game dev teacher.

I'm going to define a generalist as "someone who has basic competency in diverse skills", and thus can execute an entire project themselves. They may not be the best artist or programmer or designer -- but they can make basic art assets, prototype game systems with code, and communicate an intended experience at a presentation or meeting. Generalists often have specializations, but they're also not scared of picking up new skills or filling-in for another role.

Can the university (or industry) predict what will be relevant tools / workflows in 5-10 years? An entire generation of students learned Torque3D, Flash, and Macromedia Director. How marketable are those skills now? Many design departments have learned painfully, from experience, that teaching too closely to the tools is usually not good for the students.

Different parts of the world have different jobs. In NYC, there's basically one AAA studio here, and most indie studios staff a few people at most. I know many students who go on to do games-adjacent work in advertising / journalism / medical / education / film and TV, which often seek generalist interactive developers. These other industries don't know or care about how you profiled UE4's GPU memory stats, or how you can sculpt amazing rocks in ZBrush... they just want you to make an app or game, like some sort of general contractor.

(To be clear: we're happy when our students enter AAA, but we're also proud of students when they find any other job or figure out any other living. If games are art / culture, then that means studying game design should be generally useful for a life that's adjacent-to-games or outside-of-games too. Must all English majors become novelists and poets? Well, I didn't.)

Also, why isn't the game industry generalizing its junior positions / making itself more accessible? It's fun to blame schools for everything, but maybe some of the fault here is also with studios. If the game industry can't fill its absurd "temporary contract: junior graphics programmer with 20 years of C++ experience and 4 shipped titles" openings then maybe it should change how it hires for junior positions, and adjust its studio structure? Instead of expecting colleges to completely train its workforce, maybe the industry should invest much more in outreach and training. In most industries, this is the baseline expectation: entry-level hires will be trained and mentored on the job. In exchange for that investment, the hire stays at that company and makes it worth their while. But for that to work, I guess you'd need an industry with low turnover and low burn-out though. Some question whether the game industry can even support its existing specialists.

Games education is not a monolith!!! Each school should be allowed to specialize in their own vision of game design. At NYU Game Center, we do not train C++ engine programmers nor 3D gun artists, and we teach design-first and generalist fluency. Other games programs at USC, UCSC, UCLA, CMU, SCAD, Digipen, Guildhall, LCC, Concordia, ENJMIN, KADK, ITU Copenhagen, TH Koeln, ZHdK, Uppsala, RMIT, etc. all have different focuses too. If you want to critique post-secondary games education, that's great and we want to hear your feedback, but try to be more specific about who you mean.

What is the purpose of post-secondary education? By the way, people can't even agree on this basic question. Historically, universities started as a way to train clergy / scholars / government officials. In the US, we developed a tradition of Deweyian "liberal arts" meant to prepare well-rounded citizens to critically engage in democracy, as generalists in life itself! And Paulo Freire famously argued that education should nurture revolutionary political consciousness. How does that fit with other academic priorities like athletics programs, military aerospace research, and teaching teens how to code? You could also argue that "college" functions mostly as a rite of passage for entry into the upper middle class / elite, where the main goal is to make the "right kind" of friends and social networks, and it doesn't even matter what you study.

The real world does not consider game development to be part of the real world. Practical technical skills in game development are considered impractical by most of society. How does this prepare students for the real world? Most would argue that "training students in technical skills for a niche hyper-competitive industry with very few accessible junior positions, while industry practice and trends drastically change every 5-10 years" is definitely not the real world. This 99% of people would also think teaching game development is obscenely irresponsible of us as educators, and we should stop trying to push game design as a distinct discipline.

Of course we think the naysayers are wrong, and we're still trying to listen to the game industry's needs. However, we still need to do it our own way -- a way that balances academic / institutional norms and dreams of a better future, while also hopefully helping students live their best possible life... a life that revolves around much more than just the game industry's current demands for specific job openings scattered across less than a dozen cities around the world.

It's complicated, there are many obligations in many directions, and the best course of action is more ambiguous than you'd think.

TLDR: every school worries about this a lot

Again, I just wanted to speak to my own experience and perspective as someone who teaches game development. I also encourage you to check out Brendan Keogh's take as well as Innes McKendrick's thread too, for more on the idea of industry placement / what generalism means.