I am a zillennial. Born on the border of Millennials and Generation Z; too soon to get TikTok, too late to qualify for a pandemic stimulus check. And look, someone has to say it: computer labs are antiquated.
My early years at school involved moving to and from computer labs throughout the week, my class following the painted lines of our elementary school and struggling to stay quiet like children do. But by the time I graduated from high school in 2016, the role of the computer lab had diminished, its uses relegated to standardized testing and Adobe or Autodesk subscriptions. The history of the computer lab is tied to gaming as much as it is to personal computing, and this transition has impacted not only what we play, but also how we play.
I used my first school computer in kindergarten, and individual desktop computers would remain a standard feature in my classrooms. What I did on that first desktop I usually don’t remember. There were educational games and lots of coloring (I loved that). Then, in my early digital years in elementary school, computer games became something we played after school. world of sand, a kind of chemistry simulation where pixelated elements stream from the top of the screen and interact with each other in the Petri dish in your desktop window, was a favorite. space invaders was a mainstay, and in fifth grade we took turns playing cube runner and other Flash games over lunch in our classroom which then had three desktop computers in the back.
Smartphones weren’t quite a thing yet, mind you. My fifth-grade teacher had a recently released iPhone that didn’t have a model number, but my classmates and I had flip-and-slide-screen cell phones, if any. What most of us owned were portable game consoles, the Nintendo DS, a popular pastime when traveling to summer camp. It was the time of diamond and pearlto share a mario Kart DS cartridge via DS Download Play, from GameSharks at Walmart. Although the DS was not quite common yet. It was expensive enough that only kids who already played games had the handheld – and unlike home consoles, they were less likely to be torn between the normative lines of who was considered a “gamer”. or, as it may be, a “player”.
In the computer lab, however, anyone could play. And when I went to college in 2009, I often found myself in the library when there was nothing else to do. Children gathered around tables with their friends; others sat in a corner of the library where a row of computers wrapped around itself. The games, and the unblocked sites where you could find them, were by sight and ear. I want to point out that this was a communal play space, where people saw what you were playing and that you play. They were almost all browser-based games that mostly involved puzzles and platforming. However, the more tech-savvy among us have figured out how to set up multiplayer matches in games like Flash-based gaming. scorched earth clone tanks and the unofficial 3D light cycle game Advanced Armagetron through LAN connections.
Nostalgically (and ignoring all the other parts of college), it’s an almost idyllic gaming memory. Long before any of us knew what ludonarrative dissonance was, when Minecraft was an incipient sandbox, and “the cloud” was difficult to understand. But in seventh grade, I stopped going to the library in the morning.
Sometime in sixth grade I got an iPod Touch. A few of my friends had these at the time, first- and second-generation devices with accelerometers, headphone jacks, and 8GB of storage. These became portable windows to the internet and portable gaming machines that we could play in the morning while waiting in a hallway, or for five minutes before class, or while waiting for the last bell to ring, or on the bus during of an excursion.
Many of my classmates’ families had more or less fancy desktop computers at home, but my house didn’t have Wi-Fi until I was in eighth grade, which meant I couldn’t what to download Crazy Penguin Catapult or the classic indie shmup space deadbeef home via a wired connection to my family’s office. The first Wi-Fi connections at school, the public library or the mall were not going to be enough.
Eventually the iPod Touch became mainstream and the kids at my college even started bringing iPhones to class. And because of their versatile nature, these devices were embraced by people who, like in the computer lab, weren’t already playing games. While Apple and an endless number of former industry executives with mobile publisher startups position the device as a threat to major publishers like Sony and Nintendo, that competition never really materialized. That’s partly because it was never about “gamers”, but all other people who play games anyway. And much like the arrival of consoles in the home after the decline of arcades, the iPod Touch saw gaming move once again from a communal environment – the computer lab – to an individual experience on school buses or computers. algebra class stuffed discreetly into backpacks.
The iPod Touch, described by Kotaku‘s Ari Notis as the “herald” of the mobile gaming revolution, presaged the explosion of the medium over the past decade. But it was ultimately a transitional device. Over the next few years, the platform would quickly be replaced by the even more versatile smartphone, and in my school we frequent computer rooms less and less. Personal computers were just an expectation for homework when I entered high school, and my state even made virtual school lessons a required part of our curriculum.
As the computer lab was lost, so were many of its games. Forgotten websites, closed servers, unsupported technology. The same thing would happen to the iPod Touch library when the App Store removed support for 32-bit applications. In May, Apple even announced that it stop production of the device. If the iPod Touch helped kill the computer lab, or at least its cultural impact on gaming, we are now a generation away from that schism. But there is an addendum to the computer lab’s role in education and gaming.
You may have understood that I grew up in an affluent region of the United States. iPhones for college kids in 2010 probably gave that. The adoption of new technologies at school and at home is strongly determined by wealth, and I was surrounded by it in the South Florida suburbs where my parents moved in the 90s.
In college, I volunteered at a youth literacy program in Parramore, Florida, a historically isolated suburb of Orlando. I worked with elementary and middle school students at the only school in the neighborhood, serving the younger kids who were still Gen Z. It was the era of the rise of the Nintendo Switch, of fortnite, where for many people, a laptop may have replaced both a desktop computer and perhaps even a TV at home.
I worked across from their school and visited the cafeteria after class one day to pick up our students. Orlando’s public school system couldn’t be more different than back home. I don’t know what their computer labs were like, if their school still has them or had them, but a lot of the students had touchscreen laptops provided by the school, they did their homework, texted each other and, of course, playing games (or watching people play games, so to speak). Huddled around cafeteria tables, technicians provided access to games not available elsewhere. And while these were even more individualized experiences, these students could now easily play with each other online, stay in touch at home, and maybe even find a virtual connection in the comment sections and the cats.
In this sense, the spirit of the computer lab lives on.