Recall the Gameboy print ad? Released back when most teenagers still read magazines – a skeleton clutching a game console with the warning “Don’t forget to eat.” Electronic game playing had achieved the status of an addiction, so compelling one was in danger of forgetting basic needs. In my family it wasn’t quite that bad. At numerous times I had to disable my son’s computer by removing the keyboard when he refused to cease gaming. However, when computer use was unrestricted, my son would arise at noon, grab a stash of chips, raisin bread and orange juice, and then hunker down in front of his machine all day. He might wander into the kitchen at eight thirty in the evening, asking when dinner would be ready, only to learn that he had ignored the call to come and eat two hours earlier.
Curious, I asked what drove him to play computer games for hours. He claimed to be building a reputation as the leader of a StarCraft clan. He was deluding himself, I thought. Who is going to be impressed by that? But I just nodded as he continued to defend how this demonstrated leadership ability. Then, he wrote about this experience in his college essay, pointing out that a number of universities had competing StarCraft teams – UC Berkeley even had a class designed around the game.[1] Due in part to family rules, my son spent more time with homework than he did playing games and did manage to get a modest engineering scholarship at a local university. He is currently working towards a degree with the goal of game development, hoping to turn a obsession into a career.
Competition against others becomes a major force in most on-line games. According to my son, most gamers feel that if you are not playing against a live opponent, you’re not really playing a game. What happens to those that lose this competition, when games are required for classes? When it comes to games for learning everyone wants to know if they can be designed to reproduce the addictive qualities of on-line games. I want to know if it is even wise to do so.