Fake video game physics

Before I did physics, I made video games. That "before" time is rapidly receding away from us. For context, I was about half as many years old. My tool of choice in those dial-up years was an engine called Game Maker, which offers both drag-and-drop logic and an internal programming language. My games were mostly prototypes: sometimes a few levels, sometimes just a core mechanic or a visual effect. There was a heavy use of either stock or misappropriated graphics and sounds. Those early games were only shown to friends or in an online community, and many are by now lost to digital amnesia and erosion.

Video game development is now a very respectable, professional, well-paid job for those who want to do it full-time. At entry level, it is often used as a way to learn about coding, algorithms, and basics of computer science. To integrate logic, images, and sounds into an interactive experience is exhilarating. Certainly because of my experience with Game Maker, I didn't have much trouble picking up more "serious" programming languages later on. But I want to discuss instead of what I learned (or rather ignorantly rediscovered) about physics by making my games.

Science and games have fundamentally different selective pressures. Science as a human practice is directly linked to measurable reality. In science, we attempt to create theories and match them with experiments and data to get an agreement. But in games, we want to design something that looks realistic without necessarily being real. More often than not, we want something that is cool, and to hell with realism. At the same time, unlike animated movies, games need to run the logic and the graphics in real time, which sets the resource limitation on our design ideas.

One of the first problems we face in the physics classroom is projectile motion. We are taught that any thrown object moves along a parabola. So naturally, when your game character jumps, or throws something, they need to fly on a parabola, right? When you set out to implement that logic in game code, you inadvertently make a numerical integration scheme (Euler method for the win) and figure out how to keep track of collisions with the ground and walls. But as you tinker with the parameters such as initial velocity and gravitational acceleration, you realize that different motion creates a very different feel for the game, especially when the movement is triggered by the player. A jump can feel sharp or floaty, depending on how much time the character spends at the top of the arc and how much control you have in the air. More profound game designers even figured out that a jumping character doesn't even need to follow a parabola, and hard-coded other trajectories in service of the feeling. The character movement is bound by the rules, but only those inside the screen, not outside of it.

When coding video games, one needs to deal with a lot of applied geometry, such as checking for collisions and computing angles and intersections. Multiple times I had to bug my mom about sines and cosines, since my school program hadn't gotten there yet. The tour de force of geometry was a lighting engine that computed the shadows from multiple objects by considering the distances and relative heights of the objects and the light sources: street lamps, handheld flashlights, glare of gunshots. The light fields were overlayed on top of each other and then served as a filter for looking at the objects underneath. Beyond geometry, this taught me a healthy dose of ray optics, blurring, and color mixing.

Next to the exact algorithmic computations of geometry, I started playing with pseudorandom number generation. I wasn't familiar with mathematical modeling of random events. I didn't even know that there are different probability distributions - just whatever the rand() function spits out. But players can't actually tell the difference between distributions if you tweak them carefully enough. For example, that pretty rendered light of mine - what if the lamp is flickering? I let the lamp be in two states: on and off. Upon entering a state, I set a countdown timer to a constant plus a random number, after which it will switch state and set the timer anew. Mathematically, this is very close to telegraph noise, but the human brain can't pick up a definite periodic on/off pattern, and the flickering light gets an emergent emotional property - it becomes spooky. Incidentally, earlier this week the world learned that flickering lights in video games are not random at all - precisely because someone stared at them long enough to pick up the longer periodic pattern.

After temporal stochasticity, I got interested in the spatial one, perhaps anticipating my later interest in statistical mechanics. From the spread of bullets leaving one's gun, to particle effects of sparks and explosions, to procedural generation of backgrounds or whole levels, slightly varying patterns help stretching out the limited graphical assets a bit longer. To design these features, the developer needs to tweak the probability distributions of all the parameters, again, until it looks cool. Usually one spawns several particles, which thus behave as iid variables (independent and identically distributed). This sample of random particles from the designed distribution thus readily illuminates the properties of the distribution itself.

At one point I needed to make a round cloud of particles that are created at a random radial distance from the center. The rand() function in my code gives uniformly distributed random numbers, so to make the distribution broader, I passed its output to an inverted cumulative distribution function (as I discovered much later, this is actually a standard method for sampling simple distributions). I just knew that the function needs to map from [0,1] to [0,inf), and many options were available: log, tan, or something hyperbolic. When I implemented this sampling, I noticed that most particles are concentrated around the center, but some outliers popped up quite far away. In other words, the variance of distribution was much larger than its mean. By being naive about the functions, I accidentally discovered fat-tail probability distributions years before I heard about them in science.

Probably the most inventive part of video game coding is Artificial Intelligence (AI). Contrary to what's in the news on all the neural networks and machine learning, video game characters typically use very simple, hand-curated algorithms to navigate space. The characters need to appear smart rather than be smart. Countless fan experiments on existing games show that outsmarting the AI or finding its shortcomings is very easy. In my first games, the navigation was dead simple - you run straight towards the player and bite. Got stuck in a wall? Oh, that's just too bad. Then a next version of Game Maker arrived and introduced simple local rules for obstacle avoidance, which instantly made enemies much more wicked in their pursuit, at the expense of lots of computation.

But what if the enemies are dumb but many? A prototypical example is a zombie shooter. Running a copy of the pathfinding algorithm for every zombie would quickly overwhelm the computer, so I needed to somehow exploit the redundancy of information. I ended up coding an invisible "pheromone" field emitted by the player. Each zombie referred to the field at their location, computed a crude pheromone gradient, and ran up the slope. This doesn't just parody the stochastic gradient descent in optimization problems - but instead models the very real chemotaxis process! Bacteria are able to sense the difference in concentration of nutrients and reorient their motion in the direction of more food. Ants and other insects lay their trails in a similar way so that their colleagues can find a way. Further ants traveling on the same path reinforce it, thus creating a feedback between the behavior and the navigational map. I decided that my zombies should just follow the pheromone field without affecting it - that is, no feedback. But to make the action more spicy, I used doors as relays that reinforce the field - when a door senses a nonzero pheromone level, it emits a secondary signal around it. Once the zombies feel the player approaching from the other side of the door, they start very aggressively hitting the door until they break through and spill outside. In the heat of battle, these little programming shortcuts are hard to notice - but OH MY GOD ZOMBIES BROKE THE DOOR!

My Game Maker period ended when I started making legit physics simulations in it. Not only was I much more busy with physics competitions, but I also realized the limited precision of my rudimentary numerical methods. Making those simulations I somehow spent much more time on the user interface (buttons and sliders) than the actual simulation details. The lessons from game design were telling me to make my simulator just look cool, but my budding scientific instincts demanded more rigor. I am not sure which one won circa 2010. My mind refuses to recall whether the simulation code had any deliberate tenfold fudge factors.

It gets much easier to create video game systems once you understand some physical organizing principles behind the dynamics. Thinking about games allows one to cut through the boring part of science that has to do with experimental verification, and skip to the neat stuff with emergent dynamics. Of course, physics is but one lens that can help. Other people brought their knowledge of psychology, graphic design, sociology, music, literary theory, or any number of other disciplines. I have a humble dream to return to games one day and fuse what I learned about complex systems with what people discovered in the video game community or imported fro mother fields. I am not sure when an opening would present itself.

Previous
Previous

Quantifying equilibrium emergence

Next
Next

What Andrey Kolmogorov lets you do