A Better Way to Prevent Cheating for Online Games

Forrest Smith has written an article about the problem of rampant cheating (http://altdevblogaday.com/2012/04/02/extravagant-cheating-via-direct-x/) on PC games. I’ve wanted to address this topic because it online cheating has affected my gaming experience in the past (although not any longer because programming has taken up most, if not all, of my gaming time). Cheating on PC games can be seen as a classic Black Hat vs. White Hat battle that never ends. Forrest mentions data analysis as a tool to measure the likelihood of a player using one or more cheats to gain an unfair advantage (which should be noticeable in terms of raw game data). I’d like to add a few things.

I agree that data analysis is probably the best way to detect cheaters. This is because it is possible to detect players who “toggle” subtle cheats that are difficult to notice (e.g., realistic aimbot, or wallhack) — these players use these cheats for 2 minutes then turn them off, and then on for another 2 minutes, then off again, and so on. While it would be difficult for humans to detect such players, computers, with enough data points, could easily see consistent, significant fluctuations in game data and raise a red flag for further analysis by a human.

Though data analysis is excellent, I think that the amount of computation needed (and also the amount of raw data needed) to automatically (and intelligently) analyze every player, in real time, is beyond the current state of technology, or at least economic feasibility. Even if trusted supercomputers could crunch player data, no game company is going to throw in thousands, if not millions, of dollars just to check for cheaters — especially if the game is over two or three years old.

The best way to prevent cheating is to use a network of trust. This is the magic phrase that comes up over and over again for those who are engaged in real security work. Public key cryptography is an excellent example. You could easily have a server that uses the current state-of-the-art public-key cryptography to ensure that, for example, only players who share at least 2 other players in their trusted keyring can log in to play. The effect of this is that you are simulating real human behavior — people tend to avoid taking risks (in this case, the possibility of a ruined gaming experience because of cheating) with complete strangers. You could even enforce draconian measures, such as, “if a trusted person has been confirmed as a cheater, all other associated players will have their trust level reduced.” Such a rule would severely dampen the spread of new cheats, as the banned player will be prevented from playing on trusted servers, and all of his real-life friends who added him to their keyring will have their “trustworthiness” level reduced. In short, real-life peer pressure would dramatically affect the game.

Sadly, the only problem with the network of trust is that it prevents millions of players from playing against each other immediately, because, let’s face it, the vast majority of online players one encounters are complete strangers. Still, I think that this method is a viable option; I am 99.999999% sure that there are thousands, if not tens of thousands, of players out there who would be willing to sacrifice maxed-out (full player) servers in order to enjoy hours of trusted gaming that guarantees a system that no one in the world, not even world governments, is able to hack.

UPDATE April 21, 2012: Fix grammar/typos.

Advertisements

4 thoughts on “A Better Way to Prevent Cheating for Online Games

  1. A network of trust is only a deincentive to cheating when cheating can be detected.

    Cheating involving player actions within the game (“hacks”) can be detected: If a player walks through a wall, suddenly teleports to the other side of the map or gains half of their health back with no apparent reason, a cheater can recognise incoming input as being against the rules and hence even automatically punish the offender.

    Cheating involving automation cannot be detected: Let’s say I want to be able to easily sneak behind enemies in an FPS game. Try to stop me modifying my X window server so that colours only present in player character textures show up in fluerescent green instead. Try to stop me scanning RAM for player location data. Try to stop me, hell, creating a robot with a webcam pointed at the screen, wired to an image processing algo, wired to actuators poised over my laptop keys and to a repurposed CNC machine that moves the mouse with millimeter accuracy. Anything you give me, I can repurpose to evil.

    You can of course make things difficult to automate, at the expense of dev time and innocent-user experience, but never impossible. The balance is a tough call sometimes.

    This problem has another edge case — the opposite edge: What about a game that encouraged cheating? Players could go VIMsane with keyboard macros, make use of built-in image processing utilities, mess with everything except the server code and share their mods online. The server can stop actual game code hacks (like walking through walls or teleporting around) through simply restricting incoming data to allowable values before storing it for transmission to other players. Machine-aided cyborg players with guns! Impossible reflexes countering each other at lightning speed! Team-wide P2P distribution of AI perceptual data and dynamic collaborative decision-making! Visualisations! Shapow! Kaboom! Yeah! (3D graphics, why you so difficult to learn. I want to make this game already…)

  2. Hi Antaku!

    > Cheating involving automation cannot be detected: Let’s say I want to be able to easily sneak behind enemies in an FPS game. Try to stop me modifying my X window server so that colours only present in player character textures show up in fluerescent green instead. Try to stop me scanning RAM for player location data.

    Of course, the trusted server can only “see” the incoming data from the client and only validate this data, and this has nothing to do with what the cheater can do in terms of modifying “closed data”, so to speak, on this own machine. The network of trust still trumps this though, I believe, because the majority of gamers — I would say at least 70%, or even 80%-90% — in any game, do not cheat. In other words, the “good” network of trust among non-cheaters will always outnumber the network of trust among cheaters. Moreover, there is no such thing as a network of trust among cheaters — I have never seen two wallhackers or two aimbotters friend up and play against each other in good spirit. Now, if such hackers are always on the same team and always play together, then we have a serious problem. But even if they always play together and establish a network of trust to try to fool others, I seriously doubt that their network will grow very large, if at all. Recall that if a cheater has been revealed, *all* players in that network of trust will have their “trusted” score reduced. It’s not like a simple “online friend” that you add after meeting them for 5 minutes — you are acknowledging that you are risking your own trust level by accepting them into your network. If you see a player who is extremely good, almost elite, then you will be that much more hesitant to add them to your network. I think that people will even take interpersonal skills into account and will try to get to know them more as a person, not just a player in the game, before deciding to add them to their network. Also, keep in mind that your network of trust will have several different levels of trust, e.g., “somewhat trustworthy”, “very trustworthy”, “fully trusted”, etc. And all of these different values would be taken into account when determining the impact/power/ramifications of your network of trust.

    > You can of course make things difficult to automate, at the expense of dev time and innocent-user experience, but never impossible.

    Yes, of course — this is the classic black hat vs. white hat scenario — the battle never ends.

    > This problem has another edge case — the opposite edge: What about a game that encouraged cheating? Players could go VIMsane with keyboard macros, make use of built-in image processing utilities, mess with everything except the server code and share their mods online. The server can stop actual game code hacks (like walking through walls or teleporting around) through simply restricting incoming data to allowable values before storing it for transmission to other players.

    Hmm, well, I am not sure if human players would play such a game. If they are putting in almost zero input, and letting the computer do all the work, then really what they are doing is employing their own AI engine to play in their stead. There have been tons of online AI tournaments (every year they have one where programmers design AIs to battle against each other — but the name escapes me), and it’s extremely popular indeed. However, even in that game (or any game!) cheaters will always be frowned upon. By definition, a game is a sequence of states where the transition from one state to the next must obey a set of rules; if you have a game where the rules may be changed at any time (a “game” that encouraged cheating, as you put it), then there is no longer a game. Of course, you can have a game that changes its own rules (e.g., in chess, where a Pawn suddenly becomes a Queen upon reaching the 8th rank), but such possible changes of the rule must be known in advance; otherwise, the player loses the ability to assess the state of the game, as well as the ability to exercise informed, meaningful decision-making to try to improve his position in the game (meaningful decision-making, which gives some sort of reward, is probably the quintessential element of any game worth playing).

  3. it’s carolyn here. attached is my twitter. my twitter has all my programming tweets. that’s the only thing i think about 24/7 so you can check out my OCD there.

    BTW it’s great to see you talking about arch, as i have been wanting to compile a gentoo and arch from scratch for months now.

    also i became obsessed with cryptography too, for a bit, before i had to shelf away that obsession.

Comments are closed.