[MUD-Dev] Trusting the client, encrypting data

Crosbie Fitch crosbie at cyberspaceengineers.org
Thu Dec 18 18:46:41 New Zealand Daylight Time 2003


From: Amanda Walker

> The problem is not the cryptosystem.  You can use any cryptosystem
> you want, implemented on a FIPS 140 approved hardware
> cryptographic module, whatever.  It won't matter.

Indeed.

You have to escape the centric mindset of the game provider.

  Take a step back.

  Imagine there is no game provider.

  You have a hundred players.

  99 of them are interested in playing a fair game.

  None of them know each other's human identity. There is only a
  machine identity that each player maintains only so long as it
  suits them.

  They all exchange open information about the game they're playing,
  but willingly choose not to look at or modify any information not
  in accordance with the rules of the game. They play by the rules.

  The only hope for them is that they are interested in maintaining
  play by the rules.

  Of course, they can always change the rules, democratically
  perhaps, but there must be an interest or realisation that the
  game is only fun if they agree to follow the same set of rules.

At this point it should be clear that to hide any information makes
little sense. Everyone trusts everyone else already. To hide
something indicates a lack of trust. In any case, hiding something
just adds a minor one time cost to the effort involved in undoing
the hiding. It's like sending someone a parcel with "Do not open
until Xmas" written on it. If you're into the rules of the game you
won't open it until the right time, but the wrapping paper is a
minor barrier, given you have to be able to open it up eventually
anyway.

What 99 of the players really want to do is know in fairly good time
when 1 player has decided not to play by the rules.

If there are 99 computers with 'cheat detection' analysis programs
running, then it doesn't matter if the 1 cheat disables
theirs. There will be enough CPUs whirring away in the background
analysising all the players' behaviour, and as soon as a player
consistently behaves in a way that reveals they have unfair
advantage (not just that they're a good player), then the cheat can
be ditched. Only to return as player 101 or 125 - who knows?

Perhaps in the same way that Open Source programs may be considered
to have fewer back-doors in them than proprietary programs, the
integrity of game rules may be better achieved by processes that can
operate in the open rather than those that rely on encryption.

A cheat faces a similar kind of problem to the police with an
undercover agent working within a drugs cartel, if one wishes to
remain playing, even with inside information, one still has to take
pains to engineer situations where one can take remedial action such
that the cartel doesn't suspect they've been infiltrated.

With games I expect the trick is to make it take a large amount of
effort for the cheat to operate without triggering the 'automatic
cheat detectors'.

Conversely to cheat detectors, there's also the trustworthiness
detectors.  Players who've built up a good reputation with other
players, have a long history, no evidence of cheating, and can be
relied upon to be interested in fair play. One can then be more
suspicious of the players with poor reputations, and more forgiving
of players with good reputations.

I like to believe that players are the friend (well, most of
them). If you like to believe that players are the enemy, then
perhaps I can only suggest you look at organised crime or
terrorism. Each criminal is a priori suspect, nevertheless, the
organisation can only arise based on some degree of trust,
reputation, and integrity checking. Reward according to reputation,
not according to how well you cheat.

Ok, I admit it. I waved my hands.

Or did I?
_______________________________________________
MUD-Dev mailing list
MUD-Dev at kanga.nu
https://www.kanga.nu/lists/listinfo/mud-dev



More information about the MUD-Dev mailing list