08-16-2003, 02:31 AM | #1 |
Legend
Join Date: Sep 2002
Location: Mill Valley, California
Posts: 2,305
|
|
08-16-2003, 04:36 PM | #2 |
Senior Member
|
From a Sociological view point you can trust somebody to do anything: Mow your lawn, watch your kids, handle your finances, etc. But there are many different categories of actions that you can assign a value of trust to, and these categories have different means of collecting that trust. Trust in one category does not always imply trust in other categories; the babysitter I trust to watch my children is not the same person I trust to handle my investments.
Your example here is roleplaying, but what does the level of trust affect? What are the goals of the system? What other areas of trust are you considering? As it stands it could encompass everything and you'd need a PhD in Sociology to unravel it =). |
08-16-2003, 05:13 PM | #3 |
Member
Join Date: Jul 2003
Home MUD: Lusternia
Posts: 191
|
At the risk of being totally off base from what you're seeking, and assuming the purpose is purely to rate RPers, what about simply allowing every player to rank another player's RP with a static number (changeable at any time at that player's whim), i.e, rank on a scale of 1-10 how you feel John Doe RPs. Then weight the significance of that ranking by how close the players are to each other. In other words, a ranking from a member in John Doe's guild carries greater weight then a ranking from a member of a player's city which carries greater weight then a ranking from a member of an opposing city, etc. Further, you could weight the ranking even more by the player's current ranking, i.e., John Doe has an average ranking by other players of 1 so the weight he gives when ranking is pretty insignificant, compared to someone who's average ranking is 10.
Of course, this may take greater resources than worthwhile, keeping track of who ranks whom, etc., but not everyone will rank everyone else and you could always cull from the records players who are autodeleted, inactive for a set time, etc. You could also limit who is able to rank and limit who can be ranked by how many hours a player has logged (only those over 100 hours or whatever). On the plus side you'd get a ranking that is more indicative of the overall feeling of how others consider a person's RP that isn't limited to just GMs or city leaders, or whatever. |
08-16-2003, 09:40 PM | #4 |
Legend
Join Date: Sep 2002
Location: Mill Valley, California
Posts: 2,305
|
That would almost achieve the opposite of what we want as it's not only very gameable but it plays into the way people would game it: ie members of an organization rating each other highly in order to pump up their rankings. Then, once those people are all highly rated, they can start griefing their enemies by ranking them lowly en-masse.
Incidentally, we'd want ot use it for more than just RP. For instance, vetting player content without admin approval is a possible and practical use in a mud. This is a good article that is somewhat related to what I'm talking about: It's a bit long but it's well worth the read. Anyway, like I said, I'm not convinced it can be done in a reasonably non-gameable way in a mud but it'd be an incredibly valuable tool if we can figure it out. --matt |
08-16-2003, 09:43 PM | #5 |
Legend
Join Date: Sep 2002
Location: Mill Valley, California
Posts: 2,305
|
Call it a trust network or a reputation system, etc. It's really more of a mathematical thing than a sociological thing. There exists, in a community, a general consensus about someone else. The trick is to figure out how to translate that consensus into a number in a way in which any single individual or likely group of individuals can't consciously manipulate the number.
--matt |
08-16-2003, 11:24 PM | #6 |
Senior Member
|
The general consensus of a person is a social metric. Math only helps with the technical details of what your modeling, but you won't know what it is you are modeling (or if it's accurate) without some knowledge of the social interactions behind the model.
I think the general consensus is still too vague though, mainly because such a thing doesn't exist in any meaningful form. There has to be a general consensus on some *thing*, such as expertise at a certain task, moral values, security at different tasks, etc. Now if you want to model the general consensus of someone at role-playing on a scale of 1 to 10, the most accurate model is the one where people directly rate that person's work. This isn't always feasible because with a large base of things to rate you will not have the resources to properly evaluate everything. So instead you delegate this rating to another entity (e.g., Consumer Reports, movie critics), which is still another form of trust. It's all based on abstraction, and different people use this abstraction in different ways. In fact, this abstraction is quite easy to represent if you use a very simple social model with a few key elements such as the amount and type of trust you delegate to an entity, and that entity's rating of the object or action in question. Finding that value for a particular person is just a weighted average traversing the delegated links for each entity. The general consensus is a bit more complex though, as you need to take into account how much delegated trust a particular entity has in order to determine how influential that entity is. Unless we know what the ultimate goal if the network is, though, we could talk forever on creating a general value that represented none of the things you are considering. You mentioned that it shouldn't be gameable, which leads me to believe that there is some in-game effect for this value. What is that effect? Edit: Oh, also, not allowing people to consciously manipulate the number is a strange request. I know that certain actions will raise or lower my level of trust for other people, and this is most certianly a good thing within any society, as it helps keep people's actions in check. |
08-17-2003, 03:56 PM | #7 |
Senior Member
|
I'm not sure if that's just my lack of experience or something else - and I'm not 100% sure if my answer is on topic? but like..
From what I have saw in any community - not just mud - anywhere at all, any rating system where the people rate some other people they can and meet will always be inaccurate, and quite possibly it will be very inaccurate. If you could hire special "overseers", which would do nothing else but spectate whatever fields of activity you need to rate - your rating system would become much more efficient - yet it still would be far from perfect, especially if the overseers would be able to exchange their views with each other. I guess its just part of every human's brain, that inability to fairly rate others, especially when it comes to someone they personally know? |
08-17-2003, 09:00 PM | #8 |
Senior Member
|
Absolutely right. People tend to get skewed easily by logical fallacies and petty things. It's why marketing consists of more than just stating the facts about a particular product or service =).
The request puts us at a unique position though, because the_logos never requested that the value be an accurate depiction of whatever is being rated. A 'general consensus' is what people think, not necessarily the way things actually are. If you wanted the way things actually are then a general consensus is not sufficient. In that case you would probably require more expert-level rating based on some criteria. The general consensus is, however, good for finding who people think is the best role-player, because what people think is the thing it is most adept at gauging. |
08-20-2003, 02:34 PM | #9 |
Legend
Join Date: Sep 2002
Location: Mill Valley, California
Posts: 2,305
|
The only skewing comes when players consciously coordinate to manipulate a ranking. We're not concerned with any objective scale of rankings. We're only interested in judging how the community at large -actually- feels about someone regarding some aspect(s) of their behavior.
There exists some general feeling in a community about a particular aspect of a person. For instance, as an example, how good said person is at roleplaying. The idea is to translate that general feeling into a number but to do it in such a way that groups of special interests (like said person's friends or enemies) cannot significantly skew the results without co-opting a huge portion of the population. Like I said, I'm not at all convinced it can be done but it would be most useful it it can be. --matt |
08-20-2003, 02:49 PM | #10 |
Senior Member
|
Been reading what I wrote?
The more I think about it, the more it sounds like a relatively simple problem. What is wrong with a direct or delegated ranking system? |
08-20-2003, 11:52 PM | #11 |
New Member
Join Date: Jun 2002
Posts: 12
|
I am not sure of what you want but I think -any- consensus is skewed by existing relationships amongst a group.
The only thing that comes to mind that might help is a poll. When I say poll I mean a series of carefully worded questions and carefully chosen possible answers with a place where people must explain their answers. This way you get a wealth of knowledge about what a certain person thinks of another's RP. If you break down RP into chunks (emoting, ability to stay in character, background, etc.) and make voters explain why they gave a certain player a certain rating for each category then you have more information on which to judge a particular person's vote and whether you think its well justified. Polls are an art all their own and a well-designed poll takes work. Hopefully you could find something about polls in the political realm and how they word them to get meaningful answers. A badly worded poll can tell you next to nothing. I am no expert on them but I did take a course or two about polls many years ago and I recall the wording being of top importance. Hope any of this helps. |
08-21-2003, 02:25 PM | #12 |
Member
Join Date: Sep 2002
Posts: 100
|
Given your needs, I would modify the design criteria: It's more important to have the system fail in understandable (to the players) ways than to try to make it foolproof. It should be a tool to help players simulate asking their friends for recommendations and references, not a black box that they expect to make decisions for them.
I think that the key to a trust system in this context is making being trusted a highly valuable commodity to other players. Maybe other players have to pay (in-game money ;) ) for personal references or submitted-content evaluations. Suggestion: Anyone can rate anyone else in a predefined list of categories. This list would include meta-categories. Example: 1a) How I rate X's Quality of RP 1b) How much I trust X's assessment of the quality of RP of others 2a) How much I trust X to fulfil a contract 2b) How much I trust X's assessment of other's contract trustworthyness ... To simplify, all of the meta-categories could be represented as a single "Trust" of another player to make evaluations of others. Newbies would default to not having ratings on the (#a) items for anyone, and having high trust in (#b) items for certain admin-selected characters (which may or may not actually be players, they could be just dummies that trust your principal "good people" on the mud) When a player wants to know how much he should trust someone else, a weighted computation is done down his trust chain (highest ranking to his own evaluation of a person if it exists, lesser but still substantial weight to the ratings of people he trusts to make the evaluation, etc). Failures would be of the following types: 1) A highly trusted character starts betreaying people. This is a problem with any system, but can be viewed as a valuable in-character event. It's not a bug, it's a feature! :) 2) You trusted the wrong peer group. Tough luck for you, isn't it? 3) Lack of "connectivity" between yourself and the person you want to know is "good" for whatever criterion you're looking for. I think that generally, a system like this reporting that there's insufficient information is better than giving a manipulatable or wrong answer "(I don't know him, and none of MY friends do either, but all of HIS friends say he's great") Once the basic system is in place, you could consider expanding to inter-peergroup trust metrics if desirable. The player can get more than just a flat number: a summary of key relations can be presented if desired. --- >evaluate Bob trustworthy Most of your friends seem to find Bob trustworthy. John strongly disagrees. Bill's friend Greg also disagrees. Your general social connections report that he's generally upstanding, but a few people have reported him to be dishonest. --- Now, implementing this in Rapture sounds like a challenge. I'd rather export data and do most of the processing off-line :) Whether the user is capable of evaluating different qualities independently is a key early decision- would they simply give their friends the highest rating in all categories and someone who was rude to them a low rating in all categories? And for your second need, approving content without admin approval: A player who creates, say, an object, can submit it to other players for review. The trustworthiness of the collective evaluation can be computed using the trust viewpoint of admin chars. After a player has been creating for a while with admin approvals, he can start to get (admin-granted) trust from someone like Rurin. If you make the normal admin approvals available (but inconvenient and/or costly- long waiting period, in-game currency cost) then having a heavily-trusted character would be very valuable and something unlikely to be squandered. Getting multiple such characters to sign off on inappropriate content would probably be very difficult if any player is free to submit a complaint about content. Stilton |
08-27-2003, 09:20 PM | #13 |
Senior Member
Join Date: Apr 2002
Location: Seattle, Washington
Posts: 342
|
That is what The Pattern's End uses in modes of trust. I'm not sure if it's the kind of trust system you are thinking about, but the trust levels are from 0-6, and it's all about whether you can enter a guild area, draw from their donation's chest, etc. The immortals also have a system for trust where depending on your trust, you can edit areas or code on the MUD. We have three implementors, and I think those 3 set the trust for the other immortals. The immortals also select who is going to be the head of a guild, and they always have a nice system to choose the right person. They give the guild leaders a trust of 6 (the highest) and then they can set all the guild members to what they want to (trust level 5 allows the person to set the trusts of fellow guild members).
I find it a very good system. If you think people are going to take advantage of the system... Well, the immortals and guild leaders would be fired if they abused a system. Two, they wouldn't select someone who would do that, and keep them in business. |
09-04-2003, 06:41 PM | #14 |
New Member
Join Date: May 2003
Posts: 15
|
If I understand correctly the problem is how to create a system that ranks players that cannot be abused. Well I have thought about it and this is my thoughts on it.
Firstly, I think it would work best if it was invisible to players. If players don't know about it you can be sure it won't be abused all that much. This obviosuly means you cannot have a ranking system where players award points of such. Instead I have come up with a vague idea that would work better. Here's the idea. You have a command called "invite". You invite a player and you can invite them to roleplay with you, or hunt with you or whatever. The game measures how often someone is invited. This way you can see who people would most like to roleplay with/hunt with. It's not perfect but I think it would work to some extent at least. For a way of telling how much players trust each other you could use a slightly different method. Players assign other players a certain amount of trust, but this allows that player to do certain things. So for example if you set your trust level with someone to a high value then you cannot attack them (without first lowering your trust). If you trust a player then any attacks theyhave like back stab or such are harder for you to defend against (you don't see it coming). Of course this trust level could be controlled automatically to some extent, for example if the plaeyr attacked you then your trust for them goes down automatically. Your trust level for someone could also allow them to see more details about your character, so for example they your guild might not show up to most players examining you, but shows up to people you trust. Also you could set your trust level to strangers. This would mean players could roleplay paranoid characters. The incentive to trust people is that they would be able to see how much you trusted them, so by trusting people they in turn might trust you. This would be abusable, players could trust their friends so that their ranking would go up, but at the same time by trusting someone you are risking things, because for example if you trust someone you are less likely to notice if they try and steal from you, or they are able to see where you are (although on many MUDs that is possible anyway). The problem with this system is that it is complicated. I think trust can work, but like any system for it to work well you ahve to spend a long time working on it. Anyway, I think my main point is that the only way to measure trust is to give people a chance to really trust someone, and see what they do. Do you think that is any good? Or would it just not work in practice? -Maraz- |
09-08-2003, 04:32 AM | #15 |
New Member
|
The context in which I heard this is lost in the mists of history, and I'm likely mangling the quote anyway, but consider this:
If we are sitting around in a pub, you and I, and the town drunk runs in exclaiming that a naked woman on a horse is roaming about in Trafalgar Square, we would laugh at him and go back to our pints. Should a respected business man burst in and cry out that a unicorn is prancing through the Square, we would pick up a phone and seek help for him. If it were the Pope to stumble in and breathlessly proclaim that a naked woman on a unicorn was parading through Tralfalgar square, we would run out to check for ourselves! We know it is foolish, impossible, absurd... but coming from a voice of such authority we must verify with our own eyes what we know must be true before we'll cast aspersions. Any 'realistic' system must take this into account. This isn't to say that any system adopted by a mud must be realistic, and in the case of Achaea where in-game rewards are auctioned for RL money, NO system is better than a system that can be manipulated by the players. Thus 'realistic' gets bumped down the priority list even further. Taking the situation from above as a lesson though, perhaps a viable system would involve giving players a number of points which they can distribute as they please. Clan/guild/faction/nation/whatever leaders would recieve the most, their underlings a bit less... newbies the least of all. From there, one of two systems propose themself to me. Either dissallow assigning them within your own guild/faction, arguing that you already trust your clanmates as much as possible, and the points are to be awarded to players outside of your circle. Otherwise, allow the points to be passed around without restriction, but weight their value according to where they came from. A point given to a guild leader from the guild steward might be worth only 1/4 or 1/8 of its real value. In either case, hide the actual value of the point... perhaps giving the player some vague message to gauge with 'You note the look of respect directed at you from your peers', but make the steps between new messages wide enough that it would take all the points of several people to change the message. In the first case it would be imperative that the rewards of membership to a faction is clearly superior to the rewards granted by being highly respected, and that the factions themselves are either highly competitive with each other, or numerous enough to counterbalance the effect of a few alliances. Man! it is way too late for me to finish this, but I think I got the skeleton of the idea out there. Silrathi, will try to come back to this later |
09-23-2003, 12:15 AM | #16 |
New Member
|
I think regardless of who you are, we place people into certain trust boxes, in games, according to guild status, clan status, level or any form of easily recognizable achievement in the world. This happens naturally for most people. Coding this into the game could be interesting, but would have to be heavily moderated. Anything that relies on opinion and is controlled by players can be dangerous (but fun).
We have a form of this in our world, though it is web-based and does not affect in-game in any coded manner. Players may leave a (moderated) good or bad comment a day on another’s registry, these comments can then be read by any registered player. This can give other players an idea of what the person is like, who they are friends with, whether they have complaints about them and a general outline of their persona. |
10-20-2003, 11:10 AM | #17 |
New Member
Join Date: Oct 2003
Location: Oxford, England
Posts: 10
|
I have trouble thinking with the idea of a PhD Maths working on something like this!
I would suggest incorporating someone such as a psychologist or philosopher into the mix. It seems an impossible task to come up with a logical system for measuring something so completely subjective. Any such system would, to my mind, require a great deal of human management. This essentially means that any such system would be little more than a databsase. Thain. |
11-07-2003, 12:22 AM | #18 |
New Member
Join Date: Nov 2003
Posts: 4
|
Every philosopher that read that cringed, I promise you. Except for the most misguided early moderns and a few of the fluffier New Age thinkers, we tend to reject the notion of subjectivity.
That having been said, I think that the favor system already in place in Rapture is quite good at what it does. The players are aware of the effects of favoring, and very few of them care so little as to allow someone who is out to ruin an organization into power (and most of those people won't get the favors they need to give someone that power). If someone uses their power to favor in a way detrimental to the organization, the player's peers can strip that authority in a hurry with a concerted effort. Can it be manipulated? You bet. Are power and trust in the real world both manipulated and manipulative? That's kinda the point... Think about your average American election. Disregarding the fluke that just happened in California, generally speaking, only those with political experience are even given a passing thought. How does one come to acquire that experience? Either by taking on menial jobs for the betterment of the community or by manuevering to the top through the favor of those already in power. Hard work builds trust. Association with those who are trusted builds trust (and if those in power aren't trusted, then the association still grants respect and credibility). Now think about the favor system. When you do good things consistently, you get favored. When you please the hierarchy, you get favored. Generally, if you accept an unelected position, you get favored. Do these things often enough, stay in the public eye, and you're a prime candidate for an election. Of course, there are people who are trusted who aren't good candidates for, well...anything. Those who are charismatic but flighty are probably trusted--probably trusted very deeply by their inner circle of friends--but the rewards for this kind of trust (both IC and OOC) are usually simple favors shared between friends. There are those that'll go toe-to-toe with anyone in authority to protect a standing tradition, and if it is a good tradition, that person will be trusted. Granted, this same person is going to get disfavored all along the way, but if someone were to do this in the real world, he's probably have to lay low for a little while, too. No matter how necessary, driving a respected leader into retirement can do interesting things to one's reputation. Most non-political flavors of trust are regulated just by the dynamics of human interaction. Why code things that don't need coding? Bad business practices are spread by word-of-mouth. Bad RP will cut down on the player's circle of friends if the expectation to stay IC is enforced by the imms and the players. Random acts of stupidity, even if IC, is going to block entry to just about everything. |
11-09-2003, 09:50 AM | #19 |
New Member
Join Date: Oct 2003
Location: Oxford, England
Posts: 10
|
Heh! Careful what you claim there, my friend. I agree with most of the rest of your post, but seems this topic died.
- Thain. |
11-09-2003, 01:14 PM | #20 |
Member
Join Date: Mar 2003
Posts: 103
|
|
Thread Tools | |
Trust networks - Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Roleplaying vs OOC Trust | Delerak | Roleplaying and Storytelling | 7 | 10-06-2003 03:45 PM |
Level 65 Trust | Nostrum | Newbie Help | 9 | 10-23-2002 01:45 AM |
|
|