Silk Road forums
Support => Feature requests => Topic started by: Obnubilate on April 14, 2013, 10:13 pm
-
Our current rating system (a 1-5 based rating system is not working.)
People who use this rating system properly are blacklisted and demonised for doing so.
All vendors expect a 5/5.
If you give anything less, chances are you are going to have a bad time.
For example, in IRC I witnessed a vendor complain how a customer had left a 4/5 as opposed to a 5/5 because the packaging wasn't perfect, or something of a similar nature- The vendor then claimed he would never sell to this client again.
After talking more with the vendor myself, he pointed out a valid point - anything less than 5/5 may as well be a 1/5.
It ruins the user's reputation.
We can see the 1-5 star rating isn't working on other popular websites such as E-bay as well. Most E-bay vendors post "Anything less than 5 stars is an unsatisfied customer"
I invite the Silk Road staff to reconsider our current rating system.
Instead of a 1-5 rating, I reccomend a thumbs up/down system.
Sort of like YouTube. This way, vendors can still have a 1-100 rating on the site, but instead of users rating from 1-5, they either give a thumbs up, or down.
I would like to argue that this system would increase trust.
Taking example from YouTube (Which is maintained and programmed by the world's smartest minds), Things greatly increased when they got rid of the old 5 star rating system.
Now, it is just a red/green bar.
People like to visualise information like this- It would much increase a vendor's efficiency in dealing with unhappy clients if it were a simple thumbs down.
I'm tired of seeing "5/5 - Cocaine was bunk shit, drywall powder cut with 95% tide- gave me a nosebleed, didn't get me high, but got the stains out. Good vendor communication"
Obviously the above is a bit of an exxageration, but we're not far off.
Currently, all I can tell from a vendor's rating is either that he's asking for FE, Communicates well, or is a "bad vendor"
I think it's time we evolved the system, In my opinion.
There's a reason all major successful websites have switched from this rating style, isn't it time we did as well?
-
The SR rating system is broken, and so is yours, which is why I have voted against it.
What is needed is an arrival statistic. People are giving 5/5 for drugs they never actually received, and often only received an anemic refund for the non-arrival.
If there was an arrival statistic, people would simply say arrived yes/no and other buyers would be able to see it. The arrival statistic should directly relate to the country that the user is in - so customers in Sweden, for example, and see the arrival statistics for a particular vendor into that country.
The Atlantis market is in the process of implementing this. Don't hold your breath about any changes being made here. Other users and I have been vocal about this issue for months, to no avail.
-
Sliding Statistic System:
Arrival: (Y/N)
Stealth: (None, Fair, Excellent)
Quality: (Fake, Poor, Fair, Good, Excellent)
From these statistics, Arrival should be heavily weighted.
50% of the total should be for Arrival.
25% for Stealth.
25% for Quality.
An exception to the formula would be if quality = fake, change the weight of Quality to 50% and the Arrival back down to 25% for this one transaction. This would help prevent people from sending sand and getting fair reviews for shipping SOMETHING but it was fake/misrepresented.
if ($quality == "Fake"){
$qualityWeight = "50%";
$arrivalWeight = "25%";
$stealthWeight = "25%";
} else {
$qualityWeight = "25%";
$arrivalWeight = "50%";
$stealthWeight = "25%";
}
Within this system, it should take the most recent 50 transactions (or 2 months time) and weigh those 50 (or 2 months) at a heavier weight than the previous sales. This would help shut down scammers VERY quick with a few legit bad reviews.
I think only those who have purchased over $100 should be allowed to use the complete scale. Under $100 should be limited to just the usual 1-5 rating system, to prevent vendor-feedback-war from ordering e-books for cheap just to damage a vendors rep. You want to damage a vendors rep so bad, you're going to have to pay $100 to do it, and after the next 49 people have good experiences, it will essentially negate that and be wasted money..
To weigh the last 50 transactions (or 2 months) at a 4:1 ratio vs all previous transactions would also give vendors with some prior hiccups a chance to recover in 2 months. Fair to the vendor and the community, while keeping the vendor-feedback-war to a minimum.
Also, if you buyers don't finalize and it goes to Auto-Finalize and you don't leave feedback, you should be fined $10. $5 to the vendor and $5 to SR. These tools are here to keep the place in order.
-
If there was an arrival statistic, people would simply say arrived yes/no and other buyers would be able to see it.
This ^^. I don't personally believe the SR rating .SYS is broken. But I do believe the people are broken. Buyers and sellers alike. Some people have no clue how to leave feedback, and some (actually, most) vendors think it is a slight when you leave less than 5/5, if there is a serious problem with quality or stealth. I should not have to state this, but 5/5 is 100%, 4/5 is 80%, 3/5 is 60%, 2/5 is 40%, and 1/5 is 20%. Some buyers have left 5/5 feedback for product that didn't even arrive. Frankly that's embarrassing. (Actually, that was stated above as well so I am not the only one to notice it.)
Sliding Statistic System:
Arrival: (Y/N)
Stealth: (None, Fair, Excellent)
Quality: (Fake, Poor, Fair, Good, Excellent)
From these statistics, Arrival should be heavily weighted.
50% of the total should be for Arrival.
25% for Stealth.
25% for Quality.
An exception to the formula would be if quality = fake, change the weight of Quality to 50% and the Arrival back down to 25% for this one transaction. This would help prevent people from sending sand and getting fair reviews for shipping SOMETHING but it was fake/misrepresented.
While I reserve the right to disagree with components of this proposed .SYS, I thumbs up the above. (+K for actually thinking all this carefully through). I hope this doesn't come out condescending, but some users will struggle when leaving feedback if complicated. Sorry, it's the truth. I don't judge, I love. :)
But I'd like to add to this debate that any implemented .SYS is dependent on its users. You can have the best rating .SYS in the world and it would still manage to be rendered ineffective by some buyers. I wish buyers would be more honest with their feedback b/c it hurts the .SYS and lets other buyers get sucked in to receiving dangerous goods or perhaps even shitty gear or good gear that doesn't end up getting shipped out.
J U S T M Y 2 B I T C E N T S . . . .
Piece, Love, and Fuck Haters.
-
It would be useful if you could see a) when the person giving feedback joined the site and b) how many other items they have given feedback on
We wouldn't need to see their usename, just some stats would be useful in figuring out if a vendor is giving them self fake feedback.
I noticed a vendor signed up the other day and within a few hours they already had good feedback... Express post is quick... but not same day quick!
-
Let's not reinvent the wheel here. I suggest a simple description of of what each rating means. Something like:
5 - superb quality, flawless packaging
4 - perfect except for some miner details ; will use again
3 - okay product, packaging okay, but not great, slow seller response. will use someone else next time.
2 - ghetto-quality product, packaging very suspicious, I regret doing business with this person
1 - scam. product was inert, from a country other than the one listed, etc.
Also, DPR could add something like:
N/A - asked to finalize early
-
Let's not reinvent the wheel here. I suggest a simple description of of what each rating means. Something like:
5 - superb quality, flawless packaging
4 - perfect except for some miner details ; will use again
3 - okay product, packaging okay, but not great, slow seller response. will use someone else next time.
2 - ghetto-quality product, packaging very suspicious, I regret doing business with this person
1 - scam. product was inert, from a country other than the one listed, etc.
Also, DPR could add something like:
N/A - asked to finalize early
This would be reasonable if we were starting out new. But we aren't. We are starting with this current system which I agree is broken. But how do you start a new 1-5 based system "the right way" when it's already been used wrong for so long. As a vendor, I can tell you that even if everyone were to agree that "starting now" let's use the 1-5 systems the right way that means something, the first time I get a 4 after that, I'm going to have a problem with it. It's taken me two months to recover from a single 4 rating a buyer gave me (then tried to extort free weed from me in return for changing it to a 5). Vendors react the way we do to 4 ratings because we've been conditioned to react that way. In this system, having a 99 rating is very harmful and I don't see it can be fixed without throwing out the existing system and starting over. You would be amazed how many concerned customers messaged me after that 4 rating brought me down to 99, asking sympathetically "dude, what happened?" as if I had acquired a painfully terminal illness or something. It's actually pretty close, I gotta tell you.
We are all accustomed at this point to believing that anything less than 100 score means bad vendor. Buyers and vendors are acclimated to that system, as bad as it might be, and I don't see how you can change only what the numbers mean in that system. Unfortunately, it seems that DPR has some attachment to the current system. I would too if I had come up with those geeky formulas used to calculate the vendor score (read the wiki if you haven't seen those formulas). I'm sure that a lot of effort and well intentioned thought went into coming up with those calculations. I see two problems. One has been talked about previously, that the 1-5 scores no longer mean what they are supposed to mean. The other problem I see is transparency. It's crucial that buyers and sellers have confidence that the system is fair. For them to think it's fair, it needs to be understandable at least at a basic level. The current formulas for vendor rating are the opposite of transparent. I'm not a geek by any means, but I got straight A's in math class and I still can't figure out when that 4 rating is going to stop effecting my vendor page rating which now stands at 99.8% after 2 months and 50 new transactions.
So I think some of the other posters are on the right track in saying it needs to be re-invented entirely if its going to be changed at all. That would mean getting rid of the 100 rating system and the 1-5 feedback entirely and starting with something completely new. It could be as simple as changing the numbers to letters but it has to be different enough that people will know it's a new system, and not just the old broken system "but now we're doing it right". And I think the poster who mentioned it has to be easy to use has a very valid point that ties into my point about transparency. It has to be easy to understand AND to use for it to work. It has to be understandable enough to most users that it's perceived to be fair.
Something that hasn't been mentioned that is a pet peeve of mine is feedback comments. It seems like some vendors are able to make bad feedback comments disappear from their vendor page. Vendor support as much as admitted to me that it was a loophole that some vendors have figured out. They mentioned something about technical reasons why it's hard to fix. I haven't figured it out but apparently some vendors have and it needs to be fixed for the feedback comments to be meaningful. I know of a particular recent scamming vendor (I'm not going to mention names here) who did that very recently. I'm sure some reading this was taken in by his scam and were surprised to find that his vendor page has no negative feedback comments at all despite his rating dropping down quite a bit. So it seems the score was rated correctly, but on his vendor page the bad feedback ratings and negative comments were deleted. A bug like that in the system makes it seem unfair and makes vendors and buyers distrust the system. It should be a priority for that to be fixed. Because right now, the comments are the only realistic and meaningful ratings there are in the system. Since in the world we live in now, any feedback score less than 5 is nuclear winter to a vendor, the comments are really all we have.
-
I'm tired of seeing "5/5 - Cocaine was bunk shit, drywall powder cut with 95% tide- gave me a nosebleed, didn't get me high, but got the stains out. Good vendor communication"
cracked me the fuck up, +1 if I could mate
-
Thumbs up/down system is essentially what we have now where anything less than a 5/5 is a bad review. I think the system we have now would be fine if were tweaked a little. Anything under 3 should be considered bad that way you can show vendors where they need to improve without fucking there 100 score and you could be more honest with your number rating.
-
The rating system isn't a problem. Every vendor who set someone on the blacklist for a bad rating is the problem!
-
The blacklist everyone talks about.. is there a blacklist pubicly available or do you guys just mean the vendors own personal blacklist ?
-
The rating system isn't a problem. Every vendor who set someone on the blacklist for a bad rating is the problem!
The blacklist everyone talks about.. is there a blacklist pubicly available or do you guys just mean the vendors own personal blacklist ?
No it is only available to vendors on the roundtable. By rights anyone on it should not be sold to, but there are a lot of vendors that still don't check it before sending out samples which I think is not the way to go. I am sure that some buyers get there that don't deserve too, also, so I guess it swings both ways.
Piece, Love, and Fuck Haters.