Two years ago some friends and I started a company out of our apartment called Tribeworthy with the idea of creating a rating and review platform for news, with the goal of improving trust and understanding between journalists and news consumers.

Six months after we started it, the 2016 election happened.

Then a few days ago, Elon Musk’s tweets happened.

It’s been a wild ride.

Our main focus right now is on our Chrome browser extension, but we also have a website and an iOS app. You can checkout the extension here: https://www.tribeworthystart.com

My name is Austin Walter, ask me anything!

Proof: https://imgur.com/AeQsvzZ

EDIT:

Further proof:

https://twitter.com/Tribeworthy/status/1002337822575955968 https://itunes.apple.com/us/app/tribeworthy/id1326275137?mt=8

Comments: 419 • Responses: 35  • Date: 

Arrogus963 karma

Do you honestly expect people to rate media based on their accuracy rather than how good a job said media does of confirming their previously-held beliefs?

_oscilloscope313 karma

So one thing that we're doing to try to combat this within our review process is we make people choose a specific reason why they think there is a problem with an article. We only display the top three reasons with an article's rating, so in this way it's harder for a person to try to trash an article just because they don't like it. In addition, if a person it constantly reviewing certain types of articles as good or bad, that will eventually be reflected in their user rating.

Nofanta10 karma

How about also choosing a specific reason why they think there is not a problem with an article - it's absolutely just as important and open to abuse and manipulation.

_oscilloscope8 karma

We are absolutely thinking of doing this. Our first goal was to address the problems with negatively affecting content, but now we're exploring how to curb that exact issue.

Beckels84249 karma

How is your platform not subject to inherent bias like any other social media platform or website? It relies on consumer votes so is susceptible to voter bias and manipulation. For example, if only a certain demographic knows about your site or comes to vote, or if a group with an agenda bands together to promote and vote certain stories.

_oscilloscope34 karma

So first of all, we are aware that you can never completely eliminate bias on a platform like this. That said, we have a few techniques for trying to filter some of the bias that people bring with them. First, we're doing our best to re-think how we show content to users to avoid the echochamber effect. Second, we assign a user rating to each user that depends on how much they've verified their account, how well rated they're past reviews are, and whether we've detected them trying to manipulate the system.

22rann86 karma

How “well rated” past reviews are? They’re rated by biased people right?

_oscilloscope8 karma

Yes. Yes they are. But, as I said, that's only a part of what determines a user's rating. We've seen how Reddit and Stack Overflow run their sites, and we don't want to discourage early or first time users, and we know that simple upvote/downvotes can be easily abused. Our plan is to allow anyone to start leaving reviews after creating an account, it's just that the weight of their review will initially be very low. As users provides more verifiable information (e.g. TFA, profile picture, phone number, twitter account) their rating will increase in weight. If a person is reviewing a certain way over and over again, this will effect the weight of their review, etc.

Trisa13325 karma

If a person is reviewing a certain way over and over again, this will effect the weight of their review

I think this is the problem everyone is trying to get you to verify. What is your solution to this? because it sounds like if they get a bunch of their hive minds to thumbs up their review, it will increase their weight...even if it is wrong and heavily biased.

_oscilloscope14 karma

What I meant by that is if a person is constantly upvoting/downvoting the same user, the effect of that person's vote on that user's rating will lessen over time. As well, if there is an influx of the same type of review or votes on an article or user, we plan to make it so those people leaving the votes or reviews will be flagged as possibly brigading, and weight of their reviews or votes will be decreased for the article or person they were trying to rate.

There are still issues with this that need to be worked out over course, but that's the gist.

Beckels8419 karma

Doing our best to re-think how we show content to users to avoid the echochamber effect.

This sounds exactly like the Facebook algorithms that DECIDE FOR US, what to show and what we will be interested in.

I'm sure you mean well, and I'm sure you'll do your best, but the problem with any vote/like/share media platform is that you immediately get a hive mentality. Misinformation gets spread like wildfire just because it's "popular", and you're only going to see the biggest trenders.

If you ask me, what we need is some kind of platform that focuses on organization, ease of use, and access to every remote little nugget of news across the world. Something that lets me search through and find information for myself without telling me what's "popular". Instead of giving users credibility/karma/popularity ratings, how about give us that info about the news sources and journalists? How long have they been up and running? Did the journalists go to school, do they have a history of award/recognition or controversy? Do they have known political affiliations? I don't care if HornDog57 actively comments on news stories, i care about who wrote the article I'm reading and what agenda they have.

_oscilloscope4 karma

I mean that's the problem isn't it? If one organization tries to decide for consumers what they see, then the org is being patronizing and controlling. If they try to let the information be as free flowing and granular as users want, then the org is encouraging echo chambers.

There isn't going to be a perfect solution, it's just going to be a compromise, or there will be several sites that do one or the other.

However, I never said we were planning on deciding for users like facebook does. I think our current concept is much closer to yours. By using a browser extension, we can focus on displaying ratings to users as they go about their natural news gathering process, and one of the things we liked most about a "Yelp" or "Rotten Tomatoes" style model was their focus on search over displaying a feed. In addition to that, we want to allow a journalist who wrote an article to be able to claim a profile and then respond to reviews or critism on their articles.

BuildEraseReplace107 karma

What safeguards are in place to prevent "score bombing" or otherwise destroying a journalists reputation in a similar way to downvote brigades here on Reddit.

My concern is that, unlike vote brigades which are commonplace on Reddit but do nothing except lower useless internet points, your website could be used to effectively ruin journalistic reputations if your website takes off and otherwise earns credibility.

Will you have a moderation team who are able to identify this behaviour? Would you then blacklist voters or somehow protect the scores of targets? (which creates obvious problems of protecting potentially poor journalism... alliteration unintended)

How are you planning on combatting this behaviour? You can be virtually certain that you'll encounter it, and unsatisfactory management of it will hurt undeserving journalists and eventually the credibility of the rating system altogether.

_oscilloscope24 karma

We absolutely acknowledge the potential harm this could have on journalists. This is why we don't let people directly rate or review journalists themselves, their ratings will be determined by the aggrigate review of their articles. We have also been slowly rolling out this functionality so that we make sure it works well.

We will have a moderation team, and if we see users brigading a journalist's article those user's ratings will be temporarily reduced in weight so as to not negatively affect the journalist, and then those users accounts will be flagged for review.

pperca69 karma

What kind of protections a site like that would have against bias?

Uninformed people rating articles based on their ideology do not make your site any less biased than the original sources.

What kind of safeguards do you have to eliminate that? As we have seen in Facebook, Twitter and Reddit, popularity does not equate to truth.

_oscilloscope12 karma

As I've said in other comments, we are implementing a credibility rating for users on our platform to try to fight bias. To add to that though, I'd like to say that we're not trying to be the arbiters of truth. I think as time goes on there will be three answers to how to determine new credibility. Fact-checkers, AI, and crowd-sourcing. So as we progress we'd like to partner with other organizations such as fact-checkers and media companies to try to give news consumers a more holistic view of what's going on.

JuIiusCaesar33 karma

Do you have a way to prevent corporate, Russian, or some other source of bots from messing with the ratings?

_oscilloscope12 karma

Through a combination of normal recaptcha techniques, account verification, and detecting the bots location. Honestly, I not overly concerned about bots. Bots have persisted on social media in the past because it helped social media sites look more popular, not because they couldn't be detected. In fact what I'm more concerned with is coming up with a transparent way of demonstrating to our users that we are preventing bots.

That's not to say we think our system is perfect or bot-proof. We're constantly looking for more safeguards.

soldieronspeed30 karma

Your site runs into the same issue as every other public information forum, it depends on the populace to be well informed and unbiased; something that is pretty much impossible in our current social climate. Your idea to combat this is also flawed, because just because a person completes their entire profile and a bunch of other people regularly rate their opinions highly does not mean that person knows what they are talking about.

You are essentially creating and enforcing what Kant referred to as the guardians; people who simply because of their position or popularity could shape information and society even if their information was wrong, or inaccurate.

While consensus and social agreement are important, empirical evidence and hard facts are what that consensus should be based on. With that in mind, how do you think you could improve your site to increase agency of the individual and help people make informed decisions for themselves?

_oscilloscope12 karma

What good is empirical evidence and hard facts if nobody pays attention to them? And how do you get people to focus on them without centralizing control of what people see?

It seems to many people even trying fix the problem makes you a part of the problem.

We're still a very small company, and our product is still progressing. As we grow we plan to put in more safe guards.

We plan to be transparent with how determine ratings.

We are putting in algorithms to work in conjunction with the crowd-sourced reviews.

We want each news outlet and journalist to have their own pages, where you can read general information about them in addition to their ratings such as where they are based, what they generally write about, etc.

We aren't going to let people directly rate or review people or organizations. Those ratings will be determined through aggregation of their article ratings.

We want to eventually let journalists claim their pages and then respond to criticism on their articles so people can get both sides of the story. Much in the same way that a business owner can respond to a reviewer on Yelp.

We are partnering with news organizations, and eventually want to have journalists rate articles as well as users.

We hope to eventually even partner with fact-checking organizations to try to cover as much ground as we can.

Ultimately we're just trying to consolidate information about news outlets, journalists, and articles so users can reference all this information in one place, and then give them a general sense of how other journalists and consumers feel about.

Redditisfreedom30 karma

The whole concept seems flawed, change my mind. "We have created a platform for users to accurately rate the truth of news sources, to combat... users... from... inaccurately... rating... hmmm.." So, your plan is to just ask people on the internet to not be people on the internet?

_oscilloscope7 karma

  1. We currently do account for high/low scores, but that's it. We will be employing additional algorithms as we progress to account for bias and the weight of people's users ratings.

  2. I think the idea of having a unified credibility rating is a terrible idea. I think having broken-up or separate credibility ratings in different spaces can be a good thing. I don't want to decide whether you can buy a car, I just want people to be held accountable on my own platform.

NicCage4life29 karma

How you define truth? Can truth be determined by public consensus?

_oscilloscope27 karma

I don't think it is for us to define truth, and it definitely can't be determined by public consensus. You have to remember that we had this idea before the 2016 US presidential election. We didn't come up with this idea in response to "fake news" or anything like that.

We wanted to to this because we wanted to increase communication and trust between news publishers and news consumers. We want journalists to be held accountable, and we want our users to be held accountable.

Grzegorxz17 karma

Isn't the idea of ''rating the core truth'' based off of how much people love or hate the truth?

_oscilloscope17 karma

This is actually one of the main reasons I wanted to have this AMA. We don't want the public leaving reviews to determine the "core truth". We don't want to be, or plan to be, a site for users to rate how "true" an article is. We want to allow users to state how much they trust an article, and then make them give a specific reason why they think that.

With this in mind, we also don't think our ratings should be the only way a consumer determines whether they trust something. We think that what we're doing is needed in conjuction with traditional fact checking.

pfeifits15 karma

Isn't the difficulty in media bias that there is a market for it? Meaning there are many people who agree with the narrative and believe it to be true, even if it is demonstrably false. I think it is naive to say that if a majority thinks it's true, it is. There are many instances where a majority believe something to be false. Who becomes the arbiter of what is true and what is false when everyone is sectioned off into their own echo chamber?

_oscilloscope12 karma

Yes that is the difficulty in media. That's why we started doing this in the first place, to try to reduce the market for media bias. I understand that many are not satisfied with my answers so far dealing with bias, or think I'm naive.

I've been on Reddit for a long time. I read the news. I'm perhaps more of a cynic than most people.

Here are my thoughts.

News media is messed up. It's messed up for lots of reasons.

How do we improve the situation of news media? Fixing it is going to require lots of different people trying to come up with many different solutions. I don't think all of them will work, but if we don't try something nothing will improve.

The problem I have chosen to focus on is trust. Trust between new publishers and news consumers. Not truth. Trust.

I know that the first, and most prevalent problem I am dealing with is bias from users.

But I think the difference between our tools and other currently existing platforms is that they wanted to pretend that bias didn't exist or wasn't a problem.

I'm telling you now that I unequivocally believe that bias from users is a problem and that it is pervasive.

I think that accepting that is half the battle.

Whenever a person leaves a review, we know that no matter how objective they're trying to be, they are going to be injecting their own bias into it. In the future, as we grow, we plan on compensating for this using an algorithm. But I think it's important to state that while I believe that algorithms can be useful they aren't going to be an answer all on their own.

We have other ideas in the works, systematic sourcing, text analysis, but talking about them while they're underdeveloped won't help.

We also want to partner with fact-checking organizations to help them know what areas people feel the most uncertain about. But that's a ways off.

But instead of saying "what's the use?" I'd encourage everyone to instead try out what we currently have, offer suggestions on how they think it can be improved, tell us how we can be more transparent so that we can show you that the system is working.

Don't let the search for a perfect system be the enemy of the good.

ssnewp_22025 karma

What's the end-goal for the website?

_oscilloscope7 karma

We'd love for news consumers that find an article to reference us before deciding to spend their time on it. Just as we check Rotten Tomatoes before spending our time watching a film. And in the same way we want to help people find the most trusted articles on any given topic. For example, if you came to Tribeworthy and searched Iran Nuclear Deal, we'd give you a list of all the relevant articles and how trusted each of those articles are by other news consumers. Then you can decide for yourself which ones are worth your time.

ForgedIronMadeIt4 karma

Why do we expect non-experts on the news (i.e., John and Jane Doe) to be able to accurately assess the veracity of the news? Sounds like a massive ad populum. I think I will prefer having news organizations verifying each other as necessary.

Aside from that, it sounds like this is going to be more or less a clone of Reddit. What is the difference?

_oscilloscope7 karma

We actually agree with you. That's why on each article page we want there to be a general user rating, and then also a journalist's rating. Much in the same way film critics will review a film, we want other verified journalists to review articles.

As for how we're different from Reddit, I think we sound like Reddit in passing, but we're really nothing like Reddit. Right now, the main way we are having users engage with articles through out Chrome extension, and as we improve the website we want to to be more similar to look and feel of Rotten Tomatoes or Yelp where you can see some of that days content, but the main focus will be on our search bar. As well, We'll be implementing Journalist and News Outlet pages soon, where you'll be able to get general information about them, but not directly review them. Ratings of News Outlets and Journalists will be determined by aggregating the ratings of their articles.

BurgerPleaseYT4 karma

What's your favorite burger joint?

_oscilloscope4 karma

In-n-out.

BossFightStats4 karma

What qualifications do you and your friends possess? Have you hired any ethicists or others with qualifications outside of programming?

_oscilloscope3 karma

We are partnered with media watchdog and media literacy organizations. They have helped advise us while we've been developing the review process.

JacePriester4 karma

The extension requires permission to: "read and change all your data on the websites you visit"

I write software, I get it, it needs to inspect pages and add stuff to them... but you know how bad that sounds? Pass.

_oscilloscope2 karma

Yeah I know that sounds terrible. Chrome forces us to use that language and there's not much we can do about it. Do you have any tips or suggestions?

radgepack3 karma

What about us Firefox users?

_oscilloscope3 karma

That's the next browser that we're working on supporting.

nom_of_your_business2 karma

When will Firefox support it?

_oscilloscope2 karma

We don't have an ETA yet, but that's the next browser we are working on supporting. If you don't use Chrome you can also use our iOS app or our website.

dankwaffle2 karma

Have you contacted Elon Musk regarding his thoughts on your platform?

_oscilloscope2 karma

We've tweeted at him but other than that we don't have any other way to reach out to him.

Snackasaurus2 karma

I see on your website you indicate that "speculation" is a negative - how would this affect editorials and comment pieces that analyse politics and economics, for example?

How would you guard against legitimate speculation - clearly presented as such - from being downvoted?

Similarly, where you say that it's a bad thing for news organisations to decide what their readers see, isn't this just a matter of editorial stances on certain matters? A news source isn't obliged to give all sides of a story if, for example, the organisation favours one political candidate over another.

_oscilloscope2 karma

We are looking for a better way of distinguishing purposeful and upfront speculation and editorial pieces. Possibly breaking them off into their own category. However, we're much more concerned about articles that try to pass themselves off as being authoritative or objective analysis that are actually just trying to manipulate people into thinking a certain way.

Dalmahr1 karma

Noticed on the site it talks a out a Chrome extension. I've mostly stopped using chrome. Is there any plans for an Edge or Firefox version ?

_oscilloscope1 karma

Firefox is next on our list of browsers to support!

[deleted]1 karma

[deleted]

ooainaught1 karma

My guess is this works in a similar way to Wikipedia?

_oscilloscope2 karma

Tribeworthy is a play on the word "Trustworthy" as decided by our "Tribe" of users. But that's not a bad idea. We're actually open to name changes if we ever feel like the name Tribeworthy is holding us back.

nineran1 karma

Do you believe that objectivity is a value media/journalists should strive for?

As opposed to, say, transparency?

_oscilloscope5 karma

I don't think objectivity and transparency have to be at odds. I think that they are both values that news media should strive for.

Chaserly1 karma

Is this site not easybib?

_oscilloscope3 karma

With our extension we actually let you take our ratings and review process with you as you go about your natural news gathering. So we're more of a utility than a bibliography generator.

I_cant_stop_evening1 karma

Have you tried to reach out to Elon Musk about your site? Is there anything you can do to merge ideas or sites or are you kind of SoL that some guy with a massive following will create a site that will most likely surpass your user base greatly?

_oscilloscope2 karma

I don't think he actually plans to start a site like this. Maybe invest or buy one out, but he seemed like he was mostly joking. We've tweeted at him a few times but that's about it.

nooBarOne1 karma

Are there "shades of gray" in the ratings as opposed to just true/false? Seems hardly anything is considered 100% "true" by 100% of the people. Does the rating reflect the proportion of true vs. false ratings?

_oscilloscope2 karma

We're currently considering introducing a scale instead of just using trust/contest buttons. The ratings do reflect the proportion of true/false ratings. We definitely want to introduce more "shades of gray" into the review process thought. Nothing is cut and dry.

marklar001 karma

Is your name Kevin Rose?

_oscilloscope1 karma

Do I look like Kevin Rose?

FlyingChihuahua0 karma

How long did you suck Musk's dick to get him to say that?

_oscilloscope8 karma

Road head can last a long time when you're in an self-driving car. Also much safer.

IwishIwasunique0 karma

Dude, you have crazy eyes! Why should I trust you? Seriously, and more important to me, what are your political leanings?

_oscilloscope4 karma

Haha thanks? The point was to make it look like Elon was holding me hostage.

gwoz88810 karma

So you’re Aaron Swartz?

F

_oscilloscope1 karma

F

NoTeeNoShade-2 karma

What do you have to say to high school-aged children about the necessity or the usefulness of your creation? What must they know, as new Internet scholars?

_oscilloscope-1 karma

High-schoolers and college students have been VERY receptive. In fact, we go to high schools and universities often to speak to students about this. To them is just makes sense right away. Our review process trains them to question how the content was put together, if there are any biases or fallacies in the article's argument, and provides them with a warning about problems in the article before they start reading. We've found that high school-aged children review less often because they have less confidence in their ability to critique articles, but they're getting a lot of value observing the reviews/ratings when gathering news online.