I was just reading
an article about facebook taking a more active role in "combatting misinformation"...
Facebook says it is taking more steps to encourage voting, minimize misinformation and reduce the likelihood of post-election “civil unrest.”
...“This election is not going to be business as usual. We all have a responsibility to protect our democracy,” Facebook CEO Mark Zuckerberg said in a post on Thursday.
Yeah that all sounds great. But I have zero faith that the people running Facebook are actually "protecting democracy" -- they are
participants in democracy, not moderators, and have their own agenda.
To me this feels like another
social control experiment I didn't provide informed consent for. And Facebook does this stuff all the time. I feel that anybody who wields that kind of power should have some kind of checks and balances. Right now, we have no way of knowing what their goals are.
If facebook's algorithm actually influences civil unrest... can we see it? can we discuss what it's doing, and if it's doing that well? When government policy does these things, we get to debate about it, vote (indirectly) on it.
Right now in 2020,
Algorithmic Transparency is not a widely supported political issue.
But I think it should be.
what would that even look like?
here's my potshot at it:
I think that websites which display a newsfeed and have over 1 million registered users should be required to publish a whitepaper on their algorithm.
what do you think?