News:

If it quacks like a sociopath, but also ponders its own sociopathy, it's probably just an asshole.

Main Menu

Are customized search results polarizing opinions?

Started by Mesozoic Mister Nigel, January 29, 2017, 09:16:29 PM

Previous topic - Next topic

Mesozoic Mister Nigel

Yes. Yes, they are.

https://www.splcenter.org/20170118/google-and-miseducation-dylann-roof

This article highlights one of the many problems with Google search ranking, and the way the site uses preferential ranking to show you results it thinks you will "like" based on what sites you spend the most time on -- it reinforces whatever the user already believes, exacerbating confirmation bias. This is why, when debating people about certain topics, it often appears as if they literally exist in a different world with different facts. Google is helping to polarize public opinion on controversial topics.

I think that search engine customization is, potentially, going to continue to widen social gaps and badly needs to be regulated. If you think about it, it means that the primary gateway to information for all Americans is being selectively censored - soft censoring via result ranking, but censored nonetheless - per individual. The censorship is naturally going to fall along socioeconomic lines. That means, literally, that the information Google presents a poor black user will be qualitatively different from the information Google presents a wealthy white user.

Search results are customized based on past searches and the time spent on web pages. Therefore, customized search results are tailored to an individual's previous exposure to websites. The filtering is not purposely based on race or class, but the natural net effect is the same as if it were. Essentially, someone who reads about alternative medicine will get more alternative medicine websites, and when they search a topic, those websites will be prioritized in their search results. Over time, the cumulative effect will be that they are less likely to be exposed to consensus science and medicine, and may even be functionally unable to find articles that offer science-based views because their results are so heavily weighted toward woo. A black person is more likely to gravitate toward articles, and therefore sites, that support their perspective. Likewise with a white person, as exemplified by Dylann Roof in the linked article. The net result is that individual customization of search results reinforces and even creates polarization around ideas and values.  In my opinion, this is profoundly unethical, harmful, and a threat to social stability.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


minuspace

Quote from: Mesozoic Mister Nigel on January 29, 2017, 09:16:29 PM
Yes. Yes, they are.

https://www.splcenter.org/20170118/google-and-miseducation-dylann-roof

This article highlights one of the many problems with Google search ranking, and the way the site uses preferential ranking to show you results it thinks you will "like" based on what sites you spend the most time on -- it reinforces whatever the user already believes, exacerbating confirmation bias. This is why, when debating people about certain topics, it often appears as if they literally exist in a different world with different facts. Google is helping to polarize public opinion on controversial topics.

I think that search engine customization is, potentially, going to continue to widen social gaps and badly needs to be regulated. If you think about it, it means that the primary gateway to information for all Americans is being selectively censored - soft censoring via result ranking, but censored nonetheless - per individual. The censorship is naturally going to fall along socioeconomic lines. That means, literally, that the information Google presents a poor black user will be qualitatively different from the information Google presents a wealthy white user.

Search results are customized based on past searches and the time spent on web pages. Therefore, customized search results are tailored to an individual's previous exposure to websites. The filtering is not purposely based on race or class, but the natural net effect is the same as if it were. Essentially, someone who reads about alternative medicine will get more alternative medicine websites, and when they search a topic, those websites will be prioritized in their search results. Over time, the cumulative effect will be that they are less likely to be exposed to consensus science and medicine, and may even be functionally unable to find articles that offer science-based views because their results are so heavily weighted toward woo. A black person is more likely to gravitate toward articles, and therefore sites, that support their perspective. Likewise with a white person, as exemplified by Dylann Roof in the linked article. The net result is that individual customization of search results reinforces and even creates polarization around ideas and values.  In my opinion, this is profoundly unethical, harmful, and a threat to social stability.

Ethically difficult question.  Once previous results start informing the things for which I search, then it is difficult for me to hold Entity culpable of providing me with what I want.  Would have to raise question to second order: is this really what I want to want?  Then polemic becomes similar to that of addiction.  Should I be free to engage in an activity that may prevent me from thinking/acting freely?  At first glance it may seem logically unethical/irrational to support such contradiction.  The problem is that most are not willing to accept that they may engage in an activity that robs them of their ability to do otherwise.  In particular when the mechanism is such that the subject is made to believe that they formulated and reinforced the point of view for themselves, they will take their bias very seriously.  A wrong idea that nonetheless confirms that bias will be held in higher esteem than any incongruent truth.  Unfortunately this kind of understanding usually requires experiencing something like Trump becoming President of the United States.

Mesozoic Mister Nigel

Quote from: LuciferX on January 29, 2017, 09:55:39 PM
Quote from: Mesozoic Mister Nigel on January 29, 2017, 09:16:29 PM
Yes. Yes, they are.

https://www.splcenter.org/20170118/google-and-miseducation-dylann-roof

This article highlights one of the many problems with Google search ranking, and the way the site uses preferential ranking to show you results it thinks you will "like" based on what sites you spend the most time on -- it reinforces whatever the user already believes, exacerbating confirmation bias. This is why, when debating people about certain topics, it often appears as if they literally exist in a different world with different facts. Google is helping to polarize public opinion on controversial topics.

I think that search engine customization is, potentially, going to continue to widen social gaps and badly needs to be regulated. If you think about it, it means that the primary gateway to information for all Americans is being selectively censored - soft censoring via result ranking, but censored nonetheless - per individual. The censorship is naturally going to fall along socioeconomic lines. That means, literally, that the information Google presents a poor black user will be qualitatively different from the information Google presents a wealthy white user.

Search results are customized based on past searches and the time spent on web pages. Therefore, customized search results are tailored to an individual's previous exposure to websites. The filtering is not purposely based on race or class, but the natural net effect is the same as if it were. Essentially, someone who reads about alternative medicine will get more alternative medicine websites, and when they search a topic, those websites will be prioritized in their search results. Over time, the cumulative effect will be that they are less likely to be exposed to consensus science and medicine, and may even be functionally unable to find articles that offer science-based views because their results are so heavily weighted toward woo. A black person is more likely to gravitate toward articles, and therefore sites, that support their perspective. Likewise with a white person, as exemplified by Dylann Roof in the linked article. The net result is that individual customization of search results reinforces and even creates polarization around ideas and values.  In my opinion, this is profoundly unethical, harmful, and a threat to social stability.

Ethically difficult question.  Once previous results start informing the things for which I search, then it is difficult for me to hold Entity culpable of providing me with what I want.  Would have to raise question to second order: is this really what I want to want?  Then polemic becomes similar to that of addiction.  Should I be free to engage in an activity that may prevent me from thinking/acting freely?  At first glance it may seem logically unethical/irrational to support such contradiction.  The problem is that most are not willing to accept that they may engage in an activity that robs them of their ability to do otherwise.  In particular when the mechanism is such that the subject is made to believe that they formulated and reinforced the point of view for themselves, they will take their bias very seriously.  A wrong idea that nonetheless confirms that bias will be held in higher esteem than any incongruent truth.  Unfortunately this kind of understanding usually requires experiencing something like Trump becoming President of the United States.

I can't tell if this is satire, or actual philosophical masturbation. If the latter, you left out consent.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


Faust

One of the worst that directly impacts me has been when I am searching for C# or code related stuff, and I come home and searches start pushing those.

I've started using duck duck go for home use and google for work.
Sleepless nights at the chateau

tyrannosaurus vex

This is absolutely a problem. I don't think anyone on a design team has really considered what a horrifying problem it really is. Understandably, they are solving a problem, and they're solving it extremely efficiently. But it establishes a pattern that's dangerously myopic. Policymakers aren't really at fault for it, because they're so woefully uneducated in these technologies that they themselves are very likely to be just as victimized by the phenomenon as anyone else. It takes a truly analytical mind, or constant exposure to the mechanics of this technology, to even notice that it's happening, much less divine what its context and effects are or formulate some kind of response to it.

And it isn't limited to search engines. Because nearly all social media interactions are recorded and fed into these algorithms, the existing tendency of people to surround themselves with like minds and submerge themselves into impenetrable echo chambers is magnified. The entire system of search engines, social media outlets, and targeted ads becomes a self-healing network that not only reinforces existing opinion, but by exposing those opinions only to compatible opinions, the only viable evolution of ideas is in the direction of extremization. So not only are we being corralled into communities where we are unlikely to encounter information that contradicts any false beliefs we have, we are also being pushed into accepting more and more extreme iterations of those beliefs. And because of the way the algorithms, work, the more inquisitive and curious you are to discover and counter this phenomenon, the harder you'll be pushed.

So, is this happening? Definitely. What can we do about it? I have absolutely no idea. "Trolling" works to at least break into echo chambers, but the immune systems of these communities tends to eventually reject the troll and plaster over the wound with even more extreme beliefs. Memes can spread, but are mostly limited to preaching to the choir. I believe, although lacking any real education in sociology I am likely very wrong, that the only thing that can really have an impact is some organization dedicated to learning about the algorithms that reinforce this ecosystem and designing memes that are capable of spreading across ideological boundaries. What that would look like (or if it's even a meaningful idea) I don't know.
Evil and Unfeeling Arse-Flenser From The City of the Damned.

Mesozoic Mister Nigel

If I were a social scientist, I could write a thesis paper on it and possibly expand it into a book, which would possibly reach enough people that it might exert pressure to regulate search customization.

But I'm not. Though I'm thinking that epidemiology might be a good direction to go in for the public policy leverage.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


00.dusk

I've had a good Hard Think about this.

I think the issue is there's a huge Market incentive to keep things as they are, and the extremists appropriate or disavow anything that makes it through the informational/ideological firewall. In the cases of class/race/gender lines on search results, with /those/ you might be able to make a dent with stuff like V3X's meme concept, but I don't know if that would be enough? In fact, I'm pretty sure it wouldn't be -- it's passing minimalist information, by its nature nearly content free, and otherwise the divide abides. That does nothing to mix the info up the way it should be mixed up.

The only real solution in such a case is to torch it. Everyone gets the same search results, period, no matter who they are. I don't see any way around it. I also know that won't happen because the Market has Demands and they shall be met. It'd take a big push to overwhelm that juggernaut.

All I can think of? There should be classes. "How to use Google-fu to bypass unintentionally self-imposed information firewalls on search results." That's something you could spread, if you found the right person and the right vehicle to disseminate it. And that might be useful, in a sense. It wouldn't be enough, but I can't think of anything that would be that's also in the ballpark of our personal little bad future.

tyrannosaurus vex

As an aside, beyond the horrific implications of this, I think it's fascinating that we are now very much constructing The Machineā„¢ in a very literal sense, thousands of people are even intimately aware of the exact source code of it and how it works, and probably very few people even see what it is, or what it does in its real social context. I know personally a few people who work on exactly these type of systems, and the last thing they want to do is reinforce injustice or stop the flow of information. In their minds, they are doing what they do precisely because information does not flow freely enough, or the because right information doesn't make it to the right people. I've mentioned this sort of effect to them before, not exactly in these terms but along the same lines, and their response is the sort of "can't see the forest for the trees" thing that you'd expect from people who aren't negatively impacted by any of this, at least not visibly. They can find anything they want on Google, even things that Google would otherwise bury somewhere on page 60, but they're so fluent in that strange Google Search language you pick up after a while that it's second nature to them and it doesn't really register with them that there are millions of people who aren't that intentional with their searches.

It's possible that it's more than just the way Google handles information as a clearing house. Like the particular and nuanced syntax to tell Google what you're looking for that escapes most people, our information-overload society is forcing the evolution of information  itself so that metadata and context are now as integral to messaging as the content itself. The way that information flows from point to point is becoming a language of its own. Google could make every effort to design its algorithms so that conflicting or "verified" information overrides at least part of the user's history and preferences (and at least some people in Google now recognize the need for that and are actively working toward that), but unless the user is at least somewhat fluent in the methodology of searching intentionally, and they actually choose to do that, any aberrant information will be dismissed and forgotten anyway.

I'm probably getting a little woo-woo over it at the moment so I'm going to spend more time thinking.

Awesome topic to bring up though!
Evil and Unfeeling Arse-Flenser From The City of the Damned.

00.dusk

Quote from: V3X on January 30, 2017, 04:24:24 AM
snip

It's possible that it's more than just the way Google handles information as a clearing house. Like the particular and nuanced syntax to tell Google what you're looking for that escapes most people, our information-overload society is forcing the evolution of information  itself so that metadata and context are now as integral to messaging as the content itself. The way that information flows from point to point is becoming a language of its own. Google could make every effort to design its algorithms so that conflicting or "verified" information overrides at least part of the user's history and preferences (and at least some people in Google now recognize the need for that and are actively working toward that), but unless the user is at least somewhat fluent in the methodology of searching intentionally, and they actually choose to do that, any aberrant information will be dismissed and forgotten anyway.

I'm probably getting a little woo-woo over it at the moment so I'm going to spend more time thinking.

Awesome topic to bring up though!

This doesn't seem woo-woo at all -- it's a lot like what I was saying. Realistically, the sensible solution (without trying to say that the Murikin Free Market needs to not let those silly consumers box themselves in informationally) is education on a level that can be spread around -- e.g., teaching search engine syntax and how to parse what a given set of search results tells you about the quality of the search terms you used.

It is fascinating that these things are happening, and I can see how you'd get the woo vibe from that kinda masturbatory this-is-so-cool "look" to it -- but when you frame things like that for yourself you're suggesting a problem space, and that gives you an idea of some of the difficulties you can work around and how to go about it, which is just as important as actually finding those solutions.

You can find a needle in a haystack quite easily if you know roughly where in the haystack it should be.

tyrannosaurus vex

Quote from: 00.dusk on January 30, 2017, 04:38:47 AM

This doesn't seem woo-woo at all -- it's a lot like what I was saying. Realistically, the sensible solution (without trying to say that the Murikin Free Market needs to not let those silly consumers box themselves in informationally) is education on a level that can be spread around -- e.g., teaching search engine syntax and how to parse what a given set of search results tells you about the quality of the search terms you used.

It is fascinating that these things are happening, and I can see how you'd get the woo vibe from that kinda masturbatory this-is-so-cool "look" to it -- but when you frame things like that for yourself you're suggesting a problem space, and that gives you an idea of some of the difficulties you can work around and how to go about it, which is just as important as actually finding those solutions.

You can find a needle in a haystack quite easily if you know roughly where in the haystack it should be.

I know it's not woo in that it's technically relevant information, but it's a little woo in that it isn't very "actionable" (to use a word I hate). It's easy to jump into the deep end and start exploring and describing the problem rather than get one's bearings and find the way to an objective. We could start training people to search better, but in order for that to do any good, they need to realize there's a problem with the information they're already getting. Maybe that's where the "memes" would come in (and by "meme" I mean any contagious thought or action, not necessarily the classic "picture with text overlay" media).
Evil and Unfeeling Arse-Flenser From The City of the Damned.

00.dusk

I know what's meant by meme, for the record. Not getting snippy, just establishing that as a baseline, I'm about where everyone else on PD is in terms of understanding.

From where I'm at, there's no good solution that presents itself. The problem is enormous, ridiculously huge, and you'd have to move a lot of really dumb people. Like, start with Trump's current supporters, as in the ones who make up the positive portion of his incredibly low approval rating -- they're fanatics, by and large. They're not necessarily the type of person we want to reach, but take it as an example of the difficulty of it.

Now you tell them, somehow, that their methods of getting information are flawed. Yeah? And they'll either agree with you by way of assuming the message isn't meant for them and ignore it ("yup and that's why I get my news from Alex Jones!") or they'll disagree with you viciously and ignore it ("brainwashed by the MSM!"). Either way, you have failed. That's with the meme method.

The thing about the teaching method without the meme is you can frame it in terms of finding out about music, books, movies, etc. Or finding people on Facebook they like. Etc. And then it's "neutral" to take in, but it can (and usually will) be applied to everything -- because it increases the quality of information with everything. The True Believers are genuinely beyond reach still, sure. The curious, disillusioned and highly misinformed have a chance at realizing they're still getting fed trash -- which is better than when they do what they've been told is the right thing and either dismiss it and ignore it, or agree with what they think it says without examining it, leading to a net total of zero change.

If you think you can get behind an automated defense system like that with the right meme, then that's great. We need something like that desperately. But I've tried with a fair few of these people and their thought processes on these matters are impenetrable. Bulletproof. I have no reason to believe extremists of any other stripe would be any easier to properly break open.

Now when you leave the extremist camp, you hit the other side of the issue, socioeconomic/racial/gender/LGBT/etc divides. Which Nigel mentioned in the first post and which is, absolutely, equally important. And there, yes, a meme could do some good paired with the teaching, definitely. But if you can find the right vehicle to reach out (think of how viral stuff starts -- the exponential point happens after a Big Name tosses it out there), the meme either becomes unnecessary or it has become the vehicle.

I'm just not used to the idea of educational memes. Generally they're about entertainment or the Weird, and I can't think of a way to frame this as either of those. Can you think of any examples of educational memes? Genuinely curious, they could provide a model to work with.

tyrannosaurus vex

I don't actually think memes will be very useful except maybe as a way in. After that, it's done all it's going to do and pushing it will only -- at best -- replicate the foothold. It won't convey any content. At worst, it'll overstay the entire method's welcome and implode the whole attempt.

What you say about turning a thought process based on interests is interesting, because that's exactly the basis for the system as it is now. I suppose if time, expertise, and other resources were no issue, it might be possible to use the algorithms against themselves somehow. This approach wouldn't actually work as stated here, but imagine some dedicated effort to convince Google or Facebook algorithms that right-wingers really loved reading Snopes or were huge fans of the novels 1984 and Brave New World. I see a possible way of "infiltrating" the identity of a group and subtly tweaking the established "you might also like" defaults. Of course, to be any good at it you'd need a huge number of simultaneous users doing this on purpose and a deep understanding of how the algorithms collect and consume information.
Evil and Unfeeling Arse-Flenser From The City of the Damned.

00.dusk

I think I need to sleep on this. I'm not currently able to articulate my problem with memes as a way in, I'm just repeating myself in ever more lengthy words.

That's a me problem, not an argument/discussion problem -- this will be productive, I'm just not helping it along right now.

I'll come back to it though.

minuspace

Quote from: Mesozoic Mister Nigel on January 29, 2017, 10:09:32 PM
Quote from: LuciferX on January 29, 2017, 09:55:39 PM
Quote from: Mesozoic Mister Nigel on January 29, 2017, 09:16:29 PM
Yes. Yes, they are.

https://www.splcenter.org/20170118/google-and-miseducation-dylann-roof

This article highlights one of the many problems with Google search ranking, and the way the site uses preferential ranking to show you results it thinks you will "like" based on what sites you spend the most time on -- it reinforces whatever the user already believes, exacerbating confirmation bias. This is why, when debating people about certain topics, it often appears as if they literally exist in a different world with different facts. Google is helping to polarize public opinion on controversial topics.

I think that search engine customization is, potentially, going to continue to widen social gaps and badly needs to be regulated. If you think about it, it means that the primary gateway to information for all Americans is being selectively censored - soft censoring via result ranking, but censored nonetheless - per individual. The censorship is naturally going to fall along socioeconomic lines. That means, literally, that the information Google presents a poor black user will be qualitatively different from the information Google presents a wealthy white user.

Search results are customized based on past searches and the time spent on web pages. Therefore, customized search results are tailored to an individual's previous exposure to websites. The filtering is not purposely based on race or class, but the natural net effect is the same as if it were. Essentially, someone who reads about alternative medicine will get more alternative medicine websites, and when they search a topic, those websites will be prioritized in their search results. Over time, the cumulative effect will be that they are less likely to be exposed to consensus science and medicine, and may even be functionally unable to find articles that offer science-based views because their results are so heavily weighted toward woo. A black person is more likely to gravitate toward articles, and therefore sites, that support their perspective. Likewise with a white person, as exemplified by Dylann Roof in the linked article. The net result is that individual customization of search results reinforces and even creates polarization around ideas and values.  In my opinion, this is profoundly unethical, harmful, and a threat to social stability.

Ethically difficult question.  Once previous results start informing the things for which I search, then it is difficult for me to hold Entity culpable of providing me with what I want.  Would have to raise question to second order: is this really what I want to want?  Then polemic becomes similar to that of addiction.  Should I be free to engage in an activity that may prevent me from thinking/acting freely?  At first glance it may seem logically unethical/irrational to support such contradiction.  The problem is that most are not willing to accept that they may engage in an activity that robs them of their ability to do otherwise.  In particular when the mechanism is such that the subject is made to believe that they formulated and reinforced the point of view for themselves, they will take their bias very seriously.  A wrong idea that nonetheless confirms that bias will be held in higher esteem than any incongruent truth.  Unfortunately this kind of understanding usually requires experiencing something like Trump becoming President of the United States.

I can't tell if this is satire, or actual philosophical masturbation. If the latter, you left out consent.
Consent would require discernment, which my penis insists would not necessarily result in satire.

P3nT4gR4m

"fragile minds can be shaped by the algorithm that powers Google Search."

Another attempt to blame something for the fact that human compute is piss poor on a good day. It's always google or facebook or television or rock'n'roll or magazines promoting waif-like models...

If you examine everything except the problem, don't be surprised when your solutions don't fix it. :roll:

I'm up to my arse in Brexit Numpties, but I want more.  Target-rich environments are the new sexy.
Not actually a meat product.
Ass-Kicking & Foot-Stomping Ancient Master of SHIT FUCK FUCK FUCK
Awful and Bent Behemothic Results of Last Night's Painful Squat.
High Altitude Haggis-Filled Sex Bucket From Beyond Time and Space.
Internet Monkey Person of Filthy and Immoral Pygmy-Porn Wart Contagion
Octomom Auxillary Heat Exchanger Repairman
walking the fine line line between genius and batshit fucking crazy

"computation is a pattern in the spacetime arrangement of particles, and it's not the particles but the pattern that really matters! Matter doesn't matter." -- Max Tegmark