Home SECURITY How Filter Bubbles Manipulate What Information You See

How Filter Bubbles Manipulate What Information You See

0
How Filter Bubbles Manipulate What Information You See

[ad_1]

Have you noticed that you receive different search results based on your search history? Location? Device? And what seems to be a combination of other factors?

Have you recently searched for one thing extensively in a short period of time, and now somehow everything you search is now tied to that one topic?

Have you noticed that you lied one user’s post, and now the same users/pages keep popping up in your news feed?

(Along this same vein, have you noticed that your social media feed reverts to some version of “Most Liked by You” no matter how many times you’ve set it to show “Most Recent?”)

Well, my friend, you’re observing filter bubbles at work.

Filter bubbles plague the big search engines and social media alike. They’re subtle a lot of the times, and it’s often difficult to even realize you’re trapped in an isolated e-world that’s tough to break out of.

But what exactly are filter bubbles? How do they work? What effects do they have on your privacy and the availability of information across the internet?

I’ll explain everything here, so let’s get started.

“Filter bubble” is a term coined by Eli Pariser, who is an internet activist. Specifically, the term came about when he discussed it during a Ted Talk.

(He even wrote and published a book about it, The Filter Bubble: What the Internet Is Hiding from You.)

Filter bubbles refer to the results that algorithms produce as we go online, such as in Google, Facebook, and other social media sites.

Algorithms are large and complex computer codes. They often have many input factors that influence the “answer” they put out.

These bubbles create online results and suggestions that correspond to our browsing history and other facts such as gender, age, interests, and other personal information.

Filter bubbles are the result of a huge push towards “personalization” (and don’t forget,
hyper-personalization is a thing too
.)

In short, filter bubbles, or “echo chambers,” curate us with our own personalized library of information that would entice us to consume. Hence, the name filter bubble: filtered and isolated.

These filter bubbles have the possibility to affect our society by confining its people within cages of information that distort opinions to point where they lose some touch with reality.

1. They violate your privacy

Filter bubbles function on the fruits that come from harvesting as much information on you, your habits, and your devices as possible.

You can think of filter bubbles as one of the “products” that come from fingerprinting your devices and tracking you across the internet.

Filter bubbles essentially aggregate what information they have on you. They also directly take part in tracking you themselves.


visitor card graphic

All this info gets combined via complex algorithms, which spit out the most “appropriate” filter bubble to “trap” you in.

This filter bubble you find yourself in will limit your internet experience to what the Big Data algorithms think suit you just fine. It makes exploring new viewpoints, interests, and sources difficult.

What sucks even more is that the more you interact (read: engage) with the content that is spit out by the algorithm, the “stronger” it gets.

This is especially true if you don’t take measures to preserve your privacy.

You’ll find that even the most basic measures, such as using a privacy browser with privacy-friendly plugins, helps in curving the strength and influence of filter bubbles.

So, you can also say that, in addition to exploiting information already collected about you (from other sources) and directly collecting information on you, filter bubbles encourage more
aggressive tracking techniques
.

As you can probably guess, more aggressive tracking techniques spells bad news for privacy.

2. They manipulate what information you come across

Filter bubbles mask as “personalization just for you” but they take the concept of personalization way too far out of scope. For one, personalization options tend not to greatly affect the information you see without direct input from you.


eye illusion

You can also opt-out of personalization options – you can’t necessarily opt out of filter bubbles.

Think about it; how many times have you set your Facebook or Twitter feed to “Most Recent” but it somehow and some way reverts to the equivalent of “Things You May Like?”

Why?

The end game is consumption and engagement.

See, the big money from filter bubbles isn’t so much in numbers or volume anymore.

It is in consumption, which is normally measured in clicks (particularly for search engines). For social media it can be measured in “likes” or comments.

How do they make their money?

Well, big shock – these consumption or engagement factors get sold to advertisers.

Additionally, your profile (including your interests and “likes” or clicks) gets sold along with your consumption/engagement metrics.

The more they can link who you are, how you use the internet, what you like, what you believe in, among many other factors, the more advertisers will pay to be able to “target” you.

The more you “engage,” the more of that same or similar content you “engaged” with gets presented to you. This includes paid (ads), promoted, and “regular” content.

Additionally, you’re more likely to engage with content that you “agree” or have interacted with before. This contributes to an effect known as confirmation bias.

Confirmation bias: bias that results from the tendency to process and analyze information in such a way that it supports one’s preexisting ideas and convictions.

Eventually, you’re locked into a cage where only the types of information you have “liked” or “engaged” with get presented to you. This only worsens the confirmation bias. Hence why filter bubbles are the equivalent to “echo chambers.”

When this goes on for long enough, your views of the world get distorted. After all, how could it not be distorted when you’re looking through the equivalent of a peep hole (read: filter bubble)?

3. Endanger a free and open internet

This is perhaps the most overall devastating and lasting effects of filter bubbles…

It is the result of the combination of the extent to which they violate users’ privacy and the ability to manipulate what information you see and consume.

As you know, filter bubbles isolate you. They put you into an echo chamber. They erase the big picture.


drawing of girl being erased

This stifles any connectiveness we get from the using the internet.

It erodes the internet as the greatest tool for sharing information because it silently traps you in this vacuum of what an algorithm thinks is relevant to you, based on what you’ve searched for and interacted with before.

Because of this, the repeated confirmation bias filter bubbles create make it easier to launch disinformation campaigns.

Everything compounds and contributes to the polarization of opinions. Since filter bubbles wipe out the big picture, they also eliminate the gray areas that exist across many issues we face today.

it simply goes against “sharing information freely” on the internet.

There’s bad news: filter bubbles are hard to directly fight against.

The most important step to take in fighting against filter bubbles is being aware that they exist, and that they do affect you and your online experience.

The best way to fight against filter bubbles is to take steps to protect your overall privacy on the internet.


gray incognito icon

When you protect your privacy, there is less information being collected about you. This in turn weakens the algorithms that filter bubbles rely on.

(The less information the biggest filter bubble platforms have, the less “accurate” the filter bubble will be.)

There are many ways to protect your privacy on the internet. Many of the solutions out there are “relative” to your own personal threat model.

If you don’t know where to start, you should at minimum consider:

  • Using different search engines when searching for something (ideally, you should be using private search engines).

You should keep use of big data collecting search engines such as Google and Bing/Yahoo! to a minimum. Utilizing private search engines helps you break free of the filter bubble you’ll no-doubt find yourself in when using the big search engines.

You should also install privacy friendly add-ons and minimize data that your browser may leak – such as your internal IP address being revealed due to WebRTC leaks.

  • Not using social media accounts to log into different websites, web apps, or web services.

For example, if you use your Facebook login to log into your Spotify account – you best believe Facebook is tracking you and collecting whatever information it can about how you use this linked account.

  • Logging out of accounts – especially social media – accounts when finished. When logged in, social media trackers can tie your browsing activity directly to your social media profile.
  • Using private email services. Big data email providers such as Outlook (when synced) and Gmail scan your emails. This data in turn gets put into a shadow profile of your activities; it also gets sold and used in “hyper-personalization” and oddly specific ad re-targeting campaigns.

(Additionally, you may want to consider using multiple email accounts for different things – this is called compartmentalizing.)

  • Clearing your browsing history and cookies regularly.

Ultimately, fixing the filter bubble truly rests on Big Tech/Big Data companies.

A good start would be making the algorithms transparent to the rest of us. Currently, they treat their algorithms (especially those that contribute to the filter bubble problem) as “trade secrets.”

Listen, no one is expecting Big Tech/Data to publish their algorithms on a place like GitHub, plain as day, for all to see and copy.

However, it would be insanely helpful for all of us threatened (or trapped) in the filter bubble to know what information they are using, and to know what rules they do (or don’t) follow.

Another good start would be to plainly honor individual’s preferences.

Mainly, if I choose to “not participate” in a “personalized experience,” Big Tech should stop going covert in order to still serve me “personalized content.”

It’s just like telling a salesperson you’re not interested, only for them to follow you around as you go to different stores. Oh, and they’re constantly giving you their sales pitch and trying to shove their product materials into your hands and face.

These companies should stop trying to use underhanded tactics when it comes to “personalization.”

For example, if you set your feed to show all most recent posts, it shouldn’t silently revert to “What You May Like” hours/days/weeks later.

I get that this may put a dent in the huge profits they rake in, but when your company is credited with helping shape the internet as we know it, I think it’s fair to hold you to a different standard.

(And before they blame “updates,” I’m hard pressed to believe that every single minor update to a platform magically resets everything.)

Filter bubbles are invisible and it’s important to recognize them for what they are.

They have the capability of polarizing opinions, facilitating the spread of misinformation, and artificially limiting the openness of the internet.

They also contribute negatively to the issue of online privacy – they encourage the continued tracking of users by Big Data and Big Tech.

The threat of filter bubbles is real, so it’s also important to take measures to simultaneously protect your privacy and eliminate the filter bubbles’ effectiveness.

So with that said, as always, stay safe out there!

[ad_2]

Source link

avoidthehack.com

LEAVE A REPLY

Please enter your comment!
Please enter your name here