Social Cooling - and the long-term effects it might have.

Chalice Yao

The Purple
Joined
Sep 20, 2018
Messages
139
Location
Somewhere Purple, Germany
SL Rez
2007
Joined SLU
Dec 2007
SLU Posts
9108

A website that, in simple terms, explains the long-term effects that Big Data and the constant monitoring of social media and the connection of your daily lives into big databases for profit have, and the resulting the real-life consequences for...about almost everything. The sources for some of the points are linked on the site.

People change their personal and social behavior when they are being watched by strangers. We put on masks, some bigger, some smaller.
What happens in a society where this more and more extends to moments when you are being the only actual person in the room?
And ultimately, are we losing the right to make human mistakes, because of cold Big Data algorithms affecting our scores?


EDIT:
To add, I found the site through the comments on this particularly interesting story: Bank of America CEO: 'We Want a Cashless Society' - Slashdot
I think it's not hard to imagine the..advantages that completely trackable transaction details for even the smallest of purchases might have to certain businesses.
 
Last edited:

Myficals

Pop!
Joined
Sep 19, 2018
Messages
389
Location
a sunburnt country
SL Rez
2007
Joined SLU
Feb 2010
SLU Posts
4075
Last edited:

Ashiri

√(-1)
Joined
Sep 20, 2018
Messages
373
Location
RL: NZ
SL Rez
2007
SLU Posts
-1
While most of my purchases are cashless, I still need coins for a number of reasons: Laundry, parking, and vending machines of which only the latter two have any cashless option. I also like to carry cash for those eftpos failure scenarios.
And ultimately, are we losing the right to make human mistakes, because of cold Big Data algorithms affecting our scores?
I would say that is likely and it probably going to be worse for younger people and 'early adopters'.
The other factor will be likely be suspicion of those with sparse social media profiles.
 

Rose Karuna

Old and Cranky Xherdan
VVO Supporter 🍦🎈👾❤
Joined
Sep 24, 2018
Messages
332
Location
Someplace new soon
SL Rez
2005
Joined SLU
2007
I work in the data protection industry. The amount of information collected on people is staggering and I agree that it is changing our society and not for the better.

Algorithms are being used to determine how much we pay for insurance (more if your credit score is low, even if your driving record is clean), or what the terms of our loans will be, or what kind of political messaging we’ll receive. There are algorithms that find out the weather forecast and only then decide on the work schedule of thousands of people, laying waste to their ability to plan for childcare and schooling, never mind a second job.

Their popularity relies on the notion they are objective, but the algorithms that power the data economy are based on choices made by fallible human beings. And, while some of them were made with good intentions, the algorithms encode human prejudice, misunderstanding, and bias into automatic systems that increasingly manage our lives (my own comment: not to mention bad or inaccurate data that was collected).

Like gods, these mathematical models are opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, are beyond dispute or appeal. :qft: And they tend to punish the poor, [ill] and the oppressed in our society, while making the rich richer. That’s what Kyle Behm learned the hard way.
How algorithms rule our working lives | Cathy O’Neil

I have been telling my family and friends this for a long time. None of them could believe that anyone could collect such a staggering amount of data on them and even if they did that they could use it. Well guess what - they can and they do.
 

Noodles

Queen of Ramen
Joined
Sep 20, 2018
Messages
382
Location
Illinois
SL Rez
2006
Joined SLU
04-28-2010
SLU Posts
6947
I kind of wonder how well it handles multiple "masks" and profiles frommthe same people. How well it even identifies them. Like I sort of have a whole "Second Life" thing. I have my "regular online" way of doing things. I have my "personal/family" way of doing things. I have my "personal/friends" way of doing things. I have my "work/business" way of doing things.

A lot of these have separate accounts for my own sanity of avoiding the annoying filtering so many sites use.

These don't always or often overlap and frankly, can sometimes be contradictory.
 

Innula Zenovka

Nasty Brit
VVO Supporter 🍦🎈👾❤
Joined
Sep 20, 2018
Messages
2,294
SLU Posts
18459
There is another side to this, though, at least when it comes to credit scoring.

My last serious involvement with this area was about 20 years ago, when automated credit scoring and automated Customer Case Management were first becoming a thing, and the big selling point then was that the more data you collected the more personalised a product you could offer.

So if someone shows up as a good credit risk, they're likely to get a better interest rate/higher credit limit than otherwise they might, and if they're seen as a bad credit risk, then the interest rate goes up.

This sounds unfair, though I've never quite seen that, in that lending someone money is always a gamble, literally, and I'm certainly more likely to lend money to a friend I know will pay me back than I am to a friend I know is a bit of deadbeat, just as I'm likely to be more willing to bet on horse with a good track record than one who keeps losing races.

The difference between me and the credit card company, though, is that it's my money, so if my deadbeat friend needs the money badly enough and I can afford it, I'll do all I can to help out, knowing that I probably won't see the money again. But the credit card company are lending out other people's money to people with whom they have no bonds of friendship, so they're no more likely to lend than is a local shop likely to advance credit to a customer they don't know (or to someone they do know who has previously failed to clear their slate).

But the point is that, automated credit scoring or not, a bank or a credit card company has to have some means of assessing how risky it is to lend money -- other people's money -- to somebody, and their default position without having any data on someone is going to be to refuse.

They don't want to have to turn people down -- they make no money that way -- and people in need of credit wouldn't have applied in the first place if they didn't want it, so to my mind everyone wins if, because of the data the credit card people hold, someone in need of a loan gets one.

The alternative, we need to remember, to not being able to borrow money from the credit card company is generally, if you need the funds badly enough, to borrow from a payday loan company or a loan shark, or to do without something either you or your family quite possibly desperately need.

So the more data that's available about you, the more likely you are to get the loan you so urgently want.

The credit scoring system won't charge you high interest rates simply because they want to gouge you, though it may feel like that, but because they know, from experience, that there's an n% chance someone with your profile will default on this unsecured loan, so they build that into the interest rates to cover the bad debts they know they'll have to write off.

I'm not defending the credit card and insurance companies (or anyone else), but I do point out there's another side to the story,

To my mind, the far larger concern is about how companies use your data without your permission, but that's a whole different story. It's fortunately a matter that far easier to deal with, if there's the political will, which is why the legal data protection regimes are different on the two sides of the Atlantic Ocean.
 
Last edited:

Noodles

Queen of Ramen
Joined
Sep 20, 2018
Messages
382
Location
Illinois
SL Rez
2006
Joined SLU
04-28-2010
SLU Posts
6947
Not saying the site isn't mostly right, but it also almost feels like the idea pushed by folks who tend to have "negative" social ideas. Like the types who complain that "people only have a rainbow Facebook.ocon to see trendy not because they care."
 
  • 1Thanks
Reactions: Innula Zenovka

Dakota Tebaldi

Well-known member
VVO Supporter 🍦🎈👾❤
Joined
Sep 19, 2018
Messages
1,627
Location
Gulf Coast, USA
Joined SLU
02-22-2008
SLU Posts
16791
Not saying the site isn't mostly right, but it also almost feels like the idea pushed by folks who tend to have "negative" social ideas. Like the types who complain that "people only have a rainbow Facebook.ocon to see trendy not because they care."
Thanks.

That's my primary thinking on this. There's growing talk that people are being unfairly punished for "mistakes" that they should be entitled to forgiveness for. But in my opinion, there's a line between mistakes and deliberate bad acts and choices that people only act contrite about because they're forced to by exposure or backlash. And it's not a thin or fuzzy line at all; but there's increasing attempts by some with agendas to blur that line, and I am resistant to that.

I disagree with the way that website frames a lot of the issues. Online data collection and profiling, in my opinion, IS a problem; but the social reactions to the facts that are exposed in the data aren't what makes it a problem. In fact, the social implications aren't even new and specific to the online world.

Consider some of the problem scenarios the website brings up:

You may not get that dream job if your data suggests you're not a very positive person.

If you return goods to the store often this will be used against you.

If you have "bad friends" on social media you might pay more for your loan.

Tinder's algorithms might not show you attractive people if you are not desirableyourself.


It shouldn't be some grand revelation that every single one of these is really a corollary of a non-internet-based phenomenon. They seem scary because they're presented here as the results of cold and unforgiving computer algorithms, but in reality those algorithms reflect human decisions, attitudes, and tendencies that were true long before the internet came along. Before the internet, if you came off like Eeyore or Yosemite Sam during your job interview you might not get that dream job either. Shopkeepers' opinions of you dropped every time you returned an item and you might get a reputation that got passed along to other shopkeepers if it became a habitual thing. Your parents have always judged your friends - and anyway, how often have you heard someone tell you not to give any money to the beggar because "he'll just spend it on booze or drugs"? And unattractive people didn't get as many callbacks when they put their photos in magazine ads either.

Some other of the website's talking points:

Have you ever hesitated to click on a link because you thought your visit might be logged, and it could look bad?
Sure but when people entertain this line of reasoning they're thinking about looking bad to the police and the government, not their friends.

The pollution of our social environment is invisible to most people, just like air pollution was at first.
But it's not pollution.

Argue as much as you want that it's wrong; I'd mostly agree with you. But all the data collection isn't "polluting the social environment" with bad data the way air pollution is made up of bad things to breathe in. All it's actually doing is exposing more of who we really are as individuals, restricting our ability to lie to each other about ourselves. And that in turn means that others are free to react and form opinions of us based on who we really are, rather than the fictional person we'd prefer them to think we are. Even if you believe that we as humans have a right to project these fictions, or are healthier when we're able to, still - the truth is not pollution.

And finally we get to the premise:

As pressure to be perfect rises we will learn what privacy really is: Privacy is the right to be imperfect

When algorithms judge everything we do, we need to protect the right to make mistakes.

When everything is remembered as big data, we need the right to have our mistakes forgotten.
Which leads to my original argument - I think that what this idea really is at its core is that it's unfair to judge people for the things they choose to express, and particularly when they express things that people find REALLY abhorrent they should have the right to call those "mistakes" and have everyone ignore the fact that they said or did those things. I reject that. The pressure to not be a douchebag is not "pressure to be perfect".
 

Innula Zenovka

Nasty Brit
VVO Supporter 🍦🎈👾❤
Joined
Sep 20, 2018
Messages
2,294
SLU Posts
18459
My problem with this sort of profiling is that it risks institutionalising prejudice, and particularly when the scoring rules and algorithms are generated by AI.

That's because, if, for example, existing social prejudices tend to stigmatise and marginalise members of recognisable ethnic or religious minorities, then the AI will pick up on this, noticing that people with particular first or last names tend to get themselves arrested quite a lot, as do people who live in particular parts of town, and that they are also more likely than most to have irregular or insecure employment patterns, and act accordingly.

However, I would say that's a design problem rather than anything else, and best remedied by encouraging companies to guard against this, possibly by imposing on them a duty to nominate a director to take all reasonable measures to ensure their software, as well as their staff, understands all the fine words in the company's mission statement about equality and fairness and non-discrimination, and hitting both the company and the nominated director for large and well-publicised financial penalties and compensation if they fail to comply.
 

Katheryne Helendale

🐱 Kitty Queen🐱
Joined
Sep 20, 2018
Messages
2,148
Location
Right... Behind... You...
SL Rez
2008
Joined SLU
October 2009
I work in the data protection industry. The amount of information collected on people is staggering and I agree that it is changing our society and not for the better.



How algorithms rule our working lives | Cathy O’Neil

I have been telling my family and friends this for a long time. None of them could believe that anyone could collect such a staggering amount of data on them and even if they did that they could use it. Well guess what - they can and they do.
Yep. We've been monetized right down to our toenails. Yay capitalism.

I've become somewhat cashless. Being able to go to the store without stopping by an ATM first is pretty huge. However, I still need cash for certain transactions like bus fare and buying gas when there's less than $100 on my card.
 

Chalice Yao

The Purple
Joined
Sep 20, 2018
Messages
139
Location
Somewhere Purple, Germany
SL Rez
2007
Joined SLU
Dec 2007
SLU Posts
9108
However, I would say that's a design problem rather than anything else, and best remedied by encouraging companies to guard against this, possibly by imposing on them a duty to nominate a director to take all reasonable measures to ensure their software, as well as their staff, understands all the fine words in the company's mission statement about equality and fairness and non-discrimination, and hitting both the company and the nominated director for large and well-publicised financial penalties and compensation if they fail to comply.
The problem is that, while you are quite right, this also requires a big push by the political side to make that kind of thing - or at least the basic guards and potential punishments for companies that violate it - law. The situation here in Europe is a bit better, not least because of our new round of privacy laws, but even here the politcians are a bit..shall we say, hesitant when it comes to really drawing a clear line.
But places like the US are officially hosed; Especially if you look at the group of people who are in power right now.
 
Last edited:

Chalice Yao

The Purple
Joined
Sep 20, 2018
Messages
139
Location
Somewhere Purple, Germany
SL Rez
2007
Joined SLU
Dec 2007
SLU Posts
9108
Argue as much as you want that it's wrong; I'd mostly agree with you. But all the data collection isn't "polluting the social environment" with bad data the way air pollution is made up of bad things to breathe in. All it's actually doing is exposing more of who we really are as individuals, restricting our ability to lie to each other about ourselves. And that in turn means that others are free to react and form opinions of us based on who we really are, rather than the fictional person we'd prefer them to think we are. Even if you believe that we as humans have a right to project these fictions, or are healthier when we're able to, still - the truth is not pollution.

Which leads to my original argument - I think that what this idea really is at its core is that it's unfair to judge people for the things they choose to express, and particularly when they express things that people find REALLY abhorrent they should have the right to call those "mistakes" and have everyone ignore the fact that they said or did those things. I reject that. The pressure to not be a douchebag is not "pressure to be perfect".
The primary issue is not people being called out for being assholes. We always were good with that (for the socially common definition of asshole at various times). The real issue is that the same monitoring and mechanisms can be used to deny you things simply because some people decided that your way of behaving (which is not necessarily bad at all) does belong to their algorithms that sort you into their category of 'no-nos'. And the data points might increasingly become things that you do in private, without any social involvement. See: Digital transactions.
Add to it that we ever so often have reports of databases of such things accidentally ending up leaked.

And it is always important to remember the most crucial thing:
If the political and social landscape changes, things that once were accepted...might not be anymore. And things that were once the sign of an asshole, might become your patriotic duty. It happened before.
Except when things change someday, the very mechanisms for monitoring and data collecting that are becoming normal now will still be in place.
And it will just be a change of algorithms for the new people in charge to misuse it all. And the data of your past actions will still be there. You might end up on the bad side of it all for things that you did 20 years ago, that were perfectly fine at the time.

The mechanisms that are being put into place right now more and more would once have given certain rather bad people wet dreams.
 
Last edited:

Rose Karuna

Old and Cranky Xherdan
VVO Supporter 🍦🎈👾❤
Joined
Sep 24, 2018
Messages
332
Location
Someplace new soon
SL Rez
2005
Joined SLU
2007
What happens when every single aspect of your life is monitored, your social interactions, your buying habits and specifically down to the cookie, what you buy, what you read, who you talk with, your family, your friends, your political leanings and your medical conditions?

Even your sewage: The Plan to Test Cities’ Sewage for Drugs Is a New Form of Mass Surveillance

Monitoring Employee Productivity: Proceed with Caution

Are Your Smarthome Devices Spying on You?

How Google and Amazon are ‘spying’ on you | Consumer Watchdog

How can this kind of constant monitoring of every aspect you your life not change who you are? One year from now I'm retiring. I'm very thankful for that, eventually corporations (if they haven't already) will lose interest in retirees and just let us be. Younger people though, my heart is with you because I can only see society becoming more and more Orwellian.
 

OrinB

I'm here...
Joined
Sep 20, 2018
Messages
179
Location
UK
SL Rez
2007
Joined SLU
23 Sept 2009
SLU Posts
4771
While most of my purchases are cashless, I still need coins for a number of reasons: Laundry, parking, and vending machines of which only the latter two have any cashless option. I also like to carry cash for those eftpos failure scenarios.
You are more likely to need more than one card as some stores don't even accept cash anymore!
 

Kalel

hypnotized
VVO Supporter 🍦🎈👾❤
Joined
Sep 19, 2018
Messages
236
Location
Miami,FL
SL Rez
2006
Joined SLU
2010
SLU Posts
1965
You are more likely to need more than one card as some stores don't even accept cash anymore!
went to the laundry mat the other day... they all require a reloadable cards... theres a machine on site to get a free card and reload it. careful not to bend it... your not getting your money back if the card is broken.

if you go to Park downtown theres a solar powered machine you can go to pay for parking(card only)...get a receipt to stick on your dashboard.( or you can use the phone app and number code posted on a convient sign nearby) If you park in public garage.. you get a time stamped ticket you can pay as you go out.( machine downstairs allows cash or as you go out the gate via card only)

Vending machines still directly take cash but usually offer coupons and special deals via apps( download our app today! get a free Soda!) most likely we already have the phone in hand so it's easier to just slide it down the scanner then pull out a crumbled dollar and fight the machine trying to get it in.

I'm surprised more companies arn't doing reloadable cards...
 

Innula Zenovka

Nasty Brit
VVO Supporter 🍦🎈👾❤
Joined
Sep 20, 2018
Messages
2,294
SLU Posts
18459
In other words we're returning to the norms of a small town. Everyone knows everything about you, including the time you ate paste in kindergarten, and the affair you had with the guy who delivers your gardening supplies, and that your uncle drinks too much.
I've been thinking about this, and I'm not so sure that's the position.

After all, "everyone" doesn't know this, at least not in the way that -- for example -- "everyone" (or at least everyone in the UK who follows the news) knows about Boris Johnson's somewhat eventful personal life and the fight he had on Friday night with his latest partner.

Rather, the computers that score people's credit ratings and so on have access to all kinds of information about us, but neither we nor, I think (and this is the important bit) anyone else has any sort of ready access to it, let alone knows how (if at all) it counts in the decision-making process.

Most of the time, all that anyone knows (or cares about) is that the credit/insurance/whatever application has succeded (or hasn't). The reasons why that is are a closely guarded secret that can't readily be unravelled by anyone.

So while there may well exist sufficient evidence to infer that my uncle drinks too much (in the form of credit and debit card receipts, loyalty card records at stores, medical records, and possibly various fitness apps too), I think you'd need to know, first, who my uncle is and, second, to have the means and knowedge to collate all this data, as well as a reason to want to -- it's not the same as everyone knowing about his drinking habits because there's only three bars and two liquor stores in town, and his next-door-neighbour frequently sees him come home drunk.
 

Bartholomew Gallacher

Well-known member
Joined
Sep 26, 2018
Messages
1,040
Most of the time, all that anyone knows (or cares about) is that the credit/insurance/whatever application has succeded (or hasn't). The reasons why that is are a closely guarded secret that can't readily be unravelled by anyone.
Yes, the algorithms which decide that are indeed considered a business secret and undisclosed. Some clever people for example have been trying to reverse engineer the algorithm of the famous SCHUFA, so far with mixed results.
 

Katheryne Helendale

🐱 Kitty Queen🐱
Joined
Sep 20, 2018
Messages
2,148
Location
Right... Behind... You...
SL Rez
2008
Joined SLU
October 2009
While most of my purchases are cashless, I still need coins for a number of reasons: Laundry, parking, and vending machines of which only the latter two have any cashless option. I also like to carry cash for those eftpos failure scenarios.
The laundry machines in my apartment complex are completely cashless. PayRange App
 
  • 1Agree
Reactions: Han Held