Our algorithms overlords

Damien Cosset - Nov 10 '19 - - Dev Community

A Apple/Goldman Sacks tale

Last Thursday, David Heinemeier Hansson, aka @dhh on Twitter, ranted about an Apple product. Apple partnered up with Goldman Sachs to launch a credit card. So far, nothing out of the ordinary, a lot of companies have this sort of services. However, DHH was surprised to find out that his wife wasn't allowed nowhere near the credit limit David was authorized. You can find the whole thread there if you want.

After numerous tweets calling out the behavior, we discovered it was all because of the algorithm. The program that Apple/Goldman used in order to determine how much credit you were allowed. A lot of testimonies reinforced DHH first impressions: women weren't given the same credit limit than men, for no reason other than their gender.

Unfortunately, this is something we shouldn't be surprised about.

Algorithms and freedom

Nowadays, we live in a society where data rules everything. We log in to social media platforms and websites. We think those services are free, but they are not. Whenever we use these services, we agree to give away our data. The huge amount of data gathered by the big tech companies help them refine their business model. Ads. Ads everywhere, all the time.

With more data, those big tech companies (Twitter, Facebook, Google...) can run better ads for you. They can analyse your behavior and make sure they create the right ad for you...

These datasets are then fed to programs which are created for a particular purpose. It can be ads, or, it can be giving a certain credit limit to people.

In DHH's quest to find out what was happening, one answer kept coming up: It's the algorithm... We have not control over it...

This is frightening... Apple and Goldman both claimed they have no control over the algorithm, that they don't understand why results like these come up... Is that the world we want to live in? A world where humans give up their freedom of choice to machines and programs we don't even know how they work anymore?

Of course, some people know how they work... But my guess is there are so few of them it's just like nobody knows it...

These programs are designed to answer a question, or solve a problem. But phrasing the question or the problem is the most crucial part in the entire process. A biased question means discrimination at the end. And, unfortunately, we know that the teams creating those programs are not the most diverse... (couch rich white dudes cough). When the dust clears, it's always the same people who get hurt the most... The minorities...

Our world is changing, and dataism is coming at us hard. I'm not sure it's for the best anymore...

There is hope though

At the time of this writing, the state of New York launched an investigation over this whole thing. And it all started because someone on Twitter felt something was wrong. A proof, if we needed one, that we are not helpless in this fight. We have the right to know how companies use our data to create their algorithms. These algorithms that have an impact on so many aspects of our daily lives. If we let them run free, we might lose our very own freedom in the end...

What do you think?

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player