jump to navigation

About Computer Science and Responsibility March 31, 2018

Posted by Peter Varhol in Strategy, Technology and Culture.
Tags: , ,
add a comment

Are we prepared to take on the responsibility of the consequences of our code?  That is clearly a loaded question.  Both individual programmers and their employers use all manner of code to gain a personal, financial, business, or wartime advantage.  I once had a programmer explain to me, “They tell me to build this menu, I build the menu.  They tell me to create these options, I create these options.  There is no thought involved.”

In one sense, yes.  By the time the project reaches the coder, there is usually little in doubt.  But while we are not the masterminds, we are the enablers.

I am not sure that all software programmers viewed their work abstractly, without acknowledging potential consequences.  Back in the 1980s, I knew many programmers who declined to work for the burgeoning defense industry in Massachusetts of the day, convinced that their code might be responsible for war and violent death (despite the state’s cultural, well, ambivalence to its defense industry to begin with).

Others are troubled by providing inaccurate information being used to make decisions, or by trying to manipulate people’s emotions to feel a particular way, to buy a particular product or service.  But that seems much less damaging or harmful than enabling the launch of a nuclear-tipped ballistic missile.

Or is it?  I am pretty sure that most who work for Facebook successfully do abstract their code from the results.  How else can you explain the company’s disregard of personal reaction to their extreme intrusion into the lives of their users?  I think that might have relatively little to do with their value systems, and more to do with the culture in which they work.

To be fair, this is not about Facebook, although I could not resist the dig.  Rather, this is to point out that the implementers, yes, the enablers, tend to be divorced from the decisions and the consequences.  To be specific:  Us.

Is this a problem?  After all, those who are making the decisions are better qualified to do so, and are paid to do so, usually better than the programmers.  Shouldn’t they be the ones taking the responsibility?

Ah, but they can use the same argument in response.  They are not the ones actually creating these systems; they are not implementing the actual weapons of harm.

Here is the point.  With military systems, we are well aware that we are enabling war to be fought, the killing of people and the destruction of property.  We can rationalize by saying that we are creating defensive systems, but we have still made a conscious choice here.

With social systems, we seem to care much less that we are potentially causing harm than in war systems.  In fact, the likes of Mark Zuckerberg still continue to insist that his creation is used only for good.  That is, of course, less and less believable as time marches on.

And to be clear, I am not a pacifist.  I served in the military in my youth.  I believe that the course of human history has largely been defined by war.  And that war is the inevitable result of human needs, for security, for sustenance, or for some other need.  It is likely that humanity in general will never grow out of the need to physically dominate others (case in point, Harvey Weinstein).

But as we continue to create software systems to manipulate people, and to do things that make them do what they would not otherwise do, is this really ethically different than creating a military system?  We may be able to rationalize it on some level, but in fact we also have to acknowledge that we are doing harm to people.

So if you are a programmer, can you with this understanding and in good conscience say that you are a force for good in the world?

It Gives Me No Pleasure to Say “I Told You So” March 21, 2018

Posted by Peter Varhol in Technology and Culture.
Tags:
add a comment

Well, maybe it does.  It feels like this is the beginning of the end for Facebook.  More so than Facebook simply can’t keep the promises made in its users, and it’s not at all clear that it even wants to.

So Facebook lets third parties mine its data.  That should surprise no one; that is the business they are in.  If you don’t know what the product is, then you are the product.

But when that data is passed on to others, there is a problem.  And when Facebook knows that has occurred, and doesn’t do anything about it, that is a bigger problem.  And not just a PR problem, but a legal problem too.  To make no mention of already facing class action lawsuits.

In the past, users have not been troubled by information like this.  We have implicitly accepted the fact that Facebook is mining our data, and personalizing its responses, and we seem to believe that this applies to everyone but us.

This feels different.  Facebook always says “trust us”, and users have either taken that at face value or ignored the implications entirely.  Now we seem to realize that Facebook lies to us every chance that it gets.

Let there be no mistake here: Facebook is in the business of monetizing your data.  And the ways that it does that are pretty darned intrusive, if you stopped to think about it.  Personalization in advertising is sometimes nearly indistinguishable from surveillance, and Facebook has mastered surveillance.

But it is sad, in that we have let Facebook get that far.  And you might certainly say the multi-billion dollar companies simply don’t go away.  There will always be hardcore users worldwide, who let their emotions swing like leaves in a breeze at what they see on Facebook.  Even honest users who use Facebook as a shortcut for keeping in touch with people have to be horrified at the way their data is being use.

It may seem like I am obsessed with Facebook, given the things I have written.  In fact, I’m not at all.  I have never used Facebook, and have no desire to do so.  But I am offended at how it influences people’s behavior, often negatively.  And how it uses that information against people.

Update:  Zuckerberg has finally spoken.  And not only did he imply it was an engineering problem, he came right out and said it was actually fixed years ago.  I wish I had that kind of chutzpah.