jump to navigation

Lena, Fabio, and the Mess of Computer Science April 11, 2018

Posted by Peter Varhol in Publishing, Software development, Technology and Culture.
Tags: , , , ,
add a comment

The book Brotopia opens with a description of Lena, the November 1972 Playboy centerfold whose photo by chance was used in early research into image processing algorithms at USC.  Over time, that singular cropped image became a technical standard to measure the output of graphics algorithms.  Even today it is used in academic research to point out details of the value of alternative algorithms.

But today this image is also controversial.  Some complain that it serves to objectify women in computer science.  Others say it is simply a technical standard in the field.  A woman mathematics professor applied similar graphics algorithms to Fabio in an attempt to bring some balance to the discussion.

In the 8th grade (around the time of Lena), my middle school (Hopewell Junior High School) partitioned off boys to Shop class, and girls to Home Ec.  Perhaps one boy a year asked for Home Ec class, but it could only be taken by boys as a free elective, and was viewed as an oddity.  During my time there, to my knowledge no girl asked to be in Shop class.

Of course, I thought nothing of it at the time, but today such a segregation is troubling.  And even in 2015, a high school computer science class used Lena to show off their work with graphics algorithms, to mixed reviews.

There are many serious problems with the cult of the young white male in tech today.  As we continue to engage this demographic with not-so-subtle inducements to their libidos, we also enable them to see themselves as the Masters of the (Tech) Universe.  That worked out so well for the financial trading firms in the market failures of the 1980s and 2000s, didn’t it?

Does the same dynamic also make it more difficult for women to be taken seriously in tech?  I think that it is part of the problem, but by no means the only part.  Women in tech are like people in any field – they want to do their jobs, and not have to have cultural and frat boy behaviors that make it that much more difficult to do so.

I’ve been fortunate to know many smart and capable women throughout my life.  I had a girlfriend in college who was simply brilliant in mathematics and chemistry (in contrast, I was not brilliant at anything at that point in my life).  She may have been one of the inspirations that led me to continue plugging away at mathematics until I managed a limited amount of success at it.  Others try to do their best under circumstances that they shouldn’t have to put up with.

So let’s give everyone the same chance, without blatant and subtle behaviors that demean them and make them feel less than what they are.  We don’t today.  Case in point, Uber, which under Travis Kalanick was the best-known but by no means the only offender.  I hope we can improve, but despair that we can’t.

Advertisements

One Experience of a Lifetime April 5, 2018

Posted by Peter Varhol in Technology and Culture, Uncategorized.
Tags: , , ,
add a comment

Last month, I ran in a race called Gateway to Space.  It was executed on the Space Shuttle runway (known as the NASA Shuttle Landing Facility, because the Space Shuttles never took off horizontally) at Cape Canaveral, Florida.

The NASA Shuttle Landing Facility is 15,000 feet long, one of the longest runways in the world (Denver International has a longer one, and there may be a couple of military ones, such as Vandenberg and Edwards, that are similar).  Technically, it is about 1400 feet short of 5K, so we started on the aircraft parking area, and ran a short taxiway out to the runway.

We were told to watch out for alligators and other wildlife on the runway.

There were close to 2000 runners, although many walked it.  We began at the southern end.  There was a Space Shuttle mockup about halfway up the runway, and multiple plaques embedded into the runway designating landing and stopping points for the last Space Shuttle landings.  I have photos of several; here is one, complete with my running shoe, which other people were doing to demonstrate their physical presence.

100_2393

The weather was very nice, although it got warm fast, the distance was good, and there were water stops.  At the end, there was plenty of juice to drink.

There may be better life experiences out there, but I will always own this one.  I have always been fascinated by flying, and by space.  I am bitterly disappointed that the US cannot send people into space.  I think our government has dropped the ball, and I hope that private companies can pick it up.

In the meantime, I run the landing facility.  Definitely cool.

About Computer Science and Responsibility March 31, 2018

Posted by Peter Varhol in Strategy, Technology and Culture.
Tags: , ,
add a comment

Are we prepared to take on the responsibility of the consequences of our code?  That is clearly a loaded question.  Both individual programmers and their employers use all manner of code to gain a personal, financial, business, or wartime advantage.  I once had a programmer explain to me, “They tell me to build this menu, I build the menu.  They tell me to create these options, I create these options.  There is no thought involved.”

In one sense, yes.  By the time the project reaches the coder, there is usually little in doubt.  But while we are not the masterminds, we are the enablers.

I am not sure that all software programmers viewed their work abstractly, without acknowledging potential consequences.  Back in the 1980s, I knew many programmers who declined to work for the burgeoning defense industry in Massachusetts of the day, convinced that their code might be responsible for war and violent death (despite the state’s cultural, well, ambivalence to its defense industry to begin with).

Others are troubled by providing inaccurate information being used to make decisions, or by trying to manipulate people’s emotions to feel a particular way, to buy a particular product or service.  But that seems much less damaging or harmful than enabling the launch of a nuclear-tipped ballistic missile.

Or is it?  I am pretty sure that most who work for Facebook successfully do abstract their code from the results.  How else can you explain the company’s disregard of personal reaction to their extreme intrusion into the lives of their users?  I think that might have relatively little to do with their value systems, and more to do with the culture in which they work.

To be fair, this is not about Facebook, although I could not resist the dig.  Rather, this is to point out that the implementers, yes, the enablers, tend to be divorced from the decisions and the consequences.  To be specific:  Us.

Is this a problem?  After all, those who are making the decisions are better qualified to do so, and are paid to do so, usually better than the programmers.  Shouldn’t they be the ones taking the responsibility?

Ah, but they can use the same argument in response.  They are not the ones actually creating these systems; they are not implementing the actual weapons of harm.

Here is the point.  With military systems, we are well aware that we are enabling war to be fought, the killing of people and the destruction of property.  We can rationalize by saying that we are creating defensive systems, but we have still made a conscious choice here.

With social systems, we seem to care much less that we are potentially causing harm than in war systems.  In fact, the likes of Mark Zuckerberg still continue to insist that his creation is used only for good.  That is, of course, less and less believable as time marches on.

And to be clear, I am not a pacifist.  I served in the military in my youth.  I believe that the course of human history has largely been defined by war.  And that war is the inevitable result of human needs, for security, for sustenance, or for some other need.  It is likely that humanity in general will never grow out of the need to physically dominate others (case in point, Harvey Weinstein).

But as we continue to create software systems to manipulate people, and to do things that make them do what they would not otherwise do, is this really ethically different than creating a military system?  We may be able to rationalize it on some level, but in fact we also have to acknowledge that we are doing harm to people.

So if you are a programmer, can you with this understanding and in good conscience say that you are a force for good in the world?

It Gives Me No Pleasure to Say “I Told You So” March 21, 2018

Posted by Peter Varhol in Technology and Culture.
Tags:
add a comment

Well, maybe it does.  It feels like this is the beginning of the end for Facebook.  More so than Facebook simply can’t keep the promises made in its users, and it’s not at all clear that it even wants to.

So Facebook lets third parties mine its data.  That should surprise no one; that is the business they are in.  If you don’t know what the product is, then you are the product.

But when that data is passed on to others, there is a problem.  And when Facebook knows that has occurred, and doesn’t do anything about it, that is a bigger problem.  And not just a PR problem, but a legal problem too.  To make no mention of already facing class action lawsuits.

In the past, users have not been troubled by information like this.  We have implicitly accepted the fact that Facebook is mining our data, and personalizing its responses, and we seem to believe that this applies to everyone but us.

This feels different.  Facebook always says “trust us”, and users have either taken that at face value or ignored the implications entirely.  Now we seem to realize that Facebook lies to us every chance that it gets.

Let there be no mistake here: Facebook is in the business of monetizing your data.  And the ways that it does that are pretty darned intrusive, if you stopped to think about it.  Personalization in advertising is sometimes nearly indistinguishable from surveillance, and Facebook has mastered surveillance.

But it is sad, in that we have let Facebook get that far.  And you might certainly say the multi-billion dollar companies simply don’t go away.  There will always be hardcore users worldwide, who let their emotions swing like leaves in a breeze at what they see on Facebook.  Even honest users who use Facebook as a shortcut for keeping in touch with people have to be horrified at the way their data is being use.

It may seem like I am obsessed with Facebook, given the things I have written.  In fact, I’m not at all.  I have never used Facebook, and have no desire to do so.  But I am offended at how it influences people’s behavior, often negatively.  And how it uses that information against people.

Update:  Zuckerberg has finally spoken.  And not only did he imply it was an engineering problem, he came right out and said it was actually fixed years ago.  I wish I had that kind of chutzpah.

About Friction and Life Relationships February 23, 2018

Posted by Peter Varhol in Technology and Culture.
Tags: ,
add a comment

I’ve written about friction in these pages in the past.  In general, it references the degree of difficulty and thought required to accomplish a particular activity.  The more difficult something is to do, the more friction it entails.

Many so-called technology innovations in recent years have revolved around reducing friction in our lives.  And that’s not by itself bad.  But it does have some unintended consequences.

Take social media networks such as Facebook and to a lesser extent LinkedIn.  It has become enormously easy to connect, or to friend (since when did that become a verb?).  We think that is a great thing; we can always stay connected with the lovely young lady (or gentleman) that we had a fun conversation with at a party last week.

Do you want to know something?  It should be difficult to stay in touch with people from our past.  The friction of doing so causes us to consider carefully who is important in our lives.  If it is as easy to stay in touch with our BFF as it is to stay in touch with someone we met once at a seminar twenty years ago, then we should view that as a serious problem.  But we don’t.  Facebook gives us the curse of not having to prioritize.

Instead, we have five thousand friends, the vast majority of whom we have never met and don’t know.  I have over 900 connections on LinkedIn, and while I have a good memory, I can’t for the life of me remember over half of them.

They are not our friends.  You don’t have five thousand friends.  You may have five hundred friends, if you are especially gregarious and optimistic on how people view you.  You probably have more like twenty friends, and maybe another twenty acquaintances who you deem of enough value to stay in touch with over time (whether or not they feel the same is a different story).

So if someone is important enough to stay in touch with, they are important enough to keep a physical and virtual address.  Not a Facebook friendship; that is nothing but fake.  If they are not, then while they have added to our life experience, they will not do so again in the future.  Deal with it.

It’s Time to Shut Down Facebook February 23, 2018

Posted by Peter Varhol in Technology and Culture.
Tags: ,
add a comment

You heard it here first, but I suspect that the cacophony will only grow once Facebook’s, well, gross incompetence, and its embracement of that incompetence, becomes more apparent to more people.  To some people, Facebook is a benign tool for staying in touch with people (like we can’t write letters or emails anymore?).  To many more, it is an instrument for spreading hate and discord.

And Facebook very much enables the latter.  I am simply disgusted over its role in promoting fake news and hate in response to the recent school shooting.  You should be too.  Its excuses are not only hollow and without meaning, but they also deny any responsibility for the havoc it has enabled.

I get the feeling that senior Facebook executives gather around Mark Zuckerberg’s desk almost daily, cackling merrily about the latest trick that got past their algorithms.  In fact, the latest is doctored photos (paywall), which can be done by any 12-year old with Photoshop.  Or even with Microsoft Paint.

They don’t want to solve the problem.  It’s too much trouble.  And they are the smartest people in the room, so if they can’t see a solution, there isn’t one.  And Zuckerberg continues to think it is a non-problem, and that there is an engineering solution to this non-problem.  In reality, if he wants to continue in the social media business, he needs to throw away every single line of code and start over again.

That won’t happen, of course.  So we need to shut down Facebook.  To be fair, it’s not clear how that would happen.  While it is possible to imagine criminal charges or regulatory violations, the legal system moves in slow and mysterious ways.  And Zuckerberg will likely just move the whole thing offshore anyway.

So the only feasible solution is for every single person to stop using Facebook, now.  How will you keep in touch with people, you ask in horror.  Well, I have some thoughts about that, too.  In the next post.

Trick A Journalist Yourself February 3, 2018

Posted by Peter Varhol in Technology and Culture.
Tags:
1 comment so far

I’m usually pretty good at distilling a story down to a single narrative, focusing on it, and writing to that narrative.

Yet here, as I try to focus, the entire narrative becomes larger, less focused, and blurred in my mind.

So, let me start with the facts.  Tech journalist (and smart tech guy) Steven J. Vaughan-Nichols recently noticed an ad on Facebook:

Trick a journalist,” said the ad in bold blue type. “Scrape the web for journalists, automatically contact them and get them writing about you. Let’s get you more suckers. There’s a journalist born every minute.”

(Disclaimer:  I have known SJVN almost since he started, and I admire his ability to continue to make what I assume to be a reasonable living writing freelance tech stories over the course of three decades.)

I will be dipped in shit.

Why, pray tell?  Is this what our society has come to?  In an era where the controlling rebuttal is “fake news”, do we as individuals feel the need to sow fake news to begin with?

Seriously, this is not funny.  And if you think it is, you need to run, not walk, to the nearest psychiatrist.  Get help, please.

Ah, but you have a business purpose?  A product to promote?  Or not yet a product, but something that will be a product later?  Do it the right way.  Tell your story.  You are allowed to be enthusiastic about it, but don’t ever try to trick your journalist.

Can a journalist be tricked?  Well, yes, just as we all can.  But to what purpose?  There doesn’t seem to be any purpose here, except as sport.  And as sport, it isn’t even sporting.

We have taken journalism and have attempted to turn it into a laughingstock, a false supposition into an uncertain story.  I suppose this was inevitable, in an era where legitimate journalists can’t earn a living, and illegitimate ones get thousands of followers.

If you are laughing now, let me ask you this: When will this happen to your profession?  Sooner than you think, I will wager.  Trick a doctor?  Trick a software engineer.  Why not?  We have opened the floodgates.  There are no facts, only the narratives that are fostered by those who shout the loudest.

I am offended.  While many stories are more complex than the reporting implies, that doesn’t make them illegitimate.  And while some journalists try to bend a storyline to fit a particular point of view, that doesn’t make the storyline false.

But to intentionally create and propagate false storylines is wrong, in a fundamental sense.  It is not calling out poor journalism.  It is not making fun of a system of communicating with others.  It is just wrong.

Solving a Management Problem with Automation is Just Plain Wrong January 18, 2018

Posted by Peter Varhol in Strategy, Technology and Culture.
Tags: , ,
add a comment

This article is so fascinatingly wrong on so many levels that it is worth your time to read it.  On the surface, it may appear to offer some impartial logic, that we should automate because humans don’t perform consistently.

“At some point, every human being becomes unreliable.”  Well, yes.  Humans aren’t machines.  They have good days and bad days.  They have exceptional performances and poor performances.

Machines, on the other hand, are stunningly consistent, and least under most circumstances.  Certainly software bugs, power outages, and hardware breakdowns happen, and machines will fail to perform under many of those circumstances, but they are relatively rare.

But there is a problem here.  Actually, several problems.  The first is that machines will do exactly the same thing, every time, until the cows come home.  That’s what they are programmed to do, and they do it reasonably well.

Humans, on the other hand, experiment.  And through experimentation and inspiration come innovation, a better way of doing things.  Sometimes that better way is evolutionary, and sometimes it is revolutionary.  But that’s how society evolves and becomes better.  The machine will always do exactly the same thing, so there will never be better and innovative solutions.  We become static, and as a society old and tired.

Second, humans connect with other humans in a way machines cannot (the movie Robot and Frank notwithstanding).  This article starts with a story of a restaurant whose workers showed up when they felt like it.  Rather that addressing that problem directly, the owner implemented a largely automated (and hands off) assembly line of food.

What has happened here is that the restaurant owner has taken a management problem and attempted to solve it with the application of technology.  And by not acknowledging his own management failings, he will almost certainly fail in his technology solution too.

Except for probably fast food restaurants, people eat out in part for the experience.  We do not eat out only, and probably not even primarily, for sustenance, but rather to connect with our family and friends, and with random people we encounter.

If we cannot do that, we might as well just have glucose and nutrients pumped directly into our veins.