jump to navigation

The Human In the Loop September 19, 2017

Posted by Peter Varhol in Software development, Strategy, Technology and Culture.
Tags: ,
add a comment

A couple of years ago, I did a presentation entitled “Famous Software Failures”.  It described six events in history where poor quality or untested software caused significant damage, monetary loss, or death.

It was really more about system failures in general, or the interaction between hardware and software.  And ultimately is was about learning from these failures to help prevent future ones.

I mention this because the protagonist in one of these failures passed earlier this year.  Stanislav Pretov, a Soviet military officer who declined to report a launch of five ICBMs from the United States, as reported by their defense systems.  Believing that a real American offensive would involve many more missiles, Lieutenant Colonel Petrov refused to acknowledge the threat as legitimate and contended to his superiors that it was a false alarm (he was reprimanded for his actions, incidentally, and permitted to retire at his then-current rank).  The false alarm had been created by a rare alignment of sunlight on high-altitude clouds above North Dakota.

There is also a novel by Daniel Suarez, entitled Kill Decision, that postulates the rise of autonomous military drones that are empowered to make a decision on an attack without human input and intervention.  Suarez, an outstanding thriller writer, writes graphically and in detail of weapons and battles that we are convinced must be right around the next technology bend, or even here today.

As we move into a world where critical decisions have to be made instantaneously, we cannot underestimate the value of the human in the loop.  Whether the decision is made with a focus on logic (“They wouldn’t launch just five missiles”) or emotion (“I will not be remembered for starting a war”), it puts any decision in a larger and far more real context than a collection of anonymous algorithms.

The human can certainly be wrong, of course.  And no one person should be responsible for a decision that can cause the death of millions of people.  And we may find ourselves outmaneuvered by an adversary who relies successfully on instantaneous, autonomous decisions (as almost happened in Kill Decision).

As algorithms and intelligent systems become faster and better, human decisions aren’t necessarily needed or even desirable in a growing number of split-second situations.  But while they may be pushed to the edges, human decisions should not be pushed entirely off the page.

 

Advertisements

Are We Wrong About the Future of Digital Life? September 14, 2017

Posted by Peter Varhol in Technology and Culture.
Tags: ,
add a comment

Digital life offers the promise of no friction in our lives; that is, no difficulty in doing anything ordinary, such as shopping, meeting people, or traveling.  There is no impediment in our lives.  I have written about the idea of friction before, thinking that at least some friction is necessary for us to grow and develop as human beings.

Further, science fiction author Frank Herbert had some very definite ideas about friction, now over 50 years ago.  He invented a protagonist named Jorg X. McKie, who worked for the Bureau of Sabotage as a saboteur.  At some indeterminate time in the future, galactic government became so efficient that laws were conceived in the morning, passed in the afternoon, and effective in the evening.  McKie’s charter was to toss a monkey wrench into the workings of government, to slow it down so that people would be able to consider the impact of their rash decisions.

But let’s fast forward (or fast backward) to Bodega, the Silicon Valley startup that is trying to remove friction from convenience store stops.  Rather than visit a tiny hole-in-the-wall shop, patrons can pick up their goods at the gym, in their apartment building, or anywhere that is willing to accept a cabinet.  Customers use their app to unlock it, and their purchases are automatically recorded and charged.

It turns out that people are objecting.  Loudly.  It turns out that the bodega (a Hispanic term for the tiny shops) is more than just a convenience.  It is where neighborhood residents go to find out what is happening with other people, and to find out what is going on in general.  In an era where we are trying to remove interpersonal interaction, some of us also seem to be trying to restore it.

My point is that maybe we want to see our neighbors, or at least hear about them.  And the bodega turns out to be an ideal clearing house, so to speak.  I’ve seen something similar in northern Spain, where the tiny pinxtos shops serve pinxtos in the morning until the late afternoon, then transition into bars for the evening.  We visit one such place every morning when we are in Bilbao.  They don’t speak any English, and my Spanish is limited (and no Basque), but there is a certain community.

That is encouraging.  Certainly there is some friction in actually having a conversation, but there is also a great deal of value in obtaining information in this manner.  We establish a connection, but we also don’t know what we’re going to hear from visit to visit.

I wonder if there is any way that the company Bodega can replicate such an experience.  Perhaps not, and that is one strong reason why we will continue to rely on talking to other people.

More About Friction and Life September 5, 2017

Posted by Peter Varhol in Technology and Culture.
Tags: ,
3 comments

Apparently the next wave of getting friction out of our lives is to text people we are visiting, rather than ringing a doorbell (paywall).  It seems that doorbells disturb people (okay, in particular young people).  In some cases apparently seriously.

I’m ambivalent about this.  As one generation passes on to the next, customs change, and it is entirely likely that texting to let someone know you are outside of their door will become the new normal.  On the surface, it may not be a bad thing.

But there’s always a but.  It turns out that texting someone is an excuse for not seeing someone physically.  And there are plenty of places where I go that I may not know the phone number of the person inside.

But more about friction in general.  Friction is the difference between us as individuals gliding through life unimpeded, or having some roadblocks that prevent us from doing some of what we would like.  None of us like friction.  All of us need it.

Whatever else I may doubt, I am certain that friction is an essential part of a rich and fulfilling life.

If you are afraid of something, then there is good reason to face it.

First, friction teaches us patience and tolerance.  It teaches how to wait for what we decide is important in life.

Second, it teaches us what is important in our lives.  We don’t know what is important unless we have to work for it.

Third, it teaches us that we may have to change our needs and goals, based on the feedback we get in life.

Many of the Silicon Valley startups today are primarily about getting rid of friction in our lives.  Uber (especially), Blue Apron, really just about any phone app-based startup is about making our daily existence easier.

You heard it here first, folks.  Easier is good, but we also need to work, even for the daily chores that seem like they should be easier.  We may have to call for a cab, or look up a menu and pick up a meal.  Over the course of our lives, we learn many life lessons from experiences like that.

Do me a favor this week.  Try the doorbell.

Rage Against the Machine August 22, 2017

Posted by Peter Varhol in Technology and Culture.
1 comment so far

No, not them.  Rather, it is the question of our getting frustrated with our devices.  I might have an appointment that my device fails to remind me of (likely a setting we forgot, or something that was inadvertently turned off), and I snap at the device, rather than chastising myself for forgetting it.  Or I get poor information from my electronic assistant because it misinterprets the question.

And because our devices increasingly talking to us, we might feel an additional urge to talk back.  Or yell back if we don’t like the answers.

There are two schools of thought on this.  One is that acting out frustration against an inanimate object is an acceptable release valve and lessens our aggression against human recipients (a variation of this is the whole displacement syndrome in psychology), making it easier for us to deal with others.

The second is that acting aggressively toward an electronic assistant that speaks can actually make us more aggressive with actual people, because we become used to yelling at the device.

MIT researcher Sherry Turkle tends toward the latter result.  She says that “Yelling at our machines could lead to a “coarsening of how people treat each other.”

I’m not sure what the right answer here is; perhaps a bit of both, depending upon other personal variables and circumstances.

But I do know that yelling at an inanimate object, even if it does have a voice, will accomplish nothing productive.  Unlike the old saw where “Trying to teach a pig to fly won’t succeed, and it annoys the pig,” yelling at your electronic assistant won’t even annoy it.

Don’t do it, folks.

Google Blew It August 12, 2017

Posted by Peter Varhol in Technology and Culture, Uncategorized.
Tags:
add a comment

I don’t think that statement surprises anyone.  Google had the opportunity to make a definitive statement about the technology industry, competence, inclusion, ability, and teamwork, and instead blew it as only a big, bureaucratic company could.  Here is why I think so.

First, Google enabled and apparently supported a culture in which views colored by politics are freely promoted.  That was simply stupid.  No one wins at the politics game (and mostly everyone loses).  We believe what we believe.  If we are thoughtful human beings with a growth mindset, our beliefs are likely to change, but over a period of years, not overnight.

Second, Google let the debate be framed as a liberal versus conservative one.  It is most emphatically not.  I hate those labels.  I am sure I have significant elements of each in my psyche, along with perhaps a touch of libertarianism.  To throw about such labels is insulting and ludicrous, and Google as a company and a culture enabled it.

Okay, then what is it, you may ask.  It is about mutual respect, across jobs, roles, product lines, and level of responsibility.  It is working with the person, regardless of gender, race, age, orientation, or whatever.  You don’t know their circumstances, you may not even know what they have been assigned to do.  Your goal is to achieve a robust and fruitful working relationship.  If you can’t, at least some of that may well be on you.

The fact that you work together at Google gives you more in common with each other than almost anyone else in the world.  There are so many shared values there that have nothing to do with political beliefs, reflexive or well-considered.  Share those common goals; all else can be discussed and bridged.  It’s only where you work, after all.

You may think poorly of a colleague.  God knows I have in the past, whether it be for perceived competence, poor work habits, skimpy hours, or seeming uninspired output (to be fair, over the years a few of my colleagues may have thought something similar about me).  They are there for a reason.  Someone thought they had business value.  Let’s expend a little more effort trying to find it.  Please.

So what would I have done, if I were Sundar Pichai?  Um, first, how about removing politics from the situation?  Get politics out of office discussions in general, and out of this topic in particular.  All too often, doctrinaire people (on both sides of the aisle) simply assume that everyone thinks their ideas are inevitably right.  Try listening more and assuming less.  If you can’t, Sundar, it is time to move aside and let an adult take over.

Second, Google needs everyone to understand what it stands for.  And I hope it does not stand for liberal or conservative.  I hope it wants everyone to grow, professionally, emotionally, and in their mindsets.  We can have an honest exchange of ideas without everyone going ballistic.

Get a grip, folks!  There is not a war on, despite Google’s ham-handed attempts to make it one.  We have more in common than we are different, and let’s work on that for a while.

I can’t fix Google’s monumental screw-up.  But I really hope I can move the dial ever so slightly toward respect and rational discourse.

The Final Frontier July 6, 2017

Posted by Peter Varhol in Education, Technology and Culture.
Tags: , ,
add a comment

Yes, these are the voyages of the Starship Enterprise.  Its five-year mission: to explore strange new worlds, to seek out new life and new civilizations, to boldly go where no man has gone before.

To someone of my age, this defined the possibilities of space, perhaps even more so than the Apollo 11 landing on the moon.

We failed at this, in my lifetime, to my dying (hopefully not soon) regret.  We failed, not because of a lack of technology, but because of a lack of will.  Since the 1980s, America has been looking inward, rather than reaching for the next brass ring in the universe.

Today, we have no ability to launch astronauts into orbit.  No, we don’t.  Our astronauts go into orbit courtesy of the ESA or the Russians (not sure that ESA is doing all that much any more).  I am sure many of you are pleased at this, but you miss the larger picture.

May I quote Robert Browning: “Ah, but a man’s reach should exceed his grasp, Or what’s a heaven for?”

Seriously.  Life is bigger, much bigger, than our individual petty concerns.  We may think our concerns are larger than life, but until we reach beyond them, we are petty, we are small.  Until we give ourselves to larger and more grandiose goals, we are achieving nothing as human beings.

Look at the people, throughout history, who have given their lives, willingly, in favor of a larger goal.  Not just the astronauts, but soldiers, sailors, explorers, yes, even a few politicians.

Today, my only hope is with the private companies, SpaceX, Blue Origin, Virgin Galactic, and their ilk.  They are our future.  Not NASA, or the government in any way, shape or form.  I hope with all of my heart and soul they can reach where the collective citizenry has declined to.

Set controls for the heart of the sun.

About Uber, Friction, and Life June 28, 2017

Posted by Peter Varhol in Technology and Culture.
Tags: , ,
add a comment

No matter where you are in most major or even minor cities around the world (yes, there are significant exceptions), you can pull out your smartphone, press a couple of buttons, and have an Uber taxi meet you at your location in a few minutes.  You compare the driver with the photo you received, and you have a measure of security.  The driver already knows your destination, and you know that you don’t have to pass him (or her) some cash at the end of the process.

And that’s the way it should be, in this day and age.  The technology has been there, and Uber, Lyft, and their ilk are bringing it together.

But let’s take an honest look about what we are trading off, because there are always tradeoffs.  In this case, we are trading off friction.  By friction, I mean the hassle of hailing a commercial taxi, finding the phone number and calling a taxi company, or getting to a location where taxis tend to congregate.

(And as I was told in Stockholm last month, all taxis are not created equal.  “Don’t take that one,” the bell captain at a hotel said.  “They will gouge you.”)

All of this sounds like a good thing.  But it turns out it is part of the life learning process as a person.  For the first twenty-three years of my life, I never saw a taxi, or a train, or a subway.  I grew up in rural America.  Today I am comfortable finding and navigating all of the above, in any city in the US or Europe.  Why?  Because I had to.

(And incidentally, no matter the payment method, I always tip in cash.  These folks work for a living, and deserve the discretion of how and where to report their tips.)

I have grown as a person.  That’s difficult to quantify, and certainly given a more frictionless path in the past I might well have chosen it.  But the learning process has built my confidence and yes, my worldliness.  I am more comfortable navigating cities I have never been to before.  I don’t stay in a bubble.

If you are using Uber (and Lyft) as an excuse for not interacting with others, especially others who are different from you, then you are not learning about the world, and how to interact with it.  And as your life winds down, you may come to regret that.

The Future is Now June 23, 2017

Posted by Peter Varhol in Algorithms, Technology and Culture.
Tags: ,
add a comment

And it is messy.  This article notes that it has been 15 years since the release of Minority Report, and today we are using predictive analytics to determine who might commit a crime, and where.

Perhaps it is the sign of the times.  Despite being safer than ever, we are also more afraid than ever.  We may not let our electronics onto commercial planes (though they are presumably okay in cargo).  We want to flag and restrict contact with people deemed high-risk.  We want to stay home.  We want the police to have more powers.

In a way it’s understandable.  This is a bias described aptly by Daniel Kahneman.  We can extrapolate from the general to the particular, but not from the particular to the general.  And there is also the primacy bias.  When we see a mass attack, was are likely to instinctively interpret that as an increase in attacks in general, rather than looking at the trends over time.

I’m reminded of the Buffalo Springfield song: “Paranoia strikes deep, into your lives it will creep.”

But there is a problem using predictive analytics in this fashion, as Tom Cruise discovered.  And this gets back to Nicholas Carr’s point – we can’t effectively automate what we can’t do ourselves.  If a human cannot draw the same or more accurate conclusions, we have no right to rely blindly on analytics.

I suspect that we are going to see increased misuses of analytics in the future, and that is regrettable.  We have to have data scientists, economists, and computer professionals step up and say that a particular application is inappropriate.

I will do so when I can.  I hope others will, too.