jump to navigation

Empathetic Technology is an Idea Whose Time Should Never Come June 20, 2018

Posted by Peter Varhol in Technology and Culture.
Tags: , ,
add a comment

I love TED talks.  They are generally well thought out and well-presented, and offer some significant insights on things that may not have occurred to me before.

I really, really wanted to give a thumbs-up to Poppy Crum’s talk on empathetic technology, but it contradicted some of my fundamental beliefs on human behavior and growth.  She talks about how measuring and understanding the physical attributes of emotion will help draw us together, so that we don’t have to feel so alone and misunderstood.

Well, I suppose that’s one way to look at it.  I rather look at it as wearing a permanent lie detector.  Now, that may not be a bad thing, unless you are playing poker or negotiating a deal.  But exposing our innermost emotions to others is rightly a gradual thing, and should be under our control, rather than immediately available through technology.

Also, the example that she demonstrates in the audience requires data from the entire audience, rather than from a single individual.  And her example was highly contrived, and it’s not at all clear that it would work in practice.  It involved measuring changes in CO2 emissions from the audience based on reacting to something unexpected.

But in general, her thesis violates my thoughts on emotional friction.  Other people don’t understand us.  Other people do things that make us feel uncomfortable.  Guess what?  Adapting to that is how we grow as human beings.  And growth is what makes us human.  Now, granted, in a few cases where attempts at emotional growth result in psychopathologies, there seems like there could be value here.  But . . .

I recall the Isaac Asimov novel The Naked Sun, where humans who interact physically with others are considered pathologic.  So we become content to view each other electronically, rather than interact physically.  I see a significant loss of humanity there.

And despite how Poppy Crum paints it, I see a significant loss of humanity with her plan, too.  She is correct in that empathetic technology can help identify those whose psyches may break under the strain of adapting to friction, but I think the loss of our humanity in general overwhelms this single good.


Here’s Looking At You June 18, 2018

Posted by Peter Varhol in Algorithms, Machine Learning, Software tools, Technology and Culture.
Tags: , , ,
add a comment

I studied a rudimentary form of image recognition when I was a grad student.  While I could (sometimes) identify simple images based on obviously distinguishing characteristics, the limitations of rule-based systems, the computing power of Lisp Machines and early Macs, facial recognition was well beyond the capabilities of the day.

Today, facial recognition has benefitted greatly from better algorithms and faster processing, and is available commercially by several different companies.  There is some question as to the reliability, but at this point it’s probably better than any manual approach to comparing photos.  And that seems to be a problem for some.

Recently the ACLU and nearly 70 groups sent a letter to Amazon CEO Jeff Bezos, alongside the one from 20 shareholder groups, arguing Amazon should not provide surveillance systems such as facial recognition technology to the government.  Amazon has a facial recognition system called Rekognition (why would you use a spelling that is more reminiscent of evil times in our history?)

Once again, despite the Hitleresque product name, I don’t get the outrage.  We give the likes of Facebook our life history in detail, in pictures and video, and let them sell it on the open market, but the police can’t automate the search of photos?  That makes no sense.  Facebook continues to get our explicit approval for the crass but grossly profitable commercialization of our most intimate details, while our government cannot use commercial and legal software tools?

Make no mistake; I am troubled by our surveillance state, probably more than most people, but we cannot deny tools to our government that the Bad Guys can buy and use legally.  We may not like the result, but we seem happy to go along like sheep when it’s Facebook as the shepherd.

I tried for the life of me to curse our government for its intrusion in our lives, but we don’t seem to mind it when it’s Facebook, so I just can’t get excited about the whole thing.  I cannot imagine Zuckerberg running for President.  Why should he give up the most powerful position in the world to face the checks and balances of our government?

I am far more concerned about individuals using commercial facial recognition technology to identify and harass total strangers.  Imagine an attractive young lady (I am a heterosexual male, but it’s also applicable to other combinations) walking down the street.  I take her photo with my phone, and within seconds have her name, address, and life history (quite possibly from her Facebook account).  Were I that type of person (I hope I’m not), I could use that information to make her life difficult.  While I don’t think I would, there are people who would think nothing of doing so.

So my take is that if you don’t want the government to use commercial facial recognition software, demonstrate your honesty and integrity by getting the heck off of Facebook first.

Update:  Apple will automatically share your location when you call 911.  I think I’m okay with this, too.  When you call 911 for an emergency, presumably you want to be found.

Too Many Cameras June 15, 2018

Posted by Peter Varhol in Software platforms, Strategy, Technology and Culture.
Tags: ,
add a comment

The title above is a play off of the “Too Many Secrets” revelation in the 1992 movie Sneakers, in which Robert Redford’s character, who has a secret or two himself, finds himself in possession of the ultimate decryption device, and everyone wants it.

Today we have too many cameras around us.  This was brought home to me rather starkly when I received an email that said:

I’ve been recording you with your computer camera and caught you <censored>.  Shame on you.  If you don’t want me to send that video to your family and employer, pay me $1000.

I pause.  Did I really do <censored> in front of my computer camera?  I didn’t think so, but I do spend a lot of time in front of the screen.  In any case, <censored> didn’t quite rise to the level of blackmail concern, in my opinion, so I ignored it.

But is this scenario so completely far-fetched?  This article lists all of the cameras that Amazon can conceivably put in your home today, and in the near future, that list will certainly grow.  Other services, such as your PC vendor and security system provider, will add even more movie-ready devices.

In some ways, the explosion of cameras looking at our actions is good.  Cameras can nudge us to drive more safely, and to identify and find thieves and other bad guys.  They can help find lost or kidnapped children.

But even outside our home, they are a little creepy.  You don’t want to stop in the middle of the sidewalk and think, I’m being watched right now.  The vast majority of people simply don’t have any reason to be observed, and thinking about it can be disconcerting.

Inside, I simply don’t think we want them, phone and PC included.  I do believe that people realize it is happening, but in the short term, think the coolness of the Amazon products and the lack of friction in ordering from Amazon supersedes any thoughts about privacy.  They would rather have computers at their beck and call than think about the implications.

We need to do better than that if we want to live in an automated world.

Facebook and the Cult of Secrecy June 5, 2018

Posted by Peter Varhol in Publishing, Technology and Culture.
Tags: ,
add a comment

I recall the worldwide controversy in 2013 surrounding National Security Agency contractor Edward Snowden, who published secret (and above) information about the NSA listening programs to the world at large.  These revelations prompted some worldwide protests against the data collection by the NSA (and by extension GCHQ in the UK and others).

I gave the entire Snowden mess a shrug of my shoulders.  I am not a big fan of secrets, personal or institutional.  I do think that there are things in life that we justifiably attempt to keep secret, for a variety of reasons.  However, I also believe that any attempt to keep something a secret for any significant period of time is ultimately futile.  “Three people can keep a secret, if two are dead” represents my belief in the longevity of secrets.

However, I can’t help but marvel at people protesting against government data collection, yet those same people, and many more, willingly giving far more personal data to Facebook.  I simply don’t get why Facebook, which is undeniably more effective than the NSA, gets a pass on their deeper intrusions in our lives.

Facebook should have taught us that there are no secrets.  I don’t think that we’ve learned that lesson, and I certainly don’t think Facebook has.  This article notes the company’s duplicitous behavior regarding what it says and what it actually does.  In this case, it was Zuckerberg himself who told Congress that they no longer shared user and friend information with third parties.

It turns out that Facebook deliberately decided not to classify 60 (yes, 60) phone manufacturers as third parties.  Zuckerberg’s excuse: they needed to provide them with real user data in order to test the integration with the app on their devices.  Un, no.

Now, I am a tester by temperament, and know darn well that the normal practice is to munge data used for testing.  Facebook providing 60 vendors with real data is not testing, it is yet another violation of their terms of service.  Oh, but Facebook is allowed to do that as long as someone (the janitor, perhaps) apologizes and says it won’t happen again.

So here you have it – Facebook lies, and will continue lying as long as they can get away with it.  And who lets them get away with it?  You do.

Update: Facebook bug set 14 million users’ sharing settings to public.  I really don’t at all understand why people put up with this.

Alexa, Phone Joe May 28, 2018

Posted by Peter Varhol in Algorithms, Software platforms, Technology and Culture.
Tags: , ,
add a comment

By now, the story of how Amazon Alexa recorded a private conversation and sent the recording off to a colleague is well-known.  Amazon has said that the event was a highly unlikely series of circumstances that will only happen very rarely.  Further, it promised to try to adjust the algorithms so that it didn’t happen again, but no guarantees, of course.

Forgive me if that doesn’t make me feel better.  Now, I’m not blaming Amazon, or Alexa, or the couple involved in the conversation.  What this scenario should be doing is radically readjusting what our expectations of a private conversation are.  About three decades ago, there was a short-lived (I believe) reality TV show called “Children Say the Funniest Things.”  It turned out that most of the funniest things concerned what they repeated from their parents.

Well, it’s not only our children that are in the room.  It’s also Internet-connected “smart” devices that can reliably digitally record our conversations and share them around the world.  Are we surprised?  We shouldn’t be.  Did we really think that putting a device that we could talk to in the room wouldn’t drastically change what privacy meant?

Well, here we are.  Alexa is not only a frictionless method of ordering products.  It is an unimpeachable witness listening to “some” conversations in the room.  Which ones?  Well, that’s not quite clear.  There are keywords, but depending on location, volume, and accent, Alexa may hear keywords where none are intended.

And it will decide who to share those conversations with, perhaps based on pre-programmed keywords.  Or perhaps based on an AI-type natural language interpretation of a statement.  Or, most concerning, based on a hack of the system.

One has to ask if in the very near future Alexa may well be subject to a warrant in a criminal case?  Guess what, it has already happened.  And unintended consequences will continue to occur, and many of those consequences will continue to be more and more public.

We may well accept that tradeoff – more and different unintended consequences in return for greater convenience in ordering things.  I’m aware that Alexa can do more than that, and that its range of capability will only continue to expand.  But so will the range of unintended consequences.

How Do We Learn Languages? May 26, 2018

Posted by Peter Varhol in Technology and Culture.
add a comment

Aaron Schlossberg’s anti-Spanish rant is darkly amusing in its naivety, shocking it is explicitness.  Would I have been subject to the same treatment if found speaking English in a restaurant in Spain?  I don’t think so.

My grandparents came from Bratislava, in what is today the Republic of Slovakia, but at the time was Austria-Hungary.  All four of them are listed on the Ellis Island rolls (to be fair, I have only found three of them, but both spellings and the past are vague at best).  And yes, apparently Andy Warhol was a second cousin or something (thanks, Karen); names were pronounced and spelled differently at different times.

My parents spoke some Slovak, but rather than pass on the language to their children, used it to hide what they were saying from the children.  Today I regret this.  In general, I wish I had had the opportunity to learn different languages growing up.

Many of my school classmates were children or grandchildren of immigrants, mostly from central, southern, and eastern Europe.  There was one classmate I remember who was a very good student, and spoke good English with a slight accent.  I learned that English was his second language, that only Ukrainian was spoken in his household.

I travel quite a bit today.  I took Spanish in high school (now 40 years ago) and am in Spain once or twice a year for several years now, and my understanding of Spanish is coming along nicely.  I know a few words of German, and gave my twelve words of Russian a workout in Kiev two weeks ago (and even learned a word or two of Ukrainian).

As Hiro Protagonist noted in the  wonderful grunge novel Snow Crash, America in the near future is good at only four things – music, movies, microcode, and fast pizza delivery.  But it is precisely those things (I will also add aviation) that make the English language known throughout the world.

So how do we learn other languages?  We learn through practice, pure and simple.  Years ago, my sister took a degree in French, never used it, and today cannot remember a single word.  I meet people in Europe who know three or four languages well, because they can travel two hundred miles and hear several different languages.  Switzerland has four national languages.

We don’t have an official language, English or otherwise.  Let’s keep it like that, and let’s hear and practice other languages in the United States.  It will make us better citizens.

Google AI and the Turing Test May 12, 2018

Posted by Peter Varhol in Algorithms, Machine Learning, Software development, Technology and Culture, Uncategorized.
Tags: , , ,
add a comment

Alan Turing was a renowned mathematician in Britain, and during WW 2 worked at Bletchley Park in cryptography.  He was an early computer pioneer, and today is probably best known for the Turing Test, a way of distinguishing between computers and humans (hypothetical at the time).

More specifically, the Turing Test was designed to see if a computer could pass for a human being, and was based on having a conversation with the computer.  If the human could not distinguish between talking to a human and talking to a computer, the computer was said to have passed the Turing Test.  No computer has ever done so, although Joseph Weizenbaum’s Eliza psychology therapist in the 1960s was pretty clever (think Alfred Adler).

The Google AI passes the Turing Test.  https://www.youtube.com/watch?v=D5VN56jQMWM&feature=youtu.be.

I’m of two minds about this.  First, it is a great technical and scientific achievement.  This is a problem that for decades was thought to be intractable.  Syntax has definite structure and is relatively easy to parse.  While humans seem to understand language semantics instinctively, there are ambiguities that can only be learned through training.  That’s where deep learning through neural networks comes in.  And to respond in real time is a testament to today’s computing power.

Second, and we need this because we don’t want to have phone conversations?  Of course, the potential applications go far beyond calling to make a hair appointment.  For a computer to understand human speech and respond intelligently to the semantics of human words, it requires some significant training in human conversation.  That certainly implies deep learning, along with highly sophisticated algorithms.  It can apply to many different types of human interaction.

But no computing technology is without tradeoffs, and intelligent AI conversation is no exception.  I’m reminded of Sherry Turkle’s book Reclaiming Conversation.  It posits that people are increasingly afraid of having spontaneous conversations with one another, mostly because we cede control of the situation.  We prefer communications where we can script our responses ahead of time to conform to our expectations of ourselves.

Having our “AI assistant” conduct many of those conversations for us seems like simply one more step in our abdication as human beings, unwilling to face other human beings in unscripted communications.  Also, it is a way of reducing friction in our daily lives, something I have written about several times in the past.

Reducing friction is also a tradeoff.  It seems worthwhile to make day to day activities easier, but as we do, we also fail to grow as human beings.  I’m not sure where the balance lies here, but we should not strive single-mindedly to eliminate friction from our lives.

5/14 Update:  “Google Assistant making calls pretending to be human not only without disclosing that it’s a bot, but adding “ummm” and “aaah” to deceive the human on the other end with the room cheering it… horrifying. Silicon Valley is ethically lost, rudderless and has not learned a thing…As digital technologies become better at doing human things, the focus has to be on how to protect humans, how to delineate humans and machines, and how to create reliable signals of each—see 2016. This is straight up, deliberate deception. Not okay.” – Zeynep Tufekci, Professor & Writer 

Why Are We Here With Uber? April 30, 2018

Posted by Peter Varhol in Technology and Culture.
add a comment

An interesting study by CNN points out that over 100 Uber drivers have been accused of rape or sexual assault.  Perhaps many more have done so, but CNN has attempted to match reports with actions.  It has to be clear to even the most dense rider that this has to do with the fact that the vetting of drivers is minimal, and there is no follow-up or tracking of driver encounters with the law.

Uber, of course, claims that the drivers are not employees, but individual contractors, so they are not responsible for aberrant behaviors.  And Uber forces riders into arbitration, even for clear felonies.  I am sorry, a felony is not something you can arbitrate.  Even under not-so-new CEO Dara Khosrowshahi, there doesn’t seem to be a lot of progress.  And of course, former CEO (and still board member) Travis Kalanick simply created, enabled, and contributed to this sick culture.

So I have to ask the question: Why are we here?

We are here for a couple of reasons.  First, and probably foremost, Uber customers have supported this behavior.  I recall a viral LinkedIn post from several months ago where a young lady declared, “I don’t care what the company does.  I just want cheap and convenient rides.”

And face it, isn’t that why you use Uber?  You really don’t care that they have a sick corporate culture, or that some small percentage of drivers attack passengers.  You want to get from Point A to Point B with as little friction as possible, and screw the complications.

Second, we are here because Uber knows what you want.  You want those cheap and convenient rides.  And you don’t want to go through the hassles of finding a taxi, or getting a phone number for a taxi company in a new city, or talking to someone, or waiting for a taxi.

Uber gives that to you, because it’s convenient.  That doesn’t make it right.

You can make a difference, if it means something to you.

It means something to me.