Varhol’s Corollary to Heisenberg’s Uncertainty Principle August 29, 2016Posted by Peter Varhol in Technology and Culture.
Tags: Sherry Turkle, Werner Heisenberg
add a comment
I’m back to reading Sherry Turkle’s wonderful Reclaiming Conversation, and quite a bit of it is thought-provoking. I’m trying to apply some of the lessons here to technology teams, but in the meantime, I’m drawing some independent lessons from her excellent tome.
Turkle notes that youth go to parties, then immediately start texting others to make sure they are at the right party, the “best” one. The idea, presumably, is to find that one best party and grace it with one’s presence.
Now on to Heisenberg. The Heisenberg Uncertainty Principle says that the very act of measure an activity out of necessity affects it in some way. By attempting to measure something, we are making ourselves an actor in that activity.
Of course, Heisenberg was largely referring to the relative position of particles in quantum physics, but it’s reasonable to apply generalizations of this statement to other domains. My corollary applies to the realm of social behavior. It says that one’s presence at a party affects the quality of that party, whether or not you are seeking alternatives.
While I will attempt to quantify this relationship for an upcoming academic paper, it is clear that if you go to a party and do nothing other than seek another party, you are worsening the experience for everyone at your current party. My corollary says that it’s not just the party; it is how you interact with the party.
If you interact well, you will believe you are at the best party. If you interact poorly, any party you end up at will be below your, um, expectations. So by being there, and taking your own behavior into account, you directly influence the quality of the party by your own measurements.
In other words, your own presence and behavior has a direct and strong impact on your enjoyment.
Surprised? Neither am I.
Is a Car Just a Car? August 12, 2016Posted by Peter Varhol in Technology and Culture, Uncategorized.
add a comment
At this time of my life, yes. My transportation is an 18-year old Subaru that just starts every time. But 25 years ago, I owned a classic Corvette. L-82, large bore V-8. If I could think it, that car could deliver on it. As a teen, I had an old Chevy sedan that moved okay, and let me join the other teens in doing whatever we did with cars.
Uber entire business model is based on the assumption that a car is only transportation. I can hail a whatever sedan Uber sends me to get from here to there. I am pretty much in sync with that, because I need to get from here to there, reliably and more or less on time. I certainly don’t do it in any fancy way.
But I am not everyone. Most news/magazine websites still have an automotive section, and paper magazines like Car and Driver and Automotive News still sell well. Many people like cars, and have an emotional attachment to them. There is a certain beauty in the lines of many cars, and car ownership still remains a reachable dream for youth and adults alike.
If Uber fails, here is where it will happen. For some, perhaps many, travel is not a commodity. The journey is the reward, as Steve Jobs had once said. To many, this is the literal truth.
Uber is selling a way to get from here to there. That’s not a bad thing. But in the case of cars, it is nowhere near everything. Chevy sells tens of thousands of Corvettes every year. Other attractive, fast, and functional cars sell in the millions. They do so not because people need them, in many cases, but because they want them.
Uber works when the alternative is hailing a cab, and its advantage there will be reduced once it starts charging full price, rather than providing a subsidy on its rates.
But some people (many people?) need more than that. I don’t happen to be one of them, at this point in my life (though given my location, I still don’t use Uber), but I can still appreciate the sentiment. I don’t know that Uber will fail, because there is still a significant population that requires only occasional transport from one point to another.
But it is a crack in the business model. I don’t think any cultural shift that occurs will happen that fast or that completely to make cars simply transportation for many people. How many people could decide whether Uber is a global force or merely a taxi company.
Another Old Line Conglomerate Gets It Wrong August 4, 2016Posted by Peter Varhol in Software development, Technology and Culture.
Tags: coding, GE
add a comment
I seem to be taking my curmudgeon role seriously. Today I read that Jeff Immelt, longtime CEO of industrial conglomerate GE, says that every new (young) person hired has to learn how to code.
So many things to say here. First, I have never been a proponent of the “everyone can code” school. No, let me amend that; everyone can probably learn to code, but is that the best and most productive use of their time? I would guess not.
Second, I’m sure that in saying that Immelt has put his money where his mouth is, and has gotten his own coding skills together. No? Well, he’s the boss, so he should be setting the example.
This is just stupid, and I am willing to bet a dollar that GE won’t follow through on this idle boast. Not even the most Millennial-driven, Silicon Valley-based, we’re so full of ourselves startup tech company would demand that every employee know how to code.
And no company needs all of their employees to be spending time on a single shared skill that only a few will actually use. GE needs to focus on hiring the best people possible for hundreds of different types of professional jobs. It may be an advantage for all of them to have some level of aptitude for understanding how software works, but not coding shouldn’t be a deal-breaker.
I have worked at larger companies where grandiose strategies have been announced and promoted, but rarely if ever followed through. This pronouncement is almost certainly for PR purposes only, and will quietly get shelved sooner rather than later. And making such a statement does no credit whatsoever to Immelt, who should know better.
Being a Curmudgeon Has its Benefits August 1, 2016Posted by Peter Varhol in Technology and Culture, Uncategorized.
add a comment
I occasionally wax personal in my blog, as I did a year ago when I was facing a serious cancer diagnosis (the diagnosis was ultimately incorrect, and I am healthier than ever). Occasionally I just have to say something about a particular moment, whether or not it relates to my target blog topics.
This morning I got a regular email newsletter from Marc Cendella of The Ladders, a job search service for salaries over $100K. The title was “When the kid interviewing you says you’re too old…” In it, Cendella says that age discrimination in hiring is prevalent, and offers the older job seeker a checklist of items to attempt to overcome that bias.
Here is where I call a foul. Certainly there are things that a job seeker can do in order to make him- or her self appear to be a better fit for a given job. In general, those things range from the common-sensical (be engaged and current in your profession and energetic in your life pursuits) to the absurd (facelifts and hair coloring).
But it’s a two-way street. Why not also suggest to the hiring managers that they might have a bias that is not well serving their organization, and how they might recognize and correct that deficiency?
Oh, that’s right. Businesses like The Ladders make money from those companies doing the hiring, not from job seekers. The Ladders would rather tell the job seeker to change, rather than the hiring manager.
I would imagine that in a lengthy career spanning a dozen or more jobs and dozens of interviews, I have experienced some types of bias and discrimination. Probably everyone has; we tend to form initial impressions of someone we just met in under a second, and those first impressions can be both unconscious and difficult to overcome.
Bias in hiring is particularly difficult to demonstrate, as there could be any reason or no reason to not be selected for a job. The prospective employer certainly isn’t telling (usually), so most of this left to speculation or inference, and not even worth considering, let alone actionable.
But I found this newsletter from The Ladders to be singularly offensive. I instinctively interpreted it as “It’s not my problem that I am biased, it’s yours in that you are too old.” I deeply resent that Cendella says that it’s a problem for job-seekers, rather than a problem for hiring managers (or for both). If hiring managers let such biases creep into their decision process, they are doing both themselves and their organization a serious disservice.
I have always been sanguine about bias in hiring. My attitude has been that if I am discounted because of a personal characteristic outside of my control, it’s a place I probably wouldn’t want to work at anyway.
The fact of the matter is that unless we die young, or hit the jackpot, we are all destined to become older workers. Everyone, deal with it.
I Bought Another Band; I am not Sure Why July 31, 2016Posted by Peter Varhol in Software platforms, Technology and Culture.
Tags: Microsoft Band
add a comment
My original Microsoft Band, a nice and relatively inexpensive fitness tracker and GPS, is disintegrating before my eyes. The wristband is peeling and falling apart, and I doubt it will last much longer. It is getting more difficult to charge, as the charging cable seems to have trouble engaging with the device.
My Band is just over a year old, and I would expect any electronic to last longer, perhaps much longer (I still own a VCR, after all). I do use it almost daily, and wear it constantly except when charging (which it requires almost daily). I would like to tell Microsoft that, for all of its functionality at a reasonable price, it is an inferior product.
But I did so in a strange way; I just bought a new one. It is a Band 2, which by most accounts seems to be on the way out in favor of a possibly compelling Band 3, but I could not wait three months for that product to ship. I think my current model has about another 2-4 weeks of life ahead.
So for a product that is disintegrating after just over a year of use, why have I doubled down? Especially in a market where fitness trackers are mostly a dime a dozen, and I could choose another among many?
Yes, familiarity is one part of that answer. I know how to use it. Don’t discount that as a significant motivator. If I have to spend time learning a new feature set, I may take a while to get up to speed.
It is customizable. Within a fairly wide range, I can set up the type of information I want it to display. And I rather like the Microsoft Health app. While it may not be superior, it is easy to use within a fairly wide range of fitness activities. And it was my first experience with notifications from my phone (texts, incoming calls, voice mail), which I can’t really do without any more.
And while I complain at the rate at which my Band is falling apart, I also realize that fitness and activity technology is changing rapidly. I hope to have more compelling technology in the next purchase, and at a lower price.
Update: I bought my sister a Band 2 for Christmas 2015, and explained to her that I bought a new one because mine was falling apart. Her response: “Mine disintegrated last month, the clasp came completely apart. I did buy a new one, because I like it.” At least mine lasted a year.
Of Robots and Men July 23, 2016Posted by Peter Varhol in Technology and Culture.
Tags: robots, scientific management
add a comment
The title is, of course, not intended to be sexist, but instead modeled loosely after the John Steinbeck novella Of Mice and Men. There may in fact be some parallels between that story, which many of us read in public school, and the question of whether robots are becoming more human-like. Certainly the recent announcement by the European Union that it had drafted a “bill of rights” for robots as potential cyber-citizens leads some credence to this notion.
Or in the case of this article, whether humans are becoming more robot-like. Yes, we sometimes do things robotically, and our devices are making us more consistent (read: predictable) in our response to stimuli. In particular, the article notes that we have opened to door to almost complete objective surveillance, rather than thought and reflection. We record, not think.
I am unsympathetic; we have opted for surveillance, by checking into the Hotel Facebook (yes, “you can check out any time you like, but you can never leave”). I have declined that particular stay, but I am certainly potentially under surveillance through CCTV cameras in public places, license plate cameras on police cruisers, and my own mobile phone, of course.
But trying to make humans into robots has a long history in management and engineering. Frederick Taylor, he of the scientific management, created the mass production system that required thousands of workers, each doing a tiny fraction of an entire process required in order to assemble a complex machine like an automobile. His work made human workers into the very definition of modern robots.
And the field of industrial engineering still contains coursework and careers in what is euphemistically called “time and motion studies.” What is that, you ask. As an extension of Taylor’s scientific management, time and motion studies purport to analyze manual work in order to determine how to accomplish a specified task with the least amount of time and, well motion (despite its deep grounding in industrial engineering practice, I’m afraid my skepticism is showing). After all, if we can engineer devices, certainly we can do the same with people (sarcasm intended).
Yet time and motion studies, and Taylor’s scientific method are explicitly geared toward repeating the same limited movements over and over again, perhaps to save a fraction of a second on each assembly. Certainly those fractions added up over time, but they also explicitly prohibited the workers themselves from experimenting to determine improvements in the process as a whole. When we treat people like robots, they become robots. Surprise!
I am reminded of a story from my studies in psychology, in which pigeons were trained to sort pharmaceuticals based on the color of the pills. Fortunately, the ASPCA came to their rescue, claiming animal cruelty. The task was ultimately returned to humans, for whom apparently it did not represent cruelty.
Yes, actively turning humans into robots has a long history in engineering and management. To be fair, I think it is mostly well-meant, but incredibly demeaning and ultimately counterproductive. But I don’t think there is any surprise here.
Those of you who have read me know that I am a runner. That is a relatively recent development; it was only the acquisition of a Fitbit that put me on the path to consistency in workouts. I am quantitatively motivated, and the numbers become a sort of game with me. But I don’t allow them to take over my life, as apparently we are letting happen to many, many others.
I Think the Written Word Can Take Care of Itself June 14, 2016Posted by Peter Varhol in Technology and Culture.
add a comment
Facebook thinks that the written word is dead, to be replaced in its entirety by videos. It claims that videos convey more information.
Well, maybe. I am no longer willing to bet against Facebook, even as it gets even more asinine every year. I am a mere human, while it is an unstoppable force. And it is correct from a simple number-of-bits standpoint.
But information is more than bits and bytes. Video is art, too, but blending words in the right ways lets people use their imagination to build alternative realities. It exercises our minds in ways that video cannot.
To be fair, I am the one that laughs at people who still cling desperately to the printed word on paper. I have heard “I just like the feel of paper in my hands,” which seems to me nonsensical. But the written word, apart from defining in world history what is means to be human, is something that stimulates us to imagine very different things, something that video is ill-equipped to do.
We may go all the way to a video society, although I hope not. I am reminded of an old speculative fiction story by the late great Isaac Asimov that envisions a future society in which mathematics is done entirely by calculator, and pencil and paper (or mental) mathematics is a long forgotten art. So when someone brings it back, that person is looked upon with both wonder and suspicion.
It frightens me, though, to think that Facebook actually has the wherewithal to make this happens, if it furthers the company’s business goals. We should not give up the written word just because Facebook says we should.
I don’t think this comes as any surprise to anyone, but Facebook is shallow. Its fundamental problem is that it also promotes shallowness as communication. If we succumb to Facebook, we fail to connect as humans.