jump to navigation

I’m Talking About You, Uber September 10, 2015

Posted by Peter Varhol in Software platforms, Technology and Culture.
add a comment

The question is what am I talking about. You are so not about a “sharing economy”. Virtually all of your drivers aren’t sharing their daily cars, on their normal day to day business, to accommodate the occasional rider. Instead, they are buying extra cars to turn themselves into the modern equivalent of the taxi driver, without the taxi provided by the company. Calling this the sharing economy is a dangerous misnomer. This is a service, just like a taxi is a service.

But that’s okay, even if you’re not honest about it. At the same time, that’s the drivers’ decisions to make. I don’t think anyone is forcing them into what is essentially a part time business. And most taxi drivers are so-called independent contractors anyway, and are charged by the taxi company for the use of the car.  I am not clear on the economics, but it must work for many.

And certainly the occasional local ride concept was due for some significant disruption. Taxi service is fundamentally stuck in operations that are at least half a century old. It doesn’t work for the consumer any more. Uber works better for the rider (mostly), and can have some advantages for the driver. As well as some disadvantages, depending on decisions made by individual drivers.

The technology makes a difference. You no longer have to call a taxi company; instead, you signal from the app, tell them where you are and where you want to go, and they are generally pretty responsive.

But the technology only enables the work shift you are attempting. My short term guess: you will continue to be pretty successful, because almost everyone who uses taxis also uses smartphones. My long term guess: this is a transitional technology that will be put out of business decisively by the driverless car. I’m sure you’ve thought of that, and are looking to eliminate the middleman; i.e., the driver. This ultimately isn’t a new model for employment, or the so-called sharing economy. You will be first in line for the mass-produced Google car.

I’m not criticizing that, but I am criticizing your fundamental dishonesty in long term goals. You are not about the worker or the so-called sharing economy. You are about the disruption. You continue to lie, but that’s what you’ve done since your inception.

Somehow I Became an Athlete August 17, 2015

Posted by Peter Varhol in Technology and Culture, Uncategorized.
add a comment

Just over a year ago I got a Fitbit. The quantitative feedback afforded by that simple device started me along a path to walking, then running. I ran (mostly walked) my first formal race just about a year ago, a 5K. I’ve run in a handful of others since, and the last two have been highly satisfying from a personal standpoint. The last one has been highly satisfying from a time standpoint.

I was in the hospital just over three months ago, with a dire diagnosis. In a discussion early one morning with one of my doctors, going over my options (I had few, if any at the time), he remarked, “Well, you’re an athlete.”


He was as nonplussed as I. “You run. You said you ran over three miles yesterday.”

Well, I did, but I didn’t think that made me an athlete. Apparently it did, at least in relation to just about anyone else in my situation (and even my doctors). He explained that exercising gave me a leg up on any surgery I might need, because I was in better shape for recovery.

I declined the major surgery several doctors had recommended, and today, it looks like I don’t need it.

Today, I am increasing my distance, to four or five miles. I have my second 10K run on the horizon, and am now thinking that under the right circumstances, I may actually be able to run a half marathon.

It’s like being a recovering alcoholic, really. I can fall off that wagon too easily. But just maybe I’m getting there. I seem to have redemption possibilities.

The Microsoft Band Delivers – Mostly July 22, 2015

Posted by Peter Varhol in Software platforms, Technology and Culture.
Tags: , , ,
add a comment

I got a Microsoft Band. I was looking for my next step up in activity wearables, and liked what I read about it on the website. At $200, it is much more than an activity tracker. It includes a GPS, like more expensive sports watches, and integration with your phone provides for the ability to receive notification of calls, text messages, and other phone activity.

When I got it, my first (and pretty much only) disappointment was that it didn’t sync with my phone’s version of Android (4.1.3), supporting only 4.3 and above. My phone wouldn’t allow an upgrade to a supported version. Coincidentally (really), I bought a new phone later in the week, an LG G4, running Android 5.0 (Lollipop).

But the fact of the matter is that the system requirements weren’t clear or obvious, which is a drastic change from older PC-based software. I suppose that it is difficult to deliver or test all phones and OS versions, but this isn’t what I expect from software, even in the era of the smartphone.

But within a couple of days, I came to really like the Band. First, my first night, I received an Amber alert in my area. My phone buzzed, but the Band let me know about it, even including the text message. You can configure it to show incoming calls, texts, and even emails. It’s ease of configurability is really good, much better than most watches or other wearables.  I now depend on it as my first notification of calls if my phone isn’t physically on my person.

And the GPS-based activity tracker is remarkably easy to use and obtain data from. I didn’t read any documentation, yet was able to use it with my running routine within seconds. The results are displayed on the Microsoft Health app, and are exceptionally easy to understand and interpret.

One other minor annoyance – the touch panel simply doesn’t work with a sweaty finger. After a particularly humid run, I attempted to stop my run session, and it simply wouldn’t do so until I dried off my fingers. This limitation may be driven by pure physical reasons, but it makes me think that Microsoft’s user experience (UX) testing wasn’t as good as it could have been.

I find it disappointing that Microsoft can deliver a reasonably compelling product, yet not effectively market or promote it. Apple is rumored to have sold around five million iWatches in its first quarter, with very mixed reviews, yet the downloads for Microsoft Health (required to use the Band) in about a full year are under a hundred thousand, at least on Android. I’m not a Microsoft fanboy by any means, but I do acknowledge when it produces good products.

The Microsoft Band is a good product for people who are seeking the next level up from the Fitbit and other low-end devices, and would be useful to many more people than currently use it. I don’t know just when Microsoft ceased being a marketing monster, but it clearly fails with the Band. Make no mistake – the technology and products remain very good, even outside of the PC space, but Microsoft lost its marketing mojo at some point, and doesn’t seem interested in getting it back.

Is Emoji a Universal Language? May 22, 2015

Posted by Peter Varhol in Technology and Culture.
Tags: ,
add a comment

I was prompted to consider this question by a recent article in Wall Street Journal, which claims that the use of these pictograph characters is growing and is increasingly being used for entire sentences and even messages.

Emoji grew as a way of adding, well, emotions to otherwise dry, text-based email communications. As other ways of distributed communication emerged, emoji migrated to Twitter, Snapchat, and a variety of other distributed communications platforms.

And the number of emoji characters expanded. It’s now possible to express complete thoughts, and even form sentences, using emoji characters. Of course, that is a bit of a misnomer; the emoji “language” is loosely defined, and has variations between devices and fonts. It also lacks certain parts of what we traditionally consider a grammar – articles, adjectives, and adverbs, for example.

By and large, emoji is a good thing. Most interpersonal communications is delivered non-verbally, by volume, tone, or body language. For strictly written communication, they can add a level of emotions that we want to consciously convey.

Of course, that opens the door to a couple of disadvantages. First, we have to consciously add those emotions to our written texts, whether or not we are actually feeling them. We may, in fact, be feeling something completely different, but hide that through the use of emoji. The message recipient doesn’t observe us directly, so it’s impossible to tell.

Second, there are clearly cultural differences in emoji. The practice started in Japan, and there are a number of Japanese emoji characters that have no meaning in other cultures. In some cases, the emotion isn’t clear from the character unless you are born and raised in Japan. Certainly the same must be true of other cultures.

So emoji isn’t a universal language. In fact, it can be a language for further hiding and deception.

But it does show that even our driest communications can have a human side. And in interactions that are more and more electronic, that can’t be a bad thing.

Has the College Season Changed? April 5, 2015

Posted by Peter Varhol in Education, Technology and Culture.
Tags: ,
add a comment

I first went to college over a generation ago, as one of the first of my extended family to attend college (my sister was first, followed by a cousin, but college was an option only for my generation; my parents never finished high school). It was a haphazard process; there wasn’t anyone to turn to for advice, and of course it was pre-Internet.

So, thirty-plus years ago, you would apply to maybe three-to-five schools, because each school had an application fee of anywhere between $25 and $100 bucks (I shouldn’t be embarrassed to say that $100 bucks meant whether or not our family was going to eat that week). Plus, each application was several pages long, and easily required several hours to write out by hand.

You found out about colleges through old-fashioned thumbing through old catalogs and brochures in the school library or guidance office. Your research was hardly comprehensive or unbiased, so you may well have ended up with a handful of schools that were nowhere near the best selections.

You may or may not have gotten accepted to all of them. Of those you were accepted to, you might want to visit one or two campuses, and not in this way.

Some things have changed in the next 30 years. The application process is typically online, and the fees tend to be more reasonable (at least in adjusted dollars). College visits tend to be more organized, with specific days set aside for group activities.

Many things do not appear to have changed. The timeframe for application and admissions seems to be approximately equal, although I would imagine that decisions can be made a lot more quickly by analyzing data on applicants. And there is just as much emphasis on campus visits and feeling “comfortable” with the campus and (to a much lesser extent) the people.

There is some justification for that. Teens are likely leaving home for the first time, and there is good reason for them to be emotionally and physically comfortable with that decision.

But in an era where college tuition has increased at double the rate of inflation through most of my adult life, and universities drag their feet in moving forward, I’m not sure that’s the deciding factor any more. Cost, utility, and flexibility may have overtaken the physical plant as the primary means of deciding on a college. There are many ways to begin and complete a degree, and the traditional four years (or more) residing on a campus seem, at least to me, to be less important than they were 30+ years ago.

I’m not a parent, and I haven’t had to go through this process with teens. But I suspect that parents, my age to perhaps a decade younger, are projecting their own experiences on their college age children, and are encouraging them to make exactly the wrong decisions for this day and age.

I also raise this because I’m concerned about higher education. I was a tenure-track professor, back in the day, I was discouraged by the total lack of imagination and innovation on college campuses. My department chair was convinced that we had perfected higher education twenty years ago, and no changes were necessary.  If these are our best and brightest, I wonder just what direction they are leading the youth of today.

Of Fossil Fuels and Climate Change March 15, 2015

Posted by Peter Varhol in Technology and Culture.
Tags: , ,
add a comment

I am not an energy or climate expert by any means. I guess I would call myself a reasonably intelligent layman with good education and experience in the scientific method and interpreting experimental results.

I’ll start with a couple of blanket statements. The Earth is probably undergoing climate change, and if so, it is likely at least partially the result of greenhouse gases. Notice that I express likelihoods, or possibilities, not definitive statements. No serious scientist, under just about any circumstances, would make flat and absolute statements about an ongoing research area, especially one as complex as this. Insofar as we may hear such statements, even from people who have scientific credentials, we should run away from them.

Second, it’s not at all clear that climate change is a bad thing. The world around us is not static, and if we expect things to remain as they are, we are deluding ourselves. We have had two documented “mini”-Ice Ages over the past millennium, and those could clearly not be ascribed to human industrial intervention. After all, the Vikings were able to grow crops in southern Greenland until a cooling of the climate in the twelfth century led them to abandon not only Greenland, but likely also Labrador and certainly Newfoundland.

In a longer sense, we may still be in an Ice Age that began over two million years ago.

If we are in the process of warming, it may be a part of the natural, well, circle of life. It is probably helped along by the trapping of warming greenhouse gasses, but it may or may not be unusual in the life scale of the planet.

But to think that we may draw a conclusion based on a hundred years of data may be intriguing, and may result in poorly thought out conclusions and remedies, despite the sensationalist (and entertaining) movies to the contrary.

And I know there are winners and losers in this process. On the positive side, we may be able to grow tropical crops farther north in the Temperate Zone, ultimately being able to feed more of the planet. On the negative side, countries such as Tuvalu may ultimately be mostly flooded by a rise in sea level. While I feel for the 11,000 people in Tuvalu, I may feel more for the greater number of people we are able to feed.

All that said, I liked this article on the necessity of fossil fuels in the weekend Wall Street Journal. While it represents a biased point of view, it is likely no more biased than an opposing point of view.

It’s a good thing that we are looking toward using energy that doesn’t burn fossil fuels. But let’s not delude ourselves into believing that climate change won’t happen anyway; it’s simply the way massive complex systems work.  We called it catastrophe theory in the 1970s; now it goes by other names.

I recognize that others may disagree with my statements, and perhaps stridently. And I certainly agree that we should continue to explore renewable sources of energy (even if they are not renewable over a sufficiently long time scale). But this is a more difficult question than the popular media has been asking over the course of the last decade or so.

A Youth Guide to Digital Privacy March 14, 2015

Posted by Peter Varhol in Technology and Culture.
add a comment

I’m an old(er) guy. I was well of age when Tim Berners-Lee published his seminal work on hypertext, and was probably on the leading edge of non-academic users when I loaded a third-party TCP/IP package (NetManage) onto my Windows 3.1 installation and connected to an Internet provider and connected to the World Wide Web (hint: It wasn’t easy, and I know you just take this for granted today).

So I would like to offer the youth of today (to be fair, thirty years or more my junior, and I’m not sure why anyone would want to listen to me) some advice on navigating a digital life.

  1. Someone is always watching you. Seriously. If you are out of your residence and not already on CCTV, you will be within a few minutes.
  2. If your cell phone is on, anyone who cares knows where you are. If they don’t care at the moment, they will still know if it becomes important. If your cell phone is not on, the NSA can still find you.
  3. I’m not just talking advertisers, with whom you may already have a relationship, or at least reached a détente. If important, you will be found, by authorities, friends, enemies, and spammers.
  4. Most important: if you do something bad, or stupid, you will almost certainly be found. Maybe prosecuted, more likely ridiculed, for the whole world to see if they desire. You may even be on a news website, not because what you did was in any way newsworthy, but because it offers offbeat or comic page views.
  5. You may or may not recover your life from that point.

I mention this because young people continue to do stupid things, just as they did when I was young. They may not have seemed stupid in my youth, where I did my share of stupid things, but wasn’t caught because they couldn’t catch me. Trust me; anything you do in public today is either on camera or identifiable through a GPS trace.

You might not think you will do anything stupid in public. Chances are, if you are under 30, you have already done so (over 30, the odds reach certainty). Circa 1980, I could drive down the wrong way on a freeway on-ramp, incredibly drunk, and not get caught unless there was a patrol car in the immediate vicinity (theoretical example; maybe). Don’t even think about it today.

Many people who have grown up with the Internet are seemingly comfortable with the positive aspects of universal connectivity, but don’t give any thought as to the other side of the coin.

Moral: We are being watched. And we don’t really understand the implications.

Do We Really Hate Science? February 25, 2015

Posted by Peter Varhol in Technology and Culture.
Tags: ,
add a comment

Despite the provocative title, the March cover story in National Geographic magazine, entitled The War On Science, is a well-conceived and thoughtful feature (in fairness, the website uses a much less controversial title – Why Do Many Reasonable People Doubt Science?). It points out that the making of accepted science isn’t something that happens overnight, but can take years, even decades of painstaking work by researchers in different fields around the world before it solidifies into mostly accepted theory. Even with that, there are contrary voices, even within the scientific community.

I think the explanation is slightly off-base. I learned the scientific method fairly rigorously, but in a very imprecise science – psychology. The field has entire courses on statistics and experimental design at the undergraduate level, and labs where students have to put the scientific method into practice.

Still, because psychology is an imprecise science, I was frustrated that we were usually able to interpret outcomes, especially those in real life, in ways that matched our theories and hypotheses. But our explanations had no predictive power; we could not with any degree of confidence predict an outcome to a given scenario. That failure led me away from psychology, to mathematics and ultimately computer science.

It’s true that science is messy. Researchers compete for grants. They stake out research areas that are likely to be awarded grants, and often design experiments with additional grants in mind. Results are inconclusive, and attempts at replication contradictory. Should we drink milk, for example? Yes, but no. In general, the lay public tries to do the right thing, and the science establishment makes it impossible to know what that is.

And the vast majority of scientists who purport to explain concepts to the lay public are, I’m sorry, arrogant pricks. We have lost the grand explainers, the Jacques Cousteau and the Carl Sagan of past generations. These scientists communicated first a sense of wonder and beauty, and rarely made grand statements about knowledge that brooked no discussion.

Who do we have today? Well, except for perhaps Bill Nye the Science Guy, no one, and I wouldn’t claim for a moment that Bill Nye is in anywhere near the same league as past giants.

The scientists who serve as talking heads on supposed news features and news opinions have their own agendas, which are almost invariably presented in a dour and negative manner. They are not even explaining anything, let alone predicting, and they certainly have no feel for the beauty and wonder of their work. Doom and catastrophe will be the end result, unless we do what they say we should do.

To be fair, this approach represents a grand alliance between the news agencies, which garner more attention when their message is negative, and the scientists, who promote their work as a way to gain recognition and obtain new grants.

In short, I would like to think that there is not a war upon science. Rather, there is a growing frustration that science is increasingly aloof, rather than participatory in larger society. Everything will be fine if you just listen to me, one might say. The next day, another says the opposite.

That’s not how science should be communicating to the world at large. And until science fixes that problem, it will continue to believe that there is a war on.


Get every new post delivered to your Inbox.

Join 464 other followers