Why is the American Ugly? January 23, 2017Posted by Peter Varhol in Uncategorized.
add a comment
In 1958, Eugene Burdick and William Lederer wrote a fictional novel titled “The Ugly American”. I read it as a teen in the 1970s. (No, it was not required reading for school; I simply read a lot of different things at that point in my life). It took place largely in southeast Asia, and involved sincere but misguided American attempts to improve the lives of the average person elsewhere in the world. Afterwards, and into today, the phrase generally refers to insensitive and obnoxious Americans (mostly tourists) trying to tell those in other cultures what they are doing wrong, based on their own perspective.
There is much to say here. I am, at this moment, returning from approximately a week in Europe, speaking at a conference. I pretty much travel to Europe 3-5 times a year, for the last seven years. I realize that Europe isn’t the rest of the world, so interpret this as you will.
However you might feel about American culture and influence, it has become the gold standard of technology, entertainment and, well, art. As Neil Stephenson put it in Snow Crash, Americans are good at four things – music, movies, microcode (software), and fast pizza delivery. In Europe at least, you have a good measure of American influence in at least the first three of these. In at least some cases, it has overwhelmed the local culture.
The English language is the lingua franca. It is the language of aviation worldwide. Tour guides, hotel staff, and restaurants are almost required to understand and speak English. Some are upset with that state of affairs.
I am old enough to remember a time when Esperanto was supposed to be the universal language. But a language that does not well represent a practical reality has no chance of becoming universal.
You may argue that Americans refuse to speak another language. I will respectfully disagree. I took Spanish in high school, and Russian in college. I would like to communicate in those languages, and in others (most recently this past week, German; well, and Slovak, the language of my past). Most Americans are required to take a language in secondary school and college. Unlike the Europeans, we are so large a geographic area that we have no opportunity to use our learned languages, and they fall into disuse.
In short, I do not believe in the colloquial definition of the ugly American. Sure, a few of my compatriots are less than comprehending of the norms of a foreign culture. But there are certainly those from other countries (again, mostly tourists) who behave boorishly. Yet the world seems to hold Americans to another standard.
American movies, music, and microcode are overwhelming because they are, well, good. Or at least compelling to those who consume them.
I met many people who speak multiple languages and attempt to communicate with others (not just me). One on this trip was Mario, who was an Italian transplanted to Austria to be with his girlfriend. They spoke different languages, he Italian and her German, but they found common ground in English. Would they have even met without English? For most, English is the least common denominator of communications.
We Americans are not ugly. We are just trying to do the best we can, like everyone else.
Maybe I Should Just Give In to the Facebook Juggernaut January 13, 2017Posted by Peter Varhol in Technology and Culture.
Tags: Facebook, Zuckerberg
Despite the fact that Facebook keeps active a live stream of a 12-year old committing suicide, yet pulls down a Pulitzer Prize winning historical photograph, the vast majority of the US, and the world in general, seem copasetic with the decisions that Facebook makes about our lives.
I have serious reservations about Facebook, but even more about founder and CEO Mark Zuckerberg, who is reported to be looking into a run for the US Presidency in the next election. Someone who publically says that we don’t want privacy in our lives, yet spends millions of dollars in property and legal fees to attempt build a wall around his house, clearly talks out of both sides of his mouth. It is a classic case of “do what I say, not what I do.” I’ll actually take that one step further. Zuckerberg is telling the world that “I want my privacy. And I can afford it. You don’t, and you can’t.”
I have yet to ever sign up for Facebook, even though an increasing number of web properties are requiring Facebook user IDs to access their content. And of course, an increasing amount of interesting content is being posted exclusively on Facebook, available only to members. I still decline, but who am I against two billion other people?
I confess that my flabber is ghasted. Is it just me? Does no one else see what a heinous effect that Facebook is having on our interactions with other people? What is it, really? I am starting to doubt my own judgment that Facebook is something that I can rail against, and achieve some modicum of, well, at least acknowledgement.
I’m asking, no begging. Can someone please explain the almost universal fascination with Facebook? And if we are concerned about Donald Trump as the US President, we should be horrified at the prospect that Mark Zuckerberg may succeed him. Imagine a world where we are all required to have Facebook accounts, and to post required information about ourselves.
I would like to think that I have many more years of my life in front of me. Yet I cannot see value in them in the world of Facebook.
Is Cursive Making a Comeback? January 4, 2017Posted by Peter Varhol in Technology and Culture.
Tags: cursive, typing
add a comment
Anne Quito reported on Quartz about the possibility that more states will be adopting writing requirements that include learning cursive. I read, now almost five years ago, in Wall Street Journal about how cursive instruction was being cut back or eliminated in several states. Anne notes that at least a few states may be reversing that trend.
I am at the other side of spectrum of life, and have been removed from innovations in public school instruction for quite some time. But today I almost never write anything longhand, but I can sign my name. And I can read the Declaration of Independence, which is one of the stated reasons for studying cursive.
Beyond that, I don’t have a lot of skin in the game, but I find it fascinating that such a cultural icon as cursive may be struggling for survival. More so, I wonder what we may be losing if we lose cursive. That may be a biased question, in that I am assuming we are losing something. Others may be a bit more sanguine about the whole thing.
Are we losing the ability to read, in the original, significant writings of the past? It’s not clear to me that not writing in cursive is the same thing as not being able to read it, but if it is, yes, we are losing something tangible.
Less tangible, but every bit as real, is that we generally consider handwritten notes more personal and heartfelt than an electronic equivalent.
What’s even less clear is what we are gaining. The Wall Street Journal article from several years ago reported that the state of Indiana was going to stop teaching cursive, in favor of teaching typing. Really? I don’t mean to sound incredulous, well, yes I do. I realize there are certain skills involved (mainly motor skills that do not well relate to cursive), but I learned typing in about a semester of 50-minute classes, well enough to still do about 40 words a minute.
In 1958, the late great Isaac Asimov wrote a short story about world that had used only calculators for hundreds of years, and a man who knew how to perform arithmetic longhand. The man was looked upon as a savant, and it gave him, as the title notes, “A Feeling of Power.”
Someday, somewhere, someone who can read historical documents may well have the same feeling.
Where Were You? December 28, 2016Posted by Peter Varhol in Technology and Culture.
Tags: Carrie Fisher, Star Wars
add a comment
I was nineteen years old, in Air Force officer basic training, at Dover Air Force Base in Delaware, when the original Star Wars movie made it to the theaters. I had a 24 hour (actually less) pass, Saturday afternoon, a group of us visited the local movie theater in Dover. The line stretched halfway around the block, but oddly later at night the theater was only half full, and I saw my first Star Wars.
What to say about Star Wars? It was about the adventure. For those of us who grew up with science fiction, Isaac Asimov, Neil Armstrong, Apollo 13, and more, this was our vision of our future. A future that we would never see, but a future that we could dream about.
And, of course, this is really about Carrie Fisher, now passed before her time. I didn’t realize it, she was only a year older than me. Nineteen years old, in that movie. She had issues, certainly, and may not have been all that she could have, but that counts just about all of us. She did more than most of us.
And she was a strong woman, as Princess Leia and as Carrie Fisher. As Leia, she showed us, in the 1970s, that women could be heroes. As Carrie, she showed us that you could be comfortable in your skin, no matter how famous. I’ve known many strong women, and I wish I had known her.
Bully for Carrie Fisher.
Alexa, Delete My Data December 25, 2016Posted by Peter Varhol in Software platforms, Technology and Culture.
Tags: Alexa, data, privacy
add a comment
As we become inundated this holiday season by Amazon ads for its EchoDot voice system and Alexa artificial intelligent assistant, I confess I remain conflicted about the potential and reality of AI technology in our lives.
To be sure, the Alexa commercials are wonderful. For those of us who grew up under the influence of George Jetson (were they really only on TV for one season?), Alexa represents the realization of something that we could only dream about for the last 50+ years. Few of us can afford a human assistant, but the intelligent virtual assistant is a reality. The future is now!
It’s only when you think it through that it becomes more problematic. A necessary corollary to an intelligent virtual assistant is that assistant has enough data about you to recognize what are at times ambiguous instructions. And by having that data, and current information about us, we could imagine issues with instructions like these:
“Alexa, I’m just going out for a few minutes; don’t bother setting the burglar alarm.”
“Alexa, turn the temperature down to 55 until January 15; I won’t be home.”
I’m sure that Google already has a lot of information on me. I rarely log into my Google account, but it identifies me anyway, so it knows what I search for. And Google knows my travel photos, through Picasa. Amazon also identifies me without logging in, but I don’t buy a lot through Amazon, so its data is less complete. Your own mileage with these and other data aggregators may vary.
To be fair, the US government currently and in the past has been in possession of an incredible amount of information on most adults. I have held jobs and am a taxpayer; I have a driver’s license (and pilot’s license, for that matter); I am a military veteran; and I’ve held government security clearances.
I’d always believed that my best privacy protection was the fact that government databases didn’t talk to one another. The IRS didn’t know, and didn’t care, whether or not my military discharge was honorable (it was). Yeah. That may have been true at one time, but it is changing. Data exchange between government agencies won’t be seamless in my lifetime, but it is heading, slowly but exorably in that direction.
And the commercial firms are far more efficient. Google and Facebook today know more about us than anyone might imagine. Third party data brokers can make our data show up in the strangest places.
And lest you mistake me, I’m not saying that this is necessarily a bad thing. There are tradeoffs in every action we take. Rather, it’s something that we let happen without thinking about it. We can come up with all sorts of rationalizations on why we love the convenience and efficiency, but rarely ponder the other side of the coin.
I personally try to think about the implications every time I release data to a computer, and sometimes decline to do so (take that, Facebook). And in some cases, such as my writings and conference talks, I’ve made career decisions that I am well aware make more data available on me. I haven’t yet decided on Alexa, but I am certainly not going to be an early adopter.
Of the Things that Surprise You in Life December 23, 2016Posted by Peter Varhol in Technology and Culture.
Tags: Fitbit, running
add a comment
I am a distance runner. I have been reluctant to admit that, even to myself. But after almost two and a half years of this, with a few weeks out for a hospital stay, I think it is a reasonable conclusion. There have been hundreds of 5AM excursions out into the dawn or (at this time of year in New England) pitch darkness that I think make me eligible to claim that mantle. I’ve done about 20 formal races over that period, with mostly 5Ks, but also a couple of half marathons (I doubt that will ever happen again).
To be fair, even now, compared to some of those I see, I don’t feel particularly dedicated or determined. And I never thought I would reach this point. I could, and would, quit at any time. I’m not sure I feel particularly healthier as a runner (my doctors beg to differ). I am no more than a casual runner (maybe a little more), and will almost certainly not be anything more.
I am somewhat older; on my proximate birthday, I will be a sexagenarian (and no, it has nothing to do with sex, regrettably). How long can I keep this up? Oddly, in the winter it is more difficult to get out. However, I also run better when it’s colder.
And it is a surprise. I have no athletic history, and except for a brief burst of activity in my late 20s and early 30s, I have been pretty sedentary. I’m an office worker, after all.
I have to say, what has motivated me was the numbers. Yes, I got a low-end Fitbit. After a day of wearing it, I saw my steps. I said, “Tomorrow I can do better.” And I did, and continued to do so.
I am a distance runner. I have a bit of obsessive-compulsive in me. But I don’t burn myself out in the process. The step counts don’t work for everyone. In a few cases, they drive users to excess. But it my case, it was just about perfect.
A Brave New World December 21, 2016Posted by Peter Varhol in Technology and Culture.
Tags: technology, tragedy of the commons
add a comment
As more and more sober people call attention to the fact that there is a dichotomy between the winners and losers of the information/technology economy, it’s still not at all clear that this issue is concerning or even recognized by those best in a position to do something about it. Many of us are operating under the impression that advances are universally good, and that attempts to slow or stop such advances are universally bad.
I am not at all sure that the future of society will take care of itself. I am reminded of the old Sydney Harris cartoon, below:
We advance technology because it is fun, it is intellectually invigorating, and it will make us money. We acknowledge that in many cases we are disrupting the established order, and that in significant cases we may be the proximate cause of eliminating jobs and even entire industries. We justify that by saying that other jobs will arise to replace them.
Probably true; almost certainly true. But that process could take years, even decades. In the meantime, many lives will be disrupted as jobs and lifestyles disappear without a clear way forward.
We justify that by saying that every adult needs to be a lifetime learner, and become accustomed to multiple career shifts over the course of a lifetime. Again, true. But some are more capable at this than others, for a wide variety of reasons.
Well, those who are left behind deserve to be, right? Here is where the logic starts to break down. In a strict economic sense, that may be correct. But economics only models society at large, and only loosely (yes, I know the difference between macro and micro). Forces other than economics are at work, and economists don’t seem to want to model those forces at all. And the end result seems to be coming as a surprise to many.
In 1968, psychologist Garrett Hardin defined “The Tragedy of the Commons.” He noted that when there was a shared interest in a limited resource, it was in every person’s self-interest to use as much of that resource as they could, thus destroying the resource for all.
What we have in society today is possibly approaching a tragedy of the commons. That’s not to say that economic value is a fixed resource, but however we grow it, it is finite. If all use the value to the best of their abilities, some will achieve great wealth. Others will lose out.
I strongly believe in advances in technology, and in capitalism. In general they benefit society, and make it wealthier and more secure in the aggregate. I also strongly believe in democratic processes. Others are welcome to disagree with my beliefs in technology and capitalism.
Beyond the intellectual simulation and possibilities for profit, we in technology largely believe that we are building a better society. There are those who disagree with us, with some justification. The disagreements between these two forces may be getting closer to a head. If there is confrontation, I have no doubt that many of us will be among the first up against the wall when the revolution comes.
To be clear, I’m copasetic with being one of the haves (relatively speaking) in a have/have-not society (I also realize that society can turn on me in a heartbeat). I’m not nearly as copasetic with helping to create (in a very minor sense, but still) the have-nots. We can do better, and it is in our personal, economic, and societal interest to do better.
I wonder what Ayn Rand would say.
If You Want to Know Something About a Culture December 16, 2016Posted by Peter Varhol in Technology and Culture, Uncategorized.
add a comment
Know the words they are using. This simply fantastic infographic helps enormously in understanding how different words are used to reflect regional cultures in the US. Among the examples they provide are different words for meals, food, cities, and, well, swear words. This is seriously good information for those who attempt to differentiate how language makes us who we are. We don’t pay enough attention to that.