jump to navigation

Is Emoji a Universal Language? May 22, 2015

Posted by Peter Varhol in Technology and Culture.
Tags: ,
add a comment

I was prompted to consider this question by a recent article in Wall Street Journal, which claims that the use of these pictograph characters is growing and is increasingly being used for entire sentences and even messages.

Emoji grew as a way of adding, well, emotions to otherwise dry, text-based email communications. As other ways of distributed communication emerged, emoji migrated to Twitter, Snapchat, and a variety of other distributed communications platforms.

And the number of emoji characters expanded. It’s now possible to express complete thoughts, and even form sentences, using emoji characters. Of course, that is a bit of a misnomer; the emoji “language” is loosely defined, and has variations between devices and fonts. It also lacks certain parts of what we traditionally consider a grammar – articles, adjectives, and adverbs, for example.

By and large, emoji is a good thing. Most interpersonal communications is delivered non-verbally, by volume, tone, or body language. For strictly written communication, they can add a level of emotions that we want to consciously convey.

Of course, that opens the door to a couple of disadvantages. First, we have to consciously add those emotions to our written texts, whether or not we are actually feeling them. We may, in fact, be feeling something completely different, but hide that through the use of emoji. The message recipient doesn’t observe us directly, so it’s impossible to tell.

Second, there are clearly cultural differences in emoji. The practice started in Japan, and there are a number of Japanese emoji characters that have no meaning in other cultures. In some cases, the emotion isn’t clear from the character unless you are born and raised in Japan. Certainly the same must be true of other cultures.

So emoji isn’t a universal language. In fact, it can be a language for further hiding and deception.

But it does show that even our driest communications can have a human side. And in interactions that are more and more electronic, that can’t be a bad thing.

On Certifications, Knowledge, and Competence April 29, 2015

Posted by Peter Varhol in Software development.
Tags:
add a comment

There is an ongoing and more recently growing controversy surrounding the ways that testers and other software professionals demonstrate their competence. In particular, the controversy centers on the testing services provided by the world’s largest provider of certification services, the ISTQB, and its US arm, the ASTQB.

The primary criticisms of the ISTQB seem to be that the use of a multiple choice test for certification trivializes testing knowledge and judgment. In addition, some see the tutorial offerings of supporters to be an unnecessary and unwanted commercial advantage in training for this specific certification. So there are currently online petitions that demand the ISTQB release its own data that assesses the validity of the certification.

While both of those criticisms have some truth, Cem Kaner notes in a comprehensive post that the ISTQB seems intent on measuring and improving its certifications, and supports its right to examine their data and institute those improvements in private. As long as it is not making demonstrably false statements regarding the knowledge gained as a result of certification, the organization has the legal and ethical right to improve its exams. Kaner does not claim to have reviewed all of the organization’s marketing materials, but does assert that he sees no apparently false claims.

Kaner, one of the true deans of software testing, is right. I have referred to him as our “adult supervision” on Twitter. But the argument goes beyond his level-headed analysis of what the ISTQB offers and does not offer to testers.

We are fortunate to work in a field where there are a lot of ways of adding value. Some write code to build software, and write unit tests to verify how they think discrete parts of the code should work. Others work closely with developers to understand the underlying structure and develop tests that reflect the requirements of the application and its underlying requirements. Still more are domain experts, fighting for the user community and possessing a keen understanding of what is needed beyond the stated requirements to deliver a quality product.

Measuring all of those skill sets in a single exam, whether multiple choice, single correct answer, or even in essay format, seems incongruous and indeed impossible. Yet a broad range of people and skills can and do help in determining whether an application meets the needs of the business and has the necessary quality to deploy.

Part of the larger problem is that some employers expect ISTQB certification as a prerequisite for a testing job, which distorts its value in the marketplace. Another part is that ISTQB marketing, while not demonstrably false, might be interpreted as misleading.

Kaner says that while the ISTQB certification lacks many essentials, we have not yet been able to devise anything better. He’s not happy with ISTQB certification, but for technical rather than business reasons. And he’s smart enough to know that there isn’t anything better right now, although he is hoping for alternatives to develop over time. He would prefer to expend energy in improving possible certifications, rather than fighting over the relative value of this particular one. That’s a position that’s hard to argue with.

How Mature Are Your Testing Practices? April 22, 2015

Posted by Peter Varhol in Software development, Strategy.
add a comment

I once worked for a commercial software development company whose executive management decided to pursue the Software Engineering Institute’s (SEI) CMMI (Capability Maturity Model-Integration) certification for its software development practices. It hired and trained a number of project managers across multiple locations, documented everything it could, and slowed down its schedules so that teams could learn and practice the new documented processes, and collect the data needed to improve those processes.

There were good reasons for this software provider to try to improve its practices at the time. It had a quality problem, with thousands of known defects not getting addressed and going into production, and its customers not happy with the situation.

However, this new initiative didn’t turn out so well, as you might imagine. After spending millions of dollars over several years, the organization eventually achieved CMMI Level 2 (the activities were repeatable). It wasn’t clear that quality improved, although it likely would have become incrementally better over a longer period of time. But time moved on, and CMMI certification ceased to have the cachet that it once did. Today, in a stunning reversal of strategy, this provider now claims to be fully committed to Agile development practices.

This is a cautionary tale for any software project that looks for a specific process as a solution to their quality or delivery issues. A particular process or discipline won’t automatically improve your software. In the case of my previous employer, CMMI added burdensome overhead to a software supplier that was also forced to respond more quickly to changing technologies.

There are a number of different maturity models that claim to enable organizations to develop and extend processes that can make a difference in software quality and delivery. The SEI’s CMMI is probably the best known and most widely utilized. There is also a testing maturity model, which specifies similar principles as CMMI into the testing realm. And software tools vendor Coverity has recently released its Development Testing Maturity Model, which outlines a phased-in approach to development testing adoption, and claims to better support a DevOps strategy.

All of these maturity models, in moderation, can be useful for software projects and organizations seeking to standardize and advance the maturity of their project processes. But they don’t automatically improve quality or on-schedule delivery of a product or application.

Instead, teams and organizations should build a process that best reflects their business needs and culture, and then continue to refine that process as needs change to ensure that it continues to improve their ability to deliver quality software. It’s not as important to develop a maturity model as it is to identify your process, customize your ALM tools for that process, and make sure your team is appropriately trained in using it.

Has the College Season Changed? April 5, 2015

Posted by Peter Varhol in Education, Technology and Culture.
Tags: ,
add a comment

I first went to college over a generation ago, as one of the first of my extended family to attend college (my sister was first, followed by a cousin, but college was an option only for my generation; my parents never finished high school). It was a haphazard process; there wasn’t anyone to turn to for advice, and of course it was pre-Internet.

So, thirty-plus years ago, you would apply to maybe three-to-five schools, because each school had an application fee of anywhere between $25 and $100 bucks (I shouldn’t be embarrassed to say that $100 bucks meant whether or not our family was going to eat that week). Plus, each application was several pages long, and easily required several hours to write out by hand.

You found out about colleges through old-fashioned thumbing through old catalogs and brochures in the school library or guidance office. Your research was hardly comprehensive or unbiased, so you may well have ended up with a handful of schools that were nowhere near the best selections.

You may or may not have gotten accepted to all of them. Of those you were accepted to, you might want to visit one or two campuses, and not in this way.

Some things have changed in the next 30 years. The application process is typically online, and the fees tend to be more reasonable (at least in adjusted dollars). College visits tend to be more organized, with specific days set aside for group activities.

Many things do not appear to have changed. The timeframe for application and admissions seems to be approximately equal, although I would imagine that decisions can be made a lot more quickly by analyzing data on applicants. And there is just as much emphasis on campus visits and feeling “comfortable” with the campus and (to a much lesser extent) the people.

There is some justification for that. Teens are likely leaving home for the first time, and there is good reason for them to be emotionally and physically comfortable with that decision.

But in an era where college tuition has increased at double the rate of inflation through most of my adult life, and universities drag their feet in moving forward, I’m not sure that’s the deciding factor any more. Cost, utility, and flexibility may have overtaken the physical plant as the primary means of deciding on a college. There are many ways to begin and complete a degree, and the traditional four years (or more) residing on a campus seem, at least to me, to be less important than they were 30+ years ago.

I’m not a parent, and I haven’t had to go through this process with teens. But I suspect that parents, my age to perhaps a decade younger, are projecting their own experiences on their college age children, and are encouraging them to make exactly the wrong decisions for this day and age.

I also raise this because I’m concerned about higher education. I was a tenure-track professor, back in the day, I was discouraged by the total lack of imagination and innovation on college campuses. My department chair was convinced that we had perfected higher education twenty years ago, and no changes were necessary.  If these are our best and brightest, I wonder just what direction they are leading the youth of today.

Of Fossil Fuels and Climate Change March 15, 2015

Posted by Peter Varhol in Technology and Culture.
Tags: , ,
add a comment

I am not an energy or climate expert by any means. I guess I would call myself a reasonably intelligent layman with good education and experience in the scientific method and interpreting experimental results.

I’ll start with a couple of blanket statements. The Earth is probably undergoing climate change, and if so, it is likely at least partially the result of greenhouse gases. Notice that I express likelihoods, or possibilities, not definitive statements. No serious scientist, under just about any circumstances, would make flat and absolute statements about an ongoing research area, especially one as complex as this. Insofar as we may hear such statements, even from people who have scientific credentials, we should run away from them.

Second, it’s not at all clear that climate change is a bad thing. The world around us is not static, and if we expect things to remain as they are, we are deluding ourselves. We have had two documented “mini”-Ice Ages over the past millennium, and those could clearly not be ascribed to human industrial intervention. After all, the Vikings were able to grow crops in southern Greenland until a cooling of the climate in the twelfth century led them to abandon not only Greenland, but likely also Labrador and certainly Newfoundland.

In a longer sense, we may still be in an Ice Age that began over two million years ago.

If we are in the process of warming, it may be a part of the natural, well, circle of life. It is probably helped along by the trapping of warming greenhouse gasses, but it may or may not be unusual in the life scale of the planet.

But to think that we may draw a conclusion based on a hundred years of data may be intriguing, and may result in poorly thought out conclusions and remedies, despite the sensationalist (and entertaining) movies to the contrary.

And I know there are winners and losers in this process. On the positive side, we may be able to grow tropical crops farther north in the Temperate Zone, ultimately being able to feed more of the planet. On the negative side, countries such as Tuvalu may ultimately be mostly flooded by a rise in sea level. While I feel for the 11,000 people in Tuvalu, I may feel more for the greater number of people we are able to feed.

All that said, I liked this article on the necessity of fossil fuels in the weekend Wall Street Journal. While it represents a biased point of view, it is likely no more biased than an opposing point of view.

It’s a good thing that we are looking toward using energy that doesn’t burn fossil fuels. But let’s not delude ourselves into believing that climate change won’t happen anyway; it’s simply the way massive complex systems work.  We called it catastrophe theory in the 1970s; now it goes by other names.

I recognize that others may disagree with my statements, and perhaps stridently. And I certainly agree that we should continue to explore renewable sources of energy (even if they are not renewable over a sufficiently long time scale). But this is a more difficult question than the popular media has been asking over the course of the last decade or so.

A Youth Guide to Digital Privacy March 14, 2015

Posted by Peter Varhol in Technology and Culture.
add a comment

I’m an old(er) guy. I was well of age when Tim Berners-Lee published his seminal work on hypertext, and was probably on the leading edge of non-academic users when I loaded a third-party TCP/IP package (NetManage) onto my Windows 3.1 installation and connected to an Internet provider and connected to the World Wide Web (hint: It wasn’t easy, and I know you just take this for granted today).

So I would like to offer the youth of today (to be fair, thirty years or more my junior, and I’m not sure why anyone would want to listen to me) some advice on navigating a digital life.

  1. Someone is always watching you. Seriously. If you are out of your residence and not already on CCTV, you will be within a few minutes.
  2. If your cell phone is on, anyone who cares knows where you are. If they don’t care at the moment, they will still know if it becomes important. If your cell phone is not on, the NSA can still find you.
  3. I’m not just talking advertisers, with whom you may already have a relationship, or at least reached a détente. If important, you will be found, by authorities, friends, enemies, and spammers.
  4. Most important: if you do something bad, or stupid, you will almost certainly be found. Maybe prosecuted, more likely ridiculed, for the whole world to see if they desire. You may even be on a news website, not because what you did was in any way newsworthy, but because it offers offbeat or comic page views.
  5. You may or may not recover your life from that point.

I mention this because young people continue to do stupid things, just as they did when I was young. They may not have seemed stupid in my youth, where I did my share of stupid things, but wasn’t caught because they couldn’t catch me. Trust me; anything you do in public today is either on camera or identifiable through a GPS trace.

You might not think you will do anything stupid in public. Chances are, if you are under 30, you have already done so (over 30, the odds reach certainty). Circa 1980, I could drive down the wrong way on a freeway on-ramp, incredibly drunk, and not get caught unless there was a patrol car in the immediate vicinity (theoretical example; maybe). Don’t even think about it today.

Many people who have grown up with the Internet are seemingly comfortable with the positive aspects of universal connectivity, but don’t give any thought as to the other side of the coin.

Moral: We are being watched. And we don’t really understand the implications.

The Challenges of Concurrency in Software March 12, 2015

Posted by Peter Varhol in Software development, Software tools.
Tags: , ,
add a comment

I learned how to program on an Apple Macintosh, circa late 1980s, using Pascal and C. As I pursued graduate work in computer science, I worked with Lisp and Smalltalk, running to the Motorola 680X0 and eventually the Intel architecture.

These were all single-threaded programs, meaning that they executed sequentially, one step at a time. As a CS grad student, and later as a university professor, I learned and taught about multi-threading and concurrent code execution.

But it was almost entirely theoretical. Until the turn of the century, almost no code was executed in parallel. Part of the reason was that none but the most sophisticated computers executed in parallel. Even as Intel and other processors moved decisively into multi-core architectures, operating systems and programmers weren’t ready to take advantage of this hardware innovation.

But only by taking advantage of multi-core and hyper-threaded processors could developers improve the performance of increasingly complex applications. So, aided by modern programming languages such as Java and C#, developers have been cautiously working on applications that can take better advantage of these processors.

To do so, they’re dusting off their old textbooks and looking at concepts like “critical section”, “fork”, and “join”. They are deeply examining their code to determine which operations can occur simultaneously without producing errors.

To be fair, several tools came out in the mid-2000s claiming the ability to automatically parallelize existing code, mostly by analyzing the code and trying to parcel out threads based on an expectation that certain code segments can be parallelized. In practice, there was not a lot of code that could safely be parallelized in this way.

But most new applications are multithreaded, and the operating system can dispatch threads to different cores and CPUs for concurrent execution. Using today’s processors, this is the only way to get the best performance out of modern software.

The problem is that developers are still fumbling their way through the process of writing code that can execute in parallel. There are two types of problems. One is deadlock, where code can’t continue because another thread won’t give up a resource, such as a piece of data. This will stop execution altogether.

Another is the insidious race condition, where the result is dependent upon which thread completes first. This is insidious because an incorrect result can be random, and is often difficult to detect because it may not result in an obvious error.

Fortunately, tools are emerging that help in the identification and analysis of concurrent software issues. One is Race Catcher, from Thinking Software. It can be used in two ways during the application lifecycle. During development and test, it can dynamically analyze Java code to look ahead for deadlocks and race conditions. You can’t predict the occurrence of a race condition, of course, but you can tell where the same data is being processed in different ways, at the same time.

In a headless version, it can run as an agent on production servers, doing the same thing. That’s a version of DevOps. We catch things in production before they become problems, and refer them back to development to be fixed.

In an era where software development is changing more quickly and dramatically than any time since the PC era, we need more tools like this.

And This is Why Government Has No Business Dictating Computer Security March 4, 2015

Posted by Peter Varhol in Uncategorized.
Tags: ,
add a comment

Governments can do an incredible amount of good. They provide services for the public good, such as law enforcement, transportation management, a legal framework, and so much more.

But government as an institution, or set of institutions, can also be incredibly stupid, especially where foresight is required. Especially in the realm of technology, which changes far more quickly than any government has the will to adapt.

So now we have a security hole in our web browsers, courtesy of the U.S. Government, which mandated that software products (such as web browsers) couldn’t use strong encryption

This is the same battle that Phil Zimmerman, author of PGP (Pretty Good Privacy) encryption, fought years ago after open-sourcing his algorithm and in doing so made it available to the world. It turned out that Zimmerman was right, and the government was wrong. In this case, wrong enough to cause potential harm to millions of computer users.

At this point, the government doesn’t seem to be interested in enforcing this any more, but some web browsers are still delivered with weak security. It was a vestige of their intent to comply with the law, and never removed as the law became, well, more flexible. But now it is doing some significant damage.

I am reminded, in a reverse way, of Frank Herbert’s science fiction character Jorj X. McKie, a card-carrying member of the Bureau of Saboteurs, a multi-planet government agency whose role was to, well, sabotage government. In this hypothetical future sphere, it needed to do so because government had become too fast, too efficient, and less deliberative in passing and enforcing laws. The Saboteurs threw a monkey wrench into government, slowing down the process.

But today we need to speed up government. Defining the boundaries of government is a debate that will continue on indefinitely. I generally agree that government should be a participant in this process. But it needs to be an informed and active participant, and not a domineering old grandparent.

Follow

Get every new post delivered to your Inbox.

Join 438 other followers