jump to navigation

Let’s Have a Frank Discussion About Complexity December 7, 2017

Posted by Peter Varhol in Algorithms, Machine Learning, Strategy, Uncategorized.
Tags: , , , ,
add a comment

And let’s start with the human memory.  “The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information” is one of the most highly cited papers in psychology.  The title is rhetorical, of course; there is nothing magical about the number seven.  But the paper and associated psychological studies explicitly define the ability of the human mind to process increasingly complex information.

The short answer is that the human mind is a wonderful mechanism for some types of processing.  We can very rapidly process a large amount of sensory inputs, and draw some very quick but not terribly accurate conclusions (Kahneman’s Type 1 thinking), we can’t handle an overwhelming amount of quantitative data and expect to make any sense out of it.

In discussing machine learning systems, I often say that we as humans have too much data to reliably process ourselves.  So we set (mostly artificial) boundaries that let us ignore a large amount of data, so that we can pay attention when the data clearly signify a change in the status quo.

The point is that I don’t think there is a way for humans to deal directly with a lot of complexity.  And if we employ systems to evaluate that complexity and present it in human-understandable concepts, we are necessarily losing information in the process.

This, I think, is a corollary of Joel Spolsky’s Law of Leaky Abstractions, which says that anytime you abstract away from what is really happening with hardware and software, you lose information.  In many cases, that information is fairly trivial, but in some cases, it is critically valuable.  If we miss it, it can cause a serious problem.

While Joel was describing abstraction in a technical sense, I think that his law applies beyond that.  Any time that you add layers in order to better understand a scenario, you out of necessity lose information.  We look at the Dow Jones Industrial Average as a measure of the stock market, for example, rather than minutely examine every stock traded on the New York Stock Exchange.

That’s not a bad thing.  Abstraction makes it possible for us to better comprehend the world around us.

But it also means that we are losing information.  Most times, that’s not a disaster.  Sometimes it can lead us to disastrously bad decisions.

So what is the answer?  Well, abstract, but doubt.  And verify.


Decisions, Decisions – There’s an Algorithm for That March 20, 2017

Posted by Peter Varhol in Software development, Strategy, Technology and Culture.
Tags: , ,
add a comment

I remember shoveling sh** against the tide. Yes, I taught statistics and decision analysis to university business majors for about 15 years.  It wasn’t so much that they didn’t care as they didn’t want to know.

I had more than one student tell me that it was the job of a manager to make decisions, and numbers didn’t make any difference. Others said, “I make decisions the way they are supposed to be made, by my experience and intuition.  That’s what I’m paid for.”

Well, maybe not too much longer. After a couple of decades of robots performing “pick-and-place” and other manufacturing processes, now machine learning is in the early stages of transforming management.  It will help select job candidates, determine which employees are performing at a high level, and allocate resources between projects, among many other things.

So what’s a manager to do? Well, first, embrace the technology.  Simply, you are not going to win if you fight it.  It is inevitable.

Second, make a real effort to understand it. While computers and calculators were available, I always made my students “do it by hand” the first time around, so they could follow what the calculations were telling them.  You need to know what you are turning your decisions over to.

Third, integrate it into your work processes. By using machine learning to complement your own abilities.  Don’t ignore it, but don’t treat it as gospel either.

There are many philosophical questions at work here. Which is better, your experience or the numbers?  Kahneman says they are about the same, which does not bode well for human decision-making.  And the analysis of the numbers will only get better; can we say the same thing about human decision-making?

Of course, this has implications to the future of management. I’ll explore my thoughts there in a future post.

Testing and Tester Bias March 30, 2013

Posted by Peter Varhol in Software development.
add a comment

Software testers are increasingly looking at how to approach the problems inherent in testing, and how the ways that we think, and the bias that we bring to our work affects the conclusions we draw.  Because we can’t test every aspect of an application exhaustively, much of the testing process is based on our past practices, judgment, and decisions that we make based on incomplete and often inconclusive evidence.

Much of the foundation behind examining how we approach and solve problems comes from Daniel Kahneman’s landmark book Thinking, Fast and Slow.  In the book, Kahneman, who is a psychologist and Nobel Prize-winning economist, defines two types of thinking.  The first is System 1 thinking.  System 1 thinking is a fast, involuntary, and largely instinctive method of thought that enables us to function on a daily basis.  If we sense things in our surroundings such as movement or sounds, our System 1 thinking interprets it instantly and responds if necessary.

System 2 thinking, in contrast, is slow, deliberate, and more considered.  If we’re faced with a situation we haven’t encountered before, or a difficult problem, we engage System 2 thinking and make a more focused decision.  We take the time to think through a problem and come up with what we believe is the best answer or response.

Each has its respective advantages and disadvantages.  System 1 thinking is good enough for most low-risk or immediate decisions, but is too simplistic for more difficult situations.  If we try to make complex decisions without engaging System 2 thinking, we risk making less than optimal decisions due to our own biases or a lack of information at the time a decision is made.

While System 2 thinking is more accurate in complex situations, it takes time to engage and think through a problem.  It’s a conscious process to decide to think more deeply about a situation, and to begin determining how to approach it.  For most simple decisions in our lives, it’s overkill and not timely enough to be useful.  System 2 thinking is also hard work, and can cause decision fatigue if done too often.

As a result, each type of thinking introduces biases into our daily work, which affects how we test and what conclusions we draw from our data.  We may depend too much on System 1 to draw fast conclusions in cases where further thought is needed, or on System 2 so much that we become fatigued and begin to make mistakes because we are mentally tired.

In practice, it’s better to alternate the two types of thinking so as to not overuse either.  If we have complex data to collect and evaluate, it helps if we break up that process with occasional rote activities such as automated regression testing. For example, exploratory testing is a gradual learning process that requires extensive System 2 thinking. In contrast, executing prepared test scripts is largely a rote exercise. Being able to alternate every day or two between the two approaches can keep testers sharp.

In my upcoming posts, I’ll take a closer look at biases that come about as a result of how we think as we approach a problem.  Then I’ll look at how these biased decisions can affect how we test and what we find when we do.