Moore's Law and Murphy's Law
Because of the enormous complexity of today's world, there is an enormous need for beta-testers.
We have always needed people in every system to test the system on a continuing basis because we can be sure that Murphy's Law is always in play. If something can go wrong, it will go wrong. And then of course there are all the variations, such as this one: it will go wrong at the worst possible time.
As the complexity of the world is now becoming exponential because of computerization, the beta-tester becomes basic to every medium size company and especially to large organizations. Moore's law will continue to operate, and so will Murphy's Law. As Moore's law accelerates, the number of failures will also accelerate, and become exponential, too.
What we hope is that there will be internal digital systems that keep Murphy's Law in check. This is sometimes called the Internet of things. The hope is that predictable and reliable beta-testing code will protect digital systems from the inevitable triumph of Murphy's Law. The problem is obvious: who polices the policeman? Who designs the code that beta-tests the code that is supposed to beta-test the system? And so on.
You would think that every online organization would hire at least one full-time beta tester, and probably two. These would not be high-paid people. They would go to the various pages, especially the order pages, on the website, and they would make sure, every day, that the order pages work exactly as they are supposed to work.
Furthermore, there is a problem with professional beta testers. They get too familiar with the programs they are supposed to be testing. They intuitively know how the program works, and when they test it, they do so not as newcomers who've arrived at the page for the first time, but as full-time beta-testers who instinctively know how the page works, even if it no longer works that way for the outsider.
You see the problem. The systems are geared toward continual devolution. In other words, the second law of thermodynamics works in the field of coding, just as it works everywhere else.
How many times have you gone to a website to spend money, and you can't figure out how to do it? More than you would like to admit, and certainly more than the designer of the webpage intended. But the presidents of corporations are busy, and they don't want to spend their time beta-testing their own websites. They assume that somebody, deep down in the bowels of the organization is taking care of business. But, of course, unless somebody is paid well to take care of business, he is doing things he is interested in doing, which is doing what ever will enable him to get his paycheck with the least amount of work. The munchkins are interested in advancing their own careers, and they're going to make the decisions in terms of the operational system of sanctions that enables them to maximize their income and minimize their output.
Any company that is generating income of more than $1 million a year online had better have at least one beta-tester who goes through the pages of the website daily, especially the order pages, to make sure that the links are working, and that the commands given to the users actually perform as intended. Entropy continues to undermine everything. One code change made by one programmer is likely to have unintended consequences somewhere in the system.
If some diligent user bothers to inform customer support, or even sales, that there is a problem, there will be almost immediate resistance on the part of customer support. What? Somebody is calling into question the design of the program? They write off this person. But if there is a system of negative sanctions in place, then somebody may be fearful and decide to take a look at the problem.
Let me give you a recent example. I am a major user of the program Camtasia Studio. It is a very good program. Everybody except for one instructor for the Ron Paul Curriculum uses Camtasia Studio. As a team, we have posted something in the range of 6,000 screencasts that have been produced with Camtasia studio. So, we are heavy-duty users of the program.
The program has an extremely annoying bug. It works in conjunction with Murphy's Law. The user is faithfully recording his presentation. Usually towards the end of the presentation, because Murphy's Law is operating, the system locks up. A pop-up appears on the screen. The pop-up tells you that the program is no longer working because of insufficient disk space.
You cannot save the file. You can try to save the file, but you can't. All of your work is destroyed. As soon as you click the button to escape, you lose everything. There is no trace of your work. It is not held in some file for safekeeping.
This bug has torpedoed me on probably a dozen occasions. I always notify the company, and I always get the same answer. I am required to go through a series of steps on my own to rearrange certain aspects of my hard disk. It's my fault. They never fix this thing. They have known about this for years. They won't fix it, and I think it is because they cannot fix it. I think it is built into the system.
Here is what is annoying. It is a stupid bug. It tells me that the problem is insufficient disk space. We are living in an era in which you can buy a 1 TB hard disk for a computer for about $80. We have not suffered from insufficient disk space problems since about 1995. For someone deep in the bowels of the company to write a pop-up warning that tells the user that the problem is insufficient disk space, borders on imbecility.
I wrote back to the company the last time, when they asked for comment, and I pointed out the fact that it is the stupidest pop-up warning I have ever seen. It is completely inappropriate. I told them they ought to have the pop-up say something like this: "Gotcha!" Or maybe this: "We can't fix this bug, so you're going to have to go back and re-record whatever it is you are recording. Sorry about that. Signed, the staff." They ought to admit they can't fix this bug, and that it is a major bug. But, no, they want me to go through the procedure of telling them what went wrong this time, and they are unwilling to revise the stupid pop-up. The pop-up makes them look ridiculous. But the programmers don't care. Nobody pays the programmers to care. Only if there are negative sanctions threatening them are they going to go in and re-write the pop-up. They can't repair the bug. They have proven this for years. But at least they could fix the pop-up. But they won't.
This is the world we live in. Programmers don't want to believe that they have made a mistake. They will do anything to protect their egos by demanding that it is the users' fault, not their fault. It is clearly their fault in this case, but they aren't going to fix it. Why should they fix it? Nobody is threatened with firing or a pay cut because they don't fix it. Since there are no negative sanctions threatening them, they have no major incentive to fix the bug. Yet this is a good company. They have a good product. But they are programmers, and programmers have a code of ethics: not admitting to mistakes. This is basic to their self-definition. Once programmed, always programmed.
This is why wise senior managers must have options for unhappy users to contact them directly. They should not allow the system to filter the complaints from the users. If something goes wrong, there should be at least a vice president of stuff that has gone wrong. He should be able to find out what is going wrong. If he relies on the munchkins below him to tell them what is going wrong, it is going to continue to go wrong, and then finally it is going to go public. There will be some major embarrassment, all because the munchkins have a fundamental desire to keep anybody higher in the chain of command from finding out that somebody within the tribe of munchkins has screwed the pooch. They don't want senior people coming in and snooping around. Somebody else coming in from outside to snoop around may find a whole lot of other mistakes. The munchkins act as a shield to keep anybody higher in the chain of command from finding out about the foul ups.
Any management team that does not understand that every bureaucracy works this way is falling into the clutches of the munchkins. The munchkins operate by keeping important information about foul ups away from senior management. Senior management has to set up control systems that enable unhappy users to contact somebody with the authority to implement and impose negative sanctions on the people who made the mistakes. But this is not how management systems work. This is especially not how digital management systems work in the realm of Internet marketing.
In every field, any company that implements a system of beta-testing at the bottom by low-paid workers, and then implements a warning system that enables unhappy users to contact senior managers directly, is going to have a greater success ratio than all the other companies that refused to do either of these things. This is money sitting on the table, waiting to be scooped up. It is inexpensive to do this. Beta-testers are cheap, and an e-mail address that would go directly to a vice president in charge of implementing programs and negative sanctions is also cheap. Senior management must tell the vice president in charge of fixing mistakes before they go public, that if he screws the pooch, he is going to be demoted to entry level beta tester. In other words, he has to believe that his job is on the line. He had better fix it before it goes public.
Someday, I will see a company that does this. I've been online for about two decades, and I have yet to see it. It is one more sign that the modern world may be using Moore's law to its advantage, but it has not yet come to grips with the operation of Murphy's Law in this new digital world order.
