Originally posted by Brendon
Not an argument, an observation. Yes both are looked at but the bulk of the time is spent on figuring out the bad and why it went bad. If you work in a large organization this is what is done. The bad did not start out like that it went wrong, why? Was it an internal organization problem? Did external factors have any bearing on the failure and if so how much. Did we see this comming? Is so could we not react quick enough, or was it lost in translation?
You can see that in a large organization, you can fix the problem with the software or hardware but equal or more attention must be payed to the health of the organization, fixing the problem is one of scope. The problem is that the widget is no longer viable, why? If the organization is running properly it should be that all things are noticed and fixed in time. The last place you want to go is down pat on the back lane, wow we really did this well, those things should be obvious. It is the bad things that you need to identify, most are very well hidden. This is Situation Normal in a large well run organization, most of them spend their problem solving time looking internal as well as external. Bad things could happen to the person that does not identify the real problem. For example fixing the motor of a boat, while not properly identifying that the pilot had no idea where the boat or land was. Now if you are in a lake this is a small problem, if you are in an Ocean...
That can be true.
But remember that when a systen has been around for a long time, it isn't always that there are "bad" parts. They may seem bad because technology has eliminated the need for the way the system operates.
To get back to UNIX. Back when it was developed, RAM was both very expensive, and thus available, even in mainframes, in, by todays standards, very small amounts. So the system used virtual memory for most everything.
Even today, UNIX's tend to do that more than other, more modern systems. So a redesign of the memory systems is in order. That's fine. But, there's nothing "bad" about it. It just isn't required to do things that way anymore. But it doesn't sink the system either. If fact, UNIX based systems tend to have the best virtual memory systems around. And it can't go away altogether.
Besides, most of this has nothing to do with whether we are talking about a large system or organizarion or a small one.
Small bits take less work to correct, but they are also less valuble to begin with. Large bits take much more work, but thet are also much more valuble.
small organizations are the same way. They are easier to move, but have less at stake overall. If they fail, there is less loss (not to them, of course).
Large organizations move more slowly. But they have much more to lose. It isn't always good for a large organization to move too quickly on a major project. They have much at stake in present operations. Much disruption occurs. Sometimes, it;s actually cheaper, and better in the long run to make incremental changes in systems thatn it is to do a complete overhaul, which might contain major bugs.
There are numerous cases of companies and governmental agencies trying to completely overhaul their systems, only to find that they were mired in the muck.
UNIX is very much like that. It started out 40 years ago as a modern system. Today it has much legacy code, and functionality. But it has also improved over the years. Code has been dropped, and new code added. Many major OS's are still based on it. Even Linux is a copy, and not a very good one either.