Originally posted by Splinemodel
Well, the electronic hardware industry rags don't concern themselves too much with software. It's the sentiment of mostly embedded software developers, although these days embedded design is more common than ever, and it results in more revenue and more code than high-level software.
OO in the embedded environment is often a poor fit. To say that OO has failed because it is a poor tool for a specific purpose is as blind as saying Java sucks for everything.
As for more revenue and more code I would ask for a reference as embedded is still a niche arena as near as I can tell. One with high growth perhaps but the software world is vast and highly profitable.
Oh and a source that ISN'T an embedded industry trade rag. Every industry self inflates their market and the obvious examples of this self-deception were the telcom industry (who's bubble burst) and the internet industry (dot bomb).
Increasing globalization of the labor force, coupled with inreased focus on embedded devices is only going to fuel the development of more efficient paradigm.
More efficient paradigm for small embedded programs that solve smaller problems on embedded devices. Not more efficient for large projects. You aren't going to run your core services from an embedded device.
Globalization is an issue only from the perspective of industrial/technical dominance rather than offshoring. Having worked with Wipro and others, offshoring can have good ROI but must be managed carefully as many companies have realized. And the benefits are not as large as one might hope (i.e. its no silver bullet for developer shortage in the US).
It is when, say the Indian or Chinese software industry
is dominating the software world that the US software industry will be in danger of collapse (like say the US auto industry). When there is an Apple or Microsoft equivalent from a foreign nation that dominates software sales are we in big trouble.
This isn't to say it can't happen but it hasn't yet. Japan was going to be our doom in the late 80s with their six sigma software quality and software factories that churned out millions of lines of code. Didn't happen.
Whether the US has a cultural advantage I don't know. So far we've done well.
One of the analogies I remember reading was the comparison of contemporary software paradigm as a "universal bolt." A lot of developers champion the idea of code re-use to a non-advantageous extent. Whereas a machine will include many types of bolts, each type selected for its suitability in a specific case, software development seems to influence the idea of using fewer total parts, but parts that are much more complex, much less reliable, and much less well suited for each individual task. Put simply, this approach has failed, and continues to be a risk for failure whenever it is used.
Strawman. You can't read marketing literature (whether its from a vendor, a methodology guru or process maven) that declares a certain technology or technique is a silver bullet and then state that all software engineering is a failure because said technology/technique turns out its not a silver bullet.
Any technique that claims 25%+ productivity improvement (cough...CMM...cough) is probably statistical BS. OO for example measures out to be around a 6% improvement over SASD and only for certain problem sets.
Does that make OO a failure? No. 6% beats 0%.
Contemporary software paradigm does NOT describe a universal bolt and hasn't since...mmm...mid 90s or whenever "software reuse" stopped being the buzzword of the day. Component based software development does advocate using pre-existing components to build a software system but largely no one uses (or advocates) large "universal" components anymore but smaller building blocks like UI widgets, network stacks, etc as you might find in .NET, DirectX, etc.
Little COTS vs big COTS integration as the industry figured out by the mid 90s that big COTS integration typically works poorly because of interface complexity. I would say that large component CBSD died in the late 90s early 00s if not earlier. Hard to say but it sure isn't a hot topic anymore.
The biggest takeaway from all this is that fundamental concepts of software development still apply. Examples include coupling and cohesion as indicators of quality and efficiencies in development. Ignore these fundamentals at your own risk. They exist. Software isn't totally a voodoo science. 90% perhaps.
What's the next step? You don't seem to think there is a next step, which is probably a bad thing to assume since hardware is changing dramatically.
Of course there's a next step. There's always a next step. However, as with OO, a parallel paradigm is not a silver bullet, nor can it ignore the fundamentals that seem to have survived (although renamed often) since the inception of software engineering.
For certain problem domains I would expect a parallel processing based paradigm to net...oh around a 6% improvement over OO or procedural methods. A double digit improvement would be a fantastic rarity. Wonderful if we can get it but I won't be drinking the kool-aid till I see the studies that show such "silver bullet" improvements.
If you are a software developer that has stopped learning you are management.
At a certain point, there's only so much that can be done in a compiler: if software developers don't want to learn new paradigms, that's fine -- the business will just move to India, China, and Eastern Europe, where the developers there are more persuadable.
However I call BS that the entire industry is going to move to a paradigm that is best suited for embedded highly parallel applications that needs to be aware of underlying hardware architecture.
Parallel architectures come and go. Its not like the Cell is the first such incarnation. They have thier place to solve certain problems more efficiently than other techniques. Likewise monolithic architectures have their place to solve other problems more efficiently than other techniques.
Anyone that poo-poos one or the other is self-limiting their tool box.
Most problem sets will be solvable using "low-performance software" even on embedded devices. Frankly what you call an embedded device and what I might call an embedded device are probably an order of magnitude different in compute power and storage capability.
Java works fine in the embedded realm today because the embedded environment are no where near as constrained as when I was a RTOS developer. More and more solutions can be built on embedded Java which is a wonderfully easier development environment than traditional RT development environments or your proposed low level Verilog like environment although the simulation and test capabilities in some of the VHDL environments would be nice have in a more consistent manner in traditional environments.
Of course that is addressed in Agile methods that emphasize test frameworks.