Third Wave Software - Part 2: Change in Software IP
There is a crisis looming in the availability of software to develop devices based on complex embedded hardware. Due to unbridled advances in hardware variety and complexity, software is being stretched to get products quickly and robustly to market. The early history of embedded systems development adjusted to application-specific hardware by customizing software specific to the application. This was fairly easy to accomplish in the early days due to practical limits on functionality imposed by targeted price-points and product size. These early efforts evolved into roll-your-own operating systems and development tool sets, and typically they were based on a single microprocessor/databus standard.
As new hardware designs with attractive price points daily hit the street, the old approach is almost laughably antiquated. Integrated hardware blurs the distinction between types of hardware (e.g. programmable DSPs with traditional microprocessor features, microprocessors with on-board memory, or integrated system-on-a-chip) while providing grist for huge advances in product functional requirements. Overnight an in-house development shop can be rendered obsolete by the speed of these changes.
Ronald Paul describes one way development teams react to these changes. They gather up whatever compatible software components they can muster to get the job done, using whatever vendor suffices for the moment. Viewed from the project, this approach has lots to recommend it. Viewed from the organization, this approach is flawed and should be condoned only in truly exceptional situations. Such an approach is dependent on highly skilled developers, is nearly impossible to institutionalize, adds enormously to problems of product support and life cycle maintenance, and makes staff training and hiring that much more difficult. Organizations gravitate toward internal standards for very a good reason, to contain complexity. Only when complexity is contained within reasonable bounds can the organization hope to institutionalize important elements of development, production, support and maintenance.
Just to emphasize this point, a major factor in software cost estimation models is whether or not the development team is well-grounded and experienced working with chosen development tools and operating environment. Change the tools and the operating environment for any development project and you must adjust the scheduled time accordingly. Change the tools and operating environment for real-time systems, and you can toss your schedule out the window. Software cost estimation models may not apply accurately to highly skilled project development teams, but they certainly reflect average development experience, and this huge industry will be driven by averages.
The amount of skill development needed to become adept using even a single IDE, a few languages and preferred compilers, levels of debuggers, as well as maintaining familiarity with critical hardware intricacies, is enough to keep most software engineers busy for their entire careers. Minimizing complexity dictates that the development environment and associated tools should not have to change abruptly in keeping current with underlying changes in hardware and utility software. There will be exceptions, but as a rule this expectation has to be meet before software stands any chance of enabling the third wave of embedded systems to experience rapidly accelerating growth.
So one thing we know is that the full expression of the third wave of computing depends on the availability of an IDE and tool set that extends across multitudinous hardware platforms, enabling a generic look-and-feel to the process of embedded software development. Sounds simple, but achieving it is a first-order challenge, one I doubt Microsoft can satisfy even tentatively. I don't even think their next generation Windows CE vaporware can satisfy this requirement. But this line of reasoning, while comforting to WIND loyalists, misses the point because Microsoft has the resources and market presence presumably to do what it must to dominate any software space, and a great many others to boot. It is not enough to argue against what is (we have been proven right about that). We have to argue against what might be.
Ordinarily this daunting challenge would be impossible to meet, but we know that the model changes accompanying each paradigm shift discussed in the background post are categorical imperatives. They cannot be eliminated with mere financial resources available to a monopolist. IBM, with limitless resources, couldn't take back the PC paradigm once it got underway, nor probably could IBM have gotten the PC paradigm underway without involving Microsoft and Intel or their equivalents at the time.
We understand that Intel can't dominate the processor portion of embedded systems, but what are the imperatives that prevent Microsoft from dominating the software side? The answer is software is splintering in ways that parallel what is happening in hardware, and similarly prevents Microsoft from dominating the embedded systems space. Ironically, this same splintering plays into WIND's hand, and will lead to a domination that extends past experiences whereby one company is given the monopolistic power necessary for the space to fulfill its destiny.
The trick to seeing the future course of events, is not to be mislead into assuming that all the software difficulty is derived from hardware splintering into multitudinous platforms. That's a difficulty for software, but one certainly overcome by WIND, and therefore within in at least the theoretical realm for Microsoft. Rather, the primary source of future software complication in the embedded space is due to something far more difficult to subdue than complications arising from hardware. The primary source of software complications are the tens, hundreds or even thousands of market niches that will emerge needing specialized operating system software.
The traditional, first and second wave solutions, in which applications sit on top of generic operating systems already has proven inadequate in a number of niches, and is likely to prove inadequate for most others. Embedded digital imagery requires an operating system that provides ordinary low-level services, such as those provided by a modular RTOS kernel, along with a number of niche-specific services, such as those designed and formulated by Flashpoint. I2O cannot be implemented efficiently by adding the I2O messaging model to a traditional operating system. Automobile injector systems cannot be adequately handled by adding rotary timing and algorithms to a traditional operating system. Interactive, enhanced TV is best attacked with a specialized operating systems, not just a layer of application programs residing on top of a traditional generic operating system.
Actually the world has encountered many proprietary operating systems in all three primary waves of computing. There is nothing unusual or special about operating systems that target specific niches. However, such operating systems developed uniquely from the ground up intrinsically suffer the usual fate of proprietary, non-standard systems -- ultimately they whither and fade from the scene. They do this because they can't stay current with industry-wide advances in operating systems, related tools and hardware changes.
The modern approach is to build niche specific operating systems on the back of a de facto standard RTOS kernel and IDE with open access to third party tools and libraries. This will solve the conundrum of the third wave, namely the need to contain complexity while being compliant with de facto standards. The talent and domain skills necessary to construct and support the operating systems needed for the many niches that will evolve is daunting, requiring the combined resources of thousands of domain specialists, hardware and software engineers, and the sponsorship and resources of representative organizations from each domain. No single company, government or any organization can control development and support on such a vast scale. In particular, Microsoft lacks even a fraction of the resources necessary to hold sway on a scale this vast.
Now comes the irony. Such a challenge is a walk in the park for WIND, simply because WIND's business model does not mandate dominance of each niche-specific operating system. WIND is pleased to play a support role to the likes of NCI, Adobe, Flashpoint or the I2OSIG. Indeed, there is no other feasible way WIND would be invited to participate in most niches. On the other hand, the one immutable characteristic of Microsoft, I believe we all agree, is that the company has no interest in supporting roles. Actually, the reason for this is Microsoft's business model, not Bill Gates' personality. (It's that categorical imperative thing mentioned above.)
Like the evolution of hardware IP, operating system IP will evolve to a domain level, but within the familiar environment of de facto standards for low-level kernel services, development utilities like compilers and debuggers, and Interactive Development and Rapid Application Development environments. In the new era WIND's RTOS and tool set is likely to continue underpinning more niche operating systems enabling them to evolve with inevitable changes in hardware and development utilities.
Microsoft will continue to chase consumer electronics with evolutions of Windows CE, but with fewer and fewer successes over time. Windows CE with a traditional application layer will prove to be a poor match for niche operating systems, in set-top boxes, in digital imagery devices (cameras, printers, etc.) and in most other niche markets for embedded systems waiting to emerge. As a consolation, "King of the PC" and soon to be "Master of the Server Universe" should suffice for any earth-bound company.
Allen
PS - This discussion begs the question about hyper growth characteristics of the tornado stage of a landmark enabling technology like the third wave of computing. The answer flows naturally from this analysis, but not without interesting twists and turns. |