SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Wind River going up, up, up!

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Allen Benn who wrote (1523)7/17/1997 1:54:00 AM
From: Allen Benn   of 10309
 
When this thread first started, all we talked about was how the ubiquitous computing would become the third great wave of computing, and just like the second wave (the PC paradigm) swamped the first wave (mainframes), so too would ubiquitous computing swamp the PC paradigm. Have you ever wondered when the third wave will arrive? Is it here already? Is it swamping the second wave? If ubiquitous computers are hidden, how are we to ever know the answers to these questions? What are the milestones that will tell us that our vision is correct? In short, is the third wave still cresting or is it now starting to unleash its awesome power on the beach of human endeavors?

Before trying to answer these questions, it is worthwhile to think back, about the previous major waves of computing, and reflect on what the signals were then that, at least in hindsight, told us the wave was peaking.

The mainframe wave peaked toward the end of the seventies, having started after the end of the war. Throughout the fifties, few people had any contact or awareness of computers. They were beginning to be used in military or space applications, but even there only in extremely limited ways. For example, rocket telemetry data routinely was converted manually to digital form, mostly processed by rooms full people using $2000 electrical/mechanical calculators pounding out programmed series of calculations, and graphed by hand with pen and ink. Not fun.

During the sixties IBM took off, aided by the all purpose 360 OS and the infamous Job Control Language (JCL). Writing FORTRAN models in the mid-sixties in San Francisco consisted of key punching cards containing the source code, traveling to a 24-hour service bureau in the city, submitting the program to be read to tape on a 360-30, wait while the tape cart filled up and was finally transported physically across the street for compilation and execution on an IBM 7090 (about $500/hr), wait again until the cart returned, and finally wait with baited breath and the stench of stale coffee for the 360-30 to print the job - to discover a JCL error! Total turnaround, one to two hours.

Nobody liked computers then. Programmers were considered clerical help, one rung above computer operators. All the smart computer-related people were, or claimed to be, systems analysts. Managers distrusted computers, and refused to know anything about them. The more ignorant the better. Suggest a computer model or simulation to solve a problem, and everyone responded negatively because the computer couldn't be trusted, or couldn't be made to work usefully in production. PhD students who needed computer results to complete a dissertation risked delay and possible failure. (My dissertation was strictly mathematical and abstract in order to avoid the computer completely.)

During the early seventies the District of Columbia automated voting machines. So the next time voting occurred, the customary release of results late that night was delayed. When the media reported the delay on TV, they said, "The results are delayed because the voting machines were automated." At the time, it seemed perfectly natural to everyone that automation slowed things down. They didn't even feel an explanation was necessary. Was the code untested? Did a bug bite? Nothing.

Needless to say, computer-generated billing drove everyone crazy. Bills were rarely correct, and getting them fixed was next to impossible, since customer service clerks could not call up a customers account on a screen and post corrections interactively. Everything had to cycle through once-a-month, step-by-step processing leading to updated master files on magnetic tape.

By the end of the seventies, everything started to change. Billing systems worked correctly most of the time. The concept of integrating computers directly into the decision-making process for businesses was accepted by managers. People knowingly relied on and had faith in the computer - although programmers were still clerks and systems analysts' didn't program. Give a lot of credit for this to hard disk drives and the fact that programmers were finally beginning to learn JCL.

Whatever the reason, the mainframe computer started performing well and slowly won the respect of middle America and corporate America alike.

The mainframe wave peaked at this point.

When PC's first arrived they were considered toys, not only by all organizations, but by its creator, IBM. The innovative Apple computer was even given a fruity name to make it appealing to less than serious users, mainly schoolchildren. Schools had their Apple orchards.

In the mid-eighties, the only people using PC's were students, some secretaries (do you remember what secretaries were?), and a few programmers thunderstruck by the notion of one-on-one computing at zero marginal cost. Managers steered clear of PC's, retaining computer illiteracy as a badge of honor. Programmers were still clerks, and all the smart people were still systems analysts.

The PC paradigm unleashed an incredible force of decentralized computing that by the mid-nineties changed everything. Secretaries are now gone. Managers and professionals alike key their own material, using the PC to aid the process. Programmers reign supreme, as the profession in highest demand and well paid and respected, mostly working on PCs and Engineering Workstations (which is the PC paradigm). Systems analysts were demoted or demolished, not at all helped by Microsoft's penchant for rapid prototyping - read "skip systems analysis." Everyone likes and uses, or wants to use, computers. Trust is so great in automated systems, that many people stopped balancing their checkbooks to guard against that inevitable "bank error not in your favor."

The PC paradigm is peaking.

Mark Weiser started writing about ubiquitous computing during the beginning of this decade, after early pioneers developed RTOS products, but still marking the primary beginning of 32-bit embedded designs. Most of the early work emphasized hardware, to the point that software was an afterthought. Hardware intricacies and resource constraints dictated design, and required software to be extremely efficient which all but necessitated assembly language. None of this appeals very much to software programmers, so engineers not programmers were evolving the state of the art of embedded systems development. This is the root cause of roll-your-own operating systems and the not-invented-here syndrome so often encountered in embedded systems development shops - the biggest nemesis of RTOS vendors.

What if someone delivered a blank PC to your home or office, assuming you wanted to load your own proprietary OS? You would think the delivery person was crazy. What idiot would make their own OS? Especially since nowadays you intend to at least have multi-tasking and other advanced OS features available. Incredible? Well, modern embedded microprocessors are as complex as your PC, and providing and guaranteeing real-time, whether you need it or not, is much more subtle than anything that runs on your desktop. It makes no more sense for Cisco Systems to maintain a proprietary OS than for NASA to spec an OS to help guide a spaceship to Mars, or for me to write my own OS for my PC.

At the start of this decade, no one but a few pioneers ever heard of embedded systems. Even by 1996 H&Q technology conference, embedded systems was never mentioned by any speaker. By 1997 embedded systems seems to be emerging, it even got mentioned at the H&Q technology conference. My lawyer friend tells me about smart refrigerators. Smart products are appearing everywhere you look, especially if you look deep, like inside printers or digital cameras.

An important milestone of ubiquitous computing probably occurred in 1996, when I think the number of embedded 32-bit microprocessors shipped was about the same as the number of PCs shipped. But by the end of the century, embedded 32-bit microprocessors will probably exceed PCs by an order of magnitude, another milestone to look for.

I think Middle America is starting to sense something awesome is happening, but they can't quite put their finger on it - hard because it is hidden. I think shortly it will start to poke through nevertheless, raising fresh concerns about invasion of privacy and the relationship of man to technology. This wave will not begin to crest until this awareness has become widespread and people develop a feeling of comfort using lots of smart, hidden computers. How many people were uneasy about invasion privacy when I described the advanced features of the visionary hearing aid the other day? It will take at least a decade for these deep-felt concerns to subside, after general awareness is reached.

But as important, this wave will not begin to crest until it becomes dominated by software programmers. Ask a corporation IT programmer about embedded systems, and he/she will stare blankly in return, having never heard the expression. Embedded systems programming will become as common in corporations as database programming is today before this wave starts to crest. That hasn't even started to happen.

Things have to change before software programmers (I am purposely not using the term "software engineers") will take on embedded systems. Mainly, it must become much less dependent on hardware. The programming tools must use high-level constructs, like the component building blocks used increasingly frequently today in customizing corporate applications.

Only when these things have all occurred will the third wave have begun to crest, after which it will peak for decades until yet another wave begins to form. Personally I can't begin to imagine the form of the follow-on wave. I'm satisfied just to watch the 3rd wave take shape.

Allen
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext