SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : VideoPropulsion, Inc. (VDOP) -- Ignore unavailable to you. Want to Upgrade?


To: riposte who wrote (14)7/9/2001 12:32:36 PM
From: Savant  Respond to of 24
 
By: Phil LoPiccolo
06/01/01
Computer Graphics World
-----------------------------------------------------------

Full Text
there's an old saying that there are two types of people who make forecasts: those who know they don't know what they're talking about and those who don't know they don't know. While that may be true, it didn't stop forecasters at the recent National Association of Broadcasters (NAB) conference from making predictions about the future of communications technology and its potential impact.

Perhaps the way to make the most accurate predictions possible about the next big advances is to extrapolate from those that have taken place in the recent past. And looking back over the past four decades, John Mailhot of Lucent Digital Video Division pointed out in a panel discussion called ''The Next Big Thing,'' the major technological developments were considered significant because they solved big problems and, as a result, dramatically changed the computer industry and how we used digital technology:

Indeed, going back to the 1970s, the big problem for computers was the lack of computer processing power, and our style of working was built around waiting for a central computer to process our batch programs. What changed all that--the big thing of the '80s--was that processing power suddenly became inexpensive, which paved the way for personal computers and allowed individual users to work on their own systems in a distributed environment.

The problem in the 1980s, however, was that PC memory was limited, which crippled users from working on anything beyond fairly simple applications. That problem was solved when both RAM and disk storage expanded swiftly--the big thing of the late '80s and early '90s--and standalone PCs were suddenly capable of handling significant applications, such as desktop publishing.

Then, in the 1990s, the issue was how to enable users of these more powerful PCs to work together. So in the late '90s, the big things were the advent of efficient desk-to-desk networking--which allowed users to collaborate on projects within an enterprise--and the emergence of the Internet and the World Wide Web.

Of course, in this decade the challenge lies in providing effective long-range networking of computers. And the next big thing will be ubiquitous broadband access to the Internet, which will enable high-speed connectivity from any computer to any other.

Unfortunately, broadband is taking longer to become universal than anyone expected. In fact, only 5 percent of households currently have broadband access. But broadband penetration will increase steadily over the next several years, noted Lou Dobbs, anchor of CNN's M0neyline News program. By 2003, more than 15 million households will be broadband enabled. And any company that isn't planning for it right now, he warns, is making a huge strategic mistake.

But how will broadband access change our way of doing business, working with computers, and interacting with each other? The boldest forecast was presented by John Sidgmore, vice-chairman of WorldCom. Here are some of the most insightful and intriguing predictions:

Dot.coms: We tend to make fun of all the Internet startups that came and went over the past few years. But the same thing happened in the early 1900s with the arrival of motorized transportation. In fact, between 1915 and 1930, a huge percentage of the new companies that formed had either ''engine'' or ''motor'' or the equivalent in their names. And most of those companies failed. But the ones that succeeded, such as General Motors Corp. and Ford Motor Company, made up a huge percentage of the Gross National Product in the US over the last century. The same will be true of the Internet when broadband becomes more universal. Most Internet companies will continue to go out of business. But the few that succeed will be hugely successful and will be the drivers of the world economy over the next 50 to 100 years.

E-commerce: Today less than 5 percent of all sales transactions are conducted on the Web. But within three years, more than half of all orders will be placed over the Internet and will account for some $1.5 and $2.5 trillion in revenue. The reason is that the logic for using the Internet for e-commerce is truly compelling. Companies are desperate to put more efficiency into the way they handle their supply chains and go about distribution. And a fully functional Internet will provide a dramatically less expensive alternative to traditional business models for order processing and customer sales, service, and support.

Video conferencing: The notion of real-time videophones was the sensation of the New York World's Fair in 1964. But it has remained a dream ever since. With high-definition TV and video projection over broadband, real-time video conferencing will finally be practical--and useful. Indeed, research shows that this kind of real-time video interaction can replace a lot of on-site meetings. And as it catches on, we'll see a big move to telecommuting over the next decade as companies search for new ways to attract employees from wider geographic areas.

Active messaging: There's a popular notion that everyone is connected to the Internet. But this is a myth. Hardly anyone is connected. In fact, a recent study by Goldman Sachs found that only 26 percent of Americans are ''regular'' users, which means they sign on more than four times a month. And the numbers are much lower elsewhere: In England, the percentage of regular users is 15 percent, and in Germany, it's 8 percent. On one hand, this means there's a lot of headroom for growth. But on the other, it still isn't practical for people to be checking into various sites all day long. Therefore, the best way to provide people with information that they need when they need it is through ''active messaging,'' whereby your system, which is always connected to the Internet, will send you timely and relevant messages. Imagine that there's a site called ''IsYourHouseIsOnFire.com.'' Although you'd want to know if your house were on fire, you wouldn't want to keep checking the site to find out. With a system that is ''always on,'' such a site could be programmed to send you active messages on a need-to-know basis.

Voice browsers: Today, we interact with the Internet by signing onto a computer, going through a log-in process, and typing in commands. But this is too difficult or too much trouble for many people. Suppose instead that you have a wearable device with an intelligent ''voice browser'' that operates like a traditional browser, except it understands spoken commands and returns relevant verbal and graphic information. Now suppose you tell it you are driving from your office to the airport to fly to Chicago for your 1:00 meeting. The browser checks your travel route for you and discovers that a truck has overturned on the road ahead. So you ask it to find the best alternate route, call the airport to get you on a later flight, and reschedule your appointment.

Internet glasses: In the scenario above, after you ask your voice browser to plot a new course for you, you don a pair of Internet glasses, and are able to see--in perfectly registered augmented reality--a map highlighting the new route and directions superimposed over the road in front of you. Technology such as this to greatly simplify the way we interact with computers is right on the horizon. And we will dream up all kinds of useful applications to take advantage of it.

It may be hard to tell how accurate any of these projections will be, but one thing is certain. As the great cultural observer Yogi Berra once said, ''The future ain't what it was.''

Phil LoPiccolo: Editor-in-Chief

--------------------------------------------------------------------------------FB/Dave S.



To: riposte who wrote (14)8/2/2001 11:10:05 AM
From: Savant  Read Replies (2) | Respond to of 24
 
RT, be nice to be involved..IBM Selected To Provide Key Technologies For Massive U.K. Computing and Data

Grid; New Computing Model Will Enable
Unprecedented Scientific Collaboration

Business Editors

ARMONK, N.Y.--(BUSINESS WIRE)--Aug. 2, 2001--IBM today announced
that it was selected to partner with several centers in the U.K.
National Grid to provide key technologies and infrastructure for the
project.
IBM is collaborating closely with these Grid centers to link a
massive network of computers throughout the United Kingdom, leveraging
IBM's expertise in scalable servers and storage, open standards,
self-managing technologies, services and e-business software.
Just as electricity is delivered to homes over an electrical grid,
Computing Grids allow geographically distributed organizations to
share applications, data and computing resources. A new model of
computing, Grids are clusters of servers joined together over the
Internet, using protocols provided by the Globus open source community
(Globus.org) and other open technologies, including Linux.
The British government, through the Office of Science and
Technology, is building the National Grid for collaborative scientific
research in a wide spectrum of disciplines. It will also serve as a
testbed for deploying "e-utility computing" also known as
"e-sourcing." - the delivery of computing resources including
bandwidth, applications, storage as a utility-like service over the
Internet.
The U.K. National Grid Center is located in Edinburgh/Glasgow, and
there will be eight regional centers located at the universities of
Oxford, Newcastle, Belfast, Manchester, Cardiff, Cambridge,
Southampton and Imperial College, London.
IBM has already won a tender to build a sophisticated data storage
facility at Oxford University, which will be the primary U.K. source
of high energy physics data generated by a leading experiment at
Fermilab in Batavia, Illinois. This is one of several major high
energy physics projects that are planning to make use of the Grid,
such as the new Large Hadron Collider experiments at CERN, the
European particle physics Laboratory in Geneva, Switzerland. Also,
using the National Grid, scientists at Cambridge will be able to run
sophisticated high-energy physics applications on computers in
Belfast.
"I am delighted that IBM is collaborating with the U.K. to build
the next-generation Globus-based Grid middleware, which will have
implications far beyond the original scientific applications," said
Tony Hey, architect of the U.K. National Grid. "IBM brings a wide
range of key technologies to the Grid agenda and are collaborating
closely with several of our Grid centres."
"The United Kingdom is clearly taking a leadership role in the
development of Grid computing, which represents a significant market
opportunity," said David Turek, IBM vice president of emerging
technologies. "IBM is proud to be an integral part of the National
Grid project -- a bold next step in the evolution of the Internet."

IBM Grid Expertise

IBM is the leading supplier of systems and services expertise to
the scientific and technical community. In addition to working with
many of the world's leading labs and research organizations in the
development of Grid projects, IBM Research used Globus technologies to
build its own Grid -- a geographically distributed supercomputer
linking IBM research and development labs in the United States,
Israel, Switzerland, Japan. IBM's Global Services organization offers
the complete range of IT skills needed to build, run and maintain
Grids.
To help customers manage complex Grids, IBM offers scalable
supercomputing systems and middleware with IBM eLiza self-management
technologies. Project eLiza, announced by IBM earlier this year, is a
company-wide program to develop systems that respond to the
requirements of their environment in order to optimize performance
across a network, improve security and survive failures.
IBM also plans to Grid-enable key IBM systems and technologies,
allowing them to be plugged into these growing worldwide networks
quickly and easily.
In the same way it played a leadership role in the commercial
adoption of Linux, IBM is working with the Globus open source
development community and the influential industry standards body,
Global Grid Forum.
Globus technologies have been developed over the last five years
in the research community, in a project led by Argonne National
Laboratory and the University of Sourthern California's Information
Sciences Institute. The Globus Toolkit and its protocols are now in
use in over 20 multi-million dollar eScience projects around the
world. Its large user community and open architecture, open source
structure and philosophy makes it a natural partner for IBM.

Grids for e-sourcing

Grids -- like Linux and the Internet itself -- are poised to grow
beyond the academic world and become an important business platform.
Grid protocols could provide a key platform for e-sourcing -- a major
initiative within IBM targeting the sale and delivery of computing
resources as a utility-like service over the Internet. IBM e-Utility
Labs in the United States are now using Grids to develop and test
e-sourcing services-- and IBM is already working with a number of
forward-thinking customers to enable e-sourcing in commercial grid
environments.
Grid protocols could allow companies to work more closely and more
efficiently with colleagues, partners and suppliers through:

-- Resource aggregation -- allowing corporate users to treat a
company's entire IT infrastructure as one computer through
more efficient management.

-- Database-sharing -- allowing companies to access remote
databases. This is particularly useful in the life sciences
community, where researchers need to work with large volumes
of biological data from a variety of sources. Engineering and
financial firms also could benefit significantly.

-- Collaboration -- allowing widely dispersed organizations to
work together on a project -- sharing everything from
engineering blueprints to software applications.

About the National Grid

The National Grid will be created as part of the e-Science Core
Programme, which is overseen by the British Government's Office of
Science and Technology. The e-Science Core Programme was announced
last year as part of a British government three-year funding package
to develop e-Science -- global scientific collaboration and the next
generation of infrastructure that will enable it.
IBM is a registered trademark of the International Business
Machines Corporation. Linux is a registered trademark of Linus
Torvalds. Other company, product and service names may be trademarks
or service marks of others.

--30--mw/ny*

CONTACT: IBM Corp.
John Buscemi, 914/766-4495
jbuscemi@us.ibm.com