SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : EMC How high can it go?
EMC 29.050.0%Sep 15 5:00 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: StanX Long who wrote (13303)10/12/2001 4:51:23 AM
From: Gus  Read Replies (1) of 17183
 
More on GIGO (garbage-in-garbage-out). As the Gartner projection clearly indicates, this is a very expensive problem that will actually worsen as IBM, Sun and Microsoft compete in the area of web services over the next few years.

Database companies, middleware vendors, application developers, data warehousing and decision support software vendors have a common problem: they can only support their own products. That creates an opening for independent players. Remember that the percentage of content created in digital form went from less than 5% in 1990 to more than 90% in 2000 so this is an emerging market with most of the pioneering adopters in the high-end.

One of the key requirements for any data integration player is the ability to support the ETL (extract, transform, load) process from any application and provide tools for any decision support application.
It should come as no surprise then that EMC is already shipping products and providing APIs to all the key players in this emerging area as one can glean from the list of ISVs participating in EMC's EIDP program (see below).

emc.com

EMC is uniquely positioned to be the dominant force in this under-invested area because of its ability to support any mainframe or server, any operating system, any database, any application, any network, and most high-end storage systems as well the industry's largest interoperability lab.

We test more components from more suppliers than anyone else in the industry—so you don't have to. Which means you get the most choices and options for your networked storage environment. In our EMC E-Labs, we replicate customer environments—using 2,600 terabytes of storage across nearly five acres of floor space. To date we've tested:

-- nearly 400 server types from 21 different vendors
-- 40 operating systems (plus upgrade releases, patches, earlier versions, etc.)
-- 81 third-party storage software products (including Veritas, BMC, and CA)
-- 145 network connectivity elements
-- 1,200 other devices, ranging from HBAs and drivers to switches and tape subsystems


EMC's Centers of Excellence showcase best practices in infrastructure for application and web hosting. Since 1997, EMC has provided custom application management and hosting services in Hopkinton, Massachusetts and in Cork, Ireland. Today, over 50 innovative customers outsource their mission-critical business and enterprise solutions to EMC in these facilities. By deploying Enterprise Storage Networks, EMC efficiently hosts 130+ TB of live customer data, used by over 500 heterogeneous servers, running 7 different operating systems and 5 different databases. These state-of-the-art facilities are available for customers and EMC outsourcing partners to tour and to learn about infrastructure best practices.

emc.com

Will Poor Data Ruin Data Warehousing?
October 10, 2001


Chief information officers and database administrators need to pay closer attention to poor data quality. If they don’t, the result might be a failed data warehousing or customer relationship management project.

"By 2005, more than 50 percent of projects will fail," Ted Friedman, a senior research analyst, stated October 9 at the Gartner Symposium/ITxpo 2001 in Orlando. "Fortune 1000 companies will spend or lose more on operational efficiencies in the back office than on data warehousing or CRM. These seem like shocking statistics, but the points of business failure include denial about data quality issues."

Bad data can affect companies in varying degrees, ranging from causing simple embarrassment to multimillion dollar mistakes. Bad data can come from a variety of sources – errors in data entry, erroneous data received across the Internet, faulty data purchased or acquired from an outside source or simply combining good data with outdated data and not having the ability to distinguish between the two.

The single most challenging aspect when companies are ready to face data quality head-on is the determination of exactly how bad is the data problem.

"Poor data quality needs to be treated as a business issue," Friedman says.

dmreview.com

The High Costs of Low-Quality Data
By Larry P. English
January 1998


Data warehousing projects fail for many reasons. All of the reasons can be traced to a single element: non-quality. Poor data architecture, inconsistently defined departmental data, inability to relate data from different data sources, missing and inaccurate data values, inconsistent uses of data fields, unacceptable query performance (timeliness of data), lack of a business sponsor (no data warehouse customer), etc., are all components of non-quality data....

....An insurance company downloaded claims data to its data warehouse to analyze its risks based upon the medical diagnosis code for which claims were paid. The data revealed that 80 percent of the claims paid out of one claims processing center were paid for a diagnosis of "broken leg." Their concern was, "What is happening here? Are we in a really rough neighborhood?" What they discovered was that the claims processors were paid for how fast they could pay a claim. So they let the system default to "broken leg." The data quality was good enough to pay a claim. All the claims payment system needed was a valid diagnosis code. But that same data was totally useless for risk analysis.

But worse than this is the fact that over the years, the archaic legacy data structures have failed to keep up with the information requirements of even the operational knowledge workers. As a result, because they require more and more information to do their jobs, knowledge workers have been forced to create their own data workarounds and store the data they need in creative ways that differ from the original file structure. This represents only the beginning of the data quality challenges facing the data warehouse team.....

.....The bottom line is that data quality problems hurt the bottom line.

Quality experts agree that the costs of non-quality are significant. Quality consultant Philip Crosby, author of Quality is Free, identifies the cost of non-quality to manufacturing as 15-20 percent of revenue. Juran pegs the costs of poor quality, including "custom complaints, product liability lawsuits, redoing defective work, products scrapped . . . in most companies they run at about 20 to 40 percent of sales."10 A. T. Kearney CEO Fred Steingraber confirms that "We have learned the hard way that the cost of poor quality is extremely high. We have learned that in manufacturing it is 25-30 percent of sales dollars and as much as 40 percent in the worst companies. Moreover, the service industry is not immune, as poor quality can amount to an increase of 40 percent of operating costs."

But what about the costs of non-quality data? If early data assessments are an indicator, the business costs of non-quality data, including non-recoverable costs, rework of products and services, workarounds, lost and missed revenue, may be as high as 10-20 percent of revenue or total budget of an organization. Furthermore, as much as 40-50 percent of the typical IT budget may actually be spent in the equivalent of manufacturing's "scrap and rework."

dmreview.com
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext