SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Oracle Corporation (ORCL) -- Ignore unavailable to you. Want to Upgrade?


To: MeDroogies who wrote (10273)3/30/1999 2:11:00 PM
From: mcmiller  Read Replies (2) | Respond to of 19079
 
Microsoft Fails to Meet Million Dollar Challenge

On March 16, 1999, Microsoft issued a press release claiming to have met
Oracle's Million Dollar Challenge to demonstrate the data warehouse
performance capabilities of SQL Server. However, Microsoft did not, in fact,
run an audited and fully disclosed industry-standard TPC-D data warehouse
benchmark as required by the Challenge. In a manner reminiscent of their
widely-denounced "Scalability Day" OLTP demonstration, Microsoft attempted to
mislead, misrepresent and confuse the market. Since their press release,
Microsoft has issued a hurried retraction stating that they did not publish a
TPC-D result. Confused?

Read on for more information about:
- What was announced?
- Why is this important?
- What was the $1 Million Database Performance Challenge?
- What was Microsoft's response to the $1 Million Challenge?
- Did Microsoft meet Oracle's Challenge?
- Where to go for more information?

What was announced?
In November 1998, Oracle offered to pay $1,000,000 to the first person who
demonstrated that SQL Server 7.0, with a 1-terabyte TPC-D database, could come
within 100 times of Oracle's best published performance for query number five
of the current TPC-D specification.

For months, Microsoft did not respond to this challenge. Then on March 16,
1999, Microsoft issued a press release entitled "Microsoft and Hewlett Packard
Meet Oracle's Database Challenge, Providing Equivalent Speed, Lower Cost,
Higher Value for Customers" in which a random, non-audited results of a
different benchmark are compared with published and audited Oracle8i TPC-D
results. This press release caused sufficient confusion that Microsoft
immediately had to issue a retraction - a hurried, clarifying statement was
sent to the press on March 17 stating that they did not publish a TPC-D result.

Why is this important?
At first, Microsoft's press release seems to contain impressive comparable
TPC-D results at a fraction of cost of the best Oracle record. But if you
start looking in detail at what Microsoft claims to have achieved, you quickly
see through the smoke and mirrors. This is yet another example of Microsoft
going to any length to get the result they need. The reality is that their
benchmark required many hours of setup, which an ordinary customer would never
do. Microsoft misrepresents the results as comparable to Oracle's audited,
TPC-D results, and snubs the industry's only accepted database performance
standard. Microsoft not only has failed to respond to the challenge, they
attempt to shift the focus away from the fact that they have never published
TPC-D results for SQL Server 7.0.

In contrast, Oracle's database performance is openly published. Oracle holds
database performance records on open systems hardware, including data
warehousing (TPC-D), transaction processing (TPC-C), and ERP benchmarks from
SAP, PeopleSoft and Baan.

What was the $1 Million Database Performance Challenge?
Oracle Corporation offered to pay ONE MILLION DOLLARS to the first person who
demonstrated that SQL Server 7.0, with a 1-terabyte TPC-D database, could come
within 100 times of Oracle's best published performance for query number five
of the current TPC-D specification (Version 1.3.1). To comply, the
challenger would have had to run a complete 1-terabyte TPC-D benchmark,
including all requirements for loading, updating and querying data and
publishing a full disclosure report of all performance metrics. The benchmark
had to be audited by a TPC-certified auditor to ensure compliance with TPC
benchmark rules.

What was Microsoft's response to the $1 Million Challenge?
In their press release and related Webcast, Microsoft claims they met the
Oracle Million Dollar Challenge by providing equivalent speeds on TPC-D query
five at a lower cost to the customer. The benchmark result was produced by
Microsoft Research, and used a Hewlett Packard 4 way 4GB Xeon server.
Microsoft's average query five result was 1.1 seconds, compared to Oracle's
0.7 seconds on a Sun Microsystems Starfire 64 way ULTRA SPARC computer.
Microsoft states they have solved the same business problem, obtaining
reasonable results at a fraction of the cost of the Oracle solution - a
$600,000 system vs. a $10 million system.

Did Microsoft meet Oracle's Challenge?
No way. First of all, the data they used was not even in a relational
database. Their benchmark required many hours of setup in which Microsoft
Research removed underlying detail data and put it into a cube. The cube is
'offline' from the database, and updates are not applied back to the database.
Thus, the requirements for a real database ACID (Atomic updates, Consistency,
Isolation and Durability) properties were not implemented. Also, with the
detail information missing, it becomes impossible to better support ad hoc
queries, as stated in their press release.

The Microsoft response shows that they will go to any lengths to get the
results they need. This includes cheating and abusing industry standards
which they agreed to before we beat them with a real database. Microsoft
compares their modified "benchmark" on a small HP machine running NT to
Oracle's published TPC-D results on a Sun Starfire, one of the world's most
powerful systems, and concludes that the results are comparable.

Finally, Microsoft's results were not audited and certified by the Transaction
Processing Council. TPC standards were established so that customers could
know that vendor performance results were valid. Comparing a random,
non-audited result to Oracle's audited, published TPC-D results is a
fundamental violation of the spirit and language of the TPC's Fair Use Policy.
Microsoft had over three years to publish and audit a TPC-D benchmark result
and did not. Instead, Microsoft Research squeezed an extract from a database,
(and hid the effort required to do that) and then constructed some logic to
access the extract. Microsoft then calls this 'a solution that met the
original intention of TPC-D'. It is difficult to imagine anything further
from the truth.

The bottom line is that Microsoft has not met the TPC-D benchmark rules, and
has not announced when or if they will ship a TPC-D compliant version of what
their research department demonstrated.

Where to go for more information?
For more information about TPC benchmarks and results, please refer to the
Transaction Performance Council Web site at tpc.org

For more information about competing with Microsoft, go to
worldwide-marketing.us.oracle.com

For more information about the Million Dollar Challenge, go to
oracle.com