SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD)
AMD 246.86-0.5%3:59 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: wbmw who wrote (243912)12/2/2007 9:15:31 PM
From: pgerassiRead Replies (1) of 275872
 
Wbmw:

You guys like to argue that you should only compare non-optimized scores, since no one uses optimizations, but if that were the case, why would the industry spend billions of $$ to optimize code so that it runs faster, and why would Intel spend (at least) hundreds of millions of $$ to build a software and solutions group that facilitates ISVs in optimizing their software so that it runs well with Intel platforms, if this did not have a positive effect on sales...?

Us guys like to argue that you should compare unoptimized code because almost all widely distributed code is not optimized. There are only two basic groups optimizing code, those who are running a custom built application on a known hardware box (theirs), mostly HPC users, and those who want to publicize their stuff is better (hardware and the software tools industries). The first group of users care nothing about Java application scores. So only the second group cares and they are not the end consumers of the equipment. Many of those optimizations are not done by the typical user and some are even counter productive by either adding bugs or by reducing overall performance. Heck just keeping the bugs out is a very good reason not to optimize. Why add to your headaches?

Besides, if the difference is between free tools and $7-8K in tool costs plus $1400 in annual tool support costs not including OS and RDBMS costs, most will opt not to go the optimal route. That is why most users opt for the free Apache, JVM and mySQL tool set.

And then there is those who do optimize, but not for the latest, but the least of their installed base. The lowest common denominator. The faster boxes will tweak to the code rather than the other way around. Thus the ISVs aren't doing the cutting edge optimization, just the basics. Thus the only part of the group is left, the OEMs and hardware makers.

So These optimizations do make for great copy, but people just don't use them. It is far more important to run, than to go somewhat faster. So reliability trumps speed.

Dell doesn't base their corporate decisions on results from Anandtech.

But Dell does base their decisions on money under the table.

But in this particular case, the data is conflicting, and I will choose to trust the SPEC results over the Anandtech results.

The data is not conflicting. And Anandtech does have more experience setting up a real web application server than you. They just go for reliability over speed. The esoteric options might make it run twice as fast in a small number of cases including the SPECjbb2005 benchmark, but they are not what normal users of web servers use or want.

What is missing from SPEC benchmarks are the costs of the HW/SW under test. Performance per dollar is a desirable metric.

Pete
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext