To: jcholewa who wrote (60570 ) 10/26/2001 5:30:22 PM From: wanna_bmw Read Replies (1) | Respond to of 275872 JC, all you need to do is follow the link I gave. Here is is again.bapco.com Re: "Does the system run ALL of these benchmarks at the same time?" Today’s desktop office productivity user utilizes more than one application simultaneously to complete day-to-day activities. Many applications remain open on the desktop through out the day. In SYSmark 2001 the Office Productivity scenario has many components of Office 2000 (Word, Excel and PowerPoint) open at the same time. The benchmark runs back and forth between the Office 2000 components. In the Internet Content Creation scenario, the benchmark alternates between Photoshop, Dreamweaver, Flash and Premiere. Re: "Is WME done in the background?" BAPCo has attempted to model this in the Office Productivity scenario by having file compression, virus scan and a speech to text translation occurring in the background while Office 2000 documents are being created in the foreground. In the Internet Content Creation scenario, video encoding runs concurrently with other web page creation activities. Re: "How exactly are they testing Photoshop? Are they using the most commonly used filters and actions, or are they stressing performance with uncommon filters and actions?" The user opens a high definition picture and runs a few sample filters, experiments with the size and orientation, changes pixel/inch ratio, fades the image, adds a border, saves the result image under jpeg format and prints the resultant image. Re: "It is BapCo's responsibility to clarify in deep detail what they're doing. This descriptions you've given me barely scratch the surface of that responsibility. They're getting paid a lot of money for these testing suites, so they had better be obligated to publicly prove their products' worth!" It sounds like they ARE providing such details. Why are you rambling on and complaining before you even check out the link I provided??? Re: "And, well, it's obvious that the P4 did get a rather sizable boost from the prior version of SysMark to the current version of SysMark. Why is this? If it's because of "future behavior", BapCo has yet to really justify it in detail." The fundamental performance unit in SYSmark 2001 is the “Response Time”. Response time, in the context of SYSmark 2001, is defined as the latency experienced between the submission of a request by the user and the completion of the processing of that request by the application. For example, the response time for a Replace All command in Word 2000 is the time between hitting the Replace All button in the Edit/Replace window and the time that Word 2000 brings up an operation completion window. Each scenario consists of a number of operations performed by a number of applications. Only the response times for each of these operations will be recorded. The time to reposition the mouse or cursor to achieve the operation is not timed, as the user will do this during normal use of the application. The overall response time for a scenario is the average of all the response times in all the applications that make up that scenario (See Table 1). The average response time for each of the two scenarios is then converted to ratings (explained in the next section – Rating Methodology). The overall SYSmark 2001 rating is derived from the geometric mean of the two scenario ratings. Also, check out this section:What’s new in SYSmark 2001? SYSmark 2001 has added a number of evolutionary and breakthrough approaches to benchmarking. Here is a list of those additions: Response Time Metric: Response time, in the context of SYSmark 2001, is defined as the latency experienced between the submission of a request by the user and the completion of the processing of that request by the application. In SYSmark 2001, only the response time of individual operations is included in the performance metric. Unlike SYSmark 2000, which measured end-to-end run time, SYSmark 2001 ignores the time to send keystrokes and mouse clicks to the application. In the real world these operations are done by the human user and so are not timed. Realistic Benchmark execution Speed with think time: In the real world users type at normal speed with sufficient pauses in between. Running benchmarks as fast as possible using an automation tool which sends super fast keystrokes, is not realistic, as normal users cannot run applications at this speed. In SYSmark 2001, a think time of up to one second is added between operations. The think time is not included in the performance measurement. The think time emulates a desktop users interaction with the operating system and applications. Operating system behavior is more realistic when application interaction has think times (just like a real user) as the OS can devote itself to other book keeping activities (like memory management, scheduling etc). The addition of the think time also makes the run of the automation script more robust on varied platforms. Background Execution: SYSmark 2001 reflects the current usage models of office productivity users and content creators. The user generally has a number of applications running in the background while the focus is on the primary work being done. BAPCo has attempted to model this in the Office Productivity scenario by having disk compression, virus scan and a speech to text translation occurring in the background while Office 2000 documents are being created in the foreground. In the Internet Content Creation scenario, video encoding runs concurrently with other web page creation activities. Streamlining Installation/De-installation: SYSmark 2001 installs all the included applications in an encrypted format when the benchmark itself is installed. Subsequent runs of the benchmark only decrypt the installations and invoke the applications. Un-installation of the applications is done only if the user uninstalls SYSmark 2001. This approach adds to the stability of the benchmark by dramatically reducing the number of installs and uninstalls. New Applications: BAPCo has added Outlook 2000, WinZip 8.0, Access 2000, McAfee VirusScan 5.13, Macromedia Dreamweaver 4 and Macromedia Flash 5 to SYSmark 2001. These new applications reflect the current workflow in today’s desktop scenarios. Is this enough information for you? I don't know how much more will satisfy the curiosity of the masses, but this pretty much convinced me that this benchmark is worth using. wanna_bmw