Hardware Variations
Microsoft ran tests on a Dell PowerEdge 2400 Server configured with two Intel 667MHz/133MHz Pentium III processors, 1GB of RAM, four 9GB 10KB RPM SCSI-3
drives configured with a PERC2/Si RAID controller and some configurations with a SCSI controller.
Variations of this hardware were applied to different load simulations.
The following variables were tested:
- RAM: 256MB, 512MB, 1GB
- Processor: 1 or 2
- Controller: SCSI and hardware RAID
Load Simulation Variations
The server task load was meant to represent two typical types of businesses with 10-user and 50-user environments. One business type is called professional services-businesses that primarily consist of "knowledge workers" that share several documents, and send and receive a lot of e-mail.
The other business type is manufacturing operations-businesses with a few knowledge workers and a larger percentage of factory or floor workers that send and receive some e-mail, and regularly update an inventory transaction system.
Typical Professional Services User
- Sends 25 e-mail messages (10KB to 1MB), receives 50 messages, and reviews or updates schedule four times per day
- Visits 8 Web sites per day
- Prints five 3-page documents per day
- Sends 4 faxes per day
- Runs 5 SQL Server queries per day
Typical Manufacturing Operations User
- Runs 25 SQL Server queries per day
- Sends 10 e-mail messages (10KB to 1MB), receives 20 messages, and reviews or updates schedule four times per day.
- Visits two Web sites per day
- Prints five 3-page documents per day
- Sends 4 faxes per day
The tests were generated by using client side scripts to test the server performance. The tests were set up based upon two business scenarios. The scripts were run from a harness machine. This machine connected to a SQL Server that contained the availability and probability of the scripts that were run. The harness machine was dual homed, and received the information from a SQL Server database outside of the Small Business Server network.
The harness machine would then start the script on one of the three client machines on the Small Business Server network.
After the script was run, it would record a success or failure to the harness machine that sent the information back to the SQL Server database on the external network. The harness would also record the Small Business Server performance counters to the external SQL database.
Each test was run for at least an hour, and given 15-30 minutes to get warmed up. This provides more accurate data to the project. The following is a breakdown of the company scenarios.
- Microsoft SQL Server Tests
The SQL Server tests were generated by using client side scripts to test the SQL Server workload. The clients ran scripts that created a table, ran a transaction against the table, or dropped the database. The size of the database and table varied through the scripts, but the database size never exceeded 500MB.
It is important to note that the load simulations performed during these tests used a fairly light SQL Server load.
For companies running a line of business application based on SQL Server, you should plan to be more conservative in your hardware planning.
Refer to Microsoft SQL Server 2000 documentation for more detailed hardware planning information.
- Microsoft Exchange Server Tests
Exchange Server load was simulated by using scripts running on three client machines sending randomly sized e-mail messages. Message size ranged from 10KB to 1MB, although the randomisation was weighted so there were a larger relative percentage of smaller messages.
The scripts were run from all three workstations when generating loads for 50 users.
The scripts were configured to send internal e-mail to clients with attachment sizes varying from 10KB to 1MB, and also to delete Inbox items.
- Microsoft ISA Server Tests
Internet Security and Acceleration (ISA) Server load was simulated by using scripts that downloaded files of various sizes using HTTP and File Transfer Protocol (FTP). The file sizes ranged from 10KB to 70MB.
- Fax Activity
Fax activity was also run through a script that tested the Microsoft Fax Service and to some degree the Print Spooler service. This fax activity was used to provide additional load to the spooler service, rather than to provide any sort of metric on printing performance.
Faxes sent to Fax Service are rendered just like standard print jobs, so the fax load is representative of regular print activity. The script was written to choose a file size randomly, ranging from a 1-page document to a 5-page document.
Services not included in the tests
- Microsoft Internet Information Server
Although Microsoft Internet Information Server (IIS), the built-in Web server for Windows 2000 Server, was running during the test, we did not run either intranet or Internet Web tests during the load simulation. There were two primary reasons for not including IIS in the performance load tests:
- Most small businesses do not maintain and support a high-bandwidth, high-availability Web page locally. Public Web hosting is usually more cost-effective for small businesses, rather than maintaining a high-speed Internet connection and full-time IS staff. This changes the role of IIS to that of a small business Intranet server in which the number of clients typically would not exceed 50.
- IIS performance is primarily dependent on bandwidth. Most small businesses do not have a fast enough Internet connection to place a significant load on IIS. The most common small business Internet connection is still a modem; the second most common is a mix of various medium bandwidth solutions such as ISDN, ADSL, Cable, and Frame Relay. At these connection speeds, the additional performance load required of IIS even with maximum bandwidth use is negligible.
If there are plans to host a Web site, hardware requirements may increase depending on your expected use.
- File Traffic
While the Fax Service and Exchange (messages with attachments) were generating some level of file and print traffic, we did not simulate additional heavy file traffic.
- 3rd Party Applications
One notable area in which the test server differed from a production server is that no backup program or anti-virus software was being run.
Also, while SQL Server was thoroughly tested, many 3rd parties have built their own Windows 2000 Server-based services, which can create additional hardware requirements.
However, these tests did run Performance Monitor and Task Manager, which may make up some of the difference that a 3rd party application might consume on the server. Performance Monitor was capturing a log with almost every available counter; perfmon.exe was using almost 5MB of RAM, rather than the typical 1.5MB.
In addition, Task Manager used an additional 1MB of RAM.