SGI Performance Comparisons

Software Installation of IRIX 6.2 (Default Configuration)

Last Change: 08/May/2008

This analysis is identical to that given for INTEGER TEST 1, except the discussion which follows is not as detailed, ie. I concentrate on the results. For an explanation of how this analysis is performed and the rationale behind it, see the explanations given for INTEGER TEST 1.

Table A summarises the time to read the installation tools from the CDROM, the total time taken for the OS installation, and how the results compare to the slowest system (R4600PC/100 with 2X CDROM).

                                 Time         Time to
                     CDROM   to Read the    install the    FACTOR
                     SPEED   Installation   2 base CDs    COMPARED
                             Tools (m:ss)     (mm:ss)    TO SLOWEST

Indigo2 R10000SC/175: 32         0:32          
Indigo2 R4400SC/250:  32         1:07          18:02        0.44
Indigo2 R4400SC/250:   2         1:54          20:40        0.51
Indy R4400SC/200:     32         0:46          21:48        0.53
Indy R4400SC/200:      2         1:51          25:11        0.62
Indy R4600PC/133:     32         0:50          35:58        0.88
Indy R4600PC/133:      2         1:51          39:07        0.96
Indy R4600PC/100:     32         0:50          38:00        0.93
Indy R4600PC/100:      2         1:51          40:48        1.00

  Table A: IRIX 6.2 Software Installation Timing Summary

It's rather strange that my Indigo2 R4400SC/250 with a 32X CDROM takes so long to read the installation tools (17 seconds longer than the R4600PC/100 Indy). Installing 6.5 on my Indigo2 system took much less time to read the installation tools (47 seconds). I ran the tests again, but saw the same results - something is slowing down the I2's ability to access the CDROM that first time (it doesn't seem to be happening during the main installation); anyone have any ideas?

Even so, it's clear that reading data from a 2X CDROM leaves the main CPU starved for data, ie. the bottleneck is the CDROM's ability to get data off the CD and into the system. The proof of this is that the time taken to read the installation tools does not correlate with CPU power.

In contrast to the 2X results, a faster CPU helps when reading data from a 32X CDROM to such an extent that the time taken to carry out a brief action (such as reading the installation tools) probably depends more on how a system's I/O hardware handles the data once it has left the CDROM. Since O2, Indigo2 and Indy have different I/O systems (very much so in the case of O2), it's easy for differences not related to CPU power to creep into the timing results. Thus, to properly evaluate how different CPUs utilise a 32X CDROM, one needs more demanding tests. Relevant tests of this type are present on this page in the form of a study of how different CDROMs are exploited during the installation of IRIX 6.5, and how different systems cope when copying different types of file(s) from a CDROM straight to disk [one large file | lots of small files].

Remember that the OS installation process for 6.2 is not the same as 6.5. Two CDs form the basic 6.2 OS set: each CD is dealt with separately, using a 'delay_conflicts' flag to avoid installation conflicts. As a result, the columns in Table B below are not the same events as those shown in Table 1 for installing IRIX 6.5.

                        ************* Sub-task Completion Times ***************
                        ------ 1st CD ------    ------ 2nd CD ------   -Post-
                   CD   Pre-    Exit    1ST     Pre-    Exit    2ND    rqsall
                   ROM  inst    Coms     CD     inst    Coms     CD     ELF
                        Ends    Begin   Done    Ends    Begin   Done    libs

I2 R10000SC/175:   32   00:14   05:53   07:47   08:02   12:25   14:02   17:30
I2 R4400SC/250:    32   00:15   05:12   07:45   08:05   13:08   15:16   18:02
I2 R4400SC/250:     2   00:15   06:20   08:52   09:11   15:34   17:47   20:40
Indy R4400SC/200:  32   00:20   06:18   09:26   09:53   16:01   18:39   21:48
Indy R4400SC/200:   2   00:20   07:40   10:47   11:14   19:00   22:00   25:11
Indy R4600PC/133:  32   00:32   09:01   15:54   16:35   26:08   31:45   35:58
Indy R4600PC/133:   2   00:32   10:33   17:30   18:09   29:14   34:54   39:07
Indy R4600PC/100:  32   00:34   09:43   16:55   17:38   27:51   33:35   38:00
Indy R4600PC/100:   2   00:34   11:06   18:14   18:57   30:40   36:24   40:48

         Table B: Detailed Timings for a Default IRIX 6.2 OS Installation

Table C shows the time taken for each individual step in the installation process, ie. the times obtained by subtracting one column from the next.

                   CD   PRE-           EXIT    PRE-            EXIT    RQSALL
                   ROM  INST    CD1    COMS    INST     CD2    COMS     ELF

I2 R4400SC/250:    32   00:15  04:57   02:33   00:20   05:03   02:08   02:46
I2 R4400SC/250:     2   00:15  06:05   02:32   00:19   06:23   02:13   02:53
Indy R4400SC/200:  32   00:20  05:58   03:08   00:27   06:08   02:38   03:09
Indy R4400SC/200:   2   00:20  07:20   03:07   00:27   07:46   03:00   03:11
Indy R4600PC/133:  32   00:32  08:29   06:53   00:41   09:33   05:37   04:13
Indy R4600PC/133:   2   00:32  10:01   06:57   00:39   11:05   05:40   04:13
Indy R4600PC/100:  32   00:34  09:09   07:12   00:43   10:13   05:44   04:25
Indy R4600PC/100:   2   00:34  10:32   07:08   00:43   11:43   05:44   04:24

   Table C: Individual times for each installation step (IRIX 6.2)

Table D gives the percentage differences for the paired results, ie. 2X vs. 32X for each system. The times for all steps that don't involve the CDROM are added together, then the highest is divided by the lowest for each system pair. The results should be low in order to be usable:

                    TOTAL TIMES      PERCENTAGE
                    2X vs. 32X       VARIATION

I2 R4400SC/250:    08:12 / 08:02        2.1%
Indy R4400SC/200:  10:05 / 09:42        4.0%
Indy R4600PC/133:  18:01 / 17:56        0.5%
Indy R4600PC/100:  18:33 / 18:38        0.5%

   Table D: Paired result differences.

These are satisfactory. In fact, if you look at the individual stage times in Table C, many relevant paired results are identical.

Now for the all-important comparison: the average performance for each system compared to the average performance of the slowest system (Indy R4600PC/100). Thus, the numbers denote how much faster each stage was executed compared to the slowest system (ie. smaller = better). Table 4 showed these figures for each installation stage, but 6.2 produces times that are 50% smaller than 6.5, so errors have a greater effect; thus, the stage times have been added together to reduce the effect of error, the results of which are in Table E.


I2 R4400SC/250:     0.44
Indy R4400SC/200:   0.53
Indy R4600PC/133:   0.97
Indy R4600PC/100:   1.00

Table E: Comparison to Indy R4600PC/100
   performance (factor differences)

As with installing 6.5, there is little benefit overall in having an R4600PC/133 as opposed to an R4600PC/100, but when one moves to a higher clocked CPU which has some L2 cache, the performance improvement is considerable.

Also, the figures in Table E have a similar profile to the final averages given in Table 4. This shows that the OS installation tasks are:

  1. consistent between 6.2 and 6.5,

  2. a usable comparitive measure of integer performance involving disk accesses between systems.

Note: to calculate percentage improvements, subtract each number from 1 and multiply by 100, eg. Indigo2 R4400SC/250 is 56% faster overall compared to Indy R4600PC/100 for processing all the tasks that do not involve the CDROM.

Remember: these tests involved alot of disk accesses. An integer task which did not access the disk so much would show an even greater improvement in speed for faster systems, eg. ImgLab tests. These results obviously prove a real-world integer task will be faster on better systems, but what's important is that one shouldn't rely on synthetic benchmarks alone when attempting to estimate system performance - the expected improvement may not be as great as one eventually sees if one's task involves disk operations.

NB: the analysis of the tasks which involve accessing the CDROM, as shown in Table C (columns CD1 and CD2), I leave as an exercise for the reader. This is because a complete analysis of CDROM performance during the installation of 6.5 is already available in CDROM TEST 1. The main reason why I have done INTEGER TEST 7 is to show that the analysis given in INTEGER TEST 1 is valid.