Grosch's law is the following observation of computer performance, made by Herb Grosch in 1953:[1]

I believe that there is a fundamental rule, which I modestly call Grosch's law, giving added economy only as the square root of the increase in speed — that is, to do a calculation ten times as cheaply you must do it hundred times as fast.

This adage is more commonly stated as

Computer performance increases as the square of the cost. If computer A costs twice as much as computer B, you should expect computer A to be four times as fast as computer B.[2]

Two years before Grosch's statement, Seymour Cray was quoted in Business Week (August 1963) expressing this very same thought:

Computers should obey a square law — when the price doubles, you should get at least four times as much speed.[3]

The law can also be interpreted as meaning that computers present economies of scale: the more costly is the computer, the price–performance ratio linearly becomes better. This implies that low-cost computers cannot compete in the market.

An analysis of rental cost/performance data for computers between 1951 and 1963 by Kenneth E. Knight found that Grosch's law held for commercial and scientific operations[4] (a modern analysis of the same data found that Grosch's law only applied to commercial operations[5]). In a separate study, Knight found that Grosch's law did not apply to computers between 1963-1967[6] (also confirmed by the aforementioned modern analysis[5]).

Debates edit

Paul Strassmann asserted in 1997, that "it was never clear whether Grosch's Law was a reflection of how IBM priced its computers or whether it related to actual costs. It provided the rationale that a bigger computer is always better. The IBM sales force used Grosch's rationale to persuade organizations to acquire more computing capacity than they needed. Grosch's Law also became the justification for offering time-sharing services from big data centers as a substitute for distributed computing."[7] Grosch himself has stated that the law was more useful in the 1960s and 1970s than it is today. He originally intended the law to be a "means for pricing computing services".[8]

See also edit

References edit

  1. ^ Grosch, H.R.J. (1953). "High Speed Arithmetic: The Digital Computer as a Research Tool". Journal of the Optical Society of America. 43 (4): 306–310. doi:10.1364/JOSA.43.000306.
  2. ^ Lobur, Julia; Null, Linda (2006). The Essentials of Computer Organization And Architecture. Jones & Bartlett. pp. 589. ISBN 0-7637-3769-0.
  3. ^ "Computers get faster than ever". Business Week. 31 August 1963. p. 28.
  4. ^ Knight, Kenneth E. (September 1966). "Changes in Computer Performance" (PDF). Datamation. Vol. 12, no. 9. pp. 40–54.
  5. ^ a b Jones, Derek (April 30, 2016). "Cost/performance analysis of 1944-1967 computers: Knight's data".
  6. ^ Knight, Kenneth E. (January 1968). "Evolving Computer Performance 1963-1967" (PDF). Datamation. pp. 31–35.
  7. ^ Strassmann, Paul A. (February 1997). "Will big spending on computers guarantee profitability?". Datamation. - Excerpts from The Squandered Computer.
  8. ^ Gardner, W. David (April 12, 2005). "Author Of Grosch's Law Going Strong At 87". TechWeb News. Archived from the original on 2006-03-26. - article discussing Grosch's Law and Herb Grosch's personal career.