Difference between revisions of "CSC103: DT's Notes 1"
Line 1,629: | Line 1,629: | ||
To understand '''Moore's Law''' we need to understand '''exponential growth''' first. In our context, it makes sense to consider quantities that grow over time, but exponential growth applies to a broader spectrum of things. However, if you understand exponential growth in our context, its application to other areas will make sense. | To understand '''Moore's Law''' we need to understand '''exponential growth''' first. In our context, it makes sense to consider quantities that grow over time, but exponential growth applies to a broader spectrum of things. However, if you understand exponential growth in our context, its application to other areas will make sense. | ||
+ | <br /> | ||
+ | ====Exponential Growth==== | ||
+ | <br /> | ||
Something has an exponential growth if its size is doubling every fixed interval of time. A cute puzzle for which people often get the wrong answer will get us started. | Something has an exponential growth if its size is doubling every fixed interval of time. A cute puzzle for which people often get the wrong answer will get us started. | ||
Line 1,710: | Line 1,713: | ||
<center>[[Image:CSC103_ExponentialGrowthLogarithmicScale.png|450px]]</center> | <center>[[Image:CSC103_ExponentialGrowthLogarithmicScale.png|450px]]</center> | ||
<br /> | <br /> | ||
+ | |||
+ | With a logarithmic scale for the y-axis, an exponential growth appears as a straight line. And now the relationship that exists between the amounts of grains on the squares of low order is plainly visible. The logarithmic scale is the key to fully expose the property of exponential growths, and exposes them as straight line in a plot where the x-axis is linear (what we normally use for simple graphs), and the y-axis logarithmic. | ||
+ | |||
+ | <br /> | ||
+ | ====Moore's Law==== | ||
+ | <br /> | ||
+ | Gordon E. Moore, the co-founder of Intel, is credited with the observation that the number of transistors in integrated circuits had doubled every 18 months since 1958 to 1965<ref name="moore1965">Gordon E. Moore, Cramming More Components onto | ||
+ | Integrated Circuits, ''Electronics'', pp. 114–117, April 19, 1965.</ref>. At the time Moore predicted that the trend would continue for "at least 10 more years." | ||
+ | |||
+ | Almost 50 years later the trend has held remarkably well, as illustrated by the diagram below taken from [http://njtechreviews.com/2011/09/04/moores-law/ NJTechReviews.com]. | ||
+ | |||
+ | <br /> | ||
+ | <center>[[File:MooresLaw.jpg]]</center> | ||
+ | <br /> | ||
+ | |||
+ | Notice that the growth of the number of transistors is quite straight in this logarithmic plot. It doesn't really matter that it is not absolutely straight. It is straight enough to se that the number of transistors has double, indeed, every 18 months or so over many decades. This is a very remarkable trend that has been observed in different areas associated with computer technology. | ||
+ | |||
+ | We saw very similar straight lines when we were discussing the Von Neumann bottleneck. The curves didn't show the growth of the number of transistors but the performance of the CPU and of the RAM. The performance is usually measured by the amount of instructions processed per second, or the amount of data accessed per second, and it is interesting to see that these quantities have grown exponentially as well. | ||
+ | |||
+ | |||
[[Image:GordonMoore.jpg|right]] | [[Image:GordonMoore.jpg|right]] |
Revision as of 18:34, 29 September 2013
--© D. Thiebaut 08:10, 30 January 2012 (EST)