Difference between revisions of "CSC103: DT's Notes 1"

From dftwiki3
Jump to: navigation, search
Line 1,575: Line 1,575:
 
|}
 
|}
 
<br />
 
<br />
We are now back to Von Neumann in an effort to understand the concept of the ''[http://en.wikipedia.org/wiki/Von_Neumann_architecture#Von_Neumann_bottleneck  Von Neumann Bottleneck]'', a term that appears from time to time in newspaper  articles for the general public, such as this article from Steve Lohr in the ''New York Times Bits'' section, titled "Big Data, Speed, and the Future of Computing", published Oct 31 2011<ref name="Lohr">Steve Lohr, Big Data, Speed, and the Future of Computing, ''New York Times Technology'', Oct 31, 2011.</ref>
+
We are now coming back to John Von Neumann in an effort to understand the concept of the ''[http://en.wikipedia.org/wiki/Von_Neumann_architecture#Von_Neumann_bottleneck  Von Neumann Bottleneck]'', a term that appears from time to time in newspaper  articles for the general public, such as this article from Steve Lohr in the ''New York Times Bits'' section, titled "Big Data, Speed, and the Future of Computing", published Oct 31 2011<ref name="Lohr">Steve Lohr, Big Data, Speed, and the Future of Computing, ''New York Times Technology'', Oct 31, 2011.</ref>.
  .
+
 
 +
John Von Neumann, if you remember, is this brilliant mathematician and renaissance man who in 1945, after having studied
 +
the design of the EDVAC computing machine at the Moore School of Electrical Engineering at the University of Pennsylvania
 +
wrote a report with his recommendations for how a computing machine should be organized and built.  There are many remarkable things about this report:
 +
* The first remarkable thing about this report is that it was the synthesis of many different good ideas of the time from the few people who were involved in building such machines, and as such it presented some form of blueprint for machines to come.  
 +
* Even more remarkable is that the draft was never officially published but circulated well within the small group of experts interested in the subject, and when new research group started getting interested in building calculating machines, they would use the report as a guide.
 +
* Remarkably, more and more engineers and mathematicians got interested in building such machines, and they kept on using the previous design guidelines, which, while they evolved with technology, stuck to the basic principles laid out by Von Neumann.
 +
* The most remarkable fact of all is that chances are very high that the computer you are currently using to read this document (laptop, desktop, tablet, phone) contains a processor built on these original principles!
 +
 
 +
These principles were that good!  And they offered such an attractive design that for the past 70 years or so, engineers have kept building computers this way. 
 +
 
 +
So what is this bottleneck and why is it bad?
 +
 
 +
The answer is that we have had an ever increasing thirst for more and more complex programs that would solve more and more complex problems.  And this appetite for solving larger and more complex problems has basically forced the computers to evolve in two simple complementary ways.  The first is that computers have had to be faster with each new generation, and the second is that the size of the memory has had to increase with every new generation.  Nate Silver provides a  very good example illustrating these complementary pressures on computer hardware in his book ''The Signal and the Noise''<ref name="silver">Nate Silver, ''The Signal and the Noise: Why So Many Predictions Fail-but Some Don't'', Penguin, 2012.</ref>.
 +
The recent hurricanes have shown an interesting competition between different models predicting the weather, and in particular the path of hurricanes over the world.  Some models are European, others American, and super storm Sandy in October 2012 has shown that some models were better predictors than others.  In that particular case, the European models predicted the path of Sandy more accurately than their counterparts.  So there is a push on the National Center for Atmospheric Research (NCAR) to update its computing power.  Why?
 +
 
 +
The reason is that to predict the weather one has to divide the earth into quadrants forming large squares in a grid covering the earthl.  Each square delimits an  area of the earth for which many parameters are recorded using various sensors and technologies (temperature, humidity, daylight, cloud coverage, wind speed, etc).  A series of equations links the influence that each parameter in a cell of the grid exerts on the parameters of neighboring cells, and a computer model simply looks at how different parameters have evolved in a give cell of the grid over a period of time, and how they are likely to continue evolving in the future. 
 +
 
 
<br />
 
<br />
  

Revision as of 22:18, 28 September 2013

--© D. Thiebaut 08:10, 30 January 2012 (EST)



This section is only visible to computers located at Smith College













.