Difference between revisions of "CSC103: DT's Notes 1"

From dftwiki3
Jump to: navigation, search
Line 1,592: Line 1,592:
 
The recent hurricanes have shown an interesting competition between different models predicting the weather, and in particular the path of hurricanes over the world.  Some models are European, others American, and super storm Sandy in October 2012 has shown that some models were better predictors than others.  In that particular case, the European models predicted the path of Sandy more accurately than their counterparts.  So there is a push on the National Center for Atmospheric Research (NCAR) to update its computing power.  Why?
 
The recent hurricanes have shown an interesting competition between different models predicting the weather, and in particular the path of hurricanes over the world.  Some models are European, others American, and super storm Sandy in October 2012 has shown that some models were better predictors than others.  In that particular case, the European models predicted the path of Sandy more accurately than their counterparts.  So there is a push on the National Center for Atmospheric Research (NCAR) to update its computing power.  Why?
  
The reason is that to predict the weather one has to divide the earth into quadrants forming large squares in a grid covering the earthl.  Each square delimits an  area of the earth for which many parameters are recorded using various sensors and technologies (temperature, humidity, daylight, cloud coverage, wind speed, etc).  A series of equations links the influence that each parameter in a cell of the grid exerts on the parameters of neighboring cells, and a computer model simply looks at how different parameters have evolved in a give cell of the grid over a period of time, and how they are likely to continue evolving in the future.   
+
The reason is that to predict the weather one has to divide the earth into quadrants forming large squares in a grid covering the earthl.  Each square delimits an  area of the earth for which many parameters are recorded using various sensors and technologies (temperature, humidity, daylight, cloud coverage, wind speed, etc).  A series of equations links the influence that each parameter in a cell of the grid exerts on the parameters of neighboring cells, and a computer model simply looks at how different parameters have evolved in a give cell of the grid over a period of time, and how they are likely to continue evolving in the future.  The larger the grid size, though, the more approximate the prediction.  A better way to enhance the prediction is to make the size of the grid smaller.  For example, one could divide the side of each square in the grid by half.  If we do that, though, this makes the number of cells in the grid increase by a factor of four.  If you are not sure why, draw a square on a piece of paper and then divide the square in half vertically and in half horizontally: you will get 4 smaller squares inside the original one. 
 +
What does that mean for the computation of the weather prediction, tough?  Well, if we have 4 times more squares, then we need 4 times more data for each cell of the new grid, and there will be 4 times more computation to be performed.  But wait!  The weather does not happen only at ground level; it also takes place in the atmosphere.  So our grid is not a grid of squares, but a grid of cubes.  And if we divide the side of each cube in half, we get 8 new sub-cubes.  So we need actually 8 times more data, and we will have 8 times more computation to perform.  But wait!  There is also another element that comes into place: time!  Winds travel at a given speed.  So the computation that expects wind to enter the side of our original cube at some period of time and exit the opposite side of a cube some interval of time later needs to be performed more often, since that same wind will now cross a sub-cube of the grid twice as fast as before.
 +
So in short, if the NCAR decides to refine the size of the grid it uses to compute its weather prediction, and divided it by two, it will have 8 x 2 = 16 times more computation to performed.  And since weather prediction takes a lot of time and should be done in no more than 24 hours to actually have a good chance to predict the weather tomorrow, that means that performing 16 times more computation in the same 24 hours will require a new computer with
 +
* a processor 16 times faster than the last computer used,
 +
* a memory that can hold 16 more data than previously.
 +
 
 +
Nate Silver makes the clever observation that since computer performance has been doubling roughly every two years<ref name="mooreslaw">Moore's Lay, Intel Corporation, 2005. ftp://download.intel.com/museum/Moores_Law/Printed_Material/Moores_Law_2pg.pdf</ref>, getting an increase of 16 in performance requires buying a new computer after 8 years, which is roughly the frequency with which NCAR upgrades its main computers!
 +
 
 +
 
 +
 
  
 
<br />
 
<br />

Revision as of 22:40, 28 September 2013

--© D. Thiebaut 08:10, 30 January 2012 (EST)



This section is only visible to computers located at Smith College













.