Difference between revisions of "CSC103: DT's Notes 1"

From dftwiki3
Jump to: navigation, search
 
(67 intermediate revisions by the same user not shown)
Line 1: Line 1:
--© [[User:Thiebaut|D. Thiebaut]] 08:10, 30 January 2012 (EST)
+
--&copy; [[User:Thiebaut|D. Thiebaut]] 2012, 2013, 2014 <br />
 +
Last revised --[[User:Thiebaut|D. Thiebaut]] ([[User talk:Thiebaut|talk]]) 08:05, 9 October 2013 (EDT)
 +
----
 
__TOC__
 
__TOC__
 +
 +
<br />
 +
<center>[[CSC103 Notes, Newer Version| '''Newer Version, 2014''']]</center>
 +
<br />
  
  
<onlysmith>
 
  
 
=CSC103 How Computers Work--Class Notes=
 
=CSC103 How Computers Work--Class Notes=
  
 +
<onlysmith>
 
{| style="width:100%; background:#FFC340"
 
{| style="width:100%; background:#FFC340"
 
|-
 
|-
Line 21: Line 27:
 
===Current Computer Design is the Result of an Evolutionary Process===
 
===Current Computer Design is the Result of an Evolutionary Process===
 
|}
 
|}
<br />
+
<center>[[File:SteamBoyDT.png|700px]] </center><br />In this course we are going to look at the computer as a tool, as the result of technological experiments that have crystalized currently on a particular design, the von Neumann architecture, on a particular source of energy, electricity, on a particular fabrication technology, silicon transistors, and a particular information representation, the binary system, but any of these could have been different, depending on many factors.  In fact, in the next ten or twenty years, one of more of these fundamental parts that make today's computers could change.
 
 
[[Image:SteamboyTheMovie.png|right|200px]] In this course we are going to look at the computer as a tool, as the result of technological experiments that have crystalized currently on a particular design, the von Neumann architecture, on a particular source of energy, electricity, on a particular fabrication technology, silicon transistors, and a particular information representation, the binary system, but any of these could have been different, depending on many factors.  In fact, in the next ten or twenty years, one of more of these fundamental parts that make today's computers could change.
 
  
 
[http://www.imdb.com/title/tt0348121/ '''Steamboy'''], a Japanese anim&eacute; by director Katsuhiro Ohtomo (who also directed ''Akira'') is interesting in more than the story of a little boy who is searching for his father, a scientist who has discovered a secret method for controlling high pressured steam.  What is interesting is that the movie is science fiction taking place not in the future, but in middle of the 19th century, in a world where steam progress and steam machines are much more advanced than they actually were at that time.  One can imagine that some events, and some discoveries where made in the world portrayed in the animated film, and that technology evolved in quite a different direction, bringing with it new machines, either steam-controlled tank-like vehicles, or ships, or flying machines.
 
[http://www.imdb.com/title/tt0348121/ '''Steamboy'''], a Japanese anim&eacute; by director Katsuhiro Ohtomo (who also directed ''Akira'') is interesting in more than the story of a little boy who is searching for his father, a scientist who has discovered a secret method for controlling high pressured steam.  What is interesting is that the movie is science fiction taking place not in the future, but in middle of the 19th century, in a world where steam progress and steam machines are much more advanced than they actually were at that time.  One can imagine that some events, and some discoveries where made in the world portrayed in the animated film, and that technology evolved in quite a different direction, bringing with it new machines, either steam-controlled tank-like vehicles, or ships, or flying machines.
Line 30: Line 34:
 
One can argue that if von Neumann hadn't written this report, we may have followed somebody else's brilliant idea for putting together a machine working with electricity, where information is stored and operated on in binary form.  Our laptop today could be using a different architecture, and programming them might be a totally different type of problem solving.
 
One can argue that if von Neumann hadn't written this report, we may have followed somebody else's brilliant idea for putting together a machine working with electricity, where information is stored and operated on in binary form.  Our laptop today could be using a different architecture, and programming them might be a totally different type of problem solving.
  
[[Image:Antikythera.jpg|right|200px]]For computers were not always electrical machines.  Initially they were mechanical machines.  The abacus, which appeared several millennia B.C. was a counting machine made of wood.  The [http://en.wikipedia.org/wiki/Antikythera_mechanism Antikythera] mechanism, is currently regarded as the first mechanical machine for computing astronomical calculations.  Mechanical as well, the important machine in the history of computers is '''[http://en.wikipedia.org/wiki/Difference_engine Babbage's Difference Engine]'''.  This one was made of gears and shafts, with a crank at the top, and was a ''general purpose'' machine.  Interestingly, this machine has given us an expression we still use with modern electronic computers:  we still hear programmers refer to  "cranking out" the results, even though the crank is long gone.
+
[[File:AntikytheraMecanism.png|right|thumb|400px| Antikythera Mechanism, photo taken by Tilemahos Efthimiadis, National Archaeological Museum, Athens, Greece., taken from commons.wikimedia.org, July 28 2014. Released under the Creative Commons Attribution 2.0 Generic license.]] For computers were not always electrical machines.  Initially they were mechanical machines.  The abacus, which appeared several millennia B.C. was a counting machine made of wood.  The [http://en.wikipedia.org/wiki/Antikythera_mechanism Antikythera] mechanism, is currently regarded as the first mechanical machine for computing astronomical calculations.  Mechanical as well, the important machine in the history of computers is '''[http://en.wikipedia.org/wiki/Difference_engine Babbage's Difference Engine]'''.  This one was made of gears and shafts, with a crank at the top, and was a ''general purpose'' machine.  Interestingly, this machine has given us an expression we still use with modern electronic computers:  we still hear programmers refer to  "cranking out" the results, even though the crank is long gone.
  
 
The same is true of '''silicon transistors''' powered by electricity.  Silicon is the material of choice for electronic microprocessor circuits as well as semiconductor circuits we find in today's computers.  Its appeal lies in its property of being able to either conduct and not conduct electricity, depending on a signal it receives which is also electrical.  Silicon allows us to create electrical switches that are very fast, very small, and consume very little power.  But because we are very good at creating semiconductor  
 
The same is true of '''silicon transistors''' powered by electricity.  Silicon is the material of choice for electronic microprocessor circuits as well as semiconductor circuits we find in today's computers.  Its appeal lies in its property of being able to either conduct and not conduct electricity, depending on a signal it receives which is also electrical.  Silicon allows us to create electrical switches that are very fast, very small, and consume very little power.  But because we are very good at creating semiconductor  
Line 54: Line 58:
 
|-
 
|-
 
|
 
|
 +
 
===Binary System===
 
===Binary System===
 
|}
 
|}
Line 524: Line 529:
 
</tanbox>
 
</tanbox>
 
<br />
 
<br />
[[Image:IceCreamContainer.jpg|right|200px]]
+
[[File:IceCreamCup3Balls.png|thumb|right|400px|D. Thiebaut, Ice Cream, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]
 
We can devise three boolean variables that can be true of false depending on three properties of a container of ice cream: ''choc'', ''fruit'', and ''HG''.  ''choc'' is true if the ice cream contains some chocolate.  ''fruit'' is true if the ice cream contains fruits, and ''HG'' is true if the ice cream is from Haagen Dazs.  A boolean function, or expression, we're going to call it ''isgood'', containing ''choc'', ''fruit'', and ''HG'' that turns true whenever the ice cream is one our friend will like would be this:
 
We can devise three boolean variables that can be true of false depending on three properties of a container of ice cream: ''choc'', ''fruit'', and ''HG''.  ''choc'' is true if the ice cream contains some chocolate.  ''fruit'' is true if the ice cream contains fruits, and ''HG'' is true if the ice cream is from Haagen Dazs.  A boolean function, or expression, we're going to call it ''isgood'', containing ''choc'', ''fruit'', and ''HG'' that turns true whenever the ice cream is one our friend will like would be this:
  
Line 617: Line 622:
 
|-
 
|-
 
|
 
|
 +
 
===Shannon's MIT Master's Thesis: the missing link===
 
===Shannon's MIT Master's Thesis: the missing link===
 
|}
 
|}
Line 630: Line 636:
  
 
<br />
 
<br />
<center>[[Image:CSC103ElectricalAndOrCircuits.png|500px]]</center>
+
<center>[[File:ANDORGatesWithSwitches.png|500px|thumb|AND OR gates with switches. D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
  
 
<br />
 
<br />
Line 762: Line 768:
 
|}
 
|}
  
Now, if we observer the first table, we should recognize the table for the '''and''' operator!  So it is true: arithmetic on bits can actually be done as a logic operation.  But is it true of the '''C''' bit?  We do not recognize the truth table of a known operator.  But remember the ice cream example; we probably come up with a logic expression that matches this table.  An easy way to come up with this expression is to express it in English first and then translate it into a logic expression:   
+
Now, if we observe the first table, we should recognize the table for the '''and''' operator!  So it is true: arithmetic on bits can actually be done as a logic operation.  But is it true of the '''S''' bit?  We do not recognize the truth table of a known operator.  But remember the ice cream example; we probably come up with a logic expression that matches this table.  An easy way to come up with this expression is to express it in English first and then translate it into a logic expression:   
  
 
::'''S''' is true in two cases: when '''a''' is true and '''b''' is false, or when '''a''' is false and '''b''' is true.
 
::'''S''' is true in two cases: when '''a''' is true and '''b''' is false, or when '''a''' is false and '''b''' is true.
Line 779: Line 785:
 
|-
 
|-
 
|
 
|
 +
 
===Discoveries===
 
===Discoveries===
 
|}
 
|}
Line 805: Line 812:
  
 
<br />
 
<br />
<center>[[Image:LogicGatesAndOrNot.png ]]</center>
+
<center>[[Image:LogicGatesAndOrNot.png|thumb|300px|Inverter, And, and Or gates. D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license. ]]</center>
  
 
<br />
 
<br />
Line 889: Line 896:
  
 
<br />
 
<br />
<center>[[Image:IntegratedCircuit.png]]&nbsp;[[Image:7408PinOut.png]]</center>
+
<center>[[File:ICAndGate.jpg|thumb|600px|Integrated Circuit, AND gate. D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
 
<br />
 
<br />
  
 
The image on the left, above, shows an ''integrated circuit'' (IC)close up.  In reality the circuit is about as long as a quarter (and with newer technology even smaller).  The image on the right shows what is inside the IC.  Just 4 AND gates.  There are other ICs that contain different types of gates, such as OR gates or Inverters.
 
The image on the left, above, shows an ''integrated circuit'' (IC)close up.  In reality the circuit is about as long as a quarter (and with newer technology even smaller).  The image on the right shows what is inside the IC.  Just 4 AND gates.  There are other ICs that contain different types of gates, such as OR gates or Inverters.
 +
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
  
 
==Building a Two-Bit Adder with Logic Gates==
 
==Building a Two-Bit Adder with Logic Gates==
Line 904: Line 925:
 
To implement it with logic gates we make ''a'' and ''b'' inputs, and ''f'' the output of the circuit.  Then ''b'' is fed into an inverter gate (NOT), and the output of the inverter into the input of an AND gate.  The other input of the AND gate is connected to the ''a'' signal, and the output becomes ''f''.
 
To implement it with logic gates we make ''a'' and ''b'' inputs, and ''f'' the output of the circuit.  Then ''b'' is fed into an inverter gate (NOT), and the output of the inverter into the input of an AND gate.  The other input of the AND gate is connected to the ''a'' signal, and the output becomes ''f''.
 
<br />
 
<br />
<center>[[Image:aANDNOTb.png]]</center>
+
<center>[[Image:aANDNOTb.png|frame|A and Not B, D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
  
 
<br />
 
<br />
Line 910: Line 931:
  
 
<br />
 
<br />
<center>[[Image:2BitAdderWithGates.png|350px]]</center>
+
<center>[[Image:2-bitAdderGates.png|frame|350px|2-Bit Adder, D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
 
<br />
 
<br />
 
<br />
 
<br />
Line 947: Line 968:
 
|-
 
|-
 
|
 
|
 +
 
=Computer Simulator=
 
=Computer Simulator=
 
|}
 
|}
Line 958: Line 980:
 
Let's assume that we want to play a very simple game based on ''coding''.  The game is quite easy to get: we want two people to have a conversation where each word or sentence that they can say is limited to a small set of preselected sentence, and each one is associated with a number.  When the two people talk to each other, they must pick the number corresponding to the sentence, question, or answer they want to say.
 
Let's assume that we want to play a very simple game based on ''coding''.  The game is quite easy to get: we want two people to have a conversation where each word or sentence that they can say is limited to a small set of preselected sentence, and each one is associated with a number.  When the two people talk to each other, they must pick the number corresponding to the sentence, question, or answer they want to say.
  
[[Image:CSC103 Conversation 2.jpg|200px|right]]
+
[[File:Conversation.jpg|thumb|500px|right|Daniel Coy, "Conversation", online image, https://flic.kr/p/7mWZpb, Captured July 2014.]]
  
 
Here's a list of numbers and their associated sentences:
 
Here's a list of numbers and their associated sentences:
Line 1,053: Line 1,075:
 
|-
 
|-
 
|
 
|
 +
 
===Computer Memory===
 
===Computer Memory===
 
|}
 
|}
Line 1,062: Line 1,085:
  
 
<br />
 
<br />
<center>[[Image:CSC103FlipFlop1.jpg|600px]]</center>
+
<center>[[File:NandFlipFlop1.png|thumb|600px|Nand Flipflop 1, D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
 
<br />
 
<br />
 
You may want to spend the time building it up with our [http://tinyurl.com/103applets| circuit simulator].  
 
You may want to spend the time building it up with our [http://tinyurl.com/103applets| circuit simulator].  
Line 1,071: Line 1,094:
  
 
<br />
 
<br />
<center>[[Image:CSC103FlipFlop2.jpg|600px]]</center>
+
<center>[[File:NandFlipFlop2.png|thumb|600px|Nand Flipflop 2, D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
 
<br />
 
<br />
  
Line 1,079: Line 1,102:
 
with similar circuits.
 
with similar circuits.
  
<br /><center>[[Image:animatedScale.gif]]</center>
+
<br /><center>[[File:Scale.gif|frame|Animated Scale, D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]</center>
 
<br />
 
<br />
  
Line 1,127: Line 1,150:
 
|-
 
|-
 
|
 
|
 +
 
===The Processor===
 
===The Processor===
 
|}
 
|}
Line 1,137: Line 1,161:
 
</bluebox>
 
</bluebox>
 
<br />
 
<br />
[[Image:Calculator.jpg|right|150px]]
+
[[Image:Calculator.png|right|thumb|300px|Ilnanny, "Calculator", online image, openclipart.org/image/800px/svg_to_png/170371/1338745223.png, captured Aug. 1st, 2014.]]
 
The processor has three important registers that allow it to work in this machine-like fashion: the '''PC''', the '''Accumulator''' (shortened to '''AC'''), and the '''Instruction Register''' ('''IR''' for short).  The PC is used to "point" to the address in memory of the next word to bring in.  When this number enters the processor, it must be stored somewhere so that the processor can figure out what kind of action to take.  This holding place is the '''IR''' register.  The way the '''AC''' register works is best illustrated by the way we use a regular hand calculator.  Whenever you enter a number into a calculator, it appears in the display of the calculator, indicating that the calculator actually holds this value somewhere internally.  When you type a new number that you want to add to the first one, the first number disappears from the display, but you know it is kept inside because as soon as you press the = key the sum of the first and of the second number appears in the display.  It means that while the calculator was displaying the second number you had typed, it still had the first number stored somewhere internally.  For the processor there is a similar register used to keep intermediate results.  That's the '''AC''' register.
 
The processor has three important registers that allow it to work in this machine-like fashion: the '''PC''', the '''Accumulator''' (shortened to '''AC'''), and the '''Instruction Register''' ('''IR''' for short).  The PC is used to "point" to the address in memory of the next word to bring in.  When this number enters the processor, it must be stored somewhere so that the processor can figure out what kind of action to take.  This holding place is the '''IR''' register.  The way the '''AC''' register works is best illustrated by the way we use a regular hand calculator.  Whenever you enter a number into a calculator, it appears in the display of the calculator, indicating that the calculator actually holds this value somewhere internally.  When you type a new number that you want to add to the first one, the first number disappears from the display, but you know it is kept inside because as soon as you press the = key the sum of the first and of the second number appears in the display.  It means that while the calculator was displaying the second number you had typed, it still had the first number stored somewhere internally.  For the processor there is a similar register used to keep intermediate results.  That's the '''AC''' register.
 
<br />
 
<br />
[[Image:CSC103MotherBoard.jpg|250px | right]]
+
[[File:PrintedCircuitBoard.jpg|250px|thumb|Barney Livingston, "BBC B - PCB, CPU removed," online image, farm1.staticflickr.com/83/235291503_080d9656a8_o_d.jpg, captured Aug. 1st, 2014.]]
 
<br />
 
<br />
 
All the processor gets from these memory cells it reads are ''numbers''.  Remember, that's the only thing we can actually create in a computer: groups of bits.  So each memory cell's number is read by the processor.  How does the number move from memory to the processor?  The answer: on metal wires, each wire transferring one bit of the number.  If you have ever taken a computer apart and taken a look at its ''motherboard'', you will have seen such wires.  They are there for bits to travel back and forth between the different parts of the computer, and in particular between the processor and the memory.  The image to the right shows the wires carrying the bits (photo courtesy of [http://www.inkity.com/catalog/product/2/11195/Motherboard-Detail.html www.inkity.com]).  Even though it seems that some wires do not go anywhere, they actually connect to tiny holes that go through the motherboard and allow them to continue on the other side, allowing wires to cross each other without touching.).
 
All the processor gets from these memory cells it reads are ''numbers''.  Remember, that's the only thing we can actually create in a computer: groups of bits.  So each memory cell's number is read by the processor.  How does the number move from memory to the processor?  The answer: on metal wires, each wire transferring one bit of the number.  If you have ever taken a computer apart and taken a look at its ''motherboard'', you will have seen such wires.  They are there for bits to travel back and forth between the different parts of the computer, and in particular between the processor and the memory.  The image to the right shows the wires carrying the bits (photo courtesy of [http://www.inkity.com/catalog/product/2/11195/Motherboard-Detail.html www.inkity.com]).  Even though it seems that some wires do not go anywhere, they actually connect to tiny holes that go through the motherboard and allow them to continue on the other side, allowing wires to cross each other without touching.).
Line 1,162: Line 1,186:
 
|-
 
|-
 
|
 
|
 +
 
===The Cookie Monster Analogy===
 
===The Cookie Monster Analogy===
 
|}
 
|}
[[File:CookieMonsterPacMan.png|right|150px]]
+
[[File:CookieMonsterPacMan.png|right|thumb|350px|Cookie Monster, D. Thiebaut, 2014, Released under the Creative Commons Attribution 2.0 Generic license.]]
 
The processor is just like the cookie monster.  But a cookie monster acting like Pac-Man, a Pac-Man that follows  
 
The processor is just like the cookie monster.  But a cookie monster acting like Pac-Man, a Pac-Man that follows  
 
a straight path made of big slabs of cement, where there's a cookie on each slab.  Our Cookie-Monster-Pac-Man  
 
a straight path made of big slabs of cement, where there's a cookie on each slab.  Our Cookie-Monster-Pac-Man  
Line 1,193: Line 1,218:
 
|-
 
|-
 
|
 
|
 +
 
===Instructions and Assembly Language===
 
===Instructions and Assembly Language===
 
|}
 
|}
Line 1,597: Line 1,623:
 
So in short, if the NCAR decides to refine the size of the grid it uses to compute its weather prediction, and divides it by two, it will have 8 x 2 = 16 times more computation to performed.  And since weather prediction takes a lot of time and should be done in no more than 24 hours to actually have a good chance to predict the weather tomorrow, that means that performing 16 times more computation in the same 24 hours will require a new computer with:
 
So in short, if the NCAR decides to refine the size of the grid it uses to compute its weather prediction, and divides it by two, it will have 8 x 2 = 16 times more computation to performed.  And since weather prediction takes a lot of time and should be done in no more than 24 hours to actually have a good chance to predict the weather tomorrow, that means that performing 16 times more computation in the same 24 hours will require a new computer with:
 
* a processor 16 times faster than the last computer used,  
 
* a processor 16 times faster than the last computer used,  
* a memory that can hold 16 more data than previously.
+
* a memory that can hold 8 times more data than previously.
  
 
Nate Silver makes the clever observation that since computer performance has been doubling roughly every two years<ref name="mooreslaw">Moore's Lay, Intel Corporation, 2005. ftp://download.intel.com/museum/Moores_Law/Printed_Material/Moores_Law_2pg.pdf</ref>, getting an increase of 16 in performance requires buying a new computer after 8 years, which is roughly the frequency with which NCAR upgrades its main computers!
 
Nate Silver makes the clever observation that since computer performance has been doubling roughly every two years<ref name="mooreslaw">Moore's Lay, Intel Corporation, 2005. ftp://download.intel.com/museum/Moores_Law/Printed_Material/Moores_Law_2pg.pdf</ref>, getting an increase of 16 in performance requires buying a new computer after 8 years, which is roughly the frequency with which NCAR upgrades its main computers!
Line 1,610: Line 1,636:
 
<br />
 
<br />
  
That's one aspect of the Von Neumann bottleneck.  Using our previous metaphor of the cookie monster, it is akin to having our cookie monster walking on a treadmill where cookies are dropped in front of him at regular intervals, and the cookie monster is becoming faster and faster at walking the treadmill and eating cookies, but the treadmill, while increasing in speed as well, is not able to keep up with the cookie monster.
+
That's '''one of the limiting aspects''' of the Von Neumann bottleneck.  Using our previous metaphor of the cookie monster, it is akin to having our cookie monster walking on a treadmill where cookies are dropped in front of him at regular intervals, and the cookie monster is becoming faster and faster at walking the treadmill and eating cookies, but the treadmill, while increasing in speed as well, is not able to keep up with the cookie monster.
  
The other aspect of the Von Neumann bottleneck is that the way the processor is the center of activities for the computer.  Everything has to go through it.  Instructions, data, everything that is in memory is '''for''' the processor.  The processor is going to have to access it, read it, modify it at least once during their time in memory.  And sometimes multiple times.  So  
+
The '''second limiting problem'''  of the Von Neumann bottleneck is in the way the processor in a computer is the '''center of activities''' for everything.  Everything has to go through it.  Instructions, data, everything that is in memory is '''for''' the processor.  The processor is going to have to access it, read it, modify it at least once during their time in memory.  And sometimes multiple times.  So  
 
this is a huge demand on the processor.  Remember the Accumulator register (AC) in our processor simulator?  Any data whatsoever that is in memory at some point will have to go into AC to be either moved somewhere else or modified.  To get an idea of what this represent, imagine that the size of the AC register is the size of a dime.  Since a register is a memory word, then the size of a memory word would be the same.  In today's computers, the Random Access Memory (RAM) contains from 4 billion to 8 billion memory words.  4 billion dimes would cover the size of a football field.  Von Neumann gave us a design where the computation is done in a tiny area while the data spans a huge area, and there is not other way to process the data than to bring them into the processor.  That's the second aspect of the Von Neumann bottleneck.
 
this is a huge demand on the processor.  Remember the Accumulator register (AC) in our processor simulator?  Any data whatsoever that is in memory at some point will have to go into AC to be either moved somewhere else or modified.  To get an idea of what this represent, imagine that the size of the AC register is the size of a dime.  Since a register is a memory word, then the size of a memory word would be the same.  In today's computers, the Random Access Memory (RAM) contains from 4 billion to 8 billion memory words.  4 billion dimes would cover the size of a football field.  Von Neumann gave us a design where the computation is done in a tiny area while the data spans a huge area, and there is not other way to process the data than to bring them into the processor.  That's the second aspect of the Von Neumann bottleneck.
  
Line 1,621: Line 1,647:
 
<br />
 
<br />
  
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===Moore's Law===
 +
|}
 +
<br />
 +
To understand '''Moore's Law''' we need to understand '''exponential growth''' first.  In our context, it makes sense to consider quantities that grow over time, but exponential growth applies to a broader spectrum of things.  However, if you understand exponential growth in our context, its application to other areas will make sense.
 +
 +
<br />
 +
====Exponential Growth====
 +
<br />
 +
Something has an exponential growth if its size is doubling every fixed interval of time.  A cute puzzle for which people often get the wrong answer will get us started. 
 +
 +
[[Image:LilyInPond.png|200px|right]]
 +
::''Suppose a lily is in the middle of a pond, and every day it doubles in size.  In 30 days the lily has covered half of the pond.  How long will it take it to cover the whole pond?''
  
 +
If you answered 31 days, then congratulations!  Indeed, if it doubles in size every day, then after Day 30 it will be twice half the size of the pond, so it will have covered the whole pond!  This simple example shows the powerful nature of exponential growth.  It took the lily 30 days to cover 50% of the lake, but it takes in only 1 day to cover the other 50%.
  
 +
There are many such puzzles in our cultures that attempt to demonstrate the extraordinary power of exponential growth through seemingly impossible feats.  Another one is the story of grains of rice on a chessboard as told by David R. Henderson and Charles L. Hooper<ref name="rice">David R. Henderson and Charles L. Hooper, ''Making Great Decisions in Business and Life'', Chicago Park Press, 1st edition, March 12, 2007.</ref>:
 
<br />
 
<br />
 +
:::''In a time of hunger, the Emperor of China wanted to repay a peasant who had saved the life of his child. The peasant could have any reward he chose, but the Emperor laughed when he heard the silly payment the foolish peasant selected: rice on a chessboard. The peasant wanted one grain of rice on the first square, doubling to two on the second, doubling to four on the third, and so on. After the Emperor agreed, his servants brought one bag of rice into his court and began tediously counting rice. Soon, he called for more and more bags of rice...''
 +
<br />
 +
It turns out that it is impossible to grant such as wish, as the number of grains that would have to be put on the 64th square of the 8-by-8 chessboard is 2 times 2 times 2 times 2... 63 times.  That is a series of 63 numbers 2 multiplying each other.  That is 2<sup>63</sup>, or 9,223,372,036,854,775,808, or 9 quintillion!  Much more than the quantity of rice produced on earth in a single year. 
 +
 +
 +
Let's stay with this second example and write down the number of the square followed by the quantity of grains of rice:
  
{| style="width:100%; background:#FFC340"
+
<center>
 +
{| class="wikitable" style="text-align: center; color: green;"  
 +
! square
 +
! grains
 +
|-
 +
| 1
 +
| 1
 +
|-
 +
|  2
 +
| 2
 +
|-
 +
|  3
 +
| 4
 +
|-
 +
|  4
 +
| 8
 +
|-
 +
|  5
 +
| 16
 +
|-
 +
|  6
 +
| 32
 
|-
 
|-
|
+
| ...
==References==
+
|  ...
 
|}
 
|}
<references />
+
</center>
</onlysmith>
+
The left column represents the number of the square being covered, and this number increases by 1 every time.  The second column represents the quantity of interest, the number of grains, and doubles every time.  So that is the setup for studying our exponential growth: something that doubles in size every fixed interval, in our case every new square.
  
 +
Let's plot these numbers to observe their growth (plots generated with [[CSC103 Plotting with R|R]]).
 
<br />
 
<br />
 +
<center>[[Image:CSC103_ExponentialGrowth1.png|450px]]</center>
 +
<br />
 +
Note the quick growth of the points in the graph.  To get the full impact of the exponential growth we need to plot a few more points, first from Square 1 to 13, then from Square 1 to 25, and finally from Square 1 all the way to 64, as illustrated in the plots below:
 +
<br />
 +
<center>
 +
[[Image:CSC103_ExponentialGrowth2.png|250px]] &nbsp;&nbsp;&nbsp;&nbsp;
 +
[[Image:CSC103_ExponentialGrowth3.png|250px]] &nbsp;&nbsp;&nbsp;&nbsp;
 +
[[Image:CSC103_ExponentialGrowth4.png|250px]]
 +
</center>
 +
 +
<br />
 +
What is important to see here is that as we show more and more squares, the actual growth that takes place for squares of low order, say, less than 40, is completely obfuscated by the large size of the quantities associated with the squares of higher order,
 +
even though the number of grains on the 40th square is an impressive 549,755,813,888!
 +
 +
The last plot in the series, where all the squares from 1 to 64 are shown actually flattens everything except for Squares 55 and up.  It also shows why people have referred to exponential-growth curves as "hockey-stick" curves, for the long flat growth and sudden turn upward, as this picture of a hockey stick below perfectly illustrates.
 +
 +
<br />
 +
<center>[[Image:HockeyStick.gif]]</center>
 +
<br />
 +
 +
So, is there a better way to display the growth of the number of grains of rice over the complete range of squares of the chessboard, one that would show that there was growth at all levels?  The answer is yes.  We can use a ''logarithmic'' scale to
 +
display  the quantities of grains.  In a logarithmic scale, numbers are arranged in such a way that any pair of numbers that are related to each other in such a way that one is half the size of the other will always be the same distance from each other on the scale.  The figure below shows a horizontal logarithmic scale.
 +
 +
<br />
 +
<center>[[Image:LogarithmicScale1.jpg|650px]]</center>
 +
<br />
 +
 +
This scale "squishes" very large numbers and emphasizes smaller ones.  The next figure illustrates the property that the distance between any number and the one that is its double is constant.  For example the distance between 4 and 8, 20 and 40, or 100 and 200 is the same in all three cases.
 +
 +
<br />
 +
<center>[[Image:LogarithmicScale2.jpg|650px]]</center>
 +
<br />
 +
 +
What happens when we apply this way of scaling the y-axis of the plot of the number of grains for all 64 squares of the chessboard is pretty remarkable.
 +
 +
 +
<br />
 +
<center>[[Image:CSC103_ExponentialGrowthLogarithmicScale.png|450px]]</center>
 +
<br />
 +
 +
With a logarithmic scale for the y-axis, an exponential growth appears as a straight line.  And now the relationship that exists between the amounts of grains on the squares of low order is plainly visible.  The logarithmic scale is the key to fully expose the property of exponential growths, and exposes them as straight line in a plot where the x-axis is linear (what we normally use for simple graphs), and the y-axis logarithmic.
 +
 +
<br />
 +
====Moore's Law====
 +
<br />
 +
[[Image:GordonMooreCC2.png|right|300px]]
 +
 +
Gordon E. Moore, the co-founder of Intel, is credited with the observation that the number of transistors in integrated circuits had doubled every 18 months since 1958 to 1965<ref name="moore1965">Gordon E. Moore, Cramming More Components onto
 +
Integrated Circuits, ''Electronics'', pp. 114–117, April 19, 1965.</ref>.  At the time Moore predicted that the trend would continue for "at least 10 more years." 
 +
 +
Almost 50 years later the trend has held remarkably well, as illustrated by the diagram below taken from [http://njtechreviews.com/2011/09/04/moores-law/ NJTechReviews.com].
 +
 +
<br />
 +
<center>[[File:MooresLaw.jpg]]</center>
 +
<br />
 +
 +
Notice that the growth of the number of transistors is quite straight in this logarithmic plot.  It doesn't really matter that it is not absolutely straight.  It is straight enough to se that the number of transistors has double, indeed, every 18 months or so over many decades.  This is a very remarkable trend that has been observed in different areas associated with computer technology.
 +
 +
We saw very similar straight lines when we were discussing the Von Neumann bottleneck.  The curves didn't show the growth of the number of transistors  but the performance of the CPU and of the RAM.  The performance is usually measured by the amount of instructions processed per second, or the amount of data accessed per second, and it is interesting to see that these quantities have grown exponentially as well.
 +
 +
<br />
 +
====The End of Moore's Law====
 +
<br />
 +
The article "The End of Moore's Law"<ref name="endMooresLaw">Tim Worstall, The End of Moore's Law, ''Forbes'', Aug. 29, 2013, http://www.forbes.com/sites/timworstall/2013/08/29/darpa-chief-and-intel-fellow-moores-law-is-ending-soon/, retrieved 9/29/13.</ref> that appeared in August 2013 in ''Forbes''  is typical of many articles that have been published for many years, and will continue being published for years to come.
 +
 +
The main point that these authors are making is that when you have an exponential law, the quantity measured at some point becomes such that it becomes physically impossible to exist.  For example, if you look at transistors, the number of transistors being packed in a surface that is roughly 1 square inch is now surpassing the billion mark.  That means that the size of the individual transistors is becoming smaller, and smaller, and smaller, and at some point they will reach the size of a few atoms.  And because we can't build anything (yet) smaller than atoms, we will be stuck, and the law will abruptly stop its trend.
 +
 +
There is a very nice video from Intel showing the fabrication process of integrated circuits.  You will notice at some point that the process of making an integrated circuit relies on depositing layers of material, sometimes metal, sometimes semiconductor, radiating it with some form of electromagnetic wave, and then removing different patterns of layers with acid.  By repeating the process a very intricate network is created, linking different transistors to each other into blocks, and linking blocks with other blocks.  The movie gives a good sense of the overwhelming complexity of the structure that results.  When you have more than a billion transistors in a square inch connected together by a giant network of wires,  some limits are becoming real.
 +
 +
So the point made by articles predicting the end of Moore's law simply state that we are designing ever smaller transistors and that the size of an atom is on the horizon.  The Forbes article puts the end of Moore's law at around 2020.
 +
 +
<br />
 +
<center><videoflash>d9SWNLZvA8g</videoflash></center>
 +
<br />
 +
 +
 +
{| style="width:100%; background:#FFC340"
 +
|-
 +
|
 +
== Introduction to the Language Processing==
 +
|}
 +
<br />
 +
[[File:ProcessingLogo.jpg | 150px|right]]
 +
Fry and Reas present a nice and concise introduction to Processing in <ref name="FryReas_AISoc">C. Reas, B. Fry, Processing: programming for the media arts, AI & Soc (2006) 20: 526–538
 +
DOI 10.1007/s00146-006-0050-9, (http://hlt.media.mit.edu/dfe_readings/processing.pdf)</ref>.  Quoting from their paper:
 +
<blockquote>
 +
Processing is a programming language and environment built for the
 +
media arts communities. It is created to teach fundamentals of computer programming within the media arts context and to serve as a software sketchbook. It is used
 +
by students, artists, designers, architects, and researchers for learning, prototyping,
 +
and production. This essay discusses the ideas underlying the software and presents
 +
its relationship to open source software and the idea of software literacy. Additionally, Processing is discussed in relation to education and  online communities.
 +
</blockquote>
 +
<br />
 +
 +
 +
<br />
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===How does Processing work?===
 +
|}
 +
 +
<br />
 +
====General Block Diagram====
 +
<br /><center>[[Image:ProcessingDotOrgGeneralArchitecture.png|600px]]</center><br />
 +
 +
When a Processing program is started, two actions take place:
 +
# the '''setup()''' function is executed by the processor.  This happens only once.  This is always the first action taken, and for this reason this function is a good place to put code that will define the shape of the applet, and anything settings that will remain constant throughout the execution of the program
 +
# the '''draw()''' function is executed.  And as soon as it is finished, it is restarted again.  In general the repeat rate is 30 times per second, corresponding to a ''frame rate'' fast enough for our human eyes not to see the ''flickering''.
 +
 +
Processing is a language that is built on top of another language: '''Java'''.  Java is compatible with all three major operating systems (Windows, OS X, or Linux), and is free.  Java is also superior for its graphing abilities.  All these properties made it a good support language for  Fry and Reas to pick when they wanted to create Processing.
 +
 +
====Where does a Processing Program actually run?====
 +
 +
The short answer: on your computer!  If you are looking at a Processing applet that is running on a Web site, then the applet is downloaded first by the browser, and run inside the browser.  If you are running a Processing program that you just created with the Processing editor, then it's also running on your computer.  Because it is running on your computer, it can easily perform many useful tasks for programmers:
 +
* Get characters typed at the keyboard
 +
* Listen to what the mouse is doing, including changing its position, or whether buttons are pushed, clicked, or released.
 +
* Display graphic shapes on the display window
 +
* Play sound files (e.g. MP3 files)
 +
* Play video files
 +
* etc... (see the [http://processing.org/exhibition/ exhibition page] on Processing for additional examples of interactions).
 +
 +
====Writing a Processing Sketch====
 +
 +
In Processing, programs are called ''sketches''.
 +
 +
First we need to download and install Processing on our computer.  See the [http://processing.org/download/ Processing Download] page for more information.
 +
 +
Next we start the ''Integrated Development Environment'' tool (IDE), which is a fancy word for ''editor'', and we enter programs.
 +
 +
<br />
 +
<center>
 +
[[Image:ProcessingIDE1.png| 300px]]
 +
</center>
 +
<br />
 +
 +
====Running and Stopping  a Processing Program====
 +
 +
'''Starting''' a program is simple: just click on the round button with a triangle in it, at the top left of the IDE, and this should start the program.
 +
 +
<br />
 +
<center>
 +
[[Image:ProcessingIDE2.png|500px]]
 +
</center>
 +
<br />
 +
To '''stop''' the program you can either close the graphics window, or click on the round button with a black square in the IDE.
 +
 +
<br />
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===Translating Assembly Examples into Processing===
 +
|}
 +
<br />
 +
We saw some very simple examples in assembly language for doing very simple operations.  Never-the-less, these simple problems required complex assembly language programs to solve them.  We now look at a these same problems, but this time using a "high-level" language: Processing.
 +
 +
====Example 1: Sum of 2 variables====
 +
 +
* The following program adds the contents of two variables together and stores the resulting sum in a third variable.
 +
<br />
 +
<source lang="java">
 +
void setup() {
 +
  int a = 10;
 +
  int b = 20;
 +
  int c = 0;
 +
 
 +
  c = a + b;
 +
  println( "c = " + c );
 +
}
 +
 +
</source>
 +
<br />
 +
* Some explanations:
 +
** '''int a''' indicates that we are creating a variable in memory called '''a'''.  We are also specifying that it will contain an integer number, which in Processing is called an '''int'''.  Also, we start the program by storing 10 in variable '''a'''.
 +
** Similarly, '''int b''' means that we have a variable called '''b''' that contains an int.  The program starts by storing 20 into '''b'''.
 +
** '''int c''', you will have guessed, means that c is also a variable and when the program starts '''c''' contains 0.
 +
** The program then adds a to b and stores the result into '''c'''.
 +
** Finally the program prints two pieces of information: a sentence "c =", followed by the actual value of '''c'''.
 +
 +
====Example 2====
 +
* This example program generates and prints all the numbers between 1 and 10.  It is similar to a program we saw in assembly.
 +
<br />
 +
 +
<source lang="java">
 +
void setup() {
 +
  int count = 10;
 +
  while ( true ) {
 +
    println( count );
 +
    count = count - 1;
 +
    if ( count == 0 ) break;
 +
  }
 +
  println( "done!" );
 +
}
 +
 +
</source>
 +
<br />
 +
* Some explanations:
 +
** The program uses a variable called '''count''' that is an ''integer'' and contains 10 at that start of the program.
 +
** The next statement is '''while (true) {  ...  }''' which means ''repeat forever'' whatever is inside the curly brackets. 
 +
** Inside the curly bracket are 3 statements.
 +
*** The first one prints the contents of '''count''' on the screen
 +
*** The second one subtracts 1 from '''count'''
 +
*** The third one is a tests.  It tests if the contents of count are 0, and if so, it forces the program to break out of the while ''loop''.
 +
** If '''count''' does not contain 0, then the loop is repeated again, once more.
 +
** If the program breaks out of the loop, it falls automatically on the next statement following the curly bracket of the loop, which prints the sentence "''done!''" on the screen.
 +
<br />
 +
* Here its output:
 +
10
 +
9
 +
8
 +
7
 +
6
 +
5
 +
4
 +
3
 +
2
 +
1
 +
done!
 +
 +
<br />
 +
====Example 3====
 +
 +
* This examples computes the sum of all the numbers from 0 to 10 (included) and stores the result in a variable called '''sum'''.
 +
<br />
 +
<source lang="java">
 +
void setup() {
 +
  int count = 10;
 +
  int sum = 0;
 +
  while ( true ) {
 +
    sum = sum + count;
 +
    count = count - 1;
 +
    if ( count == 0 ) break;
 +
  }
 +
  println( "sum = " + sum );
 +
}
 +
</source>
 +
<br />
 +
* Hopefully by now you should have enough ''Processing'' language under your belt to figure out how the program computes the sum of all the numbers.
 +
* By the way, here's the output of the program:
 +
<br />
 +
<code><pre>
 +
 +
sum = 55
 +
 +
</pre></code>
 +
<br />
 +
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===A First Example Using Graphics===
 +
|}
 +
<br />
 +
 +
Assuming that you have Processing installed on your computer, type in the following code in the IDE:
 +
<br />
 +
<source lang="java">
 +
void setup() {
 +
  size(480, 480 );
 +
  smooth();
 +
}
 +
 +
void draw() {
 +
  ellipse(mouseX, mouseY, 80, 80);
 +
}
 +
</source>
 +
<br />
 +
* Run your program.
 +
* Notice that the program draws circles (ellipses with the same horizontal radius as vertical radius) that overlap each other, following the mouse around. 
 +
* When the mouse is outside the window, the graphics window stops moving the circles.  This is because the graphics window is sensitive to the mouse only when the mouse is '''over''' the window.
 +
<br />
 +
<center>
 +
[[Image:ProcessingEllipses1.png|400px]]
 +
</center>
 +
<br />
 +
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===Some explanations===
 +
|}
 +
* Once again, a Processing program is called a ''sketch'', and we'll use both terms here interchangeably.
 +
* The sketch contains our two functions: '''setup()''' and '''draw()'''.
 +
* '''Setup()''' is executed first and sets the '''size''' of the graphics window to 480 pixels wide by 480 pixels high.  You can change these around to see how the new window is affected.
 +
* '''smooth()''' is called by setup() to make the display of graphics shape ''smoother'' to the eye.  This will slow down the program a tiny bit, but in most cases, this is something we'll want to do.  You can try removing the ''smooth()'' call and see how it affect your program.
 +
* The '''draw()''' function is listed next.  This function is called 30 times a second.  All it does is to draw an '''ellipse''' at the '''x''' and '''y''' location of the mouse on the graphics window.  Top-left is 0, 0.  Bottom right is 479, 479 (since our window size was set to 480, 480).
 +
* And that's it!
 +
<br />
 +
<tanbox>
 +
This is our first example of '''animation''' with Processing.  It's easy to do.  Just make the '''draw()''' function draw something on the graphics window where the mouse is located, and do it 30 times a second.  As the mouse moves, the new shapes are drawn following the mouse.
 +
</tanbox>
 +
<br />
 +
 +
 +
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===Some Variations to Play with  ===
 +
|}
 +
<br />
 +
<br />
 +
<br />
 +
{| style="width:100%; background:silver"
 +
|-
 +
|
 +
 +
====Variation #1====
 +
|}
 +
[[Image:DTQuestionMark2.jpg|right|200px]]
 +
 +
* Locate the '''rect()''' graphics function in Processing's [http://processing.org/reference/rect_.html reference] section.
 +
* Notice how it should be used:
 +
<br /><center>[[Image:ProcessingRectSyntax.png|500px]]</center><br />
 +
* Change the ellipse to a rectangle in the sketch as follows:
 +
 +
<br />
 +
<source lang="java">
 +
void setup() {
 +
  size(480, 480 );
 +
  smooth();
 +
}
 +
 +
void draw() {
 +
  // ellipse(mouseX, mouseY, 80, 80);
 +
  rect( mouseX, mouseY, 80, 80 );
 +
}
 +
</source>
 +
<br />
 +
 +
* the two slashes in front of the line with the ''ellipse()'' function transforms this line into a ''comment''.  A comment is line of code that does not contain any code, and contains textual information that is skipped by the processor when it runs the program.
 +
* The '''rect()''' function is asked to use ''mouseX'', and ''mouseY'' as the coordinates of where to put the rectangle.  The width and height of the rectangle are set to 80 pixels each.
 +
<br />
 +
<br />
 +
<br />
 +
 +
{| style="width:100%; background:silver"
 +
|-
 +
|
 +
 +
====Variation #2====
 +
|}
 +
[[Image:DTQuestionMark1.jpg|right|200px]]
 +
<br />
 +
* Locate the '''background()''' graphics function in Processing's [http://processing.org/reference/rect_.html reference] section.
 +
* Read the description of the '''background()''' function.
 +
* Add a ''call'' to the function ''background()'' in the ''draw()'' function.  Pass it the value 200, which is light grey (you would use 0 for full black, and 255 for full white).
 +
 +
<br />
 +
<source lang="java">
 +
void setup() {
 +
  size(480, 480 );
 +
  smooth();
 +
}
 +
 +
void draw() {
 +
  background( 200 );
 +
  // ellipse(mouseX, mouseY, 80, 80);
 +
  rect( mouseX, mouseY, 80, 80 );
 +
}
 +
</source>
 +
<br />
 +
* Try the sketch.
 +
* What is happening?  Why is there only one rectangle on the screen?
 +
* Comment out the rectangle function by putting two slashes in front of it, and uncomment the ellipse function by removing the two slashes that prefix it.  Try the sketch again!
 +
<br />
 +
<br />
 +
<br />
 +
 +
{| style="width:100%; background:silver"
 +
|-
 +
|
 +
 +
====Variation #3====
 +
|}
 +
[[Image:DTQuestionMark3.jpg|right|200px]]
 +
<br />
 +
* Let's fill the ellipse with a color.  This is accomplish with the '''fill()''' function.
 +
* Find the description for '''fill()''' in the reference section of Processing.org (you know where to go, by now!)
 +
* Add a call to fill() in the draw() function:
 +
<br />
 +
<source lang="java">
 +
void setup() {
 +
  size(480, 480 );
 +
  smooth();
 +
}
 +
 +
void draw() {
 +
  background( 200 );
 +
  fill( 150 );
 +
  ellipse(mouseX, mouseY, 80, 80);
 +
  //rect( mouseX, mouseY, 80, 80 );
 +
}
 +
</source>
 +
<br />
 +
* Try the sketch!
 +
* If you want to change the color for something more appealing, try this color table: http://web.njit.edu/~kevin/rgb.txt.html
 +
* Pick a color that you like, and copy the three numbers in the third column from the left.  Put these numbers in the '''fill()''' function, separated by commas.  For example, assume you want to use the '''GreenYellow''' color:
 +
<br />
 +
<center>
 +
[[Image:PickColorGreenYellowRGB.png|500px]]</center><br />
 +
:Change the '''fill()''' function call to read:
 +
<br />
 +
 +
::::<tt> fill( 173, 255, 47  );</tt>
 +
 +
 +
<br />
 +
* Try it!
 +
{| style="width:100%; background:#FFD373"
 +
|-
 +
|
 +
===Resources  ===
 +
|}
 +
<br />
 +
 +
====Best Online Resources====
 +
 +
* [http://processing.org Processing.org]: the Web site for '''Processing''' is probably the best place to find information, and to find the language environment which you can download for free.
 +
* [http://processing.org/learning/ Processing.org/learning]: good place to start learning Processing, or reviewing concepts seen in class.
 +
* [http://processing.org/learning/overview/ Processing.org/learning/overview]: Just what the name says; it's an '''overview''' of the language with simple examples.
 +
* [http://processing.org/reference/ Processing.org/reference]: the main '''reference''' to Processing objects and constructs.  The best place to search for new features and find examples of how to use them.
 +
* [http://www.openprocessing.org/collections/ Another great collection] of sketches on [http://www.openprocessing.org/ openProcessing.org]
 +
 +
====Searching by Yourself====
 +
There is a wealth of Web resources on '''processing'''.  Unfortunately, when you search for ''processing'' on Google (or your favorite search engine), you may get results unrelated to the language Processing, but to the word processing.  A good way to force Google to return results that are relevant to the language is to search for '''processing.org''', which is the Web site for Processing.
 +
 +
====Good Examples====
 +
* The '''Examples''' option in the '''Processing''' ''File'' menu is a good source of many examples illustrating different concepts:
 +
<br />
 +
<center>
 +
[[Image:ProcessingFileExamples.png|300px]] [[Image:blueArrowRight.png|50px]] [[Image:ProcessingFileExamples2.png|300px]]
 +
</center>
 +
<br />
 +
 +
====Misc. Videos====
 +
{| cellpadding="10"
 +
|- valign="top"
 +
|
 +
<videoflash>z-g-cWDnUdU</videoflash>
 +
|
 +
This is less about Processing than about data visualization, and how '''Ben Fry''', one of the co-authors of Processing uses the language to represent data.  Several of his projects are presented.
 +
<br />
 +
The language '''Processing''' is presented around Time Marker 17min.
 +
|}
 +
 +
{| cellpadding="10"
 +
|- valign="top"
 +
|
 +
<videoflash type="vimeo">28117873</videoflash>
 +
|
 +
Fry and Reas give a good overview of Processing in the Vimeo movie to the left.  This video was filmed before Processing 2 was released, and presents some interesting projects and libraries written for Processing by some of its users.  Fry (the first speaker) spends more time on the technology, while Reas presents different projects. 
 +
|}
 +
 +
<br />
 +
 +
<br />
 +
* [http://wiki.processing.org/w/Video_Tutorials_by_Jose_Sanchez Jose Sanchez]'s list of video tutorials
 +
 +
<br />
 +
 +
 +
{| style="width:100%; background:#FFC340"
 +
|-
 +
|
 +
==References==
 +
|}
 +
<references />
 +
<br />
 +
</onlysmith>
 
<br />
 
<br />
 
<br />
 
<br />

Latest revision as of 15:24, 7 September 2017

--© D. Thiebaut 2012, 2013, 2014
Last revised --D. Thiebaut (talk) 08:05, 9 October 2013 (EDT)



Newer Version, 2014



CSC103 How Computers Work--Class Notes


This section is only visible to computers located at Smith College












.