Difference between revisions of "CSC103: DT's Notes 1"

From dftwiki3
Jump to: navigation, search
Line 20: Line 20:
 
in silicon, it doesn't mean that it is the substrate of choice.  Researchers have shown<ref name="Adleman">Adleman, L. M., "Molecular computation of solutions to combinatorial problems". Science 266 (5187): 1021–1024. 1994.</ref> that complex computation could also be done using DNA, in vials.  Think about it: no electricity there; just many vials with solutions containing DNA molecules, a huge number of them, that are induced to code all possible combinations of a particular sequence, such that one of the combinations is the solution to the problem to solve.  DNA computing is a form of  ''parallel computation'' where many different solutions are computed at the same time, in ''parallel'', using  DNA molecules<ref name="Lewin">Lewin, D. I., "DNA computing". Computing in Science & Engineering 4 (3): 5–8, 2002.</ref>.  One last example for computation that is not performed in silicon by traveling electrons can be found in '''optical computating'''.  The idea behind this concept (we really do not have optical computers yet, just isolated experiments showing its potential) is that electrons are replaced with photons, and these photons, which are faster than electrons, but much harder to control.
 
in silicon, it doesn't mean that it is the substrate of choice.  Researchers have shown<ref name="Adleman">Adleman, L. M., "Molecular computation of solutions to combinatorial problems". Science 266 (5187): 1021–1024. 1994.</ref> that complex computation could also be done using DNA, in vials.  Think about it: no electricity there; just many vials with solutions containing DNA molecules, a huge number of them, that are induced to code all possible combinations of a particular sequence, such that one of the combinations is the solution to the problem to solve.  DNA computing is a form of  ''parallel computation'' where many different solutions are computed at the same time, in ''parallel'', using  DNA molecules<ref name="Lewin">Lewin, D. I., "DNA computing". Computing in Science & Engineering 4 (3): 5–8, 2002.</ref>.  One last example for computation that is not performed in silicon by traveling electrons can be found in '''optical computating'''.  The idea behind this concept (we really do not have optical computers yet, just isolated experiments showing its potential) is that electrons are replaced with photons, and these photons, which are faster than electrons, but much harder to control.
  
So, in summary, we start seeing that computing, at least the medium chosen for where the computation takes place can be varied, and does not have to be silicon.  Indeed, there exist many examples of computation devices that do not use electronics in silicon and can perform quite complex computation.  In consequence, we should also be ready to find that new computers in ten, twenty or thirty years will not use semiconductors made of silicon, and may not use electrons to carry information that is controlled by transistors.  In fact, it is highly probable that they won't [http://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzweil].
+
So, in summary, we start seeing that computing, at least the medium chosen for where the computation takes place can be varied, and does not have to be silicon.  Indeed, there exist many examples of computational devices that do not use electronics in silicon and can perform quite complex computation.  In consequence, we should also be ready to imagine that new computers in ten, twenty or thirty years will not use semiconductors made of silicon, and may not use electrons to carry information that is controlled by transistors.  In fact, it is highly probable that they won't [http://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzweil].
  
===Boolean Algebra===
 
  
 
While the technology used in creating today's computer is the result of an evolution and choices driven by economic factors and scientific discoveries, among others, one thing we can be sure of is that whatever computing machine we devise and use to perform calculations, that machine will have to use rules of mathematics.  It does not matter what technology we use to compute 2 + 2.  The computer must follow strict rules and implement basic mathematical rules in the way it treats information.   
 
While the technology used in creating today's computer is the result of an evolution and choices driven by economic factors and scientific discoveries, among others, one thing we can be sure of is that whatever computing machine we devise and use to perform calculations, that machine will have to use rules of mathematics.  It does not matter what technology we use to compute 2 + 2.  The computer must follow strict rules and implement basic mathematical rules in the way it treats information.   
Line 34: Line 33:
 
In our present case, the major influence on the way our computers are build is the fact that we are using electricity as the source of power, and that we're using fast moving electrons to represent, or ''code'' information.  Electrons are cheap.  They are also very fast, moving at approximately 3/4 the speed of light in wires<ref name="speedElectrons">Main, P., "When electrons go with the flow: Remove the obstacles that create electrical resistance, and you get ballistic electrons and a quantum surprise". New Scientist 1887: 30.,  1993. </ref>.  We know how to generate them cheaply (power source), how to control them easily (with switches), and how to transfer them (over electrical wires).  These properties were the reason for the development of the first vacuum tube computer by Atanasoff in 1939<ref name="atanasoff1939">Ralston, Anthony; Meek, Christopher, eds., Encyclopedia of Computer Science (second ed.), pp. 488–489, 1976.</ref>.
 
In our present case, the major influence on the way our computers are build is the fact that we are using electricity as the source of power, and that we're using fast moving electrons to represent, or ''code'' information.  Electrons are cheap.  They are also very fast, moving at approximately 3/4 the speed of light in wires<ref name="speedElectrons">Main, P., "When electrons go with the flow: Remove the obstacles that create electrical resistance, and you get ballistic electrons and a quantum surprise". New Scientist 1887: 30.,  1993. </ref>.  We know how to generate them cheaply (power source), how to control them easily (with switches), and how to transfer them (over electrical wires).  These properties were the reason for the development of the first vacuum tube computer by Atanasoff in 1939<ref name="atanasoff1939">Ralston, Anthony; Meek, Christopher, eds., Encyclopedia of Computer Science (second ed.), pp. 488–489, 1976.</ref>.
  
The choice of using electricity has influenced greatly a fundamental way in which modern computers work.  They all use the binary system at the lowest level.  Because electricity can be turned ON or OFF with a switch, it was only logical that these two states would be used to represent information.  ON and OFF.  0 and 1.  True and False.  But if we can represent two different states, two different levels of information, can we represent other than 0 or 1?  Say 257?  Can we also organize electrical circuitry that can perform the addition of two numbers?
+
The choice of using electricity has influenced greatly a fundamental way in which modern computers work.  They all use the binary system at the lowest level.  Because electricity can be turned ON or OFF with a switch, it was only logical that these two states would be used to represent information.  ON and OFF.  0 and 1.  True and False.  But if we can represent two different states, two different levels of information, can we represent other than 0 or 1?  Say 257?  Can we also organize electrical circuitry that can perform the addition of two numbers?   The answer is Yes; using the binary numbering system.
  
 +
====Binary System====
 +
 +
This section is an overview of the binary system.  Better sources of information can be found on this subject, including [http://courses.cs.vt.edu/csonline/NumberSystems/Lessons/index.html this one] from the University of Vermont.
 +
 +
To better understand the binary system, we have to refresh our memory about the way our decimal system works.
 +
 +
First let's count in decimal:
 +
 +
001
 +
002
 +
003
 +
004
 +
005
 +
006
 +
007
 +
008
 +
009
 +
----------
 +
010
 +
011
 +
...
 +
019
 +
----------
 +
020
 +
021
 +
022
 +
...
 +
029
 +
----------
 +
030
 +
...
 +
098
 +
099
 +
----------
 +
100
 +
101
 +
...
 +
 +
The first thing we have done is to pad all the numbers with leading zeros so that they all have 3 digits.  Then we start at 0 and keep counting.  What we do implicitly is go through a table of 10 digits we know by heart, 0 through 9, and at every new line with change the right most digit to be the next number in the table.  When that last digit reaches 9, we have to ''roll over'' and start back at 0 in our table of 10 digits.  We illustrated this in the list above by a line of dashes.  But what else do we do when we ''roll over'' the right most digit?  We also change the digit to its left, and make it become the next digit in that table of 10 digits we know by heart.  so from 009 we pass to 010. 
 +
 +
Then after a while we reach 019.  The right-most digit has to ''roll over'', which makes it become 0, and the digit to its left must change by 1.  It becomes 2, and we get 020.
 +
 +
At some point, applying this rule we reach 099.  Let's just apply our simple rule: the right most digit must roll over, so it becomes 0, and the digit to its left increments by 1.  Because this second digit is also 9 it, too, must roll over and become 0.  And because the 2nd digit rolls over, then the third digit, the one to the left of the 2nd digit increments by 1.  We get 100.  Does that make sense?  We are so good at doing this by heart without thinking of the process we use that sometimes it is confusing to deconstruct such knowledge.
 +
 +
 
The answers to these questions, and the ensuing solutions are provided by two giants of computer science, '''George Boole''', and '''Claude Shannon''' who worked at very different times.   
 
The answers to these questions, and the ensuing solutions are provided by two giants of computer science, '''George Boole''', and '''Claude Shannon''' who worked at very different times.   
  
 +
====Boole and the Boolean Algebra====
 
Boole's contribution was first in the history of computers.  In the 1840s, he conceived of a mathematical world where there would be only two symbols, two values, and three operations that could be performed with them, and he proved that such a system could exhibit the same rules exhibited by an algebra.  The two values used by Boole's algebra are ''True'' and ''False'', or ''T'' and ''F'', and the operators are ''and'', ''or'' and ''not''.  Boole was in fact interested in logic, and expressing combinations of statements that can either be true or false, and figuring out whether mathematics could be helpful in formulating a logic system where we could express any logical expression containing multiple simple statements that could be either true or false.
 
Boole's contribution was first in the history of computers.  In the 1840s, he conceived of a mathematical world where there would be only two symbols, two values, and three operations that could be performed with them, and he proved that such a system could exhibit the same rules exhibited by an algebra.  The two values used by Boole's algebra are ''True'' and ''False'', or ''T'' and ''F'', and the operators are ''and'', ''or'' and ''not''.  Boole was in fact interested in logic, and expressing combinations of statements that can either be true or false, and figuring out whether mathematics could be helpful in formulating a logic system where we could express any logical expression containing multiple simple statements that could be either true or false.
  

Revision as of 23:20, 30 January 2012

--© D. Thiebaut 08:10, 30 January 2012 (EST)


This section is only visible to computers located at Smith College