Difference between revisions of "Hadoop/MapReduce Tutorials"

From dftwiki3
Jump to: navigation, search
Line 64: Line 64:
 
Compiling our own version of the Java WordCount program and uploading it to AWS.
 
Compiling our own version of the Java WordCount program and uploading it to AWS.
 
|- style="background:#eeeeff" valign="top"
 
|- style="background:#eeeeff" valign="top"
 +
|
 +
[[Hadoop Tutorial 3.3 -- How Much? | Tutorial #3.3]]
 +
|
 +
Computing the cost of maintaining a cluster of 6 MapReduce instances on Amazon's AWS
 +
|- style="background:#ffffff" valign="top"
 
|
 
|
 
[[Hadoop Tutorial 4: Start an EC2 Instance | Tutorial #4]]
 
[[Hadoop Tutorial 4: Start an EC2 Instance | Tutorial #4]]

Revision as of 14:59, 18 April 2010

--D. Thiebaut 16:01, 18 April 2010 (UTC)


AmazonAWS.jpg


HadoopCartoon.png



These tutorials target the Hadoop/MapReduce Cluster in the CS Dept. at Smith College, as well as Amazon's EC2 and S3.








Tutorial Comments

Tutorial #1

Running WordCount written in Java on the Smith College Hadoop/MapReduce Cluster

Tutorial #1.1

Creating timelines of the execution of tasks during the execution of a MapReduce program.

Tutorial #2

Running WordCount in Python on the Smith College Hadoop/MapReduce Cluster

Tutorial #2.1

Running a streaming Python MapReduce program on XML files

Tutorial #2.2

Running C++ programs under Hadoop Pipes

Tutorial #3

Running Hadoop jobs on Amazon AWS

Tutorial #3.1

Uploading text to S3 and running Amazon's WordCount Java program on our own data.

Tutorial #3.2

Compiling our own version of the Java WordCount program and uploading it to AWS.

Tutorial #3.3

Computing the cost of maintaining a cluster of 6 MapReduce instances on Amazon's AWS

Tutorial #4

Start a server on Amazon's EC2 infrastructure