How to install Hadoop 1.x Version?

http://apache.mirrors.pair.com/hadoop/core/hadoop-2.6.5/
To install Hadoop 1.x version:
Step1: Update the entire System first
sudo apt-get update
Step2: Install Java (JDK)
sudo apt-get install openjdk-6-jdk
Step3: Install SSH ( Secure Shell )
sudo apt-get install ssh
Step4: Install eclipse
sudo apt-get install eclipse
Step5: Install MySQL
sudo apt-get install mysql-server mysql-client
Step6: Check the hosts and hostname before installing hadoop
sudo nano /etc/hosts
127.0.0.1       localhost
127.0.1.1       saghir-Inspiron-3420   <----- This is our root user
sudo nano /etc/hostname
saghir-Inspiron-3420
Step7: Download hadoop ( here, hadoop-1.1.2 version) from Apache Website and save in any folder(here, folder name is  work in home directory)

Downloaded: hadoop-1.1.2.tar.gz
http://archive.apache.org/dist/hadoop/core/hadoop-1.1.2/

Step8: Unzip or extract hadoop-1.1.2.tar.gz  -----to----------> hadoop-1.1.2

Step9: Open hadoop-1.1.2
-You will get many important files & folders
-Open conf folder ----You will find many important files (.xml, .sh )
-Open & check the following files:
1- core-site.xml
2- hdfs-site.xml
3- mapred-site.xml
4- hadoop-env.sh
5- masters   - It holds the information about Secondary Name Node (SNN)
6- slaves
- Configure the above files
- For Hadoop in distributed mode (multi node cluster) ,  masters and slaves files contains lots of information.
********************** 1. core-site.xml********************

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/hadoop1/work/hadoopdata/tmp</value>
</property>
</configuration>

Note: => 9000 --> It is the RPC( Remote Procedural Call ) port for NameNode .
  -->May change in some case. This is standard value.
=> Whatever will be stored in NameNode...the data will be stored in the given tmp folder










****************************2. hdfs-site.xml **********************

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
 
<property>
<name>dfs.name.dir</name>
<value>/home/hadoop1/work/hadoopdata/dfs/name</value>
</property>
 
    <property>
      <name>dfs.data.dir</name>
      <value>/home/hadoop1/work/hadoopdata/dfs/data</value>
    </property>

</configuration>
*****************************3. mapred-site.xml******************************

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
 
<property>
<name>mapred.local.dir</name>
<value>/home/hadoop1/work/hadoopdata/mapred/local</value>
</property>
 
    <property>
      <name>mapred.system.dir</name>
      <value>/mapred/system</value>
    </property>
</configuration>

---------------
Note: => 9001 --> It is the RPC( Remote Procedural Call ) port for Job Tracker.


*************************** 4. hadoop-env.sh **********************

- Contains lots of lines
- Check the following lines:

# The java implementation to use.  Required.
# export JAVA_HOME=/usr/lib/j2sdk1.5-sun

# Extra Java CLASSPATH elements.  Optional.
# export HADOOP_CLASSPATH=

----------->
 Replace the above by:

# The java implementation to use.  Required.
 export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64

# Extra Java CLASSPATH elements.  Optional.
# export HADOOP_CLASSPATH=
------------>

Note: =>  HADOOP_CLASSPATH is optional. JAVA_HOME is Required.
=> So, uncomment and  modify the JAVA_HOME path as above.
=> HADOOP_CLASSPATH and JAVA_HOME are environmental variables.
**********************************,,hh****************************
Step10: Open the bash file as :
sudo gedit ~/.bashrc
-Upadte the file by few of the environment variables as shown below
-Go to end of the file and after fi add the following lines:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_HOME=/home/hadoop1/work/hadoop-1.1.2
export PATH=$HADOOP_HOME/bin:$JAVA_HOME/bin:$PATH
Step11:
ssh localhost


ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
OR,
ssh-keygen -t rsa -P ""
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Step12: Format the NameNode
hadoop namenode -format
Step13: To start hadoop for the first time
start-all.sh
To check:
jps
5763 JobTracker
5527 DataNode
5372 NameNode
6012 Jps
5679 SecondaryNameNode
5935 TaskTracker

To stop hadoop:
stop-all.sh