Sometimes you may need to access Hadoop runtime from a machine where Hadoop services are not running. In this process you will create password-less SSH access to Hadoop machine from your local machine and once ready you can use Hadoop API to access Hadoop cluster or you can directly use Hadoop commands from local machine by passing proper Hadoop configuration.
Starting Hortonworks HDP 1.3 and/or 2.1 VM
You can use these instructions on any VM running Hadoop or you can download HDP 1.3 or 2.1 Images from the link below:
Now start your VM and make sure your Hadoop cluster is up and running. Once you VM is up and running you will get IP address and hostname on the VM screen which is mostly 192.168.21.xxx as shown below:
Accessing Hortonworks HDP 1.3 and/or 2.1 from browser:
Using the IP address provided you can check the Hadoop server status on port 8000 as below
HDP 1.3 – http://192.168.21.187:8000/about/
HDP 2.1 – http://192.168.21.186:8000/about/
The UI for both HDP1.3 and HDP 2.1 looks as below:
Now from your host machine you can also try to ssh to any of the machine using user name root and password hadoop as below:
The authenticity of host ‘192.168.21.187 (192.168.21.187)’ can’t be established.
RSA key fingerprint is b2:c0:9a:4b:10:b4:0f:c0:a0:da:7c:47:60:84:f5:dc.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added ‘192.168.21.187’ (RSA) to the list of known hosts.
email@example.com’s password: hadoop
Last login: Thu Jun 5 03:55:17 2014
Now we will add password less SSH access to these VM and there could be two option:
Option 1: You already have SSH key created for yourself earlier and want to reuse here:
In this option, first we will make sure we have RSA based key for SSH session in our local machine and then we will use it for password less SSH access:
- In your home folder (/Users/<yourname>) visit to folder name .ssh
- Identify a file name id_rsa.pub (/Users/avkashchauhan/.ssh/id_rsa.pub) and you will see a long string key there
- Now also identify another file name authorized_keys there (i.e. /Users/avkashchauhan/.ssh/authorized_keys) and you will see one or more long string keys there.
- Check the content of id_rsa.pub and make sure that this key is also available into authorized_keys files along with other keys (if there)
- Now copy the key string from id_rsa.pub file in memory.
- SSH to your HDP machine as in previous step using username and password
- visit to /root/.ssh folder
- You will find authorized_keys file there so open this file in editor and append the key here which you have copied in previous step #5.
- Save authorized_keys files
- Now in the same VM you will find id_rsa.pub file and please copy its content in memory.
- Exit the HDP VM
- In your host machine you have already checked authorized_keys in step #3, append the key from HDP VM into authorized_keys file and save it.
- Now try logging HDP VM as below:
Last login: Thu Jun 5 06:35:31 2014 from 192.168.21.1
Note: You will see that password is not needed this time as Password less SSH is working.
Option 2: You haven’t created SSH key in your local machine and will do everything from scratch:
In this option first we will create a SSH based key first and then use it exactly with Option #1.
- Log into your host machine and open terminal
- For example your home folder will be /Users/<username>
- Create a folder name .ssh inside your working folder
- now go inside .ssh folder and run the following command
$ ssh-keygen -C ‘SSH Access Key’ -t rsa
Enter file in which to save the key (/home/avkashchauhan/.ssh/id_rsa): ENTER
Enter passphrase (empty for no passphrase): ENTER
Enter same passphrase again: ENTER
- You will see id_rsa and id_rsa.pub files are created. Now we will append the contents of id_rsa.pub into authorized_keys files and it is not there then we will create and add. For both the command is as below:
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
- In the above step you will see the contents of id_rsa.pub are included into authorized_keys.
- Now we will set proper permissions for keys and folders as below:
$ chmod 700 $HOME && chmod 700 ~/.ssh && chmod 600 ~/.ssh/*
- Finally we can follow Option #1 now to add both id_rsa.pub keys in both machines authorized_keys files to have password less ssh working.
Adding correct Java Home path to java
Migrating Hadoop configuration from Remote Machine to local Machine:
To get this working we will have to get Hadoop configuration files from HDP server to local machine and to do this you just need to copy Hadoop configuration files from HDP servers as below:
Create a folder name hdp13 in your working folder and now use SCP command to copy configuration files as below over password less SSH:
$ scp -r firstname.lastname@example.org:/etc/hadoop/conf.empty/ ~/hdp13
Create a folder name hdp21 in your working folder and now use SCP command to copy configuration files as below over password less SSH:
$ scp -r email@example.com:/etc/hadoop/conf/ ~/hdp21
Adding correct JAVA_HOME to imported Hadoop configuration hadoop-env.sh
Now visit to your hdp13 or hdp21 folder and edit hadoop-env.sh file with correct JAVA_HOME as below:
# The java implementation to use. Required.
# export JAVA_HOME=/usr/jdk/jdk1.6.0_31
export JAVA_HOME=`/usr/libexec/java_home -v 1.7`
Adding correct HDP Hostname into local machine hosts entries:
Now you would need to add Hortonworks HDP hostnames into your local machines hosts file. On Mac OSX you would need to edit /private/etc/hosts file to add the following:
Once added make sure you can ping the hosts by name as below:
$ ping sandbox
PING sandbox (192.168.21.187): 56 data bytes
64 bytes from 192.168.21.187: icmp_seq=0 ttl=64 time=0.461 ms
And for HDP 2.1
$ ping sandbox.hortonworks.com
PING sandbox.hortonworks.com (192.168.21.186): 56 data bytes
64 bytes from 192.168.21.186: icmp_seq=0 ttl=64 time=0.420 ms
Access Hadoop Runtime on Remote Machine from Hadoop commands (or API) at Local Machine:
Now using local machine Hadoop runtime you can connect to Hadoop at HDP VM as below:
$ ./hadoop –config /Users/avkashchauhan/hdp13/conf.empty fs -ls /
Found 4 items
drwxr-xr-x – hdfs hdfs 0 2013-05-30 10:34 /apps
drwx—— – mapred hdfs 0 2014-06-05 03:54 /mapred
drwxrwxrwx – hdfs hdfs 0 2014-06-05 06:19 /tmp
drwxr-xr-x – hdfs hdfs 0 2013-06-10 14:39 /user
$ ./hadoop –config /Users/avkashchauhan/hdp21/conf fs -ls /
Found 6 items
drwxrwxrwx – yarn hadoop 0 2014-04-21 07:21 /app-logs
drwxr-xr-x – hdfs hdfs 0 2014-04-21 07:23 /apps
drwxr-xr-x – mapred hdfs 0 2014-04-21 07:16 /mapred
drwxr-xr-x – hdfs hdfs 0 2014-04-21 07:16 /mr-history
drwxrwxrwx – hdfs hdfs 0 2014-05-23 11:35 /tmp
drwxr-xr-x – hdfs hdfs 0 2014-05-23 11:35 /user
If you are using Hadoop API then you can pass the CONF file path to API and have access to Hadoop runtime.