09 February 2017

[Solution] Run Django in background - Mac

Standard



If you want to run Django app in background without terminating


#1. $ screen

#2. <Enter the Django startup script> python manage.py runserver 0.0.0.0:8000

#3. Press Ctrl + A then D


If you want to re-enter to the screen

#1. $ screen -r

01 February 2017

30 January 2017

06 January 2017

07 November 2016

13 October 2016

Install Apache hadoop Mahout in MacOS without Brew

Standard

Here are the steps to install Apache Mahout
1. Download latest package from http://mirror.nexcess.net/apache/mahout/
2. Extract the package
3. Create a directory and put into HDFS
<hadoop_directory> hdfs dfs -put /home/Hadoop/data/mydata.txt /mahout_data/
4. Run clustering in mahout
<mahout_directory>/bin/mahout seqdirectory
-i hdfs://localhost:9000/mahout_data/
-o hdfs://localhost:9000/clustered_data/
5. The output file will be in clustered_data directory

05 October 2016

[Solved] Hive installation error: java.net.URISyntaxException: Relative path in absolute URI

Standard
Error:
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D

at org.apache.hadoop.fs.Path.initialize(Path.java:205)

Solution:
Edit and update hive-site.xml

<name>hive.exec.scratchdir</name>
<value>/tmp/hive-${user.name}</value>

 <name>hive.exec.local.scratchdir</name>
 <value>/tmp/${user.name}</value>

<name>hive.downloaded.resources.dir</name>
<value>/tmp/${user.name}_resources</value>

04 October 2016

03 October 2016

Install Hadoop in Mac

Standard






#1. Download the latest hadoop distribution from
http://mirrors.ibiblio.org/apache/hadoop/common/


#2. Extract the compressed file anywhere

#3. Then change the following files

hadoop_distro/etc/hadoop/hdfs-site.xml
1
2
3
4
5
6
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>
hadoop_distro/etc/hadoop/core-site.xml
1
2
3
4
5
6
<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>
hadoop_distro/etc/hadoop/yarn-site.xml
1
2
3
4
5
6
<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>
hadoop_distro/etc/hadoop/mapred-site.xml
1
2
3
4
5
6
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

#4. JAVA_HOME, HADOOP_HOME settings
In my case I've installed hadoop in /var/www/html/hadoop

export HADOOP_HOME=/var/www/html/hadoop/hadoop-2.7.3
export JAVA_HOME=$(/usr/libexec/java_home)
export PATH=$PATH:$HADOOP_HOME/bin


#5. Format HDFS
$bin/hdfs namenode -format

#6. Start HDFS
$sbin/start-dfs.sh

#7. Start YARN
$sbin/start-yarn.sh

#8. Goto browser and navigate
http://localhost:50070/