Step 1: Download the Prebuilt IPFS Package

Visit the IPFS installation page at and download the prebuilt ipfs binaries for your operating system.

Why does the installation page talk about “Go IPFS”? There are multiple implementations of the IPFS protocol. The core IPFS team maintain implementations in Golang and Javascript. Those are commonly referred to as go-ipfs and js-ipfs. The official binaries are built from the Go implementation.

Step 2: Unzip the Prebuilt Package

The binaries for Mac OSX and Linux are in a gzipped tar format (.tar.gz). The binaries for Windows are in a zip file. Use the appropriate tool to unzip the file. There are some hints on under the heading Installing from a Prebuilt Package

This will create a directory called go-ipfs.

The file named ipfs is your executable ipfs binary.

Step 3: Install the IPFS Binary on your executable path

To install the binary, all you need to do is put the ipfs binary file somewhere on your executable PATH.

Note about permissions: Whichever approach you use to install the binary, make sure you have the necessary permissions. On Mac OSX or Linux, you probably want to use sudo, which is already installed on most systems.

If you’re on Mac OSX or Linux, you can use the provided install script by running.

Read the output from running this. If it complains about being unable to write the file, you need to deal with permissions (see the note above about permissions)

Step 4: Display the IPFS version

When you’re troubleshooting, it’s important to know which version of ipfs you’re using. To find out the current version, run

Step 5:Initialize the Repository

Use the ipfs init command to initialize the repository. This will generate a local ipfs repository for the current user account on your machine. It also generates a cryptographic keypair that allows your ipfs node to cryptographically sign the content and messages that you create.

Note:If you have already initialized ipfs on your machine, you will get an error message like:

This is ok. It means you’ve already done this step. You can safely proceed to Step 6.

Step 6: Use IPFS to explore the post-install documentation

If you installed a different version of ipfs, you may have gotten a slightly different path to use here. Either path will work for this tutorial. The path you got from the ipfs init command will give you documentation that’s accurate for the version of ipfs you’re using.

When you ran ipfs init, it provided a hint for how you can get started. It said:(refer step 5)


This ipfs cat command tells ipfs to read the content matching the path you provided. If the content isn’t available locally, ipfs will attempt to find it on the peer-to-peer network.

Run the ipfs cat command with the path you got from the init message:

You should see something like this:

How To Install and Configure Redis on Ubuntu 16.04


Redis is an in-memory key-value store known for its flexibility, performance, and wide language support. In this guide, we will demonstrate how to install and configure Redis on an Ubuntu 16.04 server.


To complete this guide, you will need access to an Ubuntu 16.04 server. You will need a non-root user withsudo privileges to perform the administrative functions required for this process. You can learn how to set up an account with these privileges by following our Ubuntu 16.04 initial server setup guide.

When you are ready to begin, log in to your Ubuntu 16.04 server with your sudo user and continue below.


Read more about how to install and configure Redis on Ubuntu 

Install MongoDB on Ubuntu

Step by step instruction to install MongoDB on Ubuntu based on:

Install MongoDB

Configure Package Management System (APT)

The Ubuntu package management tool (i.e. dpkg and apt) ensure package consistency and authenticity by requiring that distributors sign packages with GPG keys. Issue the following command to import theMongoDB public GPG Key:

sudo apt-key adv --keyserver hkp:// --recv 7F0CEB10
sudo apt-key adv –keyserver hkp:// –recv 7F0CEB10


Create a /etc/apt/sources.list.d/mongodb.list file using the following command.

echo 'deb dist 10gen' | sudo tee /etc/apt/sources.list.d/mongodb.list

Create a folder to install ubuntu

Now issue the following command to reload your repository:

sudo apt-get update

sudo apt-get update

Install Packages

Issue the following command to install the latest stable version of MongoDB:

sudo apt-get install mongodb-10gen

sudo apt-get install mongodb-10gen


When this command completes, you have successfully installed MongoDB! Continue for configuration and start-up suggestions.



Running Hadoop on Ubuntu Linux (Multi-Node Cluster)

Tutorial approach and structure

From two single-node clusters to a multi-node cluster – We will build a multi-node cluster using two Ubuntu boxes in this tutorial. In my humble opinion, the best way to do this for starters is to install, configure and test a “local” Hadoop setup for each of the two Ubuntu boxes, and in a second step to “merge” these two single-node clusters into one multi-node cluster in which one Ubuntu box will become the designated master (but also act as a slave with regard to data storage and processing), and the other box will become only a slave. It’s much easier to track down any problems you might encounter due to the reduced complexity of doing a single-node cluster setup first on each machine.

Figure 2: Tutorial approach and structure

Let’s get started!


Configuring single-node clusters first

The tutorial approach outlined above means that you should read now my previous tutorial on how to setup up a Hadoop single-node cluster and follow the steps described there to build a single-node Hadoop cluster on each of the two Ubuntu boxes. It is recommended that you use the ‘‘same settings’’ (e.g., installation locations and paths) on both machines, or otherwise you might run into problems later when we will migrate the two machines to the final multi-node cluster setup.

Just keep in mind when setting up the single-node clusters that we will later connect and “merge” the two machines, so pick reasonable network settings etc. now for a smooth transition later.

Read More: Running Hadoop on Ubuntu Linux (Multi-Node Cluster)

Ubuntu Enterprise Cloud