In this tutorial, we will discuss how to import CSV File into HBase table using importtsv command.
Import CSV File into HBase using importtsv
You can load bulk data in HBase table using this method as well. Many times in data analytics we receive the requirement where we have to load a CSV file into HBase table, and in such scenarios, this tutorial on import CSV data in HBase would be very helpful.
Load CSV data to HBase Table
Here are the simple steps to create a table and load a CSV file into that table.
• Login to the HBase shell
• Go to HBase shell by typing the below-
• Create an HBase table by using the below command or follow HBase create table command-
• Verify the table using either list or scan command
Now you will have to load the data and for which you need to come out of HBase shell.
Load CSV Data
Let’s say you have a CSV file named sample.csv in HDFS and you may want to keep in some HBase directory. For this use SCP command-
Now put it in HDFS using the below command-
Now again move to HBase shell and follow the below command to load CSV data into HBase table-
Once the MapReduce job is executed, you can scan the table and find the details. You will see the data loaded into HBase table.
Also, you should note here that, ImportTsv command leaves a massive amount of log files in this location- /var/logs and so you should have enough space in the cluster.
These were the easiest method to import csv file into HBase table using importtsv command efficiently.
Do try to load CSV file into HBase table using importtsv and let us know for any difficulty.
There are many more methods to load a CSV file into HBase table, and we will discuss those in our coming blog posts.