| Forum Home > Bigdata Learnings( Hadoop, HBase,Hive and other bigdata technologies) > HBase: Bulk upload using ImportTsv in HBase Table | ||
|---|---|---|
|
Site Owner Posts: 83 |
ImportTsv helps you to upload data in HBase from a tsv file placed on hdfs. Lets take and example hbase table , "employee" with column family "basic_info" Following is an example file on hdfs: $ hadoop fs -cat /user/hadoop/importtsv 1 emp1 24 2 emp2 26 3 emp3 24 Uploading data int o employee table using ImportTsv $HBASE_HOME/bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns=HBASE_ROW_KEY,basic_info:empname,basic_info:age employee hdfs/namenode:address/user/hadoop/importtsv First column of the file will be stored as row key, second column will be stored as "empname" and third column will be stored as age as follows: hbase(main):013:0>scan 'employee' ROW COLUMN+CELL 1 column=basic_info:age, timestamp=1360238054613, value=24 1 column=basic_info:empname, timestamp=1360238054613, value=emp1 2 column=basic_info:age, timestamp=1360238054613, value=26 2 column=basic_info:empname, timestamp=1360238054613, value=emp2 3 column=basic_info:age, timestamp=1360238054613, value=24 3 column=basic_info:empname, timestamp=1360238054613, value=emp3
Click here for Other topics of BigData Technologies
| |
| ||