TECH SOLUTIONS

Click here to edit subtitle

Forums

Post Reply
Forum Home > Bigdata Learnings( Hadoop, HBase,Hive and other bigdata technologies) > Sqoop: Import Data from MySql to HDFS

Sourav Gulati
Site Owner
Posts: 83

Following are the steps to import data from MySql to HDFS using Sqoop:


1) Login to mysql as follows:

 mysql -h hostname -u root -p


2) Create Database in mysql

mysql>CREATE DATABASE testhadoop;

Also you can use some existing database


3)Grant all privileges to all users to database

mysql>GRANT ALL PRIVILEGES ONtesthadoop.* TO '%'@'host-ip';

mysql>GRANT ALL PRIVILEGES ON testhadoop.* TO ''@'host-ip;


4) Create a table in database and insert some data in it.


mysql>USE testhadoop;

mysql>CREATE TABLE employee( empid INT, empname varchar(20), salray(int));

mysql>INSERT INTO employee VALUES(1,'emp1',15000);


5)For example , table contains following data :

mysql> select * from employee;

+-------+---------+--------+

| empid | empname | salary |

+-------+---------+--------+

|     1 | emp1    |  10000 |

|     2 | emp2    |  15000 |

|     3 | emp3    |  20000 |

+-------+---------+--------+


6) Copy the mysql-connector jar in $SQOOP_HOME/lib


7) Now , employee table from mysql can be imported to HDFS using Sqoop as follows:

$ $SQOOP_HOME/bin/sqoop import --connect jdbc:mysql://host-ip/testhadoop --table employee


8 )It will create a directory "employee" on HDFS . Table contents can be get from HDFS as follows:

$ hadoop fs -cat employee/part-m-00000

1,emp1,10000

2,emp2,15000

3,emp3,20000




Tags: Load data from mysql to HDFS




Click here for Other topics of BigData Technologies

 

 

February 19, 2013 at 1:47 AM Flag Quote & Reply

You must login to post.