Labels

Apache Hadoop (3) ASP.NET (2) AWS S3 (2) Batch Script (3) BigQuery (21) BlobStorage (1) C# (3) Cloudera (1) Command (2) Data Model (3) Data Science (1) Django (1) Docker (1) ETL (7) Google Cloud (5) GPG (2) Hadoop (2) Hive (3) Luigi (1) MDX (21) Mongo (3) MYSQL (3) Pandas (1) Pentaho Data Integration (5) PentahoAdmin (13) Polybase (1) Postgres (1) PPS 2007 (2) Python (13) R Program (1) Redshift (3) SQL 2016 (2) SQL Error Fix (18) SQL Performance (1) SQL2012 (7) SQOOP (1) SSAS (20) SSH (1) SSIS (42) SSRS (17) T-SQL (75) Talend (3) Vagrant (1) Virtual Machine (2) WinSCP (1)

Sunday, April 29, 2018

SQOOP Import and Export Examples


Below are some sample commands to export and import operator in SQOOP to move data from relational databases (e.g., mysql is used here) to HDFS location 

Export:
sqoop export --connect jdbc:mysql://mysqldb.****.****/database --table <table_name> --username ******* -password ****** -fields-terminated-by ',' -m 1 --export-dir <HDFS Path>

Import:
sqoop import --connect jdbc:mysql://mysqldb.******.****/MyDB --table customers --username ****** --password ****** --target-dir batch/sqoop/job1 -m 1

=> m =1 loads all data to single part file.
=> m= 5 loads data to 5 separate part files.

Extracts specific columns:
sqoop import --connect jdbc:mysql://mysqldb.edu.cloudlab.com/retail_db --table customers --username labuser --password edureka --target-dir batch/sqoop/job1 --columns “column1, column2” -m 1