Store all the Java files in a directory called java_output to evalute the further
Answer: Solution:
Step 1: Drop all the tables, which we have created in previous problems. Before implementing the solution.
Login to hive and execute following command.
show tables;
drop table categories;
drop table customers;
drop table departments;
drop table employee;
drop table ordeMtems;
drop table orders;
drop table products;
show tables;
Check warehouse directory. hdfs dfs -Is /user/hive/warehouse
Step 2: Now we have cleaned database. Import entire retail db with all the required parameters as problem statement is asking.
sqoop import-all-tables
-m3
-connect jdbc:mysql://quickstart:3306/retail_db
–username=retail_dba
-password=cloudera
-hive-import
–hive-overwrite
-create-hive-table
–compress
–compression-codec org.apache.hadoop.io.compress.SnappyCodec
–outdir java_output
Step 3: Verify the work is accomplished or not.
a. Go to hive and check all the tables hive
show tables;
select count(1) from customers;
b. Check the-warehouse directory and number of partitions,
hdfs dfs -Is /user/hive/warehouse
hdfs dfs -Is /user/hive/warehouse/categories
c. Check the output Java directory.
Is -Itr java_output/
Latest CCA175 Dumps Valid Version with 96 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund