Now import only new inserted records and append to existring directory . which has been created in first step.

Now import only new inserted records and append to existring directory . which has been created in first step.View AnswerAnswer: Solution: Step 1: Clean already imported data. (In real exam, please make sure you dont delete data generated from previous exercise). hadoop fs -rm -R departments Step 2: Import data...

December 8, 2020No CommentsREAD MORE +

Now import the data from following directory into departments_export table, /user/cloudera/departments new

Now import the data from following directory into departments_export table, /user/cloudera/departments newView AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail_db; show tables; step 2: Create a table as given in problem statement. CREATE table departments_export (departmentjd int(11), department_name varchar(45), created_date T1MESTAMP DEFAULT NOW());...

December 8, 2020No CommentsREAD MORE +

Now do the incremental import based on created_date column.

Now do the incremental import based on created_date column.View AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail db; show tables; Step 2: Create a table as given in problem statement. CREATE table departments_new (department_id int(11), department_name varchar(45), createddate T1MESTAMP DEFAULT NOW()); show tables;...

December 8, 2020No CommentsREAD MORE +

Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.

Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.View AnswerAnswer: Solution: Step 1: Create a csv tile named updateddepartments.csv with give content. Step 2: Now upload this tile to HDFS. Create a...

December 8, 2020No CommentsREAD MORE +

Now import the data from following directory into departments_export table, /user/cloudera/departments new

Now import the data from following directory into departments_export table, /user/cloudera/departments newView AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail_db; show tables; step 2: Create a table as given in problem statement. CREATE table departments_export (departmentjd int(11), department_name varchar(45), created_date T1MESTAMP DEFAULT NOW());...

December 8, 2020No CommentsREAD MORE +

Now export data from hive table departments_hive01 in departments_hive02. While exporting, please note following. wherever there is a empty string it should be loaded as a null value in mysql.

Now export data from hive table departments_hive01 in departments_hive02. While exporting, please note following. wherever there is a empty string it should be loaded as a null value in mysql. wherever there is -999 value for int field, it should be created as null value.View AnswerAnswer: Solution: Step 1: Create...

December 8, 2020No CommentsREAD MORE +

Data should be written as text to hdfs

Data should be written as text to hdfsView AnswerAnswer: Solution: Step 1: Create directory mkdir /tmp/spooldir2 Step 2: Create flume configuration file, with below configuration for source, sink and channel and save it in flume8.conf. agent1 .sources = source1 agent1.sinks = sink1a sink1b agent1.channels = channel1a channel1b agent1.sources.source1.channels = channel1a...

December 7, 2020No CommentsREAD MORE +

Please import data in a non-existing table, means while importing create hive table named hadoopexam.departments_new

Please import data in a non-existing table, means while importing create hive table named hadoopexam.departments_newView AnswerAnswer: Solution: Step 1: Go to hive interface and create database. hive create database hadoopexam; Step 2. Use the database created in above step and then create table in it. use hadoopexam; show tables; Step...

December 7, 2020No CommentsREAD MORE +

Now import data from mysql table departments to this hive table. Please make sure that data should be visible using below hive command, select" from departments_hive

Now import data from mysql table departments to this hive table. Please make sure that data should be visible using below hive command, select" from departments_hiveView AnswerAnswer: Solution: Step 1: Create hive table as said. hive show tables; create table departments_hive(department_id int, department_name string); Step 2: The important here is,...

December 7, 2020No CommentsREAD MORE +

Now import data from mysql table departments to this hive table. Please make sure that data should be visible using below hive command, select" from departments_hive

Now import data from mysql table departments to this hive table. Please make sure that data should be visible using below hive command, select" from departments_hiveView AnswerAnswer: Solution: Step 1: Create hive table as said. hive show tables; create table departments_hive(department_id int, department_name string); Step 2: The important here is,...

December 6, 2020No CommentsREAD MORE +