- All Exams Instant Download
Now import data from mysql table departments_hive01 to this hive table. Please make sure that data should be visible using below hive command. Also, while importing if null value found for department_name column replace it with "" (empty string) and for id column with -999 select * from departments_hive;
Now import data from mysql table departments_hive01 to this hive table. Please make sure that data should be visible using below hive command. Also, while importing if null value found for department_name column replace it with "" (empty string) and for id column with -999 select * from departments_hive;View AnswerAnswer:...
Now import only new inserted records and append to existring directory . which has been created in first step.
Now import only new inserted records and append to existring directory . which has been created in first step.View AnswerAnswer: Solution: Step 1: Clean already imported data. (In real exam, please make sure you dont delete data generated from previous exercise). hadoop fs -rm -R departments Step 2: Import data...
Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.
Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.View AnswerAnswer: Solution: Step 1: Create a csv tile named updateddepartments.csv with give content. Step 2: Now upload this tile to HDFS. Create a...
Now do the incremental import based on created_date column.
Now do the incremental import based on created_date column.View AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail db; show tables; Step 2: Create a table as given in problem statement. CREATE table departments_new (department_id int(11), department_name varchar(45), createddate T1MESTAMP DEFAULT NOW()); show tables;...
Data should be written as text to hdfs
Data should be written as text to hdfsView AnswerAnswer: Solution: Step 1: Create directory mkdir /tmp/spooldir2 Step 2: Create flume configuration file, with below configuration for source, sink and channel and save it in flume8.conf. agent1 .sources = source1 agent1.sinks = sink1a sink1b agent1.channels = channel1a channel1b agent1.sources.source1.channels = channel1a...
Problem Scenario 1:
Problem Scenario 1: You have been given MySQL DB with following details. user=retail_dba password=cloudera database=retail_db table=retail_db.categories jdbc URL = jdbc:mysql://quickstart:3306/retail_db Please accomplish following activities. View AnswerAnswer: Solution: Step 1: Connecting to existing MySQL Database mysql --user=retail_dba --password=cloudera retail_db Step 2: Show all the available tables show tables; Step 3: View/Count...
Data should be written as text to hdfs
Data should be written as text to hdfsView AnswerAnswer: Solution: Step 1: Create directory mkdir /tmp/nrtcontent Step 2: Create flume configuration file, with below configuration for source, sink and channel and save it in flume6.conf. agent1 .sources = source1 agent1 .sinks = sink1 agent1.channels = channel1 agent1 .sources.source1.channels = channel1...
Create a file in local filesystem named file1.txt and put it to hdfs
Create a file in local filesystem named file1.txt and put it to hdfsView AnswerAnswer: Solution: Step 1: Create directory hdfs dfs -mkdir hdfs_commands Step 2: Create a file in hdfs named data.txt in hdfs_commands. hdfs dfs -touchz hdfs_commands/data.txt Step 3: Now copy this data.txt file on local filesystem, however while...
Data should be written as text to hdfs
Data should be written as text to hdfsView AnswerAnswer: Solution: Step 1: Create directory mkdir /tmp/nrtcontent Step 2: Create flume configuration file, with below configuration for source, sink and channel and save it in flume6.conf. agent1 .sources = source1 agent1 .sinks = sink1 agent1.channels = channel1 agent1 .sources.source1.channels = channel1...
Please import data in a non-existing table, means while importing create hive table named hadoopexam.departments_new
Please import data in a non-existing table, means while importing create hive table named hadoopexam.departments_newView AnswerAnswer: Solution: Step 1: Go to hive interface and create database. hive create database hadoopexam; Step 2. Use the database created in above step and then create table in it. use hadoopexam; show tables; Step...