- All Exams Instant Download
Problem Scenario 25: You have been given below comma separated employee information. That needs to be added in /home/cloudera/flumetest/in.txt file (to do tail source)
Problem Scenario 25: You have been given below comma separated employee information. That needs to be added in /home/cloudera/flumetest/in.txt file (to do tail source) sex, name, city 1, alok, mumbai 1, jatin, chennai 1, yogesh, kolkata 2, ragini, delhi 2, jyotsana, pune 1, valmiki, banglore Create a flume conf file...
Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.
Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.View AnswerAnswer: Solution: Step 1: Create a csv tile named updateddepartments.csv with give content. Step 2: Now upload this tile to HDFS. Create a...
Data should be written as text to hdfs
Data should be written as text to hdfsView AnswerAnswer: Solution: Step 1: Create directory mkdir /tmp/spooldir/bb mkdir /tmp/spooldir/dr Step 2: Create flume configuration file, with below configuration for agent1.sources = source1 source2 agent1 .sinks = sink1 agent1.channels = channel1 agent1 .sources.source1.channels = channel1 agentl .sources.source2.channels = channell agent1 .sinks.sinkl.channel =...
Store all the Java files in a directory called java_output to evalute the further
Store all the Java files in a directory called java_output to evalute the furtherView AnswerAnswer: Solution: Step 1: Drop all the tables, which we have created in previous problems. Before implementing the solution. Login to hive and execute following command. show tables; drop table categories; drop table customers; drop table...
Also make sure you have imported only two columns from table, which are department_id, department_name
Also make sure you have imported only two columns from table, which are department_id, department_nameView AnswerAnswer: Solutions: Step 1: Clean the hdfs tile system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop...
Also make sure you have imported only two columns from table, which are department_id, department_name
Also make sure you have imported only two columns from table, which are department_id, department_nameView AnswerAnswer: Solutions: Step 1: Clean the hdfs tile system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop...
Also make sure you have imported only two columns from table, which are department_id, department_name
Also make sure you have imported only two columns from table, which are department_id, department_nameView AnswerAnswer: Solutions: Step 1: Clean the hdfs tile system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop...
Now import only new inserted records and append to existring directory . which has been created in first step.
Now import only new inserted records and append to existring directory . which has been created in first step.View AnswerAnswer: Solution: Step 1: Clean already imported data. (In real exam, please make sure you dont delete data generated from previous exercise). hadoop fs -rm -R departments Step 2: Import data...
Now import data from mysql table departments to this hive table. Please make sure that data should be visible using below hive command, select" from departments_hive
Now import data from mysql table departments to this hive table. Please make sure that data should be visible using below hive command, select" from departments_hiveView AnswerAnswer: Solution: Step 1: Create hive table as said. hive show tables; create table departments_hive(department_id int, department_name string); Step 2: The important here is,...
Now export data from hive table departments_hive01 in departments_hive02. While exporting, please note following. wherever there is a empty string it should be loaded as a null value in mysql.
Now export data from hive table departments_hive01 in departments_hive02. While exporting, please note following. wherever there is a empty string it should be loaded as a null value in mysql. wherever there is -999 value for int field, it should be created as null value.View AnswerAnswer: Solution: Step 1: Create...