Write a Sqoop Job which will import "retaildb.categories" table to hdfs, in a directory name "categories_targetJob".

Write a Sqoop Job which will import "retaildb.categories" table to hdfs, in a directory name "categories_targetJob".View AnswerAnswer: Solution: Step 1: Connecting to existing MySQL Database mysql -user=retail_dba --password=cloudera retail_db Step 2: Show all the available tables show tables; Step 3: Below is the command to create Sqoop Job (Please note...

December 4, 2020No CommentsREAD MORE +

Data should be written as text to hdfs

Data should be written as text to hdfsView AnswerAnswer: Solution: Step 1: Create directory mkdir /tmp/spooldir2 Step 2: Create flume configuration file, with below configuration for source, sink and channel and save it in flume8.conf. agent1 .sources = source1 agent1.sinks = sink1a sink1b agent1.channels = channel1a channel1b agent1.sources.source1.channels = channel1a...

December 4, 2020No CommentsREAD MORE +

Now import the data from following directory into departments_export table, /user/cloudera/departments new

Now import the data from following directory into departments_export table, /user/cloudera/departments newView AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail_db; show tables; step 2: Create a table as given in problem statement. CREATE table departments_export (departmentjd int(11), department_name varchar(45), created_date T1MESTAMP DEFAULT NOW());...

December 4, 2020No CommentsREAD MORE +

Please import the departments table in a directory called departments_enclosedby and file should be able to process by downstream system.

Please import the departments table in a directory called departments_enclosedby and file should be able to process by downstream system.View AnswerAnswer: Solution: Step 1: Connect to mysql database. mysql --user=retail_dba -password=cloudera show databases; use retail_db; show tables; Insert record Insert into departments values(9999, '"Data Science"'); select" from departments; Step 2:...

December 4, 2020No CommentsREAD MORE +

Please import data in a non-existing table, means while importing create hive table named hadoopexam.departments_new

Please import data in a non-existing table, means while importing create hive table named hadoopexam.departments_newView AnswerAnswer: Solution: Step 1: Go to hive interface and create database. hive create database hadoopexam; Step 2. Use the database created in above step and then create table in it. use hadoopexam; show tables; Step...

December 4, 2020No CommentsREAD MORE +

Problem Scenario 25: You have been given below comma separated employee information. That needs to be added in /home/cloudera/flumetest/in.txt file (to do tail source)

Problem Scenario 25: You have been given below comma separated employee information. That needs to be added in /home/cloudera/flumetest/in.txt file (to do tail source) sex, name, city 1, alok, mumbai 1, jatin, chennai 1, yogesh, kolkata 2, ragini, delhi 2, jyotsana, pune 1, valmiki, banglore Create a flume conf file...

December 4, 2020No CommentsREAD MORE +

Also make sure your results fields are terminated by '|' and lines terminated by 'n

Also make sure your results fields are terminated by '|' and lines terminated by 'nView AnswerAnswer: Solutions: Step 1: Clean the hdfs file system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop...

December 3, 2020No CommentsREAD MORE +

Import departments table from mysql to hdfs as parquet file in departments_parquet directory.

Import departments table from mysql to hdfs as parquet file in departments_parquet directory.View AnswerAnswer: Solution: Step 1: Import departments table from mysql to hdfs as textfile sqoop import -connect jdbc:mysql://quickstart:3306/retail_db ~username=retail_dba -password=cloudera -table departments -as-textfile -target-dir=departments_text verify imported data hdfs dfs -cat departments_text/part" Step...

December 3, 2020No CommentsREAD MORE +

Problem Scenario 4: You have been given MySQL DB with following details.

Problem Scenario 4: You have been given MySQL DB with following details. user=retail_dba password=cloudera database=retail_db table=retail_db.categories jdbc URL = jdbc:mysql://quickstart:3306/retail_db Please accomplish following activities. Import Single table categories (Subset data} to hive managed table, where category_id between 1 and 22View AnswerAnswer: Solution: Step 1: Import Single table (Subset data) sqoop...

December 3, 2020No CommentsREAD MORE +

Import departments table from mysql to hdfs as parquet file in departments_parquet directory.

Import departments table from mysql to hdfs as parquet file in departments_parquet directory.View AnswerAnswer: Solution: Step 1: Import departments table from mysql to hdfs as textfile sqoop import -connect jdbc:mysql://quickstart:3306/retail_db ~username=retail_dba -password=cloudera -table departments -as-textfile -target-dir=departments_text verify imported data hdfs dfs -cat departments_text/part" Step...

December 3, 2020No CommentsREAD MORE +