Huawei H13-723-ENU HCIP-Big Data Developer Online Training
Huawei H13-723-ENU Online Training
The questions for H13-723-ENU were last updated at Nov 23,2024.
- Exam Code: H13-723-ENU
- Exam Name: HCIP-Big Data Developer
- Certification Provider: Huawei
- Latest update: Nov 23,2024
Which of the following reasons will cause HDFS NameNode to enter safemode (safe read-only mode)? (multiple choice)
- A . The disk space where the metadata of the active and standby NameNodes reside is insufficient.
- B . The number of lost blocks exceeds the threshold.
- C . The missing copy exceeds the threshold.
- D . The damaged copy exceeds the threshold.
For the HBase component of the FusionInsight HD platform, what attributes of the secondary index need to be defined when adding a secondary index? (multiple choice)
- A . Index name
- B . Index column
- C . Index column type
- D . The name of the column family to which the index column belongs
When a MapReduce application is executed, which of the following actions occurred before the map phase?
- A . split
- B . combine
- C . partition
- D . sort
In FusionInsight HD, regarding the secondary development of Hive UDF, which of the following descriptions is correct? (multiple choice)
- A . Before the user-defined UDF is used, it needs to be created in the Hive system.
- B . User-defined UDF is not allowed to add information such as summary and status.
- C . User-defined UDF can add deterministic and statefull annotations according to the actual situation.
- D . In a secure cluster, user-defined UDFs are recommended to be created before each use.
In the FusionInsight HD product, what is the role name of the Kafka service?
- A . Producer
- B . Broker
- C . Consumer
- D . ZooKeeper
What are the several parameter setting methods for Spark applications? (multiple choices)
- A . Configure in the application configuration file spark-defaults.conf
- B . When submitting the application, set it through –conf
- C . In the application code, set by the setProperty method of SparkContext
- D . In the application code, set through the SparkConf object
In a FusionInsight HD cluster, Flume does not support writing collected data to which service in the cluster?
- A . HDFS
- B . HBase
- C . Kafka
- D . Redis
In Spark, assuming that lines is a DStream object, which of the following statements can periodically count the number of words on this stream?
- A . Iines.flatMap(_.split ” “” )).map(word => (word, 1)).reduce(_ +_).print()
- B . Iines.flatMap(_.split ” “” )).map(word => (word,word.Iength())).reduceByKey (_ +_).print()
- C . Iines.fIatMap(_.spIit ” “” )).map(word => (word, 1)).reduceByKey(_ +_).print()
- D . Iines.flatMap(_.split ” “” )).flatMap(word => (word, 1)).groupByKey(_ +_).print()
HBase filters can set column names or column values as filter conditions, and support multiple filters to be used together.
- A . True
- B . False
Regarding the disaster tolerance of Streaming, which of the following statements is correct? (multiple choice)
- A . After the Supervisor process exits, it can be automatically recognized and pulled up by Nimbus without affecting the running business.
- B . Worker can be automatically pulled up by Supervisor after abnormal exit without manual intervention.
- C . When a node fails, tasks on that node will be reassigned to other normal nodes without manual intervention.
- D . After Nimbus fails, the standby Nimbus will automatically take over, without affecting the running business.