How can the high rate of records be effectively managed in this application?

A mule application must periodically process a large dataset which varies from 6 GB lo 8 GB from a back-end database and write transform data lo an FTPS server using a properly configured bad job scope.

The performance requirements of an application are approved to run in the cloud hub 0.2 vCore with 8 GB storage capacity and currency requirements are met.

How can the high rate of records be effectively managed in this application?
A . Use streaming with a file storage repeatable strategy for reading records from the database and batch aggregator with streaming to write to FTPS
B. Use streaming with an in-memory reputable store strategy for reading records from the database and batch aggregator with streaming to write to FTPS
C. Use streaming with a file store repeatable strategy for reading records from the database and batch aggregator with an optimal size
D. Use streaming with a file store repeatable strategy reading records from the database and batch aggregator without any required configuration

Answer: A

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments