River IQ


Dynamic Allocation in Spark

  Ashish Kumar Spark August 26, 2018

Why spark is faster than MapReduce?Here today I will give you deep dive about Spark Resource Allocation (Static and dynamic allocation of resources).Whenever this question arose, we have come up with below explanation that Spark does in-memory processing of data or it does better or effective utilization of YARN resources than MapReduce.How and when dynamic allocation of resource will give faster and effective utilization of resources.Effective utilization of cluster or yarn memory.What is Executors?Before we start talking about stati...

Read more