Definition of Parallel processing :
Parallel processing is a method in computing of running two or more processors (CPUs) to handle separate parts of an overall task. Breaking up different parts of a task among multiple processors will help reduce the amount of time to run a program.
Things to take care of while parallel processing?
In parallel processing, it is important to divide the problem into multiple sub-units that are independent of each other or at least less dependent on other sub-units.
Embarrassingly parallel:
If the sub-units of a problem are independent of other sub-units is called embarrassingly parallel.In some cases the sub-units of a problem might be sharing some data as part of its operation, This also will cause performance issues because of communication latency.
Different ways to handle parallel programs :
Shared memory :
In this method, the
subsystem will be using the same memory space, so in this case, we don't have
to bother about communication-related factors as every sub-task is reading and
writing to the same memory location.
But this will get difficult
when multiple processes try to access /change the same memory location at the
same time.in such a situation we have to use some synchronization techniques
to avoid this conflict.
Distributed memory:
In this method, each process is completely separate
and has its own dedicated memory space, ya as expected in this method the
communication between the process is the villain! Since the
communication happens through a network interface, it is costlier compared to
shared memory.
***