There are a lot of people who get confused with the terms Concurrency and Parallelism and here is an article that solves those confusions/myths.
Some of the common questions which people usually get are:
What are Concurrency and Parallelism?
Concurrency == Parallelism?
Does Concurrency really Increase the Performance?
Can Parallelism exist without Concurrency?
Can Concurrency exist without Parallelism or Can they both exist?
Are Concurrency and Parallelism possible in a Single Core (CPU) Machine?
Concurrency and Parallelism:
as Rob Pike said
Concurrency is about dealing lot of things at once, Parallelism about doing lot of things at once
"Dealing" and "Doing" is actually creating confusion 😟. Let's talk in layman terms
Concurrency means executing multiple tasks at the same time but not necessarily simultaneously. To understand it better take an example of a single-core CPU machine
Let's take an example of Web browser processes, OS Scheduler schedules the tasks like UI Process, Rendering Process, Network Processes, etc, and gives a time slice to share the single CPU. Sometimes context switch happens when a process is waiting/blocked for another process (like I/O operation). Context switches are costly and to avoid the cost you need a higher core machine so we can reduce the number of context switches which makes the system run faster.
Still not clear, Let's take a real-world example, suppose you are jogging on a nice morning and your shoelace is untied and to tie the lace you need to stop the jog right? You cannot do these two tasks at the same time, you need to finish one and then another but the order of execution is not important.
Parallelism:
Parallelism is about doing a lot of things at once. Parallelism requires hardware with multiple processing units, essentially. In a single-core CPU, you may get concurrency but NOT parallelism.
Let's take the same example of web browser processes, Now all the four processes are running in four cores which makes things run faster.
So all we need is parallelism, why concurrency ?
Why do we need to think about concurrency?
Because Parallelism comes with a cost 💰💰💰💰
You need hardware with multiple processing units, essentially
You need to split the tasks in such a way that they are independently executable computations(functions) with no interdependency on each other
Why achieving concurrency is hard?
Make sure the program split into independently executable functions
Threads (goroutines) have to share the same memory, so you have a high chance of RaceConditions, Deadlocks. Thread safety (locks, mutexes) comes with a penalty of performance
Testing concurrent code is hard
Concurrency and Parallelism existence:
There were a lot of questions about Concurrency and Parallelism's existence and its combinations. I will try explaining better with the below table.
Concurrency | Parallelism | Meaning |
✅ | ✅ | An application that can run multiple tasks concurrently in a multi-core CPU machine. |
❌ | ✅ | An application only works on one task at a time, and this task is broken down into subtasks that can be processed in parallel. However, each task/subtask is completed before the next task is split up and executed in parallel. |
✅ | ❌ | An application runs more than one task at the same time, but no two tasks are executed at the same time instant. Single Core CPU machine is a classic example of this |
❌ | ❌ | An application processes all tasks one at a time, sequentially. |
I am going to talk more about concurrency in my next articles, stay tuned to hangoutdude
References:
Comentários