Skip to content
Advertisement

How do conventional locks protect from parallel access in Java?

One of the first things we all learnt about concurrency in Java is that we use locks (synchronized keyword, Lock/ReadWriteLock interface) to protect against concurrent access. For example:

synchronized void eat(){
        //some code
    }

In theory, this method eat() can be executed by a single thread. Even though there are many threads waiting to execute it, only one will take the lock. But then comes parallelism, which made me think twice about what I just said.

I have 4 core CPU. That means I can do 4 tasks parallelly. Yes, lock can be taken by a single thread. But could it happen that 4 threads call method eat() and take a lock at the LITERALLY same time, although there is a lock that needs to be acquired to actually do anything?

Can something like that even happen in Java? I guess it can’t but I had to ask this. And how is it even dealing with a case I just said?

Advertisement

Answer

…4 threads…take a lock at the LITERALLY same time…

Can’t happen. Any “synchronized” operation (e.g., “take a lock”) must operate on the system’s main memory, and in any conventional computer system, there is only one memory bus. It is physically impossible for more than one CPU to access the main memory at the same time.

If two CPUs decide to access the memory at literally the same time, the hardware guarantees that one of them will “win” the race and go first, while the other one is forced to wait its turn.

User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement