Since thread execution happens in a pool, and is not guaranteed to queue in any particular order, then why would you ever create threads without the protection of synchronization and locks? In order to protect data attached to an object's state (what I understand to be the primary purpose of using threads), locking appears to be the only choice. Eventually you'll end up with race conditions and "corrupted" data if you don't synchronize. So if you're not interested in protecting that data, then why use threads at all?
views:
180answers:
7Delegation, just as one example. Consider a webserver that gets connect requests. It can delegate to a worker thread a particular request. The main thread can pass all the data it wants to the worker thread, as long as that data is immutable, and not have to worry at all about concurrent data access.
(For that matter, both main thread and worker thread can send all the immutable data to each other they want, it just requires a messaging queue of some sort, so the queue may need synchronization but not the data itself. But you don't need a message queue to get data to a worker thread, just construct the data before the thread starts, and as long as the data is immutable at that point, you don't need any synchronization or locks or concurrency management of any sort, other than the ability to run a thread.)
java.util.concurrent.atomic provides for some minimal operations that can be performed in a lock-free and yet thread-safe way. If you can arrange your concurrency entirely around such classes and operations, your performance can be vastly enhanced (as you avoid all the overhead connected with locking). Granted, it's unusual to be working on such a simplifiable problem (more often some locking will be needed), but, if and when you do find yourself in such a situation, well, then, that's exactly the use case you're asking about!-)
Threads allow multiple parallel units of work to progress concurrently. The synchronisation is simply to protect shard resources from unsafe access if not needed you don't use it.
Processing on threads becomes delayed when accessing certain resources such as IO and it may be desirable to keep the CPU processing other units of work while others are delayed.
As in the example in the other answer listening to services requests may well be a unit of work that is kept independent of responding to a request as the latter my block due to resource contention - say access disk or IO.
If there's no shared mutable data, there's no need for synchronization or locks.
Synchronization and locks protect shared state from conflicting concurrent updates. If there is no shared state to protect, you can run multiple threads without locking and synchronization. This might be the case in a web server with multiple independent worker threads serving incoming requests independently. Another way to avoid synchronization and locking is to have your threads only operate on immutable shared state: if a thread can't alter any data that another thread is operating on, concurrent unsynchronized access is fine.
Or you might be using an Actor-based system to handle concurrency. Actors communicate by message passing only, there is no shared state for them to worry about. So here you can have many threads running many Actors without locks. Erlang uses this approach, and there is a Scala Actors library that allows you to program this way on the JVM. In addition there are Actors-based libraries for Java.
In order to protect data attached to an object's state (what I understand to be the primary purpose of using threads), locking appears to be the only choice. ... So if you're not interested in protecting that data, then why use threads at all?
The highlighted bit of your question is incorrect, and since it is the root cause of your "doubts" about threads, it needs to be addressed explicitly.
In fact, the primary purpose for using threads is to allow tasks to proceed in parallel, where possible. On a multiprocessor the parallelism will (all things being equal) speedup your computations. But there are other benefits that apply on a uniprocessor as well. The most obvious one is that threads allow an application to do work while waiting for some IO operation to complete.
Threads don't actually protect object state in any meaningful way. The protection you are attributing to threads comes from:
- declaring members with the right access,
- hiding state behind getters / setters,
- correct use of synchronization,
- use of the Java security framework, and/or
- sending requests to other servers / services.
You can do all of these independently of threading.
There are other kinds of protection for shared data. Maybe you have atomic sections, monitors, software transactional memory, or lock-free data structures. All these ideas support parallel execution without explicit locking. You can Google any of these terms and learn something interesting. If your primary interest is Java, look up Tim Harris's work.