The blocking queue based on array is realized. In ArrayBlockingQueue, a fixed-length array is maintained to cache the data objects in the queue. In addition, two plastic variables are stored to identify the positions of the head and tail of the queue in the array respectively.
ArrayBlockingQueue uses the same lock object when producers put in data and consumers get data, which means they can't really run in parallel. When creating ArrayBlockingQueue, we can also control whether the internal lock of the object adopts fair lock, and the unfair lock is adopted by default.
According to the implementation principle, ArrayBlockingQueue can completely adopt a separate lock, thus realizing the completely parallel operation of producer and consumer operations.
A data buffer queue (consisting of a linked list) is also maintained in the linked list-based blocking queue. When the producer puts the data into the queue, the queue will get the data from the producer and cache it in the queue, and the producer will return immediately. Only when the queue buffer reaches the maximum buffer capacity (LinkedBlockingQueue can specify this value through the constructor), the producer queue will be blocked, and the producer thread will not be awakened until the consumer consumes a piece of data in the queue. On the contrary, the processing on the consumer side is based on the same principle.
For producers and consumers, independent locks are used to control data synchronization, which also means that producers and consumers can operate the data in the queue in parallel under the condition of high concurrency, thus improving the concurrency performance of the whole queue.
Another obvious difference between ArrayBlockingQueue and LinkedBlockingQueue is that the former will not generate or destroy any additional object instances when inserting or deleting elements, while the latter will generate an additional node object. In a system that needs to process a large amount of data efficiently and concurrently for a long time, its influence on GC is still different. If its capacity size is not specified, LinkedBlockingQueue will default to a capacity similar to infinite size (integer). MAX_VALUE)。 In this case, if the producer's speed is higher than that of the consumer, the system memory may be exhausted before the queue is full.
ArrayBlockingQueue and LinkedBlockingQueue are the two most common and commonly used blocking queues. Generally speaking, it is enough to use these two classes when dealing with the producer-consumer problem between multiple threads.
Elements in the DelayQueue can only be obtained from the queue when the delay time specified by them expires. DelayQueue is a queue with no size limit, so the operation of inserting data into the queue (producer) will never be blocked, only the operation of obtaining data (consumer) will be blocked.
DelayQueue is used to place objects that implement the delay interface, and these objects can only be taken out of the queue when they expire. This kind of queue is ordered, that is, the queue head object has the longest delay expiration time. Note: Empty elements cannot be placed in this queue.
Delayed is a mixed-style interface for marking objects that should be executed after a given delay time. Delayed extends the Comparable interface, and the benchmark for comparison is the delayed time value. The return value of the implementation class getDelay of the delay interface should be final. DelayQueue is implemented internally using PriorityQueue.
Consider the following scenario:
A stupid way is to use background threads to traverse all the objects and check them one by one. This clumsy method is simple and easy to use, but when the number of objects is too large, there may be performance problems, poor inspection interval setting, too large interval, affecting accuracy and efficiency problems. And it cannot be processed in chronological order of timeout.
In this scenario, it is most appropriate to use DelayedQueue. For details, please check the study notes of the delayed queue. Exquisite and easy-to-use DelayQueue
Blocking queues based on priority (the judgment of priority is determined by the Compator object passed in by the constructor). It should be noted that PriorityBlockingQueue will not block data producers, but will only block data consumers when there is no consumable data.
In use, if the producer generates data faster than the consumer consumes data, all available heap memory space may be exhausted with long-term operation. When implementing PriorityBlockingQueue, the lock that internally controls thread synchronization is a fair lock.
SynchronousQueue is a queue that can only contain zero elements internally. The thread that inserts the element into the queue is blocked until another thread gets the element from the queue. Similarly, if a thread tries to get an element and no thread is currently inserting an element, the thread will be blocked until a thread inserts an element into the queue.
Declare that a SynchronousQueue has a fair mode and an unfair mode, and the differences are as follows:
Reference: Java Multithreading-Tools-Blocking Queues
12. Synchronization queue