BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Beta 2 Brings Refinements to .NET’s Coordination Data Structures Library

Beta 2 Brings Refinements to .NET’s Coordination Data Structures Library

This item in japanese

Bookmarks

Coordination Data Structures (CDS) is designed both to be used directly and to act as the building blocks for more complex concurrency frameworks. It includes advanced synchronization tools like the Barrier, several thread-safe collections, and a couple different ways to create futures.

The Barrier class is used to create synchronization points for multi-phase operations. Before it is used, the Barrier needs to know how many threads will be using it. As each thread reaches a checkpoint, it calls SignalAndWait. This will block the thread until all other threads have called the method, then they are released all at once. This can be repeated multiple times, with each iteration incrementing the CurrentPhaseNumber property. Monitoring code can check both the number of participating threads and how many have not reached the current checkpoint at any time. Since the CurrentPhaseNumber is stored as an Int64, each Barrier can support up to 9,223,372,036,854,775,807 phases. (Previous betas were limited based on Int32 and thus limited to about 4 billion phases.)

The BlockingCollection class is used for producer-consumer scenarios. In its simplest mode it acts as a thread-safe queue in which consumers are blocked while the queue is empty. In order to keep the queue from growing too large, a max capacity can be set. When the max capacity is reached, producers are blocked until consumers have a chance to pull some items out. Once done filling the BlockingCollection, producers can mark it as complete. This prevents future items from being added and will cause any blocked consumers to be released.

The BlockingCollection doesn’t have to be used by itself; collections of BlockingCollections can be used together. In this mode, producers and consumers can indicate that they are trying to insert or remove items from any BlockingCollection in the list, but they don’t care which particular one is chosen. This can be used with the max capacity option to create a type of load balancing amongst collections.

The ConcurrentDictionary class supports atomic adds and updates. To support this, delegates may be passed to the GetOrAdd and AddOrUpdate methods. If the key doesn’t exist in the collection, the Add delegate is called. If the key does exist, either the stored value is returned or it is passed to the Update delegate.

There were plans for a concurrent linked list, but it was cut in beta 2. Stated plainly, they were not able to get the right mix of performance and usability to justify the class. Joshua Phillips writes,

In each and every software professional’s career there comes a point where he or she might have to swallow their pride and let a creation that they love go. For some reason or another, their fancy invention just ultimately, doesn’t provide enough value to justify its existence. Now I know we went and got you all excited about ConcurrentLinkedList<T> in Beta 1 but we had to let it go (we did warn you though!). Unfortunately, with the time we had available we just couldn’t get it to be usable and perform well. There seem to be many thread-safe linked list implementations that are very scalable but usually that scalability is based on some assumption or odd caveat in the design that severely degrades the practicality of the type. It hurts us deeply to take CLL<T> out but its performance just wasn’t good enough to ship it. No need to send any flowers.

For lazily evaluated functions, there are now two options. If you want a future that can be passed around but not evaluated unless needed, you can use the Lazy class. This class wraps a function which is guaranteed to be executed once and only once when the Value property is first accessed. Future calls to Value will not cause the wrapped function to be re-executed.

The second option is found in the LazyInitializer module. The EnsureInitialized method is a lightweight way to initialize values by calling a delegate if and only if the target variable is currently null. While the target variable is guaranteed to be only assigned once, the delegate may be called by multiple concurrent threads unless a synchronization object is passed in.

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT