(Post 16/11/2007) If your application has lots
of threads that spend most of their time blocked on a Wait Handle, you
can reduce the resource burden via thread pooling. A thread pool economizes
by coalescing many Wait Handles onto a few threads.
To use the thread pool, you register a Wait Handle along
with a delegate to be executed when the Wait Handle is signaled. This
is done by calling ThreadPool.RegisterWaitForSingleObject, such in this
example:
class Test {
static ManualResetEvent starter = new ManualResetEvent (false);
public static void Main() {
ThreadPool.RegisterWaitForSingleObject (starter, Go, "hello",
-1, true);
Thread.Sleep (5000);
Console.WriteLine ("Signaling worker...");
starter.Set();
Console.ReadLine();
}
public static void Go (object data, bool timedOut) {
Console.WriteLine ("Started " + data);
// Perform task...
}
}
(5 second delay)
Signaling worker...
Started hello |
In addition to the Wait Handle and delegate, RegisterWaitForSingleObject
accepts a "black box" object which it passes to your delegate
method (rather like with a ParameterizedThreadStart), as well as a timeout
in milliseconds (-1 meaning no timeout) and a boolean flag indicating
if the request is one-off rather than recurring.
All pooled threads are background threads, meaning they
terminate automatically when the application's foreground thread(s) end.
However if one wanted to wait until any important jobs running on pooled
threads completed before exiting an application, calling Join on the threads
would not be an option, since pooled threads never finish! The idea is
that they are instead recycled, and end only when the parent process terminates.
So in order to know when a job running on a pooled thread has finished,
one must signal – for instance, with another Wait Handle.
Calling Abort on a pooled
thread is Bad Idea. The threads need to be recycled for the life
of the application domain. |
You can also use the thread pool without a Wait Handle
by calling the QueueUserWorkItem method – specifying a delegate for immediate
execution. You don't then get the saving of sharing threads amongst multiple
jobs, but do get another benefit: the thread pool keeps a lid on the total
number of threads (25, by default), automatically enqueuing tasks when
the job count goes above this. It's rather like an application-wide producer-consumer
queue with 25 consumers! In the following example, 100 jobs are enqueued
to the thread pool, of which 25 execute at a time. The main thread then
waits until they're all complete using Wait and Pulse:
class Test {
static object workerLocker = new object ();
static int runningWorkers = 100;
public static void Main() {
for (int i = 0; i < runningWorkers; i++) {
ThreadPool.QueueUserWorkItem (Go, i);
}
Console.WriteLine ("Waiting for threads to complete...");
lock (workerLocker) {
while (runningWorkers > 0) Monitor.Wait (workerLocker);
}
Console.WriteLine ("Complete!");
Console.ReadLine();
}
public static void Go (object instance) {
Console.WriteLine ("Started: " + instance);
Thread.Sleep (1000);
Console.WriteLine ("Ended: " + instance);
lock (workerLocker) {
runningWorkers--; Monitor.Pulse (workerLocker);
}
}
}
In order to pass more than a single object to the target
method, one can either define a custom object with all the required properties,
or call via an anonmymous method. For instance, if the Go method accepted
two integer parameters, it could be started as follows:
ThreadPool.QueueUserWorkItem
(delegate (object notUsed) { Go (23,34); });
Another way into the thread pool is via asynchronous
delegates.
(Sưu tầm)
|