How to configure concurrency in .NET Core Web API?
In the old WCF days, you had control over service concurrency via MaxConcurrentCalls
setting. MaxConcurrentCalls
defaulted to 16 concurrent calls but you could raise or lower that value based upon your needs.
How do you control server side concurrency in .NET Core Web API? We probably need to limit it in our case as too many concurrent requests can impede overall server performance.
ASP.NET Core application concurrency is handled by its web server. For example:
Kestrel
var host = new WebHostBuilder()
.UseKestrel(options => options.ThreadCount = 8)
It is not recommended to set Kestrel thread count to a large value like 1K
due to Kestrel async-based implementation.
More info: Is Kestrel using a single thread for processing requests like Node.js?
New Limits
property has been introduced in ASP.NET Core 2.0 Preview 2.
You can now add limits for the following:
- Maximum Client Connections
- Maximum Request Body Size
- Maximum Request Body Data Rate
For example:
.UseKestrel(options =>
{
options.Limits.MaxConcurrentConnections = 100;
}
IIS
When Kestrel runs behind a reverse proxy you could tune the proxy itself. For example, you could configure IIS application pool in web.config
or in aspnet.config
:
<configuration>
<system.web>
<applicationPool
maxConcurrentRequestsPerCPU="5000"
maxConcurrentThreadsPerCPU="0"
requestQueueLimit="5000" />
</system.web>
</configuration>
Of course Nginx and Apache have their own concurrency settings.