System.Net.WebClient unreasonably slow

When using the System.Net.WebClient.DownloadData() method I'm getting an unreasonably slow response time.

When fetching an url using the WebClient class in .NET it takes around 10 sec before I get a response, while the same page is fetched by my browser in under 1 sec. And this is with data that's 0.5kB or smaller in size.

The request involves POST/GET parameters and a user agent header if perhaps that could cause problems.

I haven't (yet) tried if other ways to download data in .NET gives me the same problems, but I'm suspecting I might get similar results. (I've always had a feeling web requests in .NET are unusually slow...)

What could be the cause of this?

Edit:
I tried doing the exact thing using System.Net.HttpWebRequest instead, using the following method, and all requests finish in under 1 sec.

public static string DownloadText(string url)
        var request = (HttpWebRequest)WebRequest.Create(url);
        var response = (HttpWebResponse)request.GetResponse();

        using (var reader = new StreamReader(response.GetResponseStream()))
        {
            return reader.ReadToEnd();
        }
}


While this (old) method using System.Net.WebClient takes 15-30s for each request to finish:

public static string DownloadText(string url)
{
       var client = new WebClient();
       byte[] data = client.DownloadData(url);
       return client.Encoding.GetString(data);
}

Solution 1:

I had that problem with WebRequest. Try setting Proxy = null;

    WebClient wc = new WebClient();
    wc.Proxy = null;

By default WebClient, WebRequest try to determine what proxy to use from IE settings, sometimes it results in like 5 sec delay before the actual request is sent.

This applies to all classes that use WebRequest, including WCF services with HTTP binding. In general you can use this static code at application startup:

WebRequest.DefaultWebProxy = null;

Solution 2:

Download Wireshark here http://www.wireshark.org/

Capture the network packets and filter the "http" packets. It should give you the answer right away.

Solution 3:

Setting WebRequest.DefaultWebProxy = null; or client.Proxy = null didn't do anything for me, using Xamarin on iOS.

I did two things to fix this:

I wrote a downloadString function which does not use WebRequest and System.Net:

        public static async Task<string> FnDownloadStringWithoutWebRequest(string url)
        {
            using (var client = new HttpClient())
            {
                //Define Headers
                client.DefaultRequestHeaders.Accept.Clear();
                client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));

                var response = await client.GetAsync(url);

                if (response.IsSuccessStatusCode)
                {
                    string responseContent = await response.Content.ReadAsStringAsync();
                    //dynamic json = Newtonsoft.Json.JsonConvert.DeserializeObject(responseContent);
                    return responseContent;
                }
                Logger.DefaultLogger.LogError(LogLevel.NORMAL, "GoogleLoginManager.FnDownloadString", "error fetching string, code: " + response.StatusCode);
                return "";
            }
        }

This is however still slow with Managed HttpClient.

So secondly, in Visual Studio Community for Mac, right click on your Project in the Solution -> Options -> set HttpClient implementation to NSUrlSession, instead of Managed.

Screenshot: Set HttpClient implementation to NSUrlSession instead of Managed

Managed is not fully integrated into iOS, doesn't support TLS 1.2, and thus does not support the ATS standards set as default in iOS9+, see here:

https://docs.microsoft.com/en-us/xamarin/ios/app-fundamentals/ats

With both these changes, string downloads are always very fast (<<1s). Without both of these changes, on every second or third try, downloadString took over a minute.


Just FYI, there's one more thing you could try, though it shouldn't be necessary anymore:

            //var authgoogle = new OAuth2Authenticator(...);
            //authgoogle.Completed...

            if (authgoogle.IsUsingNativeUI)
            {
                // Step 2.1 Creating Login UI 
                // In order to access SFSafariViewController API the cast is neccessary
                SafariServices.SFSafariViewController c = null;
                c = (SafariServices.SFSafariViewController)ui_object;
                PresentViewController(c, true, null);
            }
            else
            {
                PresentViewController(ui_object, true, null);
            }

Though in my experience, you probably don't need the SafariController.

Solution 4:

There is nothing inherently slow about .NET web requests; that code should be fine. I regularly use WebClient and it works very quickly.

How big is the payload in each direction? Silly question maybe, but is it simply bandwidth limitations?

IMO the most likely thing is that your web-site has spun down, and when you hit the URL the web-site is slow to respond. This is then not the fault of the client. It is also possible that DNS is slow for some reason (in which case you could hard-code the IP into your "hosts" file), or that some proxy server in the middle is slow.

If the web-site isn't yours, it is also possible that they are detecting atypical usage and deliberately injecting a delay to annoy scrapers.

I would grab Fiddler (a free, simple web inspector) and look at the timings.