[Mono-bugs] [Bug 686371] New: HTTP HEAD request times out if previous HEAD request resulted in 403

bugzilla_noreply at novell.com bugzilla_noreply at novell.com
Fri Apr 8 16:15:43 EDT 2011


https://bugzilla.novell.com/show_bug.cgi?id=686371

https://bugzilla.novell.com/show_bug.cgi?id=686371#c0


           Summary: HTTP HEAD request times out if previous HEAD request
                    resulted in 403
    Classification: Mono
           Product: Mono: Class Libraries
           Version: 2.10.x
          Platform: x86
        OS/Version: Windows 7
            Status: NEW
          Severity: Major
          Priority: P5 - None
         Component: System
        AssignedTo: mono-bugs at lists.ximian.com
        ReportedBy: bgrainger at logos.com
         QAContact: mono-bugs at lists.ximian.com
          Found By: Development
           Blocker: No


Created an attachment (id=424089)
 --> (http://bugzilla.novell.com/attachment.cgi?id=424089)
Sample code to reproduce the problem.

Description of Problem:
If a HttpWebRequest with Method="HEAD" is made to a server, and the server
returns HTTP error 403 (Forbidden), the next HTTP request to that server will
time out (and throw a WebException) instead of returning a response.


Steps to reproduce the problem:
1. Create a HttpWebRequest with Method="HEAD" for a URI that will give HTTP
error 403 (e.g., http://downloads.logos.com/robots2.txt). Call GetResponse().
2. Create a second HttpWebRequest (either "GET" or "HEAD") for a URI on the
same host. (It doesn't matter what HTTP status code this URI will return.) Call
GetResponse().
3. The second call to GetResponse eventually throws a WebException with a
Status of WebExceptionStatus.Timeout. Instead, it should return the response.

Sample code to demo this problem is attached. I have reproduced the bug on both
Mac OS X 10.6.7 and Windows 7 SP1.


Actual Results:
Running the program under Mono 2.10.1 produces the output:
----
HEAD http://downloads.logos.com/robots.txt -- ContentLength = 28
HEAD http://downloads.logos.com/robots2.txt -- Error = Forbidden
HEAD http://downloads.logos.com/robots.txt -- *** TIMEOUT ***
HEAD http://downloads.logos.com/robots.txt -- ContentLength = 28
----
Note that on the third line, the request immediately following the HTTP 403
error times out.

Expected Results:
Running the program under Microsoft .NET 3.5 or .NET 4.0 produces the output:
----
HEAD http://downloads.logos.com/robots.txt -- ContentLength = 28
HEAD http://downloads.logos.com/robots2.txt -- Error = Forbidden
HEAD http://downloads.logos.com/robots.txt -- ContentLength = 28
HEAD http://downloads.logos.com/robots.txt -- ContentLength = 28
----
Note that GetResponse (on line 3) succeeds.


How often does this happen? 
Every time.


Additional Information:
This doesn't happen if the server returns 404 for the first request, nor if the
first request uses the GET method. The second HTTP request can be either GET or
HEAD; it doesn't appear to matter.

-- 
Configure bugmail: https://bugzilla.novell.com/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
You are the assignee for the bug.


More information about the mono-bugs mailing list