How to deliver big files in ASP.NET Response?

I am not looking for any alternative of streaming file contents from
database, indeed I am looking for root of the problem, this was
running file till IIS 6 where we ran our app in classic mode, now we
upgraded our IIS to 7 and we are running app pool in pipeline mode and
this problem started.

I have an handler, where I have to deliver big files to client request. And I face following problems,

Files are of average size 4 to 100 MB, so lets consider 80MB file download case.

Buffering On, Slow Start

Response.BufferOutput = True;

This results in very slow start of file, as user downloads and even progress bar does not appear till few seconds, typically 3 to 20 seconds, reason behind is, IIS reads entire file first, determines the content-length and then begin the file transfer. File is being played in video player, and it runs very very slow, however iPad only downloads fraction of file first so it works fast.

Buffering Off, No Content-Length, Fast Start, No Progress

Reponse.BufferOutput = False;

This results in immediate start, however end client (typical browser like Chrome) does not know Content-Length as IIS does not know either, so it does not display progress, instead it says X KB downloaded.

Buffering Off, Manual Content-Length, Fast Start, Progress and Protocol Violation

Response.BufferOutput = False;
Response.AddHeader("Content-Length", file.Length);

This results in correct immediate file download in Chrome etc, however in some cases IIS handler results in “Remote Client Closed Connection” error (this is very frequent) and other WebClient results in protocol violation. This happens 5 to 10% of all requests, not every requests.

I guess what is happening is, IIS does not send anything called 100 continue when we dont do buffering and client might disconnect not expecting any output. However, reading files from source may take longer time, but at client side I have increased timeout but seems like IIS timesout and have no control.

Is there anyway I can force Response to send 100 continue and not let anyone close the connection?


I found following headers in Firefox/Chrome, nothing seems unusual here for Protocol Violation or Bad Header.

Access-Control-Allow-Methods:POST, GET, OPTIONS
Content-Disposition:attachment; filename="24.jpg"
Date:Wed, 07 Mar 2012 13:40:26 GMT


Turning Recycling still did not offer much but I have increased my MaxWorkerProcess to 8 and I now get less number of errors then before.

But on an average, out of 200 requests in one second, 2 to 10 requests fail.., and this happens on almost every alternate seconds.


Continuing 5% of requests failing with “The server committed a protocol violation. Section=ResponseStatusLine”, I have another program that downloads content from the webserver which uses WebClient, and which gives this error 4-5 times a second, on an average I have 5% of requests failing. Is there anyway to trace WebClient failure?

Problems Redefined

Zero Byte File Received

IIS closes connection for some reason, on client side in WebConfig, I receive 0 bytes for the file which is not zero bytes, We do SHA1 hash check, this told us that in IIS web server, no error is recorded.

This was my mistake, and its resolved as we are using Entity Framework, it was reading dirty (uncommitted rows) as read was not in transaction scope, putting it in transaction scope has resolved this issue.

Protocol Violation Exception Raised

WebClient throws WebException saying “The server committed a protocol violation. Section=ResponseStatusLine.

I know I can enable unsafe header parsing but that is not the point, when it is my HTTP Handler that is sending proper headers, dont know why IIS is sending anything extra (checked on firefox and chrome, nothing unusual), this happens only 2% of times.


Found sc-win32 64 error and I read somewhere that WebLimits for MinBytesPerSecond must be changed from 240 to 0, still everything is same. However I have noticed that whenever IIS logs 64 sc-win32 error, IIS records HTTP Status as 200 but there was some error. Now I cant turn on Failed Trace Logging for 200 because it will result in massive files.

Both of above problems were solved by increasing MinBytesPerSecond and as well as disabling Sessions, I have added detailed answer summarizing every point.


Thank you for visiting the Q&A section on Magenaut. Please note that all the answers may not help you solve the issue immediately. So please treat them as advisements. If you found the post helpful (or not), leave a comment & I’ll get back to you as soon as possible.

Method 1

Although correct way to deliver the big files in IIS is the following option,

  1. Set MinBytesPerSecond to Zero in WebLimits (This will certainly help in improving performance, as IIS chooses to close clients holding KeepAlive connections with smaller size transfers)
  2. Allocate More Worker Process to Application Pool, I have set to 8, now this should be done only if your server is distributing larger files. This will certainly cause other sites to perform slower, but this will ensure better deliveries. We have set to 8 as this server has only one website and it just delivers huge files.
  3. Turn off App Pool Recycling
  4. Turn off Sessions
  5. Leave Buffering On
  6. Before each of following steps, check if Response.IsClientConnected is true, else give up and dont send anything.
  7. Set Content-Length before sending the file
  8. Flush the Response
  9. Write to Output Stream, and Flush in regular intervals

Method 2

When you have set the content length with the bufferOutput to false then the possible reason of the fails is because IIS try to gzip the file you send, and by set the Content-Length IIS can not change it back to the compressed one, and the errors starts (*).

So keep the BufferOutput to false, and second disable the gzip from iis for the files you send – or disable the iis gzip for all files and you handle the gzip part programmatically, keeping out of gzip the files you send.

Some similar questions for the same reason:
ASP.NET site sometimes freezing up and/or showing odd text at top of the page while loading, on load balanced servers

HTTP Compression: Some external scripts/CSS not decompressing properly some of the time

(*) why not change it again ? because from the moment you set a header you can not take it back, except if you have enable this option on IIS and take care that the header have not all ready send to the browser.

Follow up

If not gziped, the next thing it came to my mind is that the file is sent and for some reason the connection got delayed, and got a timeout and closed. So you get the “Remote Host Closed The Connection”.

This can be solved depending on the cause:

  1. Client really closed the connection
  2. The timeout is from the page itself, if you use handler (again, probably, the message must be “Page Timed Out” ).
  3. The timeout is coming from the idle waiting, the page take more than the execution time, gets a timeout and close the connection. Maybe in this case the message was the Page Timed Out.
  4. The pool make a recycle the moment you send the file. Disable all pool recycles! This is the most possible cases that I can think of right now.

If it is coming from the IIS, go to the web site properties and make sure you set the biggest “Connection Timeout”, and “Enable HTTP Keep-Alives”.

The page timeout by changing the web.config (you can change it programmatically only for one specific page)

<httpRuntime executionTimeout="43200"

Also have a look at :

Session lock

One more thing that you need to examine is to not use session on the handler that you use to send the file, because the session locks the action until finish out and if a user take longer time to download a file, a second one may get time out.

some relative:

call aspx page to return an image randomly slow

Replacing ASP.Net’s session entirely

Response.WriteFile function fails and gives 504 gateway time-out

Method 3

What I would do is use the not so well-known ASP.NET Response.TransmitFile method, as it’s very fast (and possibly uses IIS kernel cache) and takes care of all header stuff. It is based on the Windows unmanaged TransmitFile API.

But to be able to use this API, you need a physical file to transfer. So here is a pseudo c# code that explain how to do this with a fictional myCacheFilePath physical file path. It also supports client caching possibilities. Of course, if you already have a file at hand, you don’t need to create that cache:

    if (!File.Exists(myCacheFilePath))
        LoadMyCache(...); // saves the file to disk. don't do this if your source is already a physical file (not stored in a db for example).

    // we suppose user-agent (browser) cache is enabled
    // check appropriate If-Modified-Since header
    DateTime ifModifiedSince = DateTime.MaxValue;
    string ifm = context.Request.Headers["If-Modified-Since"];
    if (!string.IsNullOrEmpty(ifm))
            ifModifiedSince = DateTime.Parse(ifm, DateTimeFormatInfo.InvariantInfo);
            // do nothing

        // file has not changed, just send this information but truncate milliseconds
        if (ifModifiedSince == TruncateMilliseconds(File.GetLastWriteTime(myCacheFilePath)))
            ResponseWriteNotModified(...); // HTTP 304

    Response.ContentType = contentType; // set your file content type here
    Response.AddHeader("Last-Modified", File.GetLastWriteTimeUtc(myCacheFilePath).ToString("r", DateTimeFormatInfo.InvariantInfo)); // tell the client to cache that file

    // this API uses windows lower levels directly and is not memory/cpu intensive on Windows platform to send one file. It also caches files in the kernel.

Method 4

This piece of code works for me.
It starts the data stream to client immediately.
It shows progress during download.
It doesn’t violate HTTP. Content-Length header is specified and the chuncked transfer encoding is not used.

protected void PrepareResponseStream(string clientFileName, HttpContext context, long sourceStreamLength)

    context.Response.ContentType = "application/pdf";
    context.Response.AddHeader("Content-Disposition", string.Format("filename="{0}"", clientFileName));

    //set cachebility to private to allow IE to download it via HTTPS. Otherwise it might refuse it
    //see reason for HttpCacheability.Private at
    context.Response.Buffer = false;
    context.Response.BufferOutput = false;
    context.Response.AddHeader("Content-Length", sourceStreamLength.ToString    (System.Globalization.CultureInfo.InvariantCulture));

protected void WriteDataToOutputStream(Stream sourceStream, long sourceStreamLength, string clientFileName, HttpContext context)
    PrepareResponseStream(clientFileName, context, sourceStreamLength);
    const int BlockSize = 4 * 1024 * 1024;
    byte[] buffer = new byte[BlockSize];
    int bytesRead;
    Stream outStream = m_Context.Response.OutputStream;
    while ((bytesRead = sourceStream.Read(buffer, 0, BlockSize)) > 0)
        outStream.Write(buffer, 0, bytesRead);

All methods was sourced from or, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

0 0 votes
Article Rating
Notify of

Inline Feedbacks
View all comments
Would love your thoughts, please comment.x