azzlack / microsoft.aspnet.webapi.messagehandlers.compression Goto Github PK
View Code? Open in Web Editor NEWDrop-in module for ASP.Net WebAPI that enables GZip and Deflate support
License: Apache License 2.0
Drop-in module for ASP.Net WebAPI that enables GZip and Deflate support
License: Apache License 2.0
If the client refused all schemes through "identity;q=0" or *;q=0. I am still getting a non-compressed response instead of a 406 - Not Acceptable exception.
I also noticed that the Content-Encoding: identity is missing when a valid non-compressed response is returned. The spec: "https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.3" is a little confusing, if the Content-Encoding should all ways returning with Content-Coding: Identity if compression is missing, or if Accept-Encoding: identity is a valid content negotiation. Just thought I would bring it up. Thanks for this library.
I see there are a few nice toggles like contentSizeThreshold and the ability to turn compression off for a controller or action method via CompressionAttribute.
However, being able to define compression based on the response content-type would be a really nice feature. Some content-types are inherently compressible, some are not. I don't want to mark every action method manually nor rely on devs to do it.
Minor issue, when sending identical 5.2KB packets, I tried inserting the following code into my startup.cs, I was not getting compression.
GlobalConfiguration.Configuration.MessageHandlers.Insert(0, new OwinServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
Once I switched to using the alternate initialization code things worked as expected.
config.MessageHandlers.Insert(0, new OwinServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
Compression doesn't work when AppleWebKit is found in the User-Agent header.
Chrome is sending by default: User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.129 Safari/537.36
Using fiddler I noticed that the response is uncompressed although the response header has X-Content-Encoding-Over-Network: gzip
. Removing AppleWebKit from User-Agent fixes the issue.
I'm not sure exactly why, but in my case the OnActionExecutingAsync of CompressionAttribute is being called multiple times. This causes the call to actionContext.Request.Properties.Add to fail.
My project currently targets .NET 4.5.1, which as I understand it, already includes the GZipStream / DeflateStream classes that are brought in by the Microsoft.Bcl.Compression package.
Adding any of the ".Bcl.*" packages causes no end of grief, as I then get a warning in every assembly that reference my project complaining that it, too, must install Microsoft.Bcl.Build
Am I missing something or should your package not require Microsoft.Bcl.Compression if the project it's being installed to already targets .Net 4.5 or later?
When creating a Universal Windows Platform app and installing the Microsoft.AspNet.WebApi.MessageHandlers.Compression package, building the project fails with:
Type universe cannot resolve assembly: System.Web.Http, Version=5.2.2.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35.
Installing one version earlier works fine.
More info here: http://stackoverflow.com/questions/34187169/odata-v-4-compression-support-in-uwp-windows-10-mobile?noredirect=1#comment56192795_34187169
Hello,
Looking at this Compress method in BaseCompressor it creates a byte array, reads from the memory stream into that byte array, creates another memory stream from this byte array, and finally copies that new stream into the ‘destination’ stream.
This all seems unnecessary.
Copying directly from the compressed stream (mem) into the destination and all the tests pass.
//1. NOTE REALLY REQUIRED:
//var compressed = new byte[mem.Length];
//await mem.ReadAsync(compressed, 0, compressed.Length);
//var outStream = new MemoryStream(compressed);
//await outStream.CopyToAsync(destination);
//2. OR JUST:
await mem.CopyToAsync(destination);
It would be nice if there weren't these dependencies when targeting .net 4.5 (or 4.5.2 in my use case):
Microsoft.Bcl
, Microsoft.Bcl.Build
, Microsoft.Bcl.Compression
.
AFAIK, these packages are used only for backward compatibility with .net 4, for example, to use async await
. But in .net 4.5, I don't need them.
In my project these 3 packages were added just to use Microsoft.AspNet.WebApi.MessageHandlers.Compression
.
What do you think ?
Hi,
I'm using Owin based WebApi project.
After adding your middleware and making few successful requests getting following exception and application crashes:
System.AccessViolationException was unhandled
Message: An unhandled exception of type 'System.AccessViolationException' occurred in System.Web.dll
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
I tried both OwinServerCompressionHandler and ServerCompressionHandler handlers, with .Net 4.5 and after upgrading 4.6.1 with same result.
What I'm doing wrong ?
Thanks in advance,
Mark
I am using this library in conjunction with AspNET Boilerplate.
This is a very good framework for building websites based on .NET.
One thing it does is to create the JQuery rest service proxies for Web API services.
The problem with this feature is, that the response.Contents underlying stream has already been disposed, when BaseSErverCompressionHandler tries to compress it.
Simplest solution would be to catch the ObjectDisposedException and return the uncompressed response. That way the site remains functional at least.
Hi,
I used the nuget package Microsoft.AspNet.WebApi.MessageHandlers.Compression for implementing compression for the web api following the steps mentioned in the below link:
https://github.com/azzlack/Microsoft.AspNet.WebApi.MessageHandlers.Compression
If I do not setup the threshold and go with the default code details in the App_Start\WebApiConfig.cs:
config.MessageHandlers.Insert(0, new ServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
I see in the Fiddler that the response is getting compressed. But once I set the threshold value as per the advanced usage details :
config.MessageHandlers.Insert(0, new ServerCompressionHandler(4096, new GZipCompressor(), new DeflateCompressor()));
I see in the Fiddler that the response is not getting compressed even if the Content-Length is more than 4096 bytes.
I am testing it using VS2013 premium and IIS express.
Can any help me to know is there any else I am missing out anything else.
Hello,
I've tried using the library to compress the responses coming from a Web API OData controller and the compression works great, except for that the response does not contain a header for DataServiceVersion which it did before using the compression.
This header being missing causes Breeze to fail parsing the response on the client side with a vague error message of "; ", which I've traced to a "no handler for data" error in datajs caused by the missing header.
Thanks very much,
Martyn.
During Tests, the compression works as expected, either in Debug or Release (using IISExpresss).
In production (IIS version 7.5.7600.16385) there is no compression to any requests.
Even if I enable dynamic content compression by the Compression setting on IIS, the server doesn't compress the data.
I detect the compressed response by checking the Response Header property named "Content-Encoding" which is always "gzip" in my case if succeeded.
Am I missing something? Can you help me?
Could you please give some example code on how to use DecompressionHandler and CompressionHandler together. thank you !
I use web api like rpc, post json data to the server, also return json data to the client.
I need the request/ response json data both be compressed.
Hello,
package path is too long.
Jenkins can't build the project with this nuget package ::
The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
packages\System.Net.Http.Extensions.Compression.Client.2.0.5\lib\portable-net45+wp8+wpa81+win8+monoandroid+monotouch+Xamarin.iOS+Xamarin.Mac\System.Net.Http.Extensions.Compression.Client.dll
Thank you for the handler anyway
i have my web API written in .net core 2.1 & andriod, ios clients are using these Web Api's.
so can i use this in my scenario?
Thanks in anticipation.
This assembly is not signed with a key, which makes it impossible to use with assemblies that must be GAC'ed.
Is there a reason you are requiring .NET 4.5.2 for your nuget reference (as opposed to just 4.5)?
Any chance of an upgrade?
I have a WebAPI application (.NET 4.7.2) that works locally on Windows 10 but does not work when deployed to Windows Server 2012R2/IIS8.5. The only clue I can get from it is...
{"Message":"An error has occurred.",
"ExceptionMessage":"Unable to decompress request using compressor 'System.Net.Http.Extensions.Compression.Core.Compressors.GZipCompressor'",
"ExceptionType":"System.Exception","StackTrace":"
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__19.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__16.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__15.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at System.Web.Http.HttpServer.d__24.MoveNext()",
"InnerException":{"Message":"An error has occurred.","ExceptionMessage":"Exception of type 'System.OutOfMemoryException' was thrown.","ExceptionType":"System.OutOfMemoryException","StackTrace":"
at System.Web.HttpRawUploadedContent.AddBytes(Byte[] data, Int32 offset, Int32 length)\r\n
at System.Web.HttpBufferlessInputStream.EndRead(IAsyncResult asyncResult)\r\n
at System.Web.Http.WebHost.SeekableBufferedRequestStream.EndRead(IAsyncResult asyncResult)\r\n
at System.Net.Http.StreamToStreamCopy.StartRead()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at System.Net.Http.Extensions.Compression.Core.HttpContentOperations.d__0.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__19.MoveNext()"}}
Any thoughts as currently I have no idea where to start?
After some tests with turned on GZip compression I found what CompressionHandler generate this exception. An exception occurs most often during server get unauthorized requests and sometimes during authorized.I use ordinary OWIN OAuth. Exception also occurs when I send request to auth /token endpoint. I attached screenshot of stack trace(begin of it). Please, help with problem
After we insert ServerCompressionHandler, Owin UseCookieAuthentication did not response cookies anymore :( is there anything we can do with it?
Thks for helping.
I'm using a specialized Response.Content of type PushStreamContent to stream content in chunks to client for long-running operations with client progress advance using pairs of stream.WriteAsync() and stream.Flush() (Transfer-Encoding: chunked).
To keep controller working, I've to mark it with [Compression(Enabled = false)] attribute: I suppose the compression filter read all the buffer to compress it invalidating my chunk trasmission.
Do you think is it possibile to test the type of Content and automatically disable compression if of type PushStreamContent or (better if possibile) test the property .buffered of the response stream?
Obviously this is not a big issue (I can mark my controllers with [Compression(Enabled = false)]) but it'll be very nice to have all automatically detected and managed.
Thanks
Hi,
I'm using an API Proxy to route API requests coming from a Javascript client to it's final API endpoint.Once the response gets in from the API endpoint I send it back to the client that initiated the request.
The API endpoint mostly sends back uncompressed JSON and XML responses to the API Proxy. My idea is now that I want to use your library in order to at least compress the response from the API Proxy to the final client.
However, I can't be sure that the developers who manage the API endpoint will eventually implement compressed responses in the future. Therefore I wanted to know if your library detects already compressed response bodies and doesn't perform a new compression in those cases.
Best regards and thanks, Matthias
I'm using your server compression package on a WebAPI 2.0 project and have run into an odd problem where HttpContext.Current
becomes null in controller actions that handle decompressed request content.
After some digging on StackOverflow, it seems that the .ConfigureAwait(false)
calls you're using on every await
are the source of the problem. I removed all of them from your code and rebuilt and everything is working fine now.
From what I understand, the ConfigureAwait(false)
call tells await
not to capture the synchronization context and restore it when it resumes after the await. This has the effect that, after an await
, HttpContext.Current
becomes null.
If there's a specific reason you're doing .ConfigureAwait(false)
, I guess you won't want to make any change. But otherwise, it would be greatly helpful if you could remove them. (I'm not a GitHub pro, so submitting a suggest code change isn't something I'll be doing.)
For reference, you might take a look at Stephen Cleary's answer on this SO question: http://stackoverflow.com/questions/24956178/using-httpcontext-current-in-webapi-is-dangerous-because-of-async
Hi,
Firstly, good work on getting this done. I have been looking for a nice clean solution to this for a while.
While testing this, I have noticed some strange behavior with some of my Web API endpoints. In some cases, I am getting an Http exception (flagged through fiddler) which indicates that there is a mismatch in the content length. Looking at the response, its quite right. The response shows a different length the the body.
Having a fish through the code, I have switched the AddHeaders() method call from the constructor to immediately after the "await _compressor.Compress(contentStream, stream);" call on line 76 of the CompressedContent.cs class. This has actually resolved my problem, but I was wondering if you could shed some light on whether that was the correct thing to do :)
cheers
Kev
A few actions of my controller, stream mpeg audio, throttled for limiting bandwidth.
I disabled compression on these actions, using the [Compression(Enabled = false)] attribute.
But this doesn't work, the pipeline message handler only checks this setting, after it has waited for the response to be generated. This makes this library useless for my project.
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
// Decompress compressed requests to the server
if (request.Content != null && request.Content.Headers.ContentEncoding.Any())
{
await this.DecompressRequest(request);
}
var response = await base.SendAsync(request, cancellationToken).ConfigureAwait(false);
var process = this.enableCompression(request);
try
{
if (response.Content != null)
{
// Buffer content for further processing
await response.Content.LoadIntoBufferAsync();
}
else
{
process = false;
}
}
catch (Exception ex)
{
process = false;
Debug.WriteLine(ex.Message);
}
// Compress uncompressed responses from the server
if (process && response.Content != null && request.Headers.AcceptEncoding.Any())
{
await this.CompressResponse(request, response);
}
return response;
}
Hi,
is there an easy way to disable compression for one action? I could register custom routes with handlers per route, but this is a nice solution.
I have installed the Microsoft.AspNet.WebApi.MessageHandlers.Compression.Server package and added the handler
private static void RegisterMessageHandlers(this HttpConfiguration config)
{
config.MessageHandlers.Insert(0, new ServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
}
Unfortunatelly, as I see in Fiddler, the compression is not enabled:
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/json
Expires: -1
Server: Microsoft-IIS/10.0
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Thu, 17 May 2018 10:51:26 GMT
Transfer-Encoding: chunked
Is there something I have missed?
Nice library, its very easy to integrate and use.
I observed that the HttpContentOperations:DecompressContent() copies all the headers as is, since it has done the decompression some of the headers content like content-Encoding as well as content-length should change.
In my case I'm sending POST with compressed data and this request needs to be forwarded to different host after decompression and it fails.
Whats your thoughts around it? I would be happy to contribute back if this looks like a generic improvement.
Thanks
Chirag
"NOTE!!!
This is an obsolete package, it only installs the Microsoft.AspNet.WebApi.Extensions.Compression.Server and System.Net.Http.Extensions.Compression.Client packages."
...So what should we be using instead?
I have a WebApi service which contains a controller than returns an image response, which basically looks like:
public HttpResponseMessage Get([FromUri]int id)
{
var c = this.Db.Photos.Find(id);
if (c == null || c.Photo == null || c.PhotoType == null)
return new HttpResponseMessage(HttpStatusCode.NotFound);
var result = new HttpResponseMessage(HttpStatusCode.OK);
result.Content = new ByteArrayContent(c.Photo);
result.Content.Headers.ContentType = new MediaTypeHeaderValue(c.PhotoType);
return result;
}
This works fine in all cases except when the compression handler is enabled. I got an error 504 response when it is, yet all the JSON-based responses continue to work fine. I can also make it work in Fiddler by copying the same response that fails, removing the Accept-Encoding header (effectively disabling the compressor for the request) and retrying. This was working up until the last release, so seems to be a recent bug?
Let me know if you need any more information from me. Thanks.
Hi, nice lib.
If downloading large gzip response will this will buffer the complete response before decompressing? If so, is it wise?
Does the lib supported chunked responses?
Cheers
It would be nice if we could have a handler that would only compress responses that were larger than say 4kb since compression small amounts of data could result in the data actually being larger than the original as well as requiring extra work on both the client and the server. I have looked into ways to do this, but I am not sure if it's possible. Thoughts?
It looks like the additional handler affected the HttpContext.Current.
In my case, HttpContext.Current in async web api is null.
i am making a console app and with help of easyhttp i am making a call to server,the problem is ,that server is is still compressing the data even when accept-encoding is null, please help me to pin down the issue ??
I am using following test code
var client = new HttpClient(new ClientCompressionHandler(
this.server, 20, new GZipCompressor(), new DeflateCompressor()));
client.DefaultRequestHeaders.AcceptEncoding.Add(new StringWithQualityHeaderValue("gzip"));
client.DefaultRequestHeaders.AcceptEncoding.Add(new StringWithQualityHeaderValue("deflate"));
var response = await client.PostAsJsonAsync("http://localhost:55399/api/test",
new { text = "lalelu die katze fliegt in schuh"});
In ClientCompressionHandler.CompressRequest(HttpRequestMessage request)
no compression is done since request.Content.Headers.ContentLength
is null
.
Not sure if this makes sense, but the following changes help in case of ObjectContent where ContentLength == null
.
private async Task CompressRequest(HttpRequestMessage request)
{
// As per RFC2616.14.3:
// Ignores encodings with quality == 0
// If multiple content-codings are acceptable, then the acceptable content-coding with the highest non-zero qvalue is preferred.
var compressor = (from encoding in request.Headers.AcceptEncoding
let quality = encoding.Quality ?? 1.0
where quality > 0
join c in this.Compressors on encoding.Value.ToLowerInvariant() equals
c.EncodingType.ToLowerInvariant()
orderby quality descending
select c).FirstOrDefault();
if (compressor != null)
{
try
{
// BEGIN
if (request.Content.Headers.ContentLength == null)
await request.Content.LoadIntoBufferAsync();
// END
// Only compress request if size is larger than threshold (if set)
if (this.contentSizeThreshold == 0)
{
request.Content = new CompressedContent(request.Content, compressor);
}
else if (this.contentSizeThreshold > 0 && request.Content.Headers.ContentLength >= this.contentSizeThreshold)
{
request.Content = new CompressedContent(request.Content, compressor);
}
}
catch (Exception ex)
{
throw new Exception(string.Format("Unable to compress request using compressor '{0}'", compressor.GetType()), ex);
}
}
}
Hi,
I see that the content headers are missing from the request during compression while the request size is more than 860 bytes. After debugging I came to know that the headers are ignored in the CopyTo() extension method(here and here). I had a Content-Length header in the request which lead the CopyTo() to return without copying the headers from the request to the target headers. Hence lead to missing of the headers from the request. I feel that instead of return statement it should have been continue so that the other headers can be copied to the target.
Please let me know if I am doing something wrong.
Thanks in advance.
Devs who need strong-named assemblies will benefit from those being published to NuGet. Building these from source is a hassle, albeit a small one.
Currently, compression is enabled by default and can be disabled for a specific API (or controller) using [Compression(Enabled = false)]
.
I'd like to be able to have compression disabled by default (maybe with an extra parameter passed to the ServerCompressionHandler
constructor) and then selectively enabling it for specific APIs (or controllers) with [Compression(Enabled = true)]
.
Hello,
I'm trying to track the amount of data I have sent so I placed a handler after the ClientCompressionHandler:
var clientCompressionHandler = new ClientCompressionHandler(40960, new GZipCompressor(), new DeflateCompressor()) { InnerHandler = dataTracker };
If the size of the content is less than 40960 and no compression is made the ContentLength header is set but if compression is done the ContentLength header is always null.
EDIT1:
The ContentLength is always null BEFORE the compression handler does the SendAsync but it only sets it to a value if it doesn't compress.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.