Coder Social home page Coder Social logo

microsoft.aspnet.webapi.messagehandlers.compression's People

Contributors

abrenneke avatar altumano avatar azzlack avatar brettbialer avatar charlesnrice avatar cheesebaron avatar coni2k avatar danielcrenna avatar gentledepp avatar johnny-bee avatar sebastianstehle avatar turbodrubin avatar wiltodelta avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

microsoft.aspnet.webapi.messagehandlers.compression's Issues

Explicitly Reject Identity

If the client refused all schemes through "identity;q=0" or *;q=0. I am still getting a non-compressed response instead of a 406 - Not Acceptable exception.

I also noticed that the Content-Encoding: identity is missing when a valid non-compressed response is returned. The spec: "https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.3" is a little confusing, if the Content-Encoding should all ways returning with Content-Coding: Identity if compression is missing, or if Accept-Encoding: identity is a valid content negotiation. Just thought I would bring it up. Thanks for this library.

Ability to Define Compression based on Response Content-Type

I see there are a few nice toggles like contentSizeThreshold and the ability to turn compression off for a controller or action method via CompressionAttribute.

However, being able to define compression based on the response content-type would be a really nice feature. Some content-types are inherently compressible, some are not. I don't want to mark every action method manually nor rely on devs to do it.

Correct config line for OWIN Webapi?

Minor issue, when sending identical 5.2KB packets, I tried inserting the following code into my startup.cs, I was not getting compression.

GlobalConfiguration.Configuration.MessageHandlers.Insert(0, new OwinServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));

Once I switched to using the alternate initialization code things worked as expected.
config.MessageHandlers.Insert(0, new OwinServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));

Compression doesn't work with Chrome browser

Compression doesn't work when AppleWebKit is found in the User-Agent header.
Chrome is sending by default: User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.129 Safari/537.36

Using fiddler I noticed that the response is uncompressed although the response header has X-Content-Encoding-Over-Network: gzip. Removing AppleWebKit from User-Agent fixes the issue.

Why the dependency on Microsoft.Bcl.Compression in a .NET 4.5.1 targeted project?

My project currently targets .NET 4.5.1, which as I understand it, already includes the GZipStream / DeflateStream classes that are brought in by the Microsoft.Bcl.Compression package.

Adding any of the ".Bcl.*" packages causes no end of grief, as I then get a warning in every assembly that reference my project complaining that it, too, must install Microsoft.Bcl.Build

Am I missing something or should your package not require Microsoft.Bcl.Compression if the project it's being installed to already targets .Net 4.5 or later?

UWP projects referencing Microsoft.AspNet.WebApi.MessageHandlers.Compression does not compile.

When creating a Universal Windows Platform app and installing the Microsoft.AspNet.WebApi.MessageHandlers.Compression package, building the project fails with:

Type universe cannot resolve assembly: System.Web.Http, Version=5.2.2.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35.

Installing one version earlier works fine.

More info here: http://stackoverflow.com/questions/34187169/odata-v-4-compression-support-in-uwp-windows-10-mobile?noredirect=1#comment56192795_34187169

Unnecessary Byte Array Buffer in BaseCompressor.Compressor

Hello,
Looking at this Compress method in BaseCompressor it creates a byte array, reads from the memory stream into that byte array, creates another memory stream from this byte array, and finally copies that new stream into the ‘destination’ stream.

This all seems unnecessary.

Copying directly from the compressed stream (mem) into the destination and all the tests pass.

//1. NOTE REALLY REQUIRED:
//var compressed = new byte[mem.Length];
//await mem.ReadAsync(compressed, 0, compressed.Length);
//var outStream = new MemoryStream(compressed);
//await outStream.CopyToAsync(destination);

//2. OR JUST:
await mem.CopyToAsync(destination);

Remove Microsoft.Bcl dependency in .net4.5

It would be nice if there weren't these dependencies when targeting .net 4.5 (or 4.5.2 in my use case):
Microsoft.Bcl, Microsoft.Bcl.Build, Microsoft.Bcl.Compression.

AFAIK, these packages are used only for backward compatibility with .net 4, for example, to use async await. But in .net 4.5, I don't need them.

In my project these 3 packages were added just to use Microsoft.AspNet.WebApi.MessageHandlers.Compression.

What do you think ?

AccessViolationException - Attempted to read or write protected memory

Hi,

I'm using Owin based WebApi project.
After adding your middleware and making few successful requests getting following exception and application crashes:
System.AccessViolationException was unhandled
Message: An unhandled exception of type 'System.AccessViolationException' occurred in System.Web.dll
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

I tried both OwinServerCompressionHandler and ServerCompressionHandler handlers, with .Net 4.5 and after upgrading 4.6.1 with same result.

What I'm doing wrong ?

Thanks in advance,
Mark

ObjectDisposedException - BaseServerCompressionHandler throws stream of response.Content has already been disposed.

I am using this library in conjunction with AspNET Boilerplate.
This is a very good framework for building websites based on .NET.
One thing it does is to create the JQuery rest service proxies for Web API services.
The problem with this feature is, that the response.Contents underlying stream has already been disposed, when BaseSErverCompressionHandler tries to compress it.

Simplest solution would be to catch the ObjectDisposedException and return the uncompressed response. That way the site remains functional at least.

Web API response compression not working once the threshold value is set

Hi,

I used the nuget package Microsoft.AspNet.WebApi.MessageHandlers.Compression for implementing compression for the web api following the steps mentioned in the below link:

https://github.com/azzlack/Microsoft.AspNet.WebApi.MessageHandlers.Compression

If I do not setup the threshold and go with the default code details in the App_Start\WebApiConfig.cs:

config.MessageHandlers.Insert(0, new ServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
I see in the Fiddler that the response is getting compressed. But once I set the threshold value as per the advanced usage details :

config.MessageHandlers.Insert(0, new ServerCompressionHandler(4096, new GZipCompressor(), new DeflateCompressor()));
I see in the Fiddler that the response is not getting compressed even if the Content-Length is more than 4096 bytes.

I am testing it using VS2013 premium and IIS express.
Can any help me to know is there any else I am missing out anything else.

DataServiceVersion Response Header Missing After Compression

Hello,

I've tried using the library to compress the responses coming from a Web API OData controller and the compression works great, except for that the response does not contain a header for DataServiceVersion which it did before using the compression.

This header being missing causes Breeze to fail parsing the response on the client side with a vague error message of "; ", which I've traced to a "no handler for data" error in datajs caused by the missing header.

Thanks very much,

Martyn.

Compression not working in a published webapi

During Tests, the compression works as expected, either in Debug or Release (using IISExpresss).

In production (IIS version 7.5.7600.16385) there is no compression to any requests.
Even if I enable dynamic content compression by the Compression setting on IIS, the server doesn't compress the data.

I detect the compressed response by checking the Response Header property named "Content-Encoding" which is always "gzip" in my case if succeeded.

Am I missing something? Can you help me?

more document

Could you please give some example code on how to use DecompressionHandler and CompressionHandler together. thank you !
I use web api like rpc, post json data to the server, also return json data to the client.
I need the request/ response json data both be compressed.

package path is too long for Jenkins

Hello,

package path is too long.

Jenkins can't build the project with this nuget package ::
The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.

packages\System.Net.Http.Extensions.Compression.Client.2.0.5\lib\portable-net45+wp8+wpa81+win8+monoandroid+monotouch+Xamarin.iOS+Xamarin.Mac\System.Net.Http.Extensions.Compression.Client.dll

Thank you for the handler anyway

can i use this in .net core 2.1

i have my web API written in .net core 2.1 & andriod, ios clients are using these Web Api's.

so can i use this in my scenario?

Thanks in anticipation.

Unable to decompress request using compressor 'System.Net.Http.Extensions.Compression.Core.Compressors.GZipCompressor', Exception of type 'System.OutOfMemoryException' was thrown.

I have a WebAPI application (.NET 4.7.2) that works locally on Windows 10 but does not work when deployed to Windows Server 2012R2/IIS8.5. The only clue I can get from it is...

{"Message":"An error has occurred.",
"ExceptionMessage":"Unable to decompress request using compressor 'System.Net.Http.Extensions.Compression.Core.Compressors.GZipCompressor'",
"ExceptionType":"System.Exception","StackTrace":"
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__19.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__16.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__15.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at System.Web.Http.HttpServer.d__24.MoveNext()",
"InnerException":{"Message":"An error has occurred.","ExceptionMessage":"Exception of type 'System.OutOfMemoryException' was thrown.","ExceptionType":"System.OutOfMemoryException","StackTrace":"
at System.Web.HttpRawUploadedContent.AddBytes(Byte[] data, Int32 offset, Int32 length)\r\n
at System.Web.HttpBufferlessInputStream.EndRead(IAsyncResult asyncResult)\r\n
at System.Web.Http.WebHost.SeekableBufferedRequestStream.EndRead(IAsyncResult asyncResult)\r\n
at System.Net.Http.StreamToStreamCopy.StartRead()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at System.Net.Http.Extensions.Compression.Core.HttpContentOperations.d__0.MoveNext()\r\n
--- End of stack trace from previous location where exception was thrown ---\r\n
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n
at Microsoft.AspNet.WebApi.Extensions.Compression.Server.BaseServerCompressionHandler.d__19.MoveNext()"}}

Any thoughts as currently I have no idea where to start?

Error: Server cannot append header after HTTP headers have been sent

After some tests with turned on GZip compression I found what CompressionHandler generate this exception. An exception occurs most often during server get unauthorized requests and sometimes during authorized.I use ordinary OWIN OAuth. Exception also occurs when I send request to auth /token endpoint. I attached screenshot of stack trace(begin of it). Please, help with problem
image

Response.Content of type PushStreamContent problem

I'm using a specialized Response.Content of type PushStreamContent to stream content in chunks to client for long-running operations with client progress advance using pairs of stream.WriteAsync() and stream.Flush() (Transfer-Encoding: chunked).
To keep controller working, I've to mark it with [Compression(Enabled = false)] attribute: I suppose the compression filter read all the buffer to compress it invalidating my chunk trasmission.
Do you think is it possibile to test the type of Content and automatically disable compression if of type PushStreamContent or (better if possibile) test the property .buffered of the response stream?
Obviously this is not a big issue (I can mark my controllers with [Compression(Enabled = false)]) but it'll be very nice to have all automatically detected and managed.
Thanks

Usage of this library in an API Proxy environment - autodetection of already compressed response bodies

Hi,

I'm using an API Proxy to route API requests coming from a Javascript client to it's final API endpoint.Once the response gets in from the API endpoint I send it back to the client that initiated the request.

The API endpoint mostly sends back uncompressed JSON and XML responses to the API Proxy. My idea is now that I want to use your library in order to at least compress the response from the API Proxy to the final client.

However, I can't be sure that the developers who manage the API endpoint will eventually implement compressed responses in the future. Therefore I wanted to know if your library detects already compressed response bodies and doesn't perform a new compression in those cases.

Best regards and thanks, Matthias

HttpContext gets lost because of .ConfigureAwait(false) in SendAsync() and DecompressContent()

I'm using your server compression package on a WebAPI 2.0 project and have run into an odd problem where HttpContext.Current becomes null in controller actions that handle decompressed request content.

After some digging on StackOverflow, it seems that the .ConfigureAwait(false) calls you're using on every await are the source of the problem. I removed all of them from your code and rebuilt and everything is working fine now.

From what I understand, the ConfigureAwait(false) call tells await not to capture the synchronization context and restore it when it resumes after the await. This has the effect that, after an await, HttpContext.Current becomes null.

If there's a specific reason you're doing .ConfigureAwait(false), I guess you won't want to make any change. But otherwise, it would be greatly helpful if you could remove them. (I'm not a GitHub pro, so submitting a suggest code change isn't something I'll be doing.)

For reference, you might take a look at Stephen Cleary's answer on this SO question: http://stackoverflow.com/questions/24956178/using-httpcontext-current-in-webapi-is-dangerous-because-of-async

Content-Length mismatch on some endpoints

Hi,

Firstly, good work on getting this done. I have been looking for a nice clean solution to this for a while.

While testing this, I have noticed some strange behavior with some of my Web API endpoints. In some cases, I am getting an Http exception (flagged through fiddler) which indicates that there is a mismatch in the content length. Looking at the response, its quite right. The response shows a different length the the body.

Having a fish through the code, I have switched the AddHeaders() method call from the constructor to immediately after the "await _compressor.Compress(contentStream, stream);" call on line 76 of the CompressedContent.cs class. This has actually resolved my problem, but I was wondering if you could shed some light on whether that was the correct thing to do :)

cheers

Kev

Async throttled actions with disabled compression...

A few actions of my controller, stream mpeg audio, throttled for limiting bandwidth.

I disabled compression on these actions, using the [Compression(Enabled = false)] attribute.

But this doesn't work, the pipeline message handler only checks this setting, after it has waited for the response to be generated. This makes this library useless for my project.

 protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
        {
            // Decompress compressed requests to the server
            if (request.Content != null && request.Content.Headers.ContentEncoding.Any())
            {
                await this.DecompressRequest(request);
            }

            var response = await base.SendAsync(request, cancellationToken).ConfigureAwait(false);

            var process = this.enableCompression(request);

            try
            {
                if (response.Content != null)
                {
                    // Buffer content for further processing
                    await response.Content.LoadIntoBufferAsync();
                }
                else
                {
                    process = false;
                }
            }
            catch (Exception ex)
            {
                process = false;

                Debug.WriteLine(ex.Message);
            }

            // Compress uncompressed responses from the server
            if (process && response.Content != null && request.Headers.AcceptEncoding.Any())
            {
                await this.CompressResponse(request, response);
            }

            return response;
        }

Not compressed in new ASP.NET MVC 5 app

I have installed the Microsoft.AspNet.WebApi.MessageHandlers.Compression.Server package and added the handler

private static void RegisterMessageHandlers(this HttpConfiguration config)
        {
            config.MessageHandlers.Insert(0, new ServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
        }

Unfortunatelly, as I see in Fiddler, the compression is not enabled:

HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/json
Expires: -1
Server: Microsoft-IIS/10.0
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Thu, 17 May 2018 10:51:26 GMT
Transfer-Encoding: chunked

Is there something I have missed?

HttpContext null exception

I just hooked up this library and I am getting "System.ArgumentNullException: Value cannot be null.Parameter name: httpContext" exception. I am using StructureMap as dependency injection which is using HttpContext.Current.

Here is the stack trace:

image

Any ideas?

Possible issue while redirecting after de-compression

Nice library, its very easy to integrate and use.

I observed that the HttpContentOperations:DecompressContent() copies all the headers as is, since it has done the decompression some of the headers content like content-Encoding as well as content-length should change.

In my case I'm sending POST with compressed data and this request needs to be forwarded to different host after decompression and it fails.

Whats your thoughts around it? I would be happy to contribute back if this looks like a generic improvement.

Thanks
Chirag

Error 504 when service tries to compress image

I have a WebApi service which contains a controller than returns an image response, which basically looks like:

    public HttpResponseMessage Get([FromUri]int id)
    {
        var c = this.Db.Photos.Find(id);
        if (c == null || c.Photo == null || c.PhotoType == null)
            return new HttpResponseMessage(HttpStatusCode.NotFound);

        var result = new HttpResponseMessage(HttpStatusCode.OK);
        result.Content = new ByteArrayContent(c.Photo);
        result.Content.Headers.ContentType = new MediaTypeHeaderValue(c.PhotoType);
        return result;
    }

This works fine in all cases except when the compression handler is enabled. I got an error 504 response when it is, yet all the JSON-based responses continue to work fine. I can also make it work in Fiddler by copying the same response that fails, removing the Accept-Encoding header (effectively disabling the compressor for the request) and retrying. This was working up until the last release, so seems to be a recent bug?

Let me know if you need any more information from me. Thanks.

Minimum compression size?

It would be nice if we could have a handler that would only compress responses that were larger than say 4kb since compression small amounts of data could result in the data actually being larger than the original as well as requiring extra work on both the client and the server. I have looked into ways to do this, but I am not sure if it's possible. Thoughts?

compressing even when accept encoding is null

i am making a console app and with help of easyhttp i am making a call to server,the problem is ,that server is is still compressing the data even when accept-encoding is null, please help me to pin down the issue ??

Does not compress in PostAsJsonAsync with threshold

I am using following test code

var client = new HttpClient(new ClientCompressionHandler(
  this.server, 20, new GZipCompressor(), new DeflateCompressor()));

client.DefaultRequestHeaders.AcceptEncoding.Add(new StringWithQualityHeaderValue("gzip"));
client.DefaultRequestHeaders.AcceptEncoding.Add(new StringWithQualityHeaderValue("deflate"));

var response = await client.PostAsJsonAsync("http://localhost:55399/api/test", 
  new { text = "lalelu die katze fliegt in schuh"});

In ClientCompressionHandler.CompressRequest(HttpRequestMessage request) no compression is done since request.Content.Headers.ContentLength is null.

Not sure if this makes sense, but the following changes help in case of ObjectContent where ContentLength == null.

    private async Task CompressRequest(HttpRequestMessage request)
    {
        // As per RFC2616.14.3:
        // Ignores encodings with quality == 0
        // If multiple content-codings are acceptable, then the acceptable content-coding with the highest non-zero qvalue is preferred.
        var compressor = (from encoding in request.Headers.AcceptEncoding
                          let quality = encoding.Quality ?? 1.0
                          where quality > 0
                          join c in this.Compressors on encoding.Value.ToLowerInvariant() equals
                              c.EncodingType.ToLowerInvariant()
                          orderby quality descending
                          select c).FirstOrDefault();

        if (compressor != null)
        {
            try
            {
                // BEGIN
                if (request.Content.Headers.ContentLength == null)
                    await request.Content.LoadIntoBufferAsync();
                // END

                // Only compress request if size is larger than threshold (if set)
                if (this.contentSizeThreshold == 0)
                {
                    request.Content = new CompressedContent(request.Content, compressor);
                }
                else if (this.contentSizeThreshold > 0 && request.Content.Headers.ContentLength >= this.contentSizeThreshold)
                {
                    request.Content = new CompressedContent(request.Content, compressor);
                }
            }
            catch (Exception ex)
            {
                throw new Exception(string.Format("Unable to compress request using compressor '{0}'", compressor.GetType()), ex);
            }
        }
    }

Content headers missing from the request during compression

Hi,

I see that the content headers are missing from the request during compression while the request size is more than 860 bytes. After debugging I came to know that the headers are ignored in the CopyTo() extension method(here and here). I had a Content-Length header in the request which lead the CopyTo() to return without copying the headers from the request to the target headers. Hence lead to missing of the headers from the request. I feel that instead of return statement it should have been continue so that the other headers can be copied to the target.

Please let me know if I am doing something wrong.

Thanks in advance.

Make compression opt-in

Currently, compression is enabled by default and can be disabled for a specific API (or controller) using [Compression(Enabled = false)].

I'd like to be able to have compression disabled by default (maybe with an extra parameter passed to the ServerCompressionHandler constructor) and then selectively enabling it for specific APIs (or controllers) with [Compression(Enabled = true)].

Header ContentLength is null after compression

Hello,

I'm trying to track the amount of data I have sent so I placed a handler after the ClientCompressionHandler:
var clientCompressionHandler = new ClientCompressionHandler(40960, new GZipCompressor(), new DeflateCompressor()) { InnerHandler = dataTracker };

If the size of the content is less than 40960 and no compression is made the ContentLength header is set but if compression is done the ContentLength header is always null.

EDIT1:
The ContentLength is always null BEFORE the compression handler does the SendAsync but it only sets it to a value if it doesn't compress.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.