haskell / http Goto Github PK
View Code? Open in Web Editor NEWHaskell HTTP package
Home Page: http://hackage.haskell.org/package/HTTP
License: Other
Haskell HTTP package
Home Page: http://hackage.haskell.org/package/HTTP
License: Other
I'm glad to see the new auto-proxy support, much more convenient!
A closely related feature is that more complicated proxy configurations have exception lists, e.g. no_proxy=localhost, listing servers (usually on the local LAN) that should not go via the proxy. Windows allows this, and it seems from a quick search that on unix the no_proxy env var can be used. At least, wget and w3m use it.
I know of a user who actually needs a setup like this, everything goes through a proxy, but they have some servers on localhost which the proxy of course is not aware of.
So I think what'd we'd need is to extend the Proxy constructor with such a list, and then check when we're about to make a connection if the target host is on the list and if so to make a direct connection rather than using the proxy.
Moved from haskell/cabal#2035.
Most pertinent: compiling and running this program with GHC 7.8.3 produces the following on my Windows 8.1 x64 machine. I have tried to reproduce on VMs but can't.
http-test> ghc --make Main.hs -o test.exe -prof -auto-all -caf-all
[1 of 1] Compiling Main ( Main.hs, Main.o )
Linking test.exe ...
http-test> .\test.exe +RTS -xc
Trying to download http://hackage.haskell.org/packages/archive/00-index.tar.gz
Sending:
GET /packages/archive/00-index.tar.gz HTTP/1.1
Host: hackage.haskell.org
User-Agent: cabal-install/1.18.0.5
Creating new connection to hackage.haskell.org
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
Received:
HTTP/1.1 301 Moved Permanently
Server: nginx/1.6.0
Date: Mon, 18 Aug 2014 22:29:21 GMT
Content-Type: text/plain; charset=UTF-8
Connection: keep-alive
Location: /packages/index.tar.gz
Content-Length: 0
301 - redirect
Redirecting to http://hackage.haskell.org/packages/index.tar.gz ...
Sending:
GET /packages/index.tar.gz HTTP/1.1
Host: hackage.haskell.org
User-Agent: cabal-install/1.18.0.5
Recovering connection to hackage.haskell.org
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
*** Exception (reporting due to +RTS -xc): (THUNK_1_0), stack trace:
Main.main,
called from :Main.CAF:main
test.exe: <socket: 336>: hGetBufSome: failed (Unknown error)
Include support for base-4.7.0.0
I get the following error message when using the newest GHC HEAD:
Network/BufferType.hs:57:10:
Illegal instance declaration for ‛BufferType String’
(All instance types must be of the form (T a1 ... an)
where a1 ... an are *distinct type variables*,
and each type variable appears at most once in the instance head.
Use -XFlexibleInstances if you want to disable this.)
In the instance declaration for ‛BufferType String’
Failed to install HTTP-4000.0.7
The library builds fine with the latest version of conduit
after the restrictions are removed from the cabal file.
Because of the strict parsing of URIs by Network.URI.parseURIReference, parseRequestHead fails on this "almost valid" URI:
ghci> parseRequestHead ["GET http://fonts.googleapis.com/css?family=Roboto:300|Open+Sans:700|Open+Sans:300&lang=en HTTP/1.1"]
ghci> Left (ErrorParse "parseRequestHead Request command line parse failure: GET http://fonts.googleapis.com/css?family=Roboto:300|Open+Sans:700|Open+Sans:300&lang=en HTTP/1.1")
Replacing the pipe characters with %7C allows the URI to parse.
receiveHTTP, or maybe parseRequestHead, should probably try to escape the characters that are considered invalid for URIs, before sending the string through parseURIReference.
In that regard, escapeURIString isAllowedInURI
from Network.URI may be handy.
(Something about "be conservative in what you send, be liberal in what you accept" prompted me to raise this issue.)
Is it possible to stop relying on old-time in HTTP? It would be nice to stop doing so to start flushing the old-* libraries out of the core ecosystem.
I am not able to understand what are the fields auRealm
and auSite
fields in Authority AuthBasic
are for?
In the code for withAuthority
they are not used and even Wikipedia doesnt mention them in Client Side
I'm using HEAD to check the size of a very large url before downloading it, so it's surprising that when there's a 302 redirect, it's followed using GET.
Hello,
I try to build from source HTTP-4000.2.6 package.
I make
cabal configure
Resolving dependencies...
Configuring HTTP-4000.2.6...
And than:
cabal build
Preprocessing library HTTP-4000.2.6...
Preprocessing test suites for HTTP-4000.2.6...
Building HTTP-4000.2.6...
[ 3 of 18] Compiling Paths_HTTP ( dist/build/autogen/Paths_HTTP.hs, dist/build/Paths_HTTP.o )
dist/build/autogen/Paths_HTTP.hs:21:13: Not in scope: `catch'
dist/build/autogen/Paths_HTTP.hs:22:13: Not in scope: `catch'
dist/build/autogen/Paths_HTTP.hs:23:14: Not in scope: `catch'
dist/build/autogen/Paths_HTTP.hs:24:17: Not in scope: `catch'
I use ghc-7.6.1 and catch
removed from Prelude in ghc-7.6.
Thank you.
This was originally reported as a Cabal bug, but I believe it belongs here:
The value of HTTP_PROXY has as format
http://user:pass@host:80/In the password special characters can occur, like @ and !.
According to wiki (http://en.wikipedia.org/wiki/Percent-escape)
these characters must be percent-encoded, like %40 and %21.Cabal doesn't accept percent-encoded characters.
And cabal errs on these characters (if password is a@b, Cabal doesn't look to the right site, but >probably to "b@host:80" (error message is not clear enough to be certain)
I discovered an issue with the http lib when attempting to use cabal, see this issue for reference:
haskell/cabal#2455
Here's the output I receive when running this test program:
Trying to download http://hackage.haskell.org/packages/archive/00-index.tar.gz
Sending:
GET /packages/archive/00-index.tar.gz HTTP/1.1
Host: hackage.haskell.org
User-Agent: cabal-install/1.18.0.5
Creating new connection to hackage.haskell.org
http error: Network.Browser.request: Error raised ErrorParse "Invalid response: Response code test.hs: Char.intToDigit: not a digit -1
Thanks for taking a look! Please let me know if there's anything I can do to help resolve this, I'm kind of stuck in what I'm doing right now.
When urlEncodeVars
encounters keys with multiple values, it separates the values with commas. For example:
urlEncodeVars [("k", "1"), ("k", "2")]
-- "k=1,2"
In my experience, the correct way to encode arrays like this is to append []
to the key and pass each element as a key-value pair. For instance:
urlEncode "[]"
-- "%5B%5D"
urlEncodeVars' [("k", "1"), ("k", "2")]
-- "k%5B%5D=1k%5B%5D=2"
Is it possible to change the behavior of urlEncodeVars
to match my expectation?
There are some API changes so this doesn't work trivially. Not sure if we need to simultaneously support warp 2 - need to check what works with GHC 7.4 and any old library versions we still want to support.
RFC2617 https://tools.ietf.org/html/rfc2617 says about qop
and related fields (key bits highlighted):
qop
Indicates what "quality of protection" the client has applied to
the message. If present, its value MUST be one of the alternatives
the server indicated it supports in the WWW-Authenticate header.
These values affect the computation of the request-digest. Note
that this is a single token, not a quoted list of alternatives as
in WWW- Authenticate. This directive is optional in order to
preserve backward compatibility with a minimal implementation of
RFC 2069 [6], but SHOULD be used if the server indicated that qop
is supported by providing a qop directive in the WWW-Authenticate
header field.cnonce
This MUST be specified if a qop directive is sent (see above), and
MUST NOT be specified if the server did not send a qop directive in
the WWW-Authenticate header field. The cnonce-value is an opaque
quoted string value provided by the client and used by both client
and server to avoid chosen plaintext attacks, to provide mutual
authentication, and to provide some message integrity protection.
See the descriptions below of the calculation of the response-
digest and request-digest values.nonce-count
This MUST be specified if a qop directive is sent (see above), and
MUST NOT be specified if the server did not send a qop directive in
the WWW-Authenticate header field. The nc-value is the hexadecimal
count of the number of requests (including the current request)
that the client has sent with the nonce value in this request. For
example, in the first request sent in response to a given nonce
value, the client sends "nc=00000001". The purpose of this
directive is to allow the server to detect request replays by
maintaining its own copy of this count - if the same nc-value is
seen twice, then the request is a replay. See the description
below of the construction of the request-digest value.
That is, if the client sends qop=auth
then it MUST also send nc=xxx
and cnonce=xxx
. Servers are well within their right to reject requests without these as malformed.
Currently the HTTP package does not support the nc or cnonce fields, and yet it will automatically send qop=auth
if the server offered it. Either this support should be disabled or the nc/cnonce support should be added. That is it should either follow RFC2069 (which has no notion of qop
) or RFC2617 (which introduces qop
and its related directives) but not a partial mixture of the two.
We currently have a problem with the hackage-server haskell/hackage-server#199 where we want to support RFC2617 style digest auth (because a certain browser only supports the newer RFC), but if we send qop=auth
then the HTTP package (and thus cabal etc) return broken responses. The server is currently correctly rejecting requests that use qop=auth
but without the other required fields. We will probably have to relax that and allow these incorrect requests. But other servers may not be so forgiving, and we should fix it. The easiest thing would be to simply stop sending qop=auth
, e.g.
diff --git a/Network/HTTP/Auth.hs b/Network/HTTP/Auth.hs
index 4af0d67..5dad7cc 100644
--- a/Network/HTTP/Auth.hs
+++ b/Network/HTTP/Auth.hs
@@ -95,7 +95,8 @@ withAuthority a rq = case a of
-- plus optional stuff:
, fromMaybe "" (fmap (\ alg -> ",algorithm=" ++ quo (show alg)) (auAlgorithm a))
, fromMaybe "" (fmap (\ o -> ",opaque=" ++ quo o) (auOpaque a))
- , if null (auQop a) then "" else ",qop=auth"
+ --TODO: we currently do not support qop=auth or auth-int
+ -- if we send qop=auth then we MUST also send 'nc' and 'cnonce'
]
If a webserver doesn't implement keep-alive (e.g. httpd-shed as currently used in the test harness), HTTP doesn't notice and still tries to send a second request on the same connection.
Prelude Network.HTTP> urlEncode "ололо"
"%04%3E%04%3B%04%3E%04%3B%04%3E"
while it should be "%D0%BE%D0%BB%D0%BE%D0%BB%D0%BE"
Here's an interesting data point in the PVP discussion. Pinging @tibbe.
network version 2.5 was just released. HTTP has a preemptive upper bound on network 2.5, since version 4000.2.5. I wanted to test warp against network 2.5. So I bumped upper bounds in conduit-extra on network and ran cabal install --enable-tests . warp yesod-core
. I then got the following error message:
Network/HTTP/Auth.hs:192:49:
Couldn't match expected type `Maybe URI' with actual type `URI'
In the second argument of `fromMaybe', namely
`(u `relativeTo` baseURI)'
In the expression: fromMaybe u (u `relativeTo` baseURI)
In the first argument of `map', namely
`(\ u -> fromMaybe u (u `relativeTo` baseURI))'
Failed to install HTTP-4000.2.4
Because versions following 4000.2.4 have preemptive upper bounds on network, cabal-install is trying to install an older version, which causes the build failure. If I open up the newest version of HTTP and fix the upper bounds, installation proceeds correctly.
So immediate bug report is: please bump the upper bound on the network package. Medium term bug report is: can the upper bounds listed on packdeps all be relaxed?
But a more serious question is: since HTTP didn't always follow the PVP, does that means it can't logically get any of the PVP benefits any more? Or do all of the old versions need to be deprecated?
It would be nice to be able to have an instance of HStream
that allows to stream without using lazy I/O; in particular, an instance that would allow us to close the connection when we no longer want the rest of the response body.
However, in order to define a custom type with a corresponding HStream
instance one needs to provide, amongst other things, implementations of
openStream :: String -> Int -> IO (HandleStream bufType)
close :: HandleStream bufType -> IO ()
For the implementation of openStream
we can use
openTCPConnection :: BufferType ty => String -> Int -> IO (HandleStream ty)
However, for the implementation of close
we are stuck; there is no function analogous to openTCPConnection
that allows us to close a HandleStream
.
Although we have
hstreamToConnection :: HandleStream String -> Connection
and there is a Stream
instance for Connection
, hstreamToConnection
works only for HandleStream String
, so that doesn't help either.
Finally, the HandleStream
type is completely opaque (it does not even satisfy Functor
) so we cannot define a HStream
instance for one bufType
in terms of the HStream
instance for another. If we had some way of defining new instances in terms of old instances (for instance, reuse the instance for lazy bytestrings, but somehow retain a handle to the corresponding HandleStream
so that we can close it explicitly) that would be great.
It would be very nice, especially for lambdabot (where it's more important to get a fast answer than to get one at all), to be able to configure the connection timeout for HTTP requests, preferably with sub-second resolution.
Currently, GHC 8.0.1-rc2 cannot compile HTTP: http://hydra.cryp.to/build/1606798/nixlog/1/raw. It would be great to get an update on Hackage that remedies this issue.
In ghci:
ghci > urlDecode "http://twitter.com/fred/statuses/200"
"http://twitter.com/fred/statuses/200"
ghci > urlDecode "http://twitter.com/%user_screen_name%/statuses/%id%"
"http://twitter.com/*** Exception: Char.digitToInt: not a digit 'u'
It would be vastly nicer if urlDecode
's type was something like String -> Either String String
.
A cabal-install user ran into this: http://www.haskell.org/pipermail/ghc-devs/2014-May/004928.html
In general I think HTTP ought to follow any standard or agreed practice for the use of this variable. I see the plausible options as:
I can't find any official specification of what http_proxy should mean and some googling suggests that many other people seem to favour 3) as well - see e.g. pypa/pip#478 - so I am inclined towards doing that.
The fairly simple code change would go here: https://github.com/haskell/HTTP/blob/master/Network/HTTP/Proxy.hs#L64
Hello,
In documentation for simpleHTTP
it is not mentioned that it raises IO exception if url is "https" (through failHTTPS).
It should be mentioned in docs, or even better, it should catch it and return Left
(maybe ErrorMisc).
Best regards,
vlatko
When I try to access IPv6-only sites, I get
user error (openTCPConnection: host lookup failure for "foo.bar")
Reproed on GHC 7.4.2, 7.6.3, 7.8.2. Using cabal-install 1.16 for the first two, 1.18 when using GHC 7.8.
Start with a clean package environment, then run:
cabal install cabal-install-1.16.0.2 --dry-run
Due to inconsistent upper bounds on transformers, the build plan selects HTTP-4000.0.7. At least on GHC 7.4.2, this package install ultimately fails with:
[ 7 of 15] Compiling Network.BufferType ( Network/BufferType.hs, dist/build/Network/BufferType.o )
Network/BufferType.hs:57:10:
Illegal instance declaration for `BufferType String'
(All instance types must be of the form (T a1 ... an)
where a1 ... an are *distinct type variables*,
and each type variable appears at most once in the instance head.
Use -XFlexibleInstances if you want to disable this.)
In the instance declaration for `BufferType String'
Failed to install HTTP-4000.0.7
cabal: Error: some packages failed to install:
HTTP-4000.0.7 failed during the building phase. The exception was:
ExitFailure 1
cabal-install-1.16.0.2 depends on HTTP-4000.0.7 which failed to install.
This is the exact same issue as #55. This problem is likely to cause wide-spread breakage until a new version of HTTP is released which is compatible with transformers 0.4.
Quoting haskell/cabal#1962:
In order to avoid #1602 I upgraded HTTP to version 4000.2.17. But I noticed a new (minor) problem: If the proxy mentioned in http_proxy is down, I get
(sid)root@kirk:~# unset http_proxy (sid)root@kirk:~# cabal update Downloading the latest package list from hackage.haskell.org (sid)root@kirk:~# export http_proxy=http://localhost:3128/ (sid)root@kirk:~# cabal update Downloading the latest package list from hackage.haskell.org cabal: does not exist # now starting the proxy (sid)root@kirk:~# cabal update Downloading the latest package list from hackage.haskell.org
Certainly the error message could be more helpful.
There are two new versions of HTTP released on Hackage (4000.2.7 and 4000.2.8), but the version in the .cabal
file in this repo is 4000.2.6. Is this still the latest version of the code?
What's the reason for not having getRequest
and postRequest
return Request a
, i.e.:
getRequest :: BufferType a => String -> Request a
postRequest :: BufferType a => String -> Request a
If they were like this, it would be much easier to tell the library to use (Lazy) ByteStrings instead of Strings. The simplest way I could do that with what's in the library at this point is this:
request :: URI -> IO ByteString
request url = simpleHTTP (defaultGETRequest_ url) >>= getResponseBody
While it's not necessarily ugly or bad, I think it would be much nicer to have polymorphic getRequest
/postRequest
functions.
http://hackage.haskell.org/package/HTTP says that the home page is http://projects.haskell.org/http/ , but that should probably be changed to https://github.com/haskell/HTTP .
Hi, when I try to compile the current release of HTTP
with GHC 6.12.3, the configure phase fails because of these errors:
Setup: At least the following dependencies are missing:
array >=0.3.0.2 && <0.6, base >=4.3.0.0 && <4.9
If I just remove these constraints, however, the package compiles just fine:
Configuring HTTP-4000.2.19...
Flags chosen: network-uri=True, warp-tests=True, conduit10=False,
network23=False, warn-as-error=False, mtl1=False
Dependency array -any: using array-0.3.0.1
Dependency base -any: using base-4.2.0.2
Dependency bytestring -any: using bytestring-0.9.1.7
Dependency mtl >=2.0 && <2.3: using mtl-2.2.1
Dependency network ==2.6.*: using network-2.6.0.2
Dependency network-uri ==2.6.*: using network-uri-2.6.0.1
Dependency old-time -any: using old-time-1.0.0.5
Dependency parsec -any: using parsec-3.1.8
Is there any hidden problem with the old base
and array
libraries that I should be aware of? Or is it possible that the specified version constraints are just too narrow?
This is related to #14, which may have not been fixed properly. When using cabal update
built with HTTP-4000.2.17
and using a squid2 proxy, I get cabal: does not exist
. A friend of mine did some debugging and found out:
it seems that cabal does a GET http://hackage.haskell.org/packages/archive/00-index.tar.gz
squid responds with a cache hit 301 and closes the connection.
cabal tries to use pipelining and fails
it doesn't try a new connection
It does not happen with squid3, or perl’s HTTP::Proxy
.
We should support the Text datatype. I don't think there are any dependency issues with doing so given that text is in the Platform and in any case HTTP already depends indirectly on text via network and parsec 3.
However any instance would have the same problems with encoding as in #28, so it would make sense to fix that first.
Currently HTTP does not cabal install without special instructions to use network < 2.6.
Building HTTP-4000.0.7...
Preprocessing library HTTP-4000.0.7...
Network/HTTP.hs:75:8:
Could not find module ‘Network.URI’
Perhaps you meant Network.BSD (from network-2.6.0.1)
Use -v to see a list of the files searched for.
Please update to use the new network-uri
, using a network-uri
flag. Instructions here.
I cannot build tests on ghc-7.6 due to:
https://github.com/haskell/HTTP/blob/master/HTTP.cabal#L114
This is just a request to permit ghc-7.6 as a build tool for the HTTP package.
Hi, it would be great to get a new version of HTTP that supports network 2.6.x. I tried patching the Cabal file to lift the restrictions, but that didn't quite do it. Apparently the Network.URI module no longer exists in the latest version of network.
Thanks for the great library! I just replaced curl with HTTP in my project.. It went very well in general but here are some findings:
getResponseCode (Left err) = fail $ show err
getResponseCode (Right (Response (a, b, c) _ _ _)) = return $ a_100+b_10+c
As you can see, I converted the three digits into one Int, but that's just a matter of opinion, right..
Anyway, thanks for your great work!
When a https:// URL is used, we just silently fall back to a non-encrypted connection on port 443. Should really just refuse to do it until SSL is supported.
The relevant error is:
Network/Browser.hs:968:3:
Non type-variable argument
in the constraint: MonadState (BrowserState connection) m
(Use FlexibleContexts to permit this)
In the context: (HStream ty,
MonadState (BrowserState connection) m,
MonadIO m)
While checking the inferred type for ‘dorequest2’
In an equation for ‘dorequest’:
dorequest hst rqst
= do { pool <- gets bsConnectionPool;
let uPort = ...;
conn <- liftIO
$ filterM
(\ c -> c `isTCPConnectedTo` EndPoint (uriRegName hst) uPort) pool;
.... }
where
dorequest2 c r
= do { dbg <- gets bsDebug;
.... }
A complete build log is at http://hydra.cryp.to/build/52695/nixlog/1/raw.
Relevant log:
Creating new connection to hackage.haskell.org
Received:
HTTP/1.1 301 Moved Permanently
Server: nginx/1.6.2
Content-Type: text/plain; charset=UTF-8
Location: /package/unix-time-0.2.2/unix-time-0.2.2.tar.gz
Transfer-Encoding: chunked
Accept-Ranges: bytes
Date: Tue, 19 May 2015 11:21:00 GMT
Via: 1.1 varnish
Age: 0
Connection: keep-alive
X-Served-By: cache-fra1224-FRA
X-Cache: MISS
X-Cache-Hits: 0
X-Timer: S1432034460.707740,VS0,VE133
Content-Length: 0
301 - redirect
Redirecting to
http://hackage.haskell.org/package/unix-time-0.2.2/unix-time-0.2.2.tar.gz ...
Sending:
GET /package/unix-time-0.2.2/unix-time-0.2.2.tar.gz HTTP/1.1
Host: hackage.haskell.org
User-Agent: cabal-install/1.22.0.0 (linux; x86_64)
Recovering connection to hackage.haskell.org
It was stuck for approximately 10-15 minutes on this and then the download finished. The download finishes within few seconds when using wget
.
http version: 4000.2.10
At the moment the user agent string is an independent string in Network/HTTP/Base.hs, which means I keep forgetting to update it when I update the version. There should only be one place that the version is determined.
Although the package works with String, and indeed encourages its use by having that be the default type in some of the API helper functions, the handling of non-ASCII data is completely broken. We should be doing encoding properly to be consistent with the Content-Type header, both when sending and receiving.
This has been reported by a couple of users so far and is clearly pretty nasty, but I don't think it's trivial to fix because presumably the receiving and sending sides can use different encodings and when sending requests we ought to be optionally giving the user control of the encoding.
The alternative of removing the String instances completely is likely to be very disruptive so I don't think it's a reasonable option.
My proxy settings are set to the following in Windows:
Type Proxy address to use Port
HTTP 127.0.0.1 3213
HTTPS 127.0.0.1 3213
FTP <empty> <empty>
SOCKS <empty> <empty>
Which leads to the following error using cabal
Invalid http proxy uri: "http=127.0.0.1:3213;https=127.0.0.1:3213"
proxy uri must be http with a hostname
ignoring http proxy, trying a direct connection
The registry stores the proxy setting in the format seen above. From what I can see in the source code, no effort is made to actually parse this string which then always returns an invalid string.
Preprocessing test suite 'test' for HTTP-4000.2.13...
test/httpTests.hs:14:18:
Could not find module `Httpd'
Use -v to see a list of the files searched for.
When nothing listening on localhost:81
, with following code:
(simpleHTTP (getRequest "http://localhost:81/")) >>= (\r -> case r of Left _ -> putStrLn "Error"; Right _ -> putStrLn "Alright!")
instead of printing Error
, I get:
*** Exception: connect: does not exist (Connection refused)
To me, and as suggested by someone in #haskell
, this seems like an unexpected behaviour. I'm expecting error condition to be passed in return value.
Thanks!
The front-page example code of how to use Network.Browser.browse
doesn't compile:
Couldn't match expected type `Either
Network.Stream.ConnError (Network.HTTP.Base.Response [a0])'
with actual type `(Network.URI.URI,
Network.HTTP.Base.Response String)'
Expected type: Network.Stream.Result
(Network.HTTP.Base.Response [a0])
Actual type: (Network.URI.URI, Network.HTTP.Base.Response String)
In the first argument of `getResponseBody', namely `rsp'
In the second argument of `fmap', namely `(getResponseBody rsp)'
Couldn't match expected type `Either
Network.Stream.ConnError (Network.HTTP.Base.Response [a0])'
with actual type `(Network.URI.URI,
Network.HTTP.Base.Response String)'
Expected type: Network.Stream.Result
(Network.HTTP.Base.Response [a0])
Actual type: (Network.URI.URI, Network.HTTP.Base.Response String)
In the first argument of `getResponseBody', namely `rsp'
In the second argument of `fmap', namely `(getResponseBody rsp)'
I notice that HTTP (used by cabal) does not support redirect.
I have set up $proxy_http correctly. The university proxy does not require authentication.
It seems the HTTP redirect still failed. Is there a way to solve this problem?
> cabal update --verbose=3 Downloading the latest package list from hackage.haskell.org Sending: GET http://hackage.haskell.org/packages/archive/00-index.tar.gz HTTP/1.1 User-Agent: cabal-install/1.20.0.2 (linux; x86_64) Host: hackage.haskell.org proxy uri host: proxy.swmed.edu, port: :3128 Creating new connection to proxy.swmed.edu:3128 Received: HTTP/1.1 302 authenticationrequired Via: 1.1 129.112.115.41 (McAfee Web Gateway 7.4.2.1.0.17593) Date: Tue, 29 Jul 2014 14:33:01 GMT Location: https://m-proxy2.swmed.edu:10000/mwg-internal/de5fs23hu73ds/plugin?target=Auth&reason=Auth&ClientID=851966280&ttl=43200&url=aHR0cDovL2hhY2thZ2UuaGFza2VsbC5vcmcvcGFja2FnZXMvYXJjaGl2ZS8wMC1pbmRleC50YXIuZ3o=&rnd=1406644381 Content-Type: text/html Cache-Control: no-cache Content-Length: 3678 Proxy-Connection: Keep-Alive 302 - redirect Warning: http error: Unable to handle redirect, unsupported scheme: https://m-proxy2.swmed.edu:10000/mwg-internal/de5fs23hu73ds/plugin?target=Auth&reason=Auth&ClientID=851966280&ttl=43200&url=aHR0cDovL2hhY2thZ2UuaGFza2VsbC5vcmcvcGFja2FnZXMvYXJjaGl2ZS8wMC1pbmRleC50YXIuZ3o=&rnd=1406644381 cabal: Failed to download http://hackage.haskell.org/packages/archive/00-index.tar.gz : ErrorMisc "Error HTTP code: 302"
Trying to build using ghc 7.8 on debian linux results in errors:
: nr@yorkie 10181 ; cabal configure
Resolving dependencies...
Configuring HTTP-4000.2.19...
: nr@yorkie 10182 ; cabal build
Preprocessing library HTTP-4000.2.19...
Preprocessing test suites for HTTP-4000.2.19...
Building HTTP-4000.2.19...
[ 1 of 18] Compiling Network.HTTP.MD5Aux ( Network/HTTP/MD5Aux.hs, dist/build/Network/HTTP/MD5Aux.o )
Network/HTTP/MD5Aux.hs:91:10: Warning:
No explicit implementation for
either ‘negate’ or ‘-’
In the instance declaration for ‘Num ABCD’
[ 2 of 18] Compiling Paths_HTTP ( dist/build/autogen/Paths_HTTP.hs, dist/build/Paths_HTTP.o )
dist/build/autogen/Paths_HTTP.hs:21:13: Not in scope: ‘catch’
dist/build/autogen/Paths_HTTP.hs:22:13: Not in scope: ‘catch’
dist/build/autogen/Paths_HTTP.hs:23:14: Not in scope: ‘catch’
dist/build/autogen/Paths_HTTP.hs:24:17: Not in scope: ‘catch’
I also found it a little strange that the version number cloned from github is one less than pulled from hackage. But the hackage version seems to have similar issues (not building).
Please let me know if I can provide other info.
Could you export rqMethodMap
, similar to Network.HTTP.Headers.headerMap
? This would make it possible to reuse the mapping table when serializing/instantiating instances of RequestMethod
(and Request
, in particular).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.