UNPKG is a fast, global content delivery network for everything on npm.
Please visit the UNPKG website to learn more about how to use it.
Our sponsors and backers are listed in SPONSORS.md.
The CDN for everything on npm
Home Page: https://unpkg.com
License: Other
UNPKG is a fast, global content delivery network for everything on npm.
Please visit the UNPKG website to learn more about how to use it.
Our sponsors and backers are listed in SPONSORS.md.
Currently, if someone requests the "bare" URL (a URL w/out a filename) for a package with a browser
field in its package.json, that file is the one that gets served. For example, when we see a request for /d3
(no filename), since d3 defines { browser: "build/d3.js" }
in its package.json we redirect to https://unpkg.com/d3/build/d3.js
.
I originally introduced this behavior because I wanted to enable people publishing browser-ready files in browser
to be able to have super clean URLs (e.g. <script src="//unpkg.com/d3">
), but there's a subtle problem: we have a different interpretation of browser
than the bundlers do.
This was actually brought up in #25 but for some reason I didn't quite understand the problem at that time. Probably because I hadn't done much work with ES modules yet.
Scenario 1: Imagine a package with the following config:
{
"main": "index.common.js", // node, Browserify, unpkg bare URL
"module": "index.es.js" // rollup, webpack2, <script type=module>
}
In this scenario, everything works great:
You could say it's not perfect because the unpkg bare URL points to a CommonJS module, but we'll set that aside for now.
Scenario 2: Now, let's say you have a browser-specific shim for some of your code, so you add the browser
field to swap out the node-specific index.common.js
module with browser-ready one: index.browser.js
:
{
"main": "index.common.js", // node
"browser": "index.browser.js", // Browserify, unpkg bare URL
"module": "index.es.js" // rollup, webpack2, <script type=module>
}
Now:
The problem is that the unpkg "bare" URL moved as well. So the package author just wanted a way to introduce some browser-specific shims, but we actually changed the way we serve the package too. But the browser
field was never intended as a place to put your UMD build.
Scenario 3: To get around this problem we added support for the unpkg
field, so you could do:
{
"main": "index.common.js", // node
"browser": "index.browser.js", // Browserify
"unpkg": "index.umd.js", // unpkg bare URL
"module": "index.es.js" // rollup, webpack2, <script type=module>
}
So this is good because now at least people who want to use the browser
field for what it was originally intended for (browser shims) also have a way to override which file is served at the bare URL on unpkg.
But this also makes the fact that we fall back to using the browser
field (when there is no unpkg
field) completely redundant!
Basically all we need to do is this: if you want to make unpkg serve something other than main
at the bare URL, use the unpkg
field. That's it. We don't even need a ?main
query parameter and we never need to do anything with browser
.
Anyway, if we did remove support for ?main
and the browser
field it would definitely be a breaking change, so I'd be happy to redirect any old URLs that relied on this behavior to minimize the impact. But going forward it seems like this is what the default should actually be. I hate to change this kind of thing because I know it's a pain to update stuff, but I think this should be one of the last breaking changes. The biggest win really would be that the whole service is a little less surprising and easier to understand for newcomers.
ping @mbostock @ngokevin @charlike @Daniel15 @ggoodman @EricSimons
For related discussions, see:
Is there possible to add combo service for unpkg.com?
For example, concat files if we use such syntax:
https://unpkg.com/[email protected]/normalize.css,[email protected]/styles/default.css,[email protected]/dist/css/select2.min.css
The Expires
header is set to the current time which prevents any caching.
Firstly, thank you for the service.
Normally everything works fine, but recently we've been seeing some reports of users encountering 403 errors from CloudFlare for hosted libraries. When loading manually, they are prompted with a captcha page before being granted access. Other CloudFlare hosted sites appear to load as expected, so perhaps this is a combination of referer, __cfuid and user-agent triggering extra security. Either way, it requires some manual steps for each user to bypass, and the overall user experience is less than ideal!
If I'm reading rollup/rollup#935 correctly, the security settings were changed to essentially off a few days ago. Could I confirm whether that change was made? If so, maybe there's nothing further that can be done here - but if you have any suggestions or ideas for avoiding this issue, I'd be very interested.
The specific URL we are having difficulty with is this one:
Users currently reporting that linking to the ionic CSS file (https://unpkg.com/[email protected]/css/ionic.min.css) is yielding 403's for the other fontfiles/etc it imports (unpkg.com/[email protected]/fonts/ionicons.woff2?v=3.0.0-alpha.3 ). Note how the font file has a querystring. This is the error unpkg spits back:
Invalid query: {"v":"3.0.0-alpha.3"}
I assume this is due to the new meta
querystring param you added. We have 10k ionic projects that are now borked because of this — do you have any idea how long this will take to fix?
We may be caching auto-generated index pages too aggressively. See facebook/react#9207
Hi
First off, let me say I have full appreciation for the work done in this project and the unpkg CDN in general.
To serious business: you may have just broken a number of websites relying on your service.
Example of issue in the wild:
Unfortunately, this means that with your recent changes completely unbeknownst to us, you have broken our website (and potentially many others). As Linus says: "You do not break user space!"
My suggestion is to NOT do a 403, and simply ignore other unknown variables. Or you can treat 'v' as another accepted query parameter.
Again, thank you very much for your work. If you believe that this is intended behaviour, I respect your judgement but in the interest of time we will have to revert back to self hosting or hack out our custom cache busting.
not able to read .js.gz
file
From @mjackson on January 19, 2017 18:29
One of the immediately actionable ideas that came out of #65 is to create a standalone development server for unpkg, so people can run it on their own machines. I envision this being a small binary with the following API:
unpkg [registryURL] [-p port]
This would launch an HTTP server on the given port that serves packages out of npm. If we also include the index page, it could possibly be used to serve unpkg.com as well.
Copied from original issue: unpkg/express-unpkg#66
I run in to this mostly when using unpkg.com to look at TypeScript type definitions.
For example, in https://unpkg.com/@types/[email protected]/ while trying to open index.d.ts
It seems like the mime
npm package returns video/mp2t
as the mime type for ts
files.
Would you be open to a PR which added .ts
to the list of text/plain
files here?
Given babel-core@5/package.json, specifically:
"browser": {
"./lib/api/register/node.js": "./lib/api/register/browser.js"
},
I would expect unpkg to respect this mapping when the ?main
flag is appended. For example:
path requested | file served |
---|---|
/lib/api/register/node.js |
/lib/api/register/node.js |
/lib/api/register/node.js?meta |
/lib/api/register/browser.js |
https://unpkg.com/[email protected]/?json no longer works, it redirects to a HTML page. https://unpkg.com/[email protected]/?meta works so I guess it was renamed. However, this is a breaking change and code that uses this JSON data via the old URL is now broken. We use this on the Yarn website to show a list of files in packages.
Could you please update it to silently rewrite the URL, or at least redirect to ?meta
rather than the HTML page? We can update the Yarn site but I wonder what else is broken now.
See yarnpkg/website#614.
Thank you! ❤️
When not specifying the exact package version like this:
https://unpkg.com/react/dist/react.min.js
It redirects to the latest
version:
https://unpkg.com/[email protected]/dist/react.min.js
However, the redirects are not followed from Chrome using xhr
requests. In Firefox they work fine.
Maybe add additional API, for example, to list all known versions, like this:
GET https://unpkg.com/react?versions
...
15.4.0
15.4.1
That way, if the exact version is not known, one could lookup the available versions first.
Alternatively, when latest
version is specified, it could return the contents, instead of redirecting.
https://unpkg.com/react@latest
But then it's more tricky with CDN caching.
Previously when I linked to an .html file it would run the code in the browser. Now, the browser just displays the source code. Looks like Content-Type
is text\plain
.
Here's an example HTML file
https://unpkg.com/[email protected]/dist/doc/index.html
unpkg.com currently uses the following Page Rules on CloudFlare to control how it does caching:
None of these match a URL like https://unpkg.com/package@version
. In order to cache these, we'd need to "Cache Everything" on https://unpkg.com/*
. However, this would override cache settings for all URLs, including those for static content which are currently set to "Ignore Query String".
Safari TP includes support for ES modules using <script type="module">
. However, the spec doesn't include support for "bare" import
specifiers, like import React from 'react'
. Their reasoning:
This restriction is in place so that in the future we can allow custom module loaders to give special meaning to "bare" import specifiers, like import "jquery" or import "web/crypto". For now any such imports will fail, instead of being treated as relative URLs.
That's fair. It just means that unpkg could act like a custom module loader in this case. When the ?module
query param is present, unpkg can covert bare import specifiers to load the package from unpkg. So
import React from 'react'`
would be translated on-the-fly to
import React from '//unpkg.com/react?module'
Of course, this assumes that the package's main entry point is an ES module. Otherwise, just change the path.
See previous discussion in https://github.com/unpkg/npm-http-server/issues/65#issuecomment-275265535
/cc @matthewp who discussed the topic of loading JavaScript modules on the web here
Also, /cc @domenic who has been doing a ton of work on JavaScript modules on the web.
I see this script included in Google Chrome's new tab and it's hosted here: https://unpkg.com/[email protected]/23.js
It's showing ads in new tab, redirecting to "chrome-updates.win" domain or some sites like bet or dating. I just wanna know what's going on. Screen shots: https://twitter.com/krmgns/status/878771668751192064.
Thanks.
https://unpkg.com/[email protected]
Actual result: A file with mimetype application/octet-stream
and contents starting with <template>
, in the style of a Vue.js single-file component.
Expected result: A minified, webpack-runtime style Javascript module with content-type application/javascript
like it is for example returned for https://unpkg.com/[email protected] or https://unpkg.com/[email protected].
This is happening right now for @angular/material
, but who knows what else it's affecting too...
Repro: https://unpkg.com/@angular/[email protected]/
This is a valid package & version btw, per NPM: https://www.npmjs.com/package/@angular/material
I am not totally sure whether this should be reported here or against react
- for the issue against our project see WordPress/gutenberg#1628; against react
, see facebook/react#10086.
https://unpkg.com/react-dom@next/umd/react-dom-server.development.js is currently a bad build.
It redirects to https://unpkg.com/[email protected]/umd/react-dom-server.development.js which ends like this:
/**
* For browsers that do not provide the `textInput` event, extract the
* appropriate string to use for SyntheticInputEvent.
*
* @param {string} topLevelType Record from `BrowserEventConstants`.
* @param {object} nativeEvent Native browser event.
* @return {?string} The fallback string for this `beforeInput` event.
*/
function getFallbackBeforeInputChars(topLevelType, nativeEvent) {
// If we are currently composing (IME) and using a fallback to do so,
// try to extract the composed characters from the fallback object.
// If composition event is available, we extract a string only at
// compositionevent, otherwise extract it at fallback events.
if (currentComposition) {
if (topLevelType === 'topCompositionEnd' || !canUseCompositionEve
The alpha.12
version looks fine.
Hello, all.
Using the node package http
, as in, http.get('https://unpkg.com/{anything}')
responds with a 301: Moved Permanently while the same URL from the browser will respond with a 200: OK and the document.
Is this by design? Is there an expected header perhaps?
e.g. this code will error if 'url' is an unpkg url. I have tested this as many of the other CDNs as I know about and the general behavior is to respond with a 200 and deliver the document.
import http = require('http');
const url = 'https://unpkg.com/[email protected]/dist/wade.min.js';
http.get(url, (res) => {
const statusCode = res.statusCode;
if (statusCode != 200) console.error(`Error ${statusCode}: ${res.statusMessage} ${url}.`);
else console.log('Success');
});
Thank you!
Hi,
I am getting an Access is Denied error in IE9 when using unpkg for loading my modules through Systemjs.
It works fine in IE10 and IE11.
Please let me know what can be done
AFAICT the only sites that break when we redirect all traffic from npmcdn.com to unpkg.com are the ones loading stuff via XMLHttpRequest using CORS. One way we could possibly help these people is by emitting a small warning to the console when loading JavaScript files via CORS on npmcdn.com.
/cc @JakeChampion
Now that the browser
field will most likely be deprecated I think adding support for cdn
if unpkg
is not present would be the right thing to do.
Having to ask package maintainers to add multiple fields seems very unnecessary if cdn
can cover 99% of the cases.
Maybe I'm missing something, but what are the cons of adding support for cdn
?
Hi!
First of all, I ❤️ npmcdn and how simple it is for package users and authors. Thanks!
I have one concern though: The CloudFlare CDN tracks users by setting a identifying cookie. I noticed this because I use Privacy Badger, and after some time it started blocking assets hosted on npmcdn.
I think this is bad for the (hopefully) obvious reason of letting CF track your users, and because people with content blockers may not get content loaded from npmcdn (which makes your service appear falsely unreliable). Telling everyone to unblock npmcdn is not an option.
I don't have a solution to this but I wanted to raise the issue in case you weren't aware of this. Thank you for considering this.
PS: I know that I could fork npmcdn and run through my own CDN, but not everyone can do that, and I'm concerned about the default case.
Somewhat related to #35.
I'm working on an in-browser NPM client for System that intelligently downloads all of the files the app needs from Unpkg (instead of downloading everything w/ the actual tarball). An important note is that we are not downloading UMD/dist builds, but instead downloading all files required from the main/module/typings field in the package JSON. For a large package like Angular it will typically make ~100-200 requests within a 5s period, and performance/reliability typically isn't a problem.
However, I've noticed that Unpkg will send back 503's and incorrect 404's from time to time, and I'm thinking that it might be due to a race condition that happens when Unpkg encounters a package it's seen for the first time. For example, about an hour ago I performed an install of [email protected] and got back a 503 for this file — but after refreshing that file's URL, the file contents were successfully sent back.
Other times, instead of a 503, I'll get back a 404 with the Not found: file "/folder/file.ext" in package [email protected]
(with actual file path & package name in error) — but upon refreshing the URL, the file contents come back totally fine.
@mjackson are you aware of any issues that might be causing this? (i.e. rate limits, race conditions, etc). I'm planning on running unpkg locally and seeing if I can replicate the errors there w/ logging, but any info you can provide would be much appreciated.
I performed a site speed test and one of the things it called out was
Serve static content from a cookieless domain:
https://unpkg.com/[email protected]/css/font-awesome.min.css
https://unpkg.com/[email protected]/src/fuse.min.js
https://unpkg.com/[email protected]/dist/immutable.min.js
https://unpkg.com/[email protected]/dist/react-dom.min.js
https://unpkg.com/[email protected]/dist/react.min.js
I noticed other cdns like jsdelivr don't set a cookie but unpkg does.
Many repos are es6 module based, but provide a Rollup bundle for legacy/universal usage. Three for example.
My repo is similar, es6 modules with an optional bundle for those needing a legacy format. (i.e. within my team, es6 modules are the preferred way to access my project).
Looking at Three's source on unpkg:
https://unpkg.com/[email protected]/src/geometries/BoxGeometry.js
.. it is entirely es6 based, with relative imports of their modules. Can I use these modules?
I don't know if these are answered somewhere, but if not, this is a documentation request/issue:
import BoxGeometry from 'https://unpkg.com/[email protected]/src/geometries/BoxGeometry.js'
?import { Geometry } from '../core/Geometry';
import { BufferGeometry } from '../core/BufferGeometry';
import { Float32BufferAttribute } from '../core/BufferAttribute';
import { Vector3 } from '../math/Vector3';
If even the last two are correct, still requiring UMD for unpkg, es6 module users will flock to unpkg due to the many es6 module deployment strategy issues!
I realize these are "fine points" but quite important to those of us deploying es6 modules.
Foreign Fetch is a new upcoming upgrade coming to service workers which will allow a Service Worker run across various origins and this provides Package Delivery platforms such as Unpkg working over CDNs such as Cloudflare a much more granular control over the caching of their assets on a browser.
I ❤️ Service Workers and would like to help in making the unpkg the first one to support Cross Origin Service Workers, I would love to work on a fork and create a WIP PR , if there's enough interest from the unpkg to support this in future.
Even if it's merged, it won't harm in any way, since it's a progressive enhancement and in case of absence of support in browser. Nothing happens. We can further discuss what is going to be our caching mechanism for the assets. Thanks a lot!
this url : https://unpkg.com/[email protected]/dist/leaflet.css return a timeout
From this comment it appears there is no Subresource Integrity (SRI) support.
Does the Cloudflare API provide a hash or is this something that would need to be generated on the unpkg server?
First, I wanted to say that unpkg is great, and thanks for maintaining it.
I was going to open an issue about how the redirect from a url like unpkg.com/video.js@latest/package.json
to unpkg.com/[email protected]/package.json
was broken for XHR usecase because of a lack of CORS headers on the original response. However, as I was running curl to come up with the example, I noticed that the issue was with the http
to https
redirect. If possible, it would be good to add CORS to these redirects. A user could be using protocol-relative URLs in the XHR and so, it could possibly break in a non-https page.
If it isn't possible to add CORS in that case, some documentation would be good. I can help with the documentation if necessary.
Below are some HEADER outputs from curl.
HTTP, @latest
$ curl -Is "http://unpkg.com/video.js@latest/package.json"
HTTP/1.1 301 Moved Permanently
Date: Wed, 09 Nov 2016 22:15:10 GMT
Connection: keep-alive
Set-Cookie: __cfduid=dbab12710c478ebbdbdb54a3ab8f7cd8a1478729710; expires=Thu, 09-Nov-17 22:15:10 GMT; path=/; domain=.unpkg.com; HttpOnly
Location: https://unpkg.com/video.js@latest/package.json
X-Content-Type-Options: nosniff
Server: cloudflare-nginx
CF-RAY: 2ff49f3320b02180-EWR
HTTPS, @latest
$ curl -Is "https://unpkg.com/video.js@latest/package.json"
HTTP/1.1 302 Found
Date: Wed, 09 Nov 2016 22:16:00 GMT
Content-Type: text/html
Content-Length: 104
Connection: keep-alive
Set-Cookie: __cfduid=dcb7bc35ef35832984e53a80a8aa538f31478729760; expires=Thu, 09-Nov-17 22:16:00 GMT; path=/; domain=.unpkg.com; HttpOnly
Access-Control-Allow-Origin: *
Cache-Control: public, max-age=500
Location: /[email protected]/package.json
Via: 1.1 vegur
Strict-Transport-Security: max-age=15552000; includeSubDomains; preload
X-Content-Type-Options: nosniff
Server: cloudflare-nginx
CF-RAY: 2ff4a06b8c33215c-EWR
HTTP, @5.12.6
$ curl -Is "http://unpkg.com/[email protected]/package.json"
HTTP/1.1 301 Moved Permanently
Date: Wed, 09 Nov 2016 22:32:42 GMT
Connection: keep-alive
Set-Cookie: __cfduid=db866188522fef7ec8e048529ae7d68601478730762; expires=Thu, 09-Nov-17 22:32:42 GMT; path=/; domain=.unpkg.com; HttpOnly
Location: https://unpkg.com/[email protected]/package.json
X-Content-Type-Options: nosniff
Server: cloudflare-nginx
CF-RAY: 2ff4b8e2a3692138-EWR
HTTPS, @5.12.6
$ curl -Is "https://unpkg.com/[email protected]/package.json"
HTTP/1.1 200 OK
Date: Wed, 09 Nov 2016 22:33:14 GMT
Content-Type: application/json
Content-Length: 3763
Connection: keep-alive
Set-Cookie: __cfduid=d2476e53b70b2b773bc60e899bcf306d01478730794; expires=Thu, 09-Nov-17 22:33:14 GMT; path=/; domain=.unpkg.com; HttpOnly
Access-Control-Allow-Origin: *
Cache-Control: public, max-age=31536000
Etag: W/"eb3-157fd2cfaf0"
Via: 1.1 vegur
Strict-Transport-Security: max-age=15552000; includeSubDomains; preload
X-Content-Type-Options: nosniff
Server: cloudflare-nginx
CF-RAY: 2ff4b9a7a88c185e-EWR
There are various packages, for example leaflet, who don't have minified versions of their files in their dist/
directory. So maybe unpkg can listen for a get parameter(e.g. ?m
= minified version
) and it tries to minify the css and js(and later on more languages)?
This night react-dom.min.js file disappeared from [email protected]:
curl https://unpkg.com/[email protected]/dist/react-dom.min.js -v
* Trying 104.16.124.175...
* Connected to unpkg.com (104.16.124.175) port 443 (#0)
* found 174 certificates in /etc/ssl/certs/ca-certificates.crt
* found 696 certificates in /etc/ssl/certs
* ALPN, offering http/1.1
* SSL connection using TLS1.2 / ECDHE_ECDSA_AES_128_GCM_SHA256
* server certificate verification OK
* server certificate status verification SKIPPED
* common name: ssl714328.cloudflaressl.com (matched)
* server certificate expiration date OK
* server certificate activation date OK
* certificate public key: EC
* certificate version: #3
* subject: OU=Domain Control Validated,OU=PositiveSSL Multi-Domain,CN=ssl714328.cloudflaressl.com
* start date: Wed, 25 Jan 2017 00:00:00 GMT
* expire date: Wed, 03 Jan 2018 23:59:59 GMT
* issuer: C=GB,ST=Greater Manchester,L=Salford,O=COMODO CA Limited,CN=COMODO ECC Domain Validation Secure Server CA 2
* compression: NULL
* ALPN, server accepted to use http/1.1
> GET /[email protected]/dist/react-dom.min.js HTTP/1.1
> Host: unpkg.com
> User-Agent: curl/7.47.0
> Accept: */*
>
< HTTP/1.1 404 Not Found
< Date: Wed, 16 Aug 2017 07:56:15 GMT
< Content-Type: text/plain; charset=utf-8
< Content-Length: 69
< Connection: keep-alive
< Access-Control-Allow-Origin: *
< Etag: W/"45-jukPuHbAtANRFZby2i2lJrqBn7U"
< Via: 1.1 vegur
< CF-Cache-Status: HIT
< Strict-Transport-Security: max-age=15552000; includeSubDomains; preload
< X-Content-Type-Options: nosniff
< Server: cloudflare-nginx
< CF-RAY: 38f2d6021dda4f3e-DME
<
* Connection #0 to host unpkg.com left intact
Cannot find file "/dist/react-dom.min.js" in package [email protected]%
But it still present in file list:
https://unpkg.com/[email protected]/dist/
This might be a really dumb question/assumption but I was working on an embeddable content and I was thinking if it would be possible to add a flag or file that unpkg can use to identify if a npm package is embeddable content and then make the unpkg url work as an embed url.
Is this possible or worthwhile? I'm not sure 😅 but what do you think?
It conflicts while you want to expose UMD bundle to unpkg and want to allow users of your package to use browserify to bundle it.
Scenario. You have a package that you want to have UMD bundle, commonjs and es "bundles".
You set pkg.module
to point to dist/my-package.es.js
. You set pkg.main
to point to the dist/my-package.common.js
which is CJS. You want users using modern bundlers such as Rollup/Webpack to resolve the ES variant of your module. Browserify users to resolve the commonjs. And browser users to just have a way to include the UMD bundle directly as script tag.
All is great, when you set pkg.browser
field to point to the UMD bundle then UNPKG will resolve it, but Browserify users will resolve it too, instead the CJS which is set in pkg.main
, because Browserify respects pkg.browser
when it is set. Browserify should not resolve the UMD bundle, because it kinda have kinda same wrapper like the UMD, so more unnecessary bytes and duplicate code are added.
One way is to just not set pkg.browser
and force users to point to the exact file that they want, instead of using the shortcuts that UNPKG provides for you - such as unpkg.com/my-package
and unpkg.com/my-package@version
, so you should force your users to use unpkg.com/my-package/dist/my-package.min.js
for example.
Conflict comes because UNPKG and Browserify assumes different things for that field.
Unpkg assumes that that file can be used in browser directly without anything and does nothing to that file. While Browserify says "hey give me that file, i'll wrap it, resolve its deps recursively and I as tool, will give you final bundle that you can use in browser".
Edit: Good example is https://github.com/tunnckoCore/randomorg-js, where you can see that I added a notice for the users to not use the shortcut, so they are warned that they should not expect it would work. Notice that it is still not published to npm. When it is published the shortcut will point to dist/randomorg.common.js
Hi, getting Internal Server Error
at https://unpkg.com/
From @alexisvincent on December 28, 2016 23:41
I smell an opportunity here to provide a beautiful SystemJS loading experience with unpkg.
If you haven't already played with it, SystemJS is a client side module loader and can already provides a nice experience for module loading from cdns and unpkg (Angular does this in their official guides).
I think we could make the experience even nicer however by automatically resolving the configs for raw projects using systemjs-config-builder.
The experience on the browser side would look something like this:
app.js
import { DOM } from 'react';
import { render } from 'react-dom';
render(
DOM.div('Hello World'),
document.getElementById('app')
)
index.html
<!DOCTYPE html>
<html>
<head>
<title>SystemJS ♥️ unpkg</title>
<script src="unpkg.com/[email protected]/dist/system.js"></script>
<script src="unpkg.com/[email protected]/dist/unpkg-config.js"></script>
</head>
<body>
<div id="app" />
<script>
System.import('./app.js')
</script>
</body>
</html>
Which is obviously really attractive since the only dependency you need is SystemJS, which then automatically handles the rest. This makes barrier to entry for client side development really minimal. I could also write a hosted service which parsed client code and made loading modules in production efficient.
@mjackson Let me know if this is something you would be interested to support.
Copied from original issue: unpkg/express-unpkg#65
It would be nice to use Unpkg as a universal source for packages when using dynamic loaders such as SystemJS. An ideal configuration would be:
SystemJS.config({paths:{"*":"//unpkg.com/*.js?as=umd"}})
Allowing the following forms:
require("jquery")
require("gl-matrix")
require("tgd")
Right now, this would not work for modules such as gl-matrix
and tgd
for the following reasons:
gl-matrix
exports the ES6 code by default, but has a dist
version with an UMD build at https://unpkg.com/[email protected]/dist/gl-matrix.jstgd
exports the UMD build under the umd
directory and uses umd/index.js
as the main file, however the relative imports fail because the Unpkg URL is unpkg.com/tgd and not unpkg.com/tgd/umdIn order to better support SystemJS-like paths, Unpkg could:
as=umd
parameter with the following semantics: look for an umd
or dist
directory, and use it as root.index.js
is present in this directory, it should be served by default when the request URL path does not end with a /
. For instance https://unpkg.com/[email protected]/umd/ and https://unpkg.com/[email protected]/umd now return the same, while it could https://unpkg.com/[email protected]/umd return the contents of https://unpkg.com/[email protected]/umd/index.js (or at least redirect to it).The relatively simple changes would allow to bypass package management systems such as NPM or JSPM and use unpkg as an online, versioned, package source.
In the home markdown there's a link to https://github.com/mjackson/npm-http-server#bower-support, which redirects to https://github.com/unpkg/express-unpkg#bower-support, which doesn't resolve to anything.
That was my go-to "don't use Bower" reference :)
Not sure where to put this or how to contact you but currently packages are redirecting to a site warning that your computer is infected and to install antivirus software
e.g. https://unpkg.com/[email protected]/dist/react.min.js goes to https://compliance-jessica.xyz/a.php
In a Travis environment:
This is the second time I have encountered such an issue. The first time was for [email protected], and it was reported in the old express-unpkg
repo. Here are some comments by @mjackson on the old issue, from archived emails.
Comment:
Yep, it looks like the cache has a 0-length response for that file. I can initiate a purge of the cache at that path, but I'm ultimately not sure the best method to prevent stuff like this from happening in the future. A very small percentage of requests to the origin servers fail for various reasons (timeouts, etc.), maybe the request from this edge node was one of them.
Comment:
Great detective work, @whipermr5! 🕵️
Unfortunately I don't have detailed logs from the origin for 200 responses, only 40x and 50x. I'll setup a filter to try and catch this kind of thing the next time it happens.
@mjackson Hoping the logs will be able to reveal the real issue!
Awesome work on unpkg
! We're using it a lot for the A-Frame ecosystem.
Currently for unpkg
, we can specify main
or browser
fields such that a unpkg.com/my-package/
will resolve to a file we specify. But this overloads main
and browser
:
main
pointing to their pre-built entry point (e.g., index.js
), maybe to npm link
or to differentiate browser builds from npm builds.browser
because that is used by Browserify, and specifying it can mess up build steps.Perhaps an unpkg
specific one? {"cdn": "dist/mybuild.min.js"}
This is happening for dozens of files (+ more according to our users), but here's a specific URL that 503'd multiple times for me: https://unpkg.com/[email protected]/?meta
^ feel free to check your logs for that URL and see why it failed
And for the love of god, @MartinKolarik can you please please please ship that directory JSON listing endpoint? Please?
I'm trying to make bower install a specific version of a package but it doesn't work if I use unpkg.com as source:
bower install popper.js=https://unpkg.com/[email protected]. --save
bower popper.js#1.9.1 ENORESTARGET URL sources can't resolve targets
I read that unpkg.com support(ed?) a /bower.zip
endpoint to get a zip that works with bower but I can't seem to find how to make it work?
Example url: https://unpkg.com/[email protected]?main=browser.
Note the browser
field from package.json
:
"browser": {
"./lib/api/register/node.js": "./lib/api/register/browser.js"
},
Is there link to the Terms of Use and License for unpkg. We plan to use it within my team and we would need to provide the above information to get exceptions for unpkg.
Thanks,
Bhargav
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.