oftn-oswg / core-estimator Goto Github PK
View Code? Open in Web Editor NEWCross-browser polyfill for navigator.hardwareConcurrency
Home Page: https://oswg.oftn.org/projects/core-estimator/demo/
License: Other
Cross-browser polyfill for navigator.hardwareConcurrency
Home Page: https://oswg.oftn.org/projects/core-estimator/demo/
License: Other
PNaCl can use sysconf(_SC_NPROCESSORS_ONLN)
to get the available cores.
Resources:
It has to do with the script location derrivation code.
Hello, I'd love to find an NPM release, but couldn't find any.
Hi,
is core-estimator usable on node.js ?
Is there anything in node.js to detect how many worker to launch ?
I recently had to reinstall Windows on my computer. I freshly installed all major browsers. You may recall that I use core-estimator on my web page at http://danielsadventure.info/html5fractal/.
This page seems to work fine with IE and Chrome (altho IE tends to misestimate the number of cores), but for some reason, Firefox hangs when trying to estimate the number of cores.
The strange thing is that this happens only in Firefox and only when I run the page on my public web server. When I run it in the Visual Studio development server, it works fine. Furthermore, Firefox shows no error indicating that anything went wrong, and the browser itself does not hang; only the Javascript seems to stop.
I was reading the posts Accuracy of JavaScript Time and JavaScript Benchmark Quality from John Resig and I had a realization.
First of all, Date.now()
is terrible on Windows XP and will often result in errors of up to 15ms. I'm not sure if this affects any other Windows versions or platforms though.
Second, as browser performance gets faster, the error keeps getting worse. This explains why I got much less accurate results as I lowered the loop count on the workload.
So what we should do is instead of recording the time for n
workers to complete once, we should record how many times n
simultaneous workers complete in a designated amount of time. This can be more accurate as well if we can figure out how far into the loop a payload is in. Doing it this way gives us another benefit as well, which is we can design the estimator to take a certain amount of time given a certain number of cores.
time_to_estimate <= time_per_test * (2 * floor(log2(number_of_cores)) + 1)
I just gave the live demo of the core estimator a spin on my Android phone (a Nexus 4 with stock OS), in what I believe to be the latest stable-channel version of Chrome available to the device. The demo appeared to hang, and upon inspection I noticed this error:
Uncaught TypeError: Object #<Performance> has no method 'now'
And upon further investigation, it does appear that the window.performance object has no property called 'now' on that version of Chrome (18.0.1025469).
Perhaps you guys could implement a polyfill, if relative timing is all that matters (and not precision).
Tried this on Iexploder.exe as I was checking compatibility for my sight. always returns 1 core/thread
For a long time, I have been using core estimator as a polyfill and everything has been working smoothly. Recently, on a site I've been developing, I started getting really odd situations where the browser would continuously consume 100% of a single CPU core.
It turns out that I had misconfigured the installation of core estimator, and I had had the following code in my main .html file:
<html>
...
<head>
<script src='core-estimator/core-estimator.min.js'></script>
<script src='core-estimator/workload.js'></script>
...
</head>
...
</html>
This does not create any errors or warnings, and everything works, but the configuration mistake was that script workload.js
was never intended to be included in the main html file. When it is included, it installs a self.onmessage
handler to the top window, which postMessage(null)
s to itself, causing workload.js to infinitely loop messages to itself the moment that anything else is postMessaged to the web page. This kind of error can easily go unseen, since it just silently burns CPU cycles on the background.
Perhaps workload.js
could have checks in it before installing onmessage
that the script context is actually inside a web worker, and if not, it would throw an exception "workload.js is not supposed to be included in main thread" or something similar? This would explicitly prevent such misconfigurations to silently turn into cycle wasting CPU busy spin loops.
Running the core estimator on current Firefox and an Intel Core i7 5960X that has 8 hardware cores and 16 logical cores, core estimator hangs if it finds 16 cores, and then attempts to detect 32, most likely because it tries to spawn 32 simultaneous workers, which never start up. See https://bugzilla.mozilla.org/show_bug.cgi?id=1052398. If I lift the pref dom.workers.maxPerDomain
to 40, then the hang is avoided.
(However most of the time, the core estimator does not get near the correct number on this cpu. Running in a sequence on Firefox, I'm getting 7,1,5,1,3,2,1,2,16,1,3,8,16)
Hello,
I realized that core estimator doesn't work when developer tools (F12) are opened in IE11.
Any ideas why?
https://blog.chromium.org/2017/05/goodbye-pnacl-hello-webassembly.html
Is this thing still neccesary?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.