Coder Social home page Coder Social logo

codespeed's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

codespeed's Issues

Require Result fields

http://speed.twistedmatrix.com/timeline/ broke because we inserted some Results with empty date fields. Speedcenter doesn't care and feeds them to jqplot, resulting in: Uncaught TypeError: Object NaN has no method 'getTime' when rendering a graph (and consequently no graph).

Fields that break jqplot when empty should either be required on insert, or checked as non-null before being provided to jqplot. The latter might be preferable since it doesn't require a database alteration.

GitHub backend, commits not ordered with most recent on top, and commits from same push omitted

Hi:

The current GitHub backend does not work as good as the Git backend.
The two major problems from my point of view are that the order of the shown commits is least-recent on top, which is different to the Git backend where the most-recent is shown first.

Furthermore, it looks like commits that were pushed in the same step to github are not shown.
Thus, only the most recent of those commits is shown and the next shown commit seems to be from another push. (That might correlate with dates, too, not necessarily pushs).

Best regards
Stefan

overwrite seems not to be the Django way

Hi,

First of all thank you for this great piece of software. It's fun using it.

Compared to other Django projects I find your template overwrite mechanism a bit unusual. The Django documentation describes the TEMPLATE_DIRS variable in settings.py to configure where the template loader is looking for templates.
http://www.djangobook.com/en/beta/chapter04/

Wouldn't it be better to add something like

TEMPLATE_DIRS = (
os.path.join(os.path.dirname(file), 'overwrite_templates').replace('','/'),
os.path.join(os.path.dirname(file), 'templates').replace('','/'),
)

to the settings.py?

Strictly speaking in Django words: speedcenter is the project and codespeed the app. Maybe it would be best to separate the two.

Thanks,

Frank

Add a regressions page

Essentially this would be a page that shows any benchmarks which are statically significantly worse than the best ever time for them on a given VM. Currently there is no good way to check for regressions besides looking at all the graphs. THanks!

Static URL in codespeed.js

The URL embedded in the js file is not correct when static files are served from another base URL.
The offener is the img tag on line 27:
https://github.com/tobami/codespeed/blob/master/codespeed/static/js/codespeed.js#L27

Is there a way to adapt that URL automatically?

Adding a {{ STATIC_URL }} does not help obviously since the file is served directly without being processed.
I did not see anything in the django manual which would allow preprocessing during ./manage.py collectstatic either.
That would have been nice.

Are there any other options then serving codespeed.js as a template which gets preprocessed?

Thanks
Stefan

timeline view fails if no benchmark is selected

After creating a new codespeed instance with an environment and a project and posting a result to it using tools/save_single_result.py (after fixing the KeyError problem), loading the timeline view hangs indefinitely because the ajax request for data fails with a 500. This isn't handled and the spinner just keeps spinning.

The request is made for something like:

/timeline/json/?_=1282581197452&exe=1&base=none&env=boson&revs=200

Note the missing "ben" key. This is because no benchmark is selected and the "ben" property of the configuration object is therefore "undefined". This corresponds to the radio input for benchmark selection which has just one choice, "float" (as per the default values in the save_single_result.py tool) which is not selected.

Selecting the "float" benchmark causes another request to be made which succeeds and the timeline renders properly.

Unlike the timeline on speed.pypy.org, I also see no "Display all in a grid" radio choice on a newly created codespeed instance.

New view with a history of regressions

Users should be able to see a list of past regressions, to have a history of changesets that adversely affected performance.

Further, regressions should be annotated somehow. Either letting users write notes, visible to everyone, or automatically creating an issue at the project's bug trucker (Roundup for pypy)

Page navigation support for timeline.

Hi:

Would be great if the timeline could support page navigation, i.e., the browser's back-button.
At the moment, you always return to the overview instead of a specific benchmark.

Thanks
Stefan

Don't show revisions that don't have any data for a particular exe

In the Changes view, the revision list (combo box) updates based on which project it is selected. So executables inside the same project share the same revision list. That can lead to having a selected revision, selecting another executable, and getting an empty chantes table.

Users shouldn't have to guess which rev/exe combination has data and which not.

Improve baseline displaying

The current hack where a marker-less series with two points is used for the baseline can be reimplemented the proper way: using a horizontal line instead. That is supported by jqPlot version 1.0 (canvasOverlay)

Also, jqPlot has a new feature that allows to show error messages (no data, for example): catchError.html

Comparison y-axis label wrong

PyPy Speed Center comparison chart shows the performance of different versions of PyPy. The Y-Axis says: "Ratio (less is better)", but the vertical bars are taller for more recent versions of PyPy and PyPy is getting faster. Either the bar are labeled wrong or more is better.

http://speed.pypy.org/comparison/?exe=1%2B41%2C1%2B172%2C1%2BL&ben=1%2C27%2C2%2C25%2C3%2C4%2C5%2C22%2C6%2C7%2C8%2C23%2C24%2C9%2C10%2C11%2C12%2C13%2C14%2C15%2C16%2C17%2C18%2C19%2C20%2C26&env=1&hor=false&bas=2%2B35&chart=normal+bars

Use History API instead of explicit permalink button

Disclaimer: I haven't checked yet whether jQuery Address also provides this feature.

Right now, the jQuery Address plugin is used to update the URL when the permalink is clicked. IMO this has some disadvantages:

  • change of URL triggers a (unnecessary) page reload
  • hashtag urls (as seen in the timeline view)

The History API allows to modify the browsing history, e.g. pushing new pages are replacing the current one. Some advantages:

  • no page reload
  • no (ugly) hashtag URLs

Using the API for codespeed we can:

  • Remove the permalink button (the URL will be updated on every single change)
  • Users can copy the URL anytime and get the same view again

Problems

Of course, browser support: http://caniuse.com/#search=history

Apparently no current version of IE supports the API. But it might be possible to fallback on the old behaviour, that is showing a permalink button for IE users.

https://github.com/balupton/history.js looks promising, History API for HTML5 Browsers with fallback to hashtag urls.

Demo

https://github.com/squiddy/codespeed/tree/history_demo
Look at the comparison view and you'll (hopefully) see what I'm aiming for.

http://html5demos.com/history/

ZeroDivisionError at /changes/table/

When there is a zero result added, loading the Changes tab fails:

/Users/codespeed/Scratch/Source/codespeed/speedcenter/codespeed/views.py in getchangestable
464. change = (result - c[0].value)*100/c[0].value

on Changes tab, git revision information overlaps stddev/change/trend columns

0.8 prelease github master

I took a screenshot but github doesn't allow me to attach it to this issue. I can email it directly to you if you wish.

I believe this issue is due to my use of benchmark names that are longer than 30 chars (the limit set in model.py). Sqlite doesn't care if you go over the defined limit but apparently the django view is expecting the names to be shorter.

Many of my benchmark names are 60-80 chars long (they are descriptive rather than just a simple name like "fixnum").

It would be nice if the page reflowed based upon the benchmark name. To keep things sane, it should probably wrap the name if it exceeds a certain number of chars (e.g. 80).

In the interim, I would be grateful if someone could tell me which template to modify to make it wider.

Instances with many benchmarks are slow to process POST to /result/add/

In my system I have 400+ unique benchmarks. Every time a new result is POSTed, the application does a select against the codespeed_result table for every benchmark_id (over 400 select calls). It appears it is doing this to collect all results to update the codespeed_reports table.

This doesn't scale. As more and more benchmarks are added this is just going to get slower and slower.

One potential fix is to replace the 400+ selects with a single call to the DB to pull all of these results into memory at once for processing. This has some implications for overall memory footprint, but like anything it is a trade-off (space versus time).

tools/save_single_result.py fails with an unhandled KeyError

After setting up a new codespeed instance (following the directions in README.md), I tried running save_single_result.py to see what it would do to the running codespeed instance. However, it failed with this traceback (sorry, I can't figure out how to do any formatting in this text area):

exarkun@boson:/Scratch/Sources/tobami-codespeed-f2b86bc$ python tools/save_single_result.py
Traceback (most recent call last):
File "tools/save_single_result.py", line 38, in
add(data)
File "tools/save_single_result.py", line 31, in add
print "Executable %s, revision %s, benchmark %s" % (data['executable_name'], data['commitid'], data['benchmark'])
KeyError: 'executable_name'
exarkun@boson:
/Scratch/Sources/tobami-codespeed-f2b86bc$

Possible regression in 0.8.x upon POST /result/add/

The sample script save_single_result.py throws the following exception in versions 0.8.x of codespeed:

WARNING:root:unable to save revision Jul 07, 02:17 - 14 info: updaterepo() takes no arguments (1 given)
Traceback (most recent call last):
File "/work/codespeed/speedcenter/../speedcenter/codespeed/views.py", line 797, in save_result
saverevisioninfo(rev)
File "/work/codespeed/speedcenter/../speedcenter/codespeed/views.py", line 709, in saverevisioninfo
log = getcommitlogs(rev, rev, update=True)
File "/work/codespeed/speedcenter/../speedcenter/codespeed/views.py", line 698, in getcommitlogs
updaterepo(rev.branch.project)
TypeError: updaterepo() takes no arguments (1 given)
...
[07/Jul/2011 02:17:07] "POST /result/add/ HTTP/1.1" 500 122371

This seems to be a regression; this worked fine in versions 0.6.x and 0.7.0.

Result reports

There should be a report for each revision, that summarizes significant changes in the results. This reports can then be shown in the home page, sent as email alerts and added to a RSS feed (email alerts and RSS belong to another issue)

Timeline plots fail when more than 4 plot series are selected

(Reported on the mailing list by Steffen Schwigon)

The timeline view checks whether there are more than 4 data series, and in that case it moves the legend outside the plot area to unclutter it. Unfortunately, when implementing branches the `plotopions' modification got moved before the actual variable declaration:

it just needs to be moved below the var plotoptions line

Permalinks for Environments should use id instead of name

As the new Codespeed site shows, the environment name is used for the permalink:
http://speed.rubyspec.org/changes/?tre=10&rev=268a0200d16b42f5201ba6c9faa2cd84447a6487&exe=2&env=EC2+(OpenJDK)

Clicking on a Report row (front page) will even take you to a permalink with no url encoding but plain text:
env=EC2 (OpenJDK6)

It would be better to use an id. That would also allow to modify the name of an environment and not break permalinks.

The revision can also be changed to id, to reduce the total URL length

Default branch hardcoded, migration seems to have failed in that regard

Hi:

Not sure whether I did everything correct, but I think I followed the migration instructions closely.
However, I ended up with a database which is seemingly correct, but had the branch attribute (I think on revisions) set to the empty string.

Now, I changed that to 'master' since we are using git, and to avoid confusion.

But, that gives problems with at least the changes view.
See: http://soft.vub.ac.be/~ppp/codespeed/changes/

The reason is that the 'default' is hardcoded.
I am working on at least a partial fix, and will probably send a pull requests, but wanted to mention it, because I am not sure when I will get to it.

Best regards
Stefan

Allow to set benchmark properties when saving data

When trying to save data for a benchmark that doesn't exist, it automatically gets created with default values. There should be a way to allow to set 'units', 'units_title' and 'lessisbetter' programmatically.

try to separate JS from Templates

Just want to check what your stance on this is. Right now, there is a lot of JavaScript code in the templates. I'd favor separating them more, moving (most of) the code into files. I think that can ease maintenance of the code, page size is decreasing, clients can cache (possibly minified and bundled) javascript files. Of course simply moving it won't do it, a lot of code requires setting up data and configuration, but it should be possible to clean this up. What do you think?

Missing migration for Revision.project null=True

I was going to work on #35, but when I wanted to create a new migration, south found another change:

$ python manage.py schemamigration codespeed --auto
 ? The field 'Revision.project' does not have a default specified, yet is NOT NULL.
 ? Since you are making this field nullable, you MUST specify a default
 ? value to use for existing rows. Would you like to:
 ?  1. Quit now, and add a default to the field in models.py
 ?  2. Specify a one-off value to use for existing columns now
 ?  3. Disable the backwards migration by raising an exception.

In 6cefd8e null=True was added to the definition of Revision.project, but apparently no migration was added.

If I'm not mistaken, each revision has a link to a project (a revision belongs to a branch, and that branch belongs to a project). Why can project be left out on a revision then?

Add security measures for POSTing

Currently anyone can POST data to a Codespeed instance. Some kind of security is needed. Either proper authentication or a simpler secret key.

Multiple Project UI

Currently the front-end provides no way to select which project to view, which makes it hard to use as a multi-project service.

If this seems reasonable I'm willing to add the proposed changes below:

  1. Add a slug field to Project
  2. Move all URLs to be prefixed with the project slug
  3. Have / redirect if there's only one project (maybe controlled by a flag?) or display a list otherwise

After creating an environment and a default project, /comparison/ fails with a 500 error

To reproduce, create a new codespeed server. Create an environment and a project. Then visit the comparison page. Unlike the other two pages in this configuration (which report a message about requiring at least one executable), the comparison page raises an unhandled exception resulting in a 500.

Exception Type: DoesNotExist
Exception Value: Revision matching query does not exist.

The last codespeed line in the traceback is:

/home/exarkun/Scratch/Sources/tobami-codespeed-f2b86bc/speedcenter/codespeed/views.py in getcomparisonexes
82. rev = Revision.objects.filter(project=proj).latest('date')

Required Django version incorrect

According to the readme, at least version 1.1 of Django is required to use codespeed. However, at least 2 features from 1.2 are used inside the code: syndication views and model validation.
The example project makes use of django.contrib.staticfiles, which was introduced in Django 1.3. Codespeed's templates are using STATIC_URL, so this is not going to work in older django versions right away.

Is supporting django versions < 1.3 a goal? If it is, maybe some instructions to use codespeed with these versions are needed, otherwise the readme should probably be updated.

Switch to HTML5

With the merge of a couple of JS improvements, a commit ebf9873 was merged that makes use HTML5's custom data attributes. My question in the pull request was probably not visible enough, so I'm asking here. Using custom data attributes is only valid in HTML5 (or you could provide a custom DTD, but no one does that I think).

In base.html a comment suggests that a switch to HTML5 was considered: https://github.com/tobami/codespeed/blob/master/example/templates/base.html#L18

I don't have any particular feature in mind that needs HTML5 (except custom data attributes), but I don't see anything wrong with upgrading.

Allow option to display axis in log scale for time and ratio comparison plots

Sam Mason reported that "when you're showing something that's
taking several seconds and one that's taking less than a second the
smaller one just gets pushed down to zero and all the detail is lost."

It had been considered but discarded as default due to most people inability to correctly interpret log scale. It would be useful as an option, however.

enhancement - remove items from legend that have no data

In the Comparisons tab, sometimes reading the chart is difficult when there are a lot of entries in the legend. It's hard to tell the difference between all the different shades of green and mapping them back to the bars.

So, I have a suggestion for improving the display.

If an executable/environment pair has no data point available, remove it from the legend and chart. This will rescale the data that is displayed so that it is larger and easier to read. Right now those "no data" spots cause the chart to rescale and is hard to read.

Problems with benchmark names on timeline

Not sure what the cause is, but I did not have problems with names like 'BinaryTrees (1 cores, 1 1 6)' before.
In the current version that causes a JavaScript error on the timeline: Uncaught Syntax error, unrecognized expression: [value=BinaryTrees (1 cores, 1 1 6)] jquery-1.6.2.min.js:17

See:
http://soft.vub.ac.be/~ppp/codespeed/timeline/

Hadn't had the chance to investigate more closely yet.
Will hopefully come back to it later.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.