Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mpl2 display #234

Merged
merged 5 commits into from
Feb 13, 2017
Merged

Mpl2 display #234

merged 5 commits into from
Feb 13, 2017

Conversation

bmcfee
Copy link
Collaborator

@bmcfee bmcfee commented Feb 9, 2017

This PR updates the test suite for matplotlib 2.0 compatibility, and fixes #233 .

Note: the display still works on 1.5, but for testing purposes, we now use 2.0 (and 2.0 default styles).

I had to change a couple of small things in the test setup to get things working -- notably, removing tests/__init__.py (tests does not need to be a module), and added an extras_require for testing.

@bmcfee bmcfee added this to the 0.4 milestone Feb 9, 2017
@bmcfee bmcfee requested a review from craffel February 9, 2017 18:44
@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 9, 2017

Note: this PR needs to get merged before any of the others, because display tests are currently breaking everything else.

@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 9, 2017

It looks like the source separation metrics are getting some minor errors now. I can't tell if it's due to numerical precision or an actual change in the data. Note: i didn't change any data here, only locations.

@dawenl got a minute to investigate?

@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 9, 2017

follow-up: I don't see any of these separation errors when i run the tests locally. Only notable difference I see between my env and travis is that I have openblas-numpy, and travis is vanilla. Spoke too soon; it does indeed fail locally as well.

I suspect that issue is orthogonal to this PR though.

EDIT: confirmed that these separation failures are independent of the changes in this PR. I recommend we make a separate PR to fix the separation fixtures.

@craffel
Copy link
Collaborator

craffel commented Feb 9, 2017

EDIT: confirmed that these separation failures are independent of the changes in this PR. I recommend we make a separate PR to fix the separation fixtures.

@faroit or @ecmjohnson may be the best people to ask since they touched the codebase most recently.

@faroit
Copy link

faroit commented Feb 9, 2017

@craffel I will have look this weekend

@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 11, 2017

@faroit @rabitt pointed out that this might be due to randomness in the bss_eval functions.

If so, this can be resolved by seeding the RNG before each test function executes. librosa does this by a helper function here, example here.

@bmcfee bmcfee modified the milestones: 0.5, 0.4 Feb 11, 2017
@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 13, 2017

As mentioned over in #236, I think this one's good to go. The only remaining failures are on separation metrics.

A quick CR would be nice, since this PR does change a little of how the tests are executed (beyond just the display submodule). It does not change any functionality though.


before_script:
- pep8 mir_eval evaluators tests

script:
- nosetests -v --with-coverage --cover-package=mir_eval
- nosetests -v --with-coverage --cover-package=mir_eval -w tests
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the -w tests tells nosetests it's running within the tests folder, and this is why you removed the tests prefix from all the paths?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep. This way you can also execute from within the tests directory, and the tests don't have to know that they live within the tests/ directory of the repo.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool.

@craffel
Copy link
Collaborator

craffel commented Feb 13, 2017

Looked it over, looks good. Just want to reiterate how awesome the display module is!!

@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 13, 2017

@craffel thanks! merge away?

@craffel craffel merged commit e008bd3 into mir-evaluation:master Feb 13, 2017
@bmcfee
Copy link
Collaborator Author

bmcfee commented Feb 24, 2017

@faroit did you have a chance to look at the source sep fixtures?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

matplotlib 2.0 may break regression tests on display module
3 participants