Skip to content

Bug in pandas assert_*_equal wrapper test causes other tests to pass when they should fail #1447

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
kandersolar opened this issue Apr 21, 2022 · 3 comments · Fixed by #1448
Closed
Labels
Milestone

Comments

@kandersolar
Copy link
Member

Describe the bug
Using the 36-min environment, I get different test results in the iotools tests depending on whether I include test_conftest.py in the test run.

To Reproduce

$ conda env create --file ci/requirements-py36-min.yml -n pvlib36min
$ conda activate pvlib36min
$ pytest pvlib/tests/test_conftest.py pvlib/tests/iotools  # no failures
$ pytest pvlib/tests/iotools/  # some failures

Expected behavior
I expect to be able to run test modules individually and get the same results I'd get by running the entire test suite.

Versions:

  • pvlib.__version__: master (0.9.1+2.g8d0f863)
  • pandas.__version__: 0.22.0
  • python: 3.6.13

Additional context
The failures are mostly about whether dataframe columns and indexes are equal, so it might have something to do with the assert_X_equal definitions in conftest.py? I also mentioned the psm3 failure here: #1374 (review)

@kandersolar
Copy link
Member Author

I think it's because the test__check_pandas_assert_kwargs test (https://github.com/pvlib/pvlib-python/blob/master/pvlib/tests/test_conftest.py#L55) is replacing the pandas assert methods with no-op dummies and not cleaning up after itself. I added this at the bottom of test_conftest.py:

def test_bad():
    idx1 = pandas.Index([1,2,3])
    idx2 = pandas.Index([4,5,6])
    pandas.testing.assert_index_equal(idx1, idx2)

and get no errors with pytest pvlib\tests\test_conftest.py but a failure as expected with pytest pvlib\tests\test_conftest.py::test_bad. So I guess somehow the monkey patching going on in test__check_pandas_assert_kwargs is leaking out and affecting other tests. Yikes. Commenting out the test__check_pandas_assert_kwargs function reveals failures in some modelchain tests in addition to the iotools tests I saw originally.

So I suppose we need to:

  1. Change test__check_pandas_assert_kwargs so whatever it does internally doesn't affect other tests
  2. Fix the failures that this issue was hiding. Here's the list I'm getting with pytest --remote-data:
    FAILED pvlib/tests/test_modelchain.py::test__assign_total_irrad - AssertionError: DataFrame.columns are different
    FAILED pvlib/tests/test_modelchain.py::test_prepare_inputs_from_poa - AssertionError: DataFrame.columns are different
    FAILED pvlib/tests/test_modelchain.py::test__prepare_temperature - AssertionError: Series are different
    FAILED pvlib/tests/test_modelchain.py::test__prepare_temperature_arrays_weather - AssertionError: Series are different
    FAILED pvlib/tests/test_modelchain.py::test_run_model_from_effective_irradiance[<lambda>] - AssertionError: Series are different
    FAILED pvlib/tests/test_modelchain.py::test_run_model_from_effective_irradiance[tuple] - AssertionError: Series are different
    FAILED pvlib/tests/test_modelchain.py::test_run_model_from_effective_irradiance[list] - AssertionError: Series are different
    FAILED pvlib/tests/test_modelchain.py::test_run_model_from_effective_irradiance_poa_global_differs - AssertionError: Series are different
    FAILED pvlib/tests/test_modelchain.py::test_run_model_from_effective_irradiance_weather_single_array - AssertionError: Series are different
    FAILED pvlib/tests/iotools/test_bsrn.py::test_read_bsrn[bsrn-pay0616.dat.gz] - AssertionError: Index are different
    FAILED pvlib/tests/iotools/test_bsrn.py::test_read_bsrn[bsrn-lr0100-pay0616.dat] - AssertionError: Index are different
    FAILED pvlib/tests/iotools/test_bsrn.py::test_read_bsrn_logical_records - AssertionError: Index are different
    FAILED pvlib/tests/iotools/test_psm3.py::test_read_psm3_map_variables - AssertionError: Index are different
    FAILED pvlib/tests/iotools/test_pvgis.py::test_read_pvgis_hourly[testfile2-expected_pv_json-metadata_exp2-inputs_exp2-False-None] - AssertionError: DataFrame.columns are different
    FAILED pvlib/tests/iotools/test_pvgis.py::test_read_pvgis_hourly[testfile3-expected_pv_json_mapped-metadata_exp3-inputs_exp3-True-json] - AssertionError: DataFrame.columns are different
    FAILED pvlib/tests/iotools/test_pvgis.py::test_get_pvgis_hourly[testfile2-expected_pv_json-args2-False-https://re.jrc.ec.europa.eu/api/seriescalc?lat=45&lon=8&outputformat=json&angle=30&aspect=0&pvtechchoice=CIS&mountingplace=free&trackingtype=2&components=1&usehorizon=1&raddatabase=PVGIS-CMSAF&startyear=2013&endyear=2014&pvcalculation=1&peakpower=10&loss=5&optimalangles=1]
    FAILED pvlib/tests/iotools/test_pvgis.py::test_get_pvgis_hourly[testfile3-expected_pv_json_mapped-args3-True-https://re.jrc.ec.europa.eu/api/seriescalc?lat=45&lon=8&outputformat=json&angle=30&aspect=0&pvtechchoice=CIS&mountingplace=free&trackingtype=2&components=1&usehorizon=1&raddatabase=PVGIS-CMSAF&startyear=2013&endyear=2014&pvcalculation=1&peakpower=10&loss=5&optimalangles=1]
    

@wholmgren
Copy link
Member

wholmgren commented Apr 21, 2022

NEP 29 says

When a project releases a new major or minor version, we recommend that they support at least all minor versions of Python introduced and released in the prior 42 months from the anticipated release date with a minimum of 2 minor versions of Python, and all minor versions of NumPy released in the prior 24 months from the anticipated release date with a minimum of 3 minor versions of NumPy.

Relevant pandas releases:

  • 1.0.3: March 17, 2020
  • 1.0.4: May 28, 2020
  • 1.0.5: June 17, 2020
  • 1.1.0: July 28, 2020.

I suggest that we remove the test__check_pandas_assert_kwargs function now, then bump the pandas dependency to 1.1.0 and clean up the tests once we anticipate the next release will be after June 17.

And good sleuthing!

@kandersolar
Copy link
Member Author

It would be nice to get rid of the version complexity in our pandas asserts, so bumping to 1.1.0 is appealing.

An alternative that doesn't require bumping all the way to pandas 1.1.0 right away: using either monkeypatch or mocker (not both) in test__check_pandas_assert_kwargs prevents it from leaking mocks/patches, and bumping to pandas==0.25.0 gets all the miscellaneous failures listed in my previous comment passing again (with the exception of the PSM3 failure which is a real oversight in the test function).

@kandersolar kandersolar added this to the 0.9.2 milestone Apr 22, 2022
@kandersolar kandersolar changed the title Test suite has strange interdependency between files Bug in pandas assert_*_equal wrapper test causes other tests to pass when they should fail Apr 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants