Skip to content

Skipped flaky part of test_time #25894

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 27, 2019
Merged

Conversation

WillAyd
Copy link
Member

@WillAyd WillAyd commented Mar 27, 2019

Workaround for #25875 to get CI passing - I kept intact the working part of the test and split off the failing piece into a separate test, which may be more explicit anyway

Haven't been able to reproduce this locally so plan to either keep the original issue open or create a new one for a more permanent fix, which may require a total refactor of the test

@gfyoung

@codecov
Copy link

codecov bot commented Mar 27, 2019

Codecov Report

Merging #25894 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master   #25894   +/-   ##
=======================================
  Coverage   91.47%   91.47%           
=======================================
  Files         175      175           
  Lines       52863    52863           
=======================================
  Hits        48357    48357           
  Misses       4506     4506
Flag Coverage Δ
#multiple 90.04% <ø> (ø) ⬆️
#single 41.8% <ø> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ac318d2...6a8aee5. Read the comment docs.

@codecov
Copy link

codecov bot commented Mar 27, 2019

Codecov Report

Merging #25894 into master will increase coverage by <.01%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #25894      +/-   ##
==========================================
+ Coverage   91.47%   91.47%   +<.01%     
==========================================
  Files         175      175              
  Lines       52863    52863              
==========================================
+ Hits        48357    48358       +1     
+ Misses       4506     4505       -1
Flag Coverage Δ
#multiple 90.04% <ø> (ø) ⬆️
#single 41.8% <ø> (-0.01%) ⬇️
Impacted Files Coverage Δ
pandas/util/testing.py 89.83% <0%> (+0.1%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ac318d2...2b9c032. Read the comment docs.

@gfyoung gfyoung added CI Continuous Integration Unreliable Test Unit tests that occasionally fail labels Mar 27, 2019
@pytest.mark.xfail(strict=False, reason="Unreliable test")
def test_time_change_xlim(self):
t = datetime(1, 1, 1, 3, 30, 0)
deltas = np.random.randint(1, 20, 3).cumsum()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the random ints for deltas are what can cause the failures.

Copy link
Member

@gfyoung gfyoung Mar 27, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potentially, though probably best to investigate after this PR.

@jreback jreback added this to the 0.25.0 milestone Mar 27, 2019
@jreback jreback merged commit 850fbb5 into pandas-dev:master Mar 27, 2019
@jreback
Copy link
Contributor

jreback commented Mar 27, 2019

ok so let's see if this works

@WillAyd WillAyd deleted the test-time-failure branch January 16, 2020 00:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI Continuous Integration Unreliable Test Unit tests that occasionally fail
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants