Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gpclient tests are failing sporadically #59

Open
ralphlange opened this issue Jul 22, 2019 · 2 comments
Open

gpclient tests are failing sporadically #59

ralphlange opened this issue Jul 22, 2019 · 2 comments
Assignees
Labels

Comments

@ralphlange
Copy link
Contributor

We are seeing intermittent build/test failures with the gpclient tests:

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.epics.gpclient.PVEventRecorderTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.603 sec
Running org.epics.gpclient.datasource.DataSourceImplementationTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.032 sec
Running org.epics.gpclient.PassiveRateDecouplerTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.795 sec
Running org.epics.gpclient.ActiveRateDecouplerTest
Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 4.03 sec <<<
FAILURE!
activeScanningRate(org.epics.gpclient.ActiveRateDecouplerTest)  Time elapsed:
1.022 sec  <<< FAILURE!
java.lang.AssertionError: 
Expected: a value less than or equal to <5>
     but: <6> was greater than <5>

From a distance, this looks like the test is making assumptions on parallelism/timing/resource usage on the host that are not always true.
Is there a way to improve robustness of this test?

@ralphlange ralphlange added the bug label Jul 22, 2019
@shroffk shroffk self-assigned this Jul 31, 2019
@shroffk
Copy link
Contributor

shroffk commented Aug 1, 2019

I am having trouble reproducing it...predictably.

To an extent this does make an assumption of timing... I will have to think of a completely different set of checks

@ralphlange
Copy link
Contributor Author

Well, the test sets up sending monitors every 100ms, creates a subscription, then sleeps for 500ms, and expects to have received 4 or 5 updates.
If the sleep takes longer than 500ms (and those functions usually guarantee a minimum sleep time), the subscription might as well receive 6 updates, which is what's happening in our case.

I would suggest to either simply loosen the criteria allowing 4-6 updates after a 500ms sleep,
or to turn things around, measure the time between updates, and use criteria on the results' statistics (mix/max/avg after removing outliers or similar).

shroffk added a commit to shroffk/epicsCoreJava that referenced this issue Sep 10, 2019
shroffk added a commit that referenced this issue Sep 10, 2019
#59 temporarily increasing the expected event count
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants