Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactoring of DBS3 Unit-tests #100

Open
ericvaandering opened this issue Nov 7, 2013 · 3 comments
Open

Refactoring of DBS3 Unit-tests #100

ericvaandering opened this issue Nov 7, 2013 · 3 comments

Comments

@ericvaandering
Copy link
Member

Original TRAC ticket 2876 reported by giffels
In the current design of unit-test, we have code separation between client and server tests. The web-layer server tests and the client test do nearly the same and do not share any code, which causes a code duplication.

In addition, we have separated unit-tests and validation tests on the client side. The unit-tests do not validate the output of DBS.

Furthermore, Valentin ask for a set of data, which is present in each DBS instance and can be used for integration tests with DAS. We have defined such a set of data in our deployment test. Valentin would like to have a package, which provides some sort of API to access this pre-defined data to compare it with the output of DAS. Such an API can also be used in other projects to do integration tests. Maybe we have to increase the amount of pre-defined data.

@ghost ghost assigned giffels Nov 7, 2013
@ericvaandering
Copy link
Member Author

Author: valya
I would suggest the following. We create an independent package, e.g. DBSUnitTestSuite or something. The package will provide APIs similar to DBS3, e.g. datasets, whose purpose will be to give us (DBS, DAS and other developers) consistent set of meta-data, which can be used for integration/unit test purposes. The classes will hold meta-data (keep hard-coded names). The output of these APIs will be similar to DBS APIs, the amount of details we need to provide will depend on how deep we want our integration tests work. But to start with a simple entity with leading key, e.g. dataset.name, file.name, will be sufficient.

Then it would be part of DBS deployment procedure to release new DBS instance with a set of meta-data provided by DBSUnitTestSuite. So for DBS team will provide simple python code which will insert meta-data into new DBS instance (the code will be used inside of DBS deploy script as part of post-install step), e.g.:

from DBSUnitTestSuite import datasets

# get data from test suite
data = datasets.get()

# insert data into DBS instance
for row in data:
     # do DBS insert API into your DBS instance

# do unit test against inserted data

While DAS and others will only ask for these data in their integration/unit tests, e.g.:

# DAS unit test
from DBSUnitTestSuite import datasets
# get data from test suite
data = dataset.get()
# get data from DBS instance
dbs_data = urlopen("http://dbs_url/datasets").read()
# compare Test data with data retrieved from DBS instance:
for row in data:
     if  row not in dbs_data:
         raise Exception("FAIL UNIT TEST")

How does this sound?
Valentin.

@ericvaandering
Copy link
Member Author

Author: giffels
Replying to [comment:1 valya]:

I would suggest the following. We create an independent package, e.g. DBSUnitTestSuite or something. The package will provide APIs similar to DBS3, e.g. datasets, whose purpose will be to give us (DBS, DAS and other developers) consistent set of meta-data, which can be used for integration/unit test purposes. The classes will hold meta-data (keep hard-coded names). The output of these APIs will be similar to DBS APIs, the amount of details we need to provide will depend on how deep we want our integration tests work. But to start with a simple entity with leading key, e.g. dataset.name, file.name, will be sufficient.

We already agreed on that during the O&C week at CERN. It would be good to have a separate package for that. We can distribute it as an additional rpm using the sub-package mechanism. Concerning the API, I think it should have the same interface like the dbs3 client.

Then it would be part of DBS deployment procedure to release new DBS instance with a set of meta-data provided by DBSUnitTestSuite. So for DBS team will provide simple python code which will insert meta-data into new DBS instance (the code will be used inside of DBS deploy script as part of post-install step), e.g.:

I wouldn't include that step into the deployment procedure inside the DBS deploy script. We are running deployment tests against each instance after a new cmsweb installation is available. That ensures that the data is available. We would like to avoid injection of duplicated data even that should not harm the installation.

How does this sound?

Sounds good,
Manuel

@ericvaandering
Copy link
Member Author

Author: valya
Manuel, the only reason I suggested to include running test into deploy script is to ensure that other systems can start their testing of DBS instance(s) right away. The problem that we have 1 week for integration tests once new version of data-service goes to pre-production. I don't want to be in a position of waiting for you guys to do my integration tests. If you'll delay your testing, it means my time will be squeezed. Trust me, something will certainly pop-up (sickness, travel, kids, etc.), and we will end-up that integration tests need to be done on last day and quite often on week-end. So, I don't care how you'll do it, but I do care that procedure will be in place and I want to have guarantee that integration tests will have sufficient time to complete. For instance, integration tests can start in 24h from the deployment time on pre-production node.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants