You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 23, 2023. It is now read-only.
When submitting some data to a dataset that has the same STAC ID in teh collection/catalog.json as another dataset results in the dataset being imported to S3 but then the 'update root catalog' function fails because you can't have two catalog children with the same id.
Need to check what happens when submitting a partial dataset version update, like adding one item.json to a dataset.
Tasks
Look at options for resolving this bug, with possible options are:
drop the static catalogue
Restrict the supplied files can only be collection.json
If a collection.json or catalog.json is being updated, store the STAC IDs in dynamoDB mapped to dataset for every dataset import
Check that STAC ID doesn't exist in a different dataset
If it isn't unique send a useful message back to the user
How to Reproduce
Create a dataset and import a dataset version
Create a 2nd dataset and import a dataset version with the same staging data as the 1st version
What did you expect to happen?
STEP function notices that the dataset ID is exactly the same as another dataset and stops the process and returns a message to the user, before any other validation or file copy etc
And root catalog is also not updated
What actually happened?
Dataset version was processed successfully
Root catalog had a new version that was identical to the previous version, and it had no child link to the just created child dataset
write the id in the geostore to match the random id string created by the geostore dataset function, so it is always unique
check the IDs are unique across datasets and exit the step fucntion if they are not, and return message to the user (could use pystac or put all the STAC IDs in dynamodb)
Bug Description
When submitting some data to a dataset that has the same STAC ID in teh collection/catalog.json as another dataset results in the dataset being imported to S3 but then the 'update root catalog' function fails because you can't have two catalog children with the same id.
Need to check what happens when submitting a partial dataset version update, like adding one item.json to a dataset.
Tasks
Look at options for resolving this bug, with possible options are:
drop the static catalogue
Restrict the supplied files can only be collection.json
If a collection.json or catalog.json is being updated, store the STAC IDs in dynamoDB mapped to dataset for every dataset import
Check that STAC ID doesn't exist in a different dataset
If it isn't unique send a useful message back to the user
How to Reproduce
What did you expect to happen?
What actually happened?
Software Context
Operating system: AWS Console Lambda 'test' tool
Environment: Nonprod
Relevant software versions:
Additional context
Definition of Done
CODING guidelines
non-functional
requirements
The text was updated successfully, but these errors were encountered: