Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Several pytests failing with KeyError: 'S3BucketMigrationDatetime' #2203

Closed
amykglen opened this issue Nov 15, 2023 · 6 comments
Closed

Several pytests failing with KeyError: 'S3BucketMigrationDatetime' #2203

amykglen opened this issue Nov 15, 2023 · 6 comments

Comments

@amykglen
Copy link
Member

I went to run the ARAX pytest suite today in master (on my laptop) and several pytests seem to be failing with KeyError: 'S3BucketMigrationDatetime'.

here's one example:

pytest -vsk test_qualified_regulates_query
2023-11-15T14:40:01.246769 DEBUG: (6504) [] Number of nodes in KG is 284
2023-11-15T14:40:01.247242 DEBUG: (6504) [] Number of nodes in KG by type is Counter({'biolink:Gene': 166, 'biolink:Protein': 118})
2023-11-15T14:40:01.247271 DEBUG: (6504) [] Number of edges in KG is 566
2023-11-15T14:40:01.247488 DEBUG: (6504) [] Number of edges in KG by type is Counter({'biolink:affects': 283, 'biolink:occurs_together_in_literature_with': 283})
2023-11-15T14:40:01.247643 DEBUG: (6504) [] Number of edges in KG with attributes is 566
2023-11-15T14:40:01.248656 DEBUG: (6504) [] Number of edges in KG by attribute Counter({None: 1415, 'normalized_google_distance': 283, 'virtual_relation_label': 283, 'defined_datetime': 283, 'publications': 202})
2023-11-15T14:40:01.248732 INFO: (6504) [] Transforming results to TRAPI 1.4 format (moving 'virtual' nodes/edges to support graphs)
2023-11-15T14:40:01.248755 DEBUG: (6504) [] Original input QG contained qnodes {'n1', 'n0'} and qedges {'e0'}
2023-11-15T14:40:01.248764 DEBUG: (6504) [] Non-orphan qnodes in original QG are: {'n1', 'n0'}
2023-11-15T14:40:01.253135 DEBUG: (6504) [] Replacing ARAX's internal edited QG with the original input QG..
2023-11-15T14:40:01.253185 DEBUG: (6504) [] Virtual qedge keys moved to support_graphs were: {'N1'}
2023-11-15T14:40:01.253196 DEBUG: (6504) [] There are a total of 283 AuxiliaryGraphs.
2023-11-15T14:40:01.253203 INFO: (6504) [] Done transforming results to TRAPI 1.4 format (i.e., using support_graphs)
2023-11-15T14:40:01.253411 DEBUG: (6504) [] Storing resulting Message
2023-11-15T14:40:01.253425 DEBUG: (6504) [] Writing response record to MySQL
DEBUG: Datetime now is: 2023-11-15 14:40:01.261545
FAILED

==================================================================== FAILURES =====================================================================
_________________________________________________________ test_qualified_regulates_query __________________________________________________________

    def test_qualified_regulates_query():
        query = {
            "nodes": {
                "n0": {
                     "ids": ["NCBIGene:7157"]
                },
                "n1": {
                    "categories": ["biolink:Gene"]
                }
            },
            "edges": {
                "e0": {
                    "subject": "n0",
                    "object": "n1",
                    "qualifier_constraints": [
                        {"qualifier_set": [
                            {"qualifier_type_id": "biolink:qualified_predicate",
                             "qualifier_value": "biolink:causes"},
                            # {"qualifier_type_id": "biolink:object_direction_qualifier",
                            #  "qualifier_value": "decreased"}, # for RTX issue 2068
                            #                                   # see also RTX-KG2 issue 339
                            #                                   # Uncomment to test in KG2.8.5
                            {"qualifier_type_id": "biolink:object_aspect_qualifier",
                             "qualifier_value": "activity"}
                        ]}
                    ],
                    "attribute_constraints": [
                        {
                            "id": "knowledge_source",
                            "name": "knowledge source",
                            "value": ["infores:rtx-kg2"],
                            "operator": "==",
                            "not": False
                        }
                    ]
                }
            }
        }
>       nodes_by_qg_id, edges_by_qg_id = _run_query_and_do_standard_testing(json_query=query)

test_ARAX_expand.py:831: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
test_ARAX_expand.py:34: in _run_query_and_do_standard_testing
    response = araxq.query(query_object)
../ARAXQuery/ARAX_query.py:396: in query
    result = self.execute_processing_plan(query, mode=mode)
../ARAXQuery/ARAX_query.py:923: in execute_processing_plan
    response_id = response_cache.add_new_response(response)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <response_cache.ResponseCache object at 0x1245ab130>, response = <ARAX_response.ARAXResponse object at 0x12458cb50>

    def add_new_response(self,response):
    
        DEBUG = True
        session = self.session
        envelope = response.envelope
        message = envelope.message
    
        response.debug(f"Writing response record to MySQL")
        try:
            stored_response = Response(response_datetime=datetime.now(),tool_version=self.rtxConfig.version,
                response_code=envelope.status,message=envelope.description,n_results=len(envelope.message.results))
            session.add(stored_response)
            session.flush()
            session.commit()
            response_id = stored_response.response_id
            response_filename = f"/responses/{response_id}.json"
        except:
            response.error(f"Unable to store response record in MySQL", error_code="InternalError")
            response_filename = f"/responses/error.json"
            response_id = 0
    
        servername = 'localhost'
        if self.rtxConfig.is_production_server:
            servername = 'arax.ncats.io'
        envelope.id = f"https://{servername}/api/arax/v1.4/response/{response_id}"
    
        #### New system to store the responses in an S3 bucket
        rtx_config = RTXConfiguration()
        KEY_ID = rtx_config.config_secrets['s3']['access']
        ACCESS_KEY = rtx_config.config_secrets['s3']['secret']
        succeeded_to_s3 = False
    
        #### Get information needed to decide which bucket to write to
        bucket_config = self.get_configs()
        datetime_now = str(datetime.now())
        if DEBUG:
            print(f"DEBUG: Datetime now is: {datetime_now}")
>           print(f"DEBUG: Cutover date is: {bucket_config['S3BucketMigrationDatetime']}")
E           KeyError: 'S3BucketMigrationDatetime'

../ResponseCache/response_cache.py:241: KeyError

not sure if maybe my environment is missing something?

@amykglen
Copy link
Member Author

amykglen commented Nov 15, 2023

I've gotten around this locally by essentially just not saving the response anywhere if bucket_config['S3BucketMigrationDatetime'] doesn't exist (I think ideally we don't want pytests saving responses anyway? #1868)

but not sure of what the ultimate fix should be here

@edeutsch
Copy link
Collaborator

ah thank you for fixing, this is probably my fault.

@saramsey
Copy link
Member

can we close this out?

@amykglen
Copy link
Member Author

no, I actually haven't pushed this fix - I got around the error locally but I'm not sure if my changes mess things up for our deployed instances. maybe I'll push to a branch so others can see

@amykglen
Copy link
Member Author

ok, created a pull request with my changes (#2219)

edeutsch added a commit that referenced this issue Dec 20, 2023
Only save response if we know which S3 bucket to use #2203
@edeutsch
Copy link
Collaborator

merged, thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants