-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add LLM functionality from SpeziFHIR #53
Conversation
Hey @LeonNissen, thanks for the great work here. Before I jump doing a full review, are the changes here copy&paste from |
Hey @LeonNissen - Just pushed two commits. The first one resolves merge conflicts with At least this runs now 🎉 |
febaddb
to
f3bfc5d
Compare
@jdisho Oh I am so sorry, I just read it now! But yes there are just copy & past changes 🚀 |
@jdisho Awesome great work! I think thats a good starting point for further improvements 🥳 |
5c4104a
to
c6f0273
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @LeonNissen & @jdisho!
I fixed the CodeQL build issue and would be happy to see the PR merged.
We should also scan the code and ensure that we remove elements that are no longer needed and try to cleanup a few things. I also created #56 in response to this PR to ensure that we setup some automated setup for this.
Let's get this merged and address the open issues towards an usability study.
.task { | ||
#if !TEST | ||
// interpret() | ||
#endif | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We might want to go though the elements like this and ensure that we remove uncommented code & other elements in the long run.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for updating the setup here!
@LeonNissen & @jdisho Would be amazing if we can update the PR description to reflect the changes & if we create open issue for any of the elements that we should address in future PRs based on the code that we pull in here. Examples for this include issues such as #56. Feel free to ping me here once this is done so I can merge the PR using admin permissions despite the falling code coverage. We could see if we can pull some UI tests from SpeziFHIR in here if possible. |
dismiss() | ||
} | ||
} | ||
} | ||
} | ||
} | ||
|
||
private var downloadModelSettings: some View { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This appears to be an addition to what we have in main
. Since this PR primarily focuses on moving the app code from SpeziFHIR
to LLMonFHIR
, I'm wondering if this change belongs here. Also, I noticed that even after downloading the model, the chat still interacts with OpenAI, which makes me question its current value. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, you're right. I think there is some code that is still in progress to switch from OpenAI models to local ones. Maybe we should first merge everything "as is" into the different repositories and afterwards make changes to, e.g. use the local model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fine by me. But since we're merging this to main
, we should at least make this feature inaccessible in Settings. Should be a quick change, which I can do it right now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds great thank you @jdisho!
Let's add an issue for this in this repo so we don't forget about this when the PR is merged.
@PSchmiedmayer - Great idea! I can take care of this both in |
Amazing; thank you @jdisho! Should I merge the PR as is despite the code coverage drops using admin permissions? |
Let's merge it 🚀 |
Add LLM functionality from
SpeziFHIR
♻️ Current situation & Problem
SpeziFHIR
contained LLM-based summary and interpretation functionality that uses OpenAI API-s and local models like Llama, all of which are tightly coupled with app-specific logic. This coupling makes it more difficult for other apps to useSpeziFHIR
without inheriting unnecessary LLM-related dependencies and functionality they may not need. Since this functionality is closer toLLMonFHIR
, we are migrating it here.StanfordSpezi/SpeziFHIR#24
⚙️ Release Notes
SpeziFHIR
toLLMonFHIR
.📚 Documentation
Code comments are added alongside the code.
✅ Testing
No additional testing code and this would be beyond the scope of this PR.
Code of Conduct & Contributing Guidelines
By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines: