You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'll start with asking you guys to feel free to point me to some docs that I've missed while investigating this topic.
Here is my situation:
We're building a C++ application that has a python 3 interpreter embedded into it. I must clarify that we build under Ubuntu 18. The functionalities that take advantage of the embedded interpreter expect some third party modules (E.g. numpy) to be available at run time.
So far we managed this by listing those modules as dependencies of the deb package of our tool. Continuing with the previous example, the deb package of our app lists python3-numpy as a dependency, so when we install the app numpy also gets installed. So far, so good.
This approach has a fatal flaw though: apt (Ubuntu's package manager) only provides packages for the default python version of each Ubuntu version. That is, for Ubuntu 18 we only get python 3.6 packages, under Ubuntu 20 we get python 3.8 packages, etc.
As I said we're currently building under Ubuntu 18 so our app can only embed a python 3.6 interpreter, otherwise we have no packages for the python modules we rely on.
This also means that our app can't be installed on a different Ubuntu version than the one it was built on as we would get packages for the python modules we need that are meant to be consumed by a different version of python3.
Is there any best practices or recommendations on how to handle this?
To put this in short, how are we supposed to handle the dependencies we have at runtime on python modules when we embed the interpreter?
Asking the users of our app to do something like pip install -r requirements.txt is out of the question for a plethora of reasons.
Would it be possible to make pybind spawn some virtualenv at runtime?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone.
I'll start with asking you guys to feel free to point me to some docs that I've missed while investigating this topic.
Here is my situation:
We're building a C++ application that has a python 3 interpreter embedded into it. I must clarify that we build under Ubuntu 18. The functionalities that take advantage of the embedded interpreter expect some third party modules (E.g.
numpy
) to be available at run time.So far we managed this by listing those modules as dependencies of the
deb
package of our tool. Continuing with the previous example, thedeb
package of our app listspython3-numpy
as a dependency, so when we install the appnumpy
also gets installed. So far, so good.This approach has a fatal flaw though:
apt
(Ubuntu's package manager) only provides packages for the default python version of each Ubuntu version. That is, for Ubuntu 18 we only get python 3.6 packages, under Ubuntu 20 we get python 3.8 packages, etc.As I said we're currently building under Ubuntu 18 so our app can only embed a python 3.6 interpreter, otherwise we have no packages for the python modules we rely on.
This also means that our app can't be installed on a different Ubuntu version than the one it was built on as we would get packages for the python modules we need that are meant to be consumed by a different version of python3.
Is there any best practices or recommendations on how to handle this?
To put this in short, how are we supposed to handle the dependencies we have at runtime on python modules when we embed the interpreter?
Asking the users of our app to do something like
pip install -r requirements.txt
is out of the question for a plethora of reasons.Would it be possible to make pybind spawn some virtualenv at runtime?
Beta Was this translation helpful? Give feedback.
All reactions