Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✅ TESTS ✅ chore: Setting the default spark_version value from pyspark.__version__ #237

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

stevenayers
Copy link

Duplicate of @aagumin's #175 but with fixed tests:

*Issue #170

Description of changes:
The default value for the SPARK_VERSION variable will be taken from pyspark.version. In case of problems, the user also sets the environment variable himself

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

FYI @chenliu0831

@stevenayers
Copy link
Author

@aagumin @chenliu0831 ive just noticed this class in pyspark - should we use this instead? https://spark.apache.org/docs/3.4.1/api/python/reference/api/pyspark.util.VersionUtils.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant