-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEA] Set a default Alluxio Master IP #5872
Comments
I am not entirely sure what that IP should be. Is there a guarantee that the master process is over by the Spark master (I assume that's what you are looking to do). The config you mention here is verbose, especially since |
I am thinking if we can read the Alluxio conf dir, then we will know the alluxio master IP. Then as @abellina mentioned, the |
Since we use this config to support auto mount feature by calling "alluxio fs mount", I don't think we can support the case that Alluxio master is not the same node of spark driver node(mostly the spark master in Databricks case). |
@GaryShen2008 I am thinking of some scenario: |
In that case, even the current auto mount feature won't work since we call the command line to mount the folder. |
Currently we need to get the Alluxio Master IP and then use below Spark RAPIDS config to convert the path:
I wish we can set a default Alluxio Master IP. (Say Master Node's IP) so that users do not need to manually find the Alluxio Master IP.
Eg,
spark.rapids.alluxio.masterip=<Master Node's IP >
.The text was updated successfully, but these errors were encountered: