Skip to content

Commit

Permalink
Mark kubernetes based projects as experimental. (mlflow#1606)
Browse files Browse the repository at this point in the history
* add experimental to docs

* address corey and fix docs about --cluster-spec

* address corey again:

* make it important

* fix tests

* fix lint

* fix linter
  • Loading branch information
andrewmchen committed Jul 19, 2019
1 parent 3d39607 commit 9b5e69f
Show file tree
Hide file tree
Showing 5 changed files with 31 additions and 30 deletions.
9 changes: 6 additions & 3 deletions docs/source/projects.rst
Original file line number Diff line number Diff line change
Expand Up @@ -377,12 +377,15 @@ where ``<project_uri>`` is a Git repository URI or a folder.

.. _kubernetes_execution:

Run an MLflow Project on Kubernetes
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Run an MLflow Project on Kubernetes (experimental)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. important:: As an experimental feature, the API is subject to change.

You can run MLflow Projects with :ref:`Docker environments <project-docker-container-environments>`
on Kubernetes. The following sections provide an overview of the feature, including a simple
Project execution guide with examples.
Project execution guide with examples.


To see this feature in action, you can also refer to the
`Docker example <https://github.com/mlflow/mlflow/tree/master/examples/docker>`_, which includes
Expand Down
7 changes: 4 additions & 3 deletions mlflow/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@
run = projects.run


__all__ = ["ActiveRun", "log_param", "log_params", "log_metric", "log_metrics", "set_tag", "set_tags",
"log_artifacts", "log_artifact", "active_run", "start_run", "end_run", "search_runs",
"get_artifact_uri", "set_tracking_uri", "create_experiment", "set_experiment", "run"]
__all__ = ["ActiveRun", "log_param", "log_params", "log_metric", "log_metrics", "set_tag",
"set_tags", "log_artifacts", "log_artifact", "active_run", "start_run", "end_run",
"search_runs", "get_artifact_uri", "set_tracking_uri", "create_experiment",
"set_experiment", "run"]
23 changes: 10 additions & 13 deletions mlflow/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,23 +58,20 @@ def cli():
help="ID of the experiment under which to launch the run.")
# TODO: Add tracking server argument once we have it working.
@click.option("--backend", "-b", metavar="BACKEND",
help="Execution backend to use for run. Supported values: 'local' (runs project "
"locally) and 'databricks' (runs project on a Databricks cluster). "
"Defaults to 'local'. If running against Databricks, will run against a "
"Databricks workspace determined as follows: if a Databricks tracking URI "
"of the form 'databricks://profile' has been set (e.g. by setting "
"the MLFLOW_TRACKING_URI environment variable), will run against the "
"workspace specified by <profile>. Otherwise, runs against the workspace "
"specified by the default Databricks CLI profile. See "
help="Execution backend to use for run. Supported values: 'local', 'databricks', "
"kubernetes (experimental). Defaults to 'local'. If running against "
"Databricks, will run against a Databricks workspace determined as follows: "
"if a Databricks tracking URI of the form 'databricks://profile' has been set "
"(e.g. by setting the MLFLOW_TRACKING_URI environment variable), will run "
"against the workspace specified by <profile>. Otherwise, runs against the "
"workspace specified by the default Databricks CLI profile. See "
"https://github.com/databricks/databricks-cli for more info on configuring a "
"Databricks CLI profile.")
@click.option("--backend-config", "-c", metavar="FILE",
help="Path to JSON file (must end in '.json') or JSON string which will be passed "
"as config to the backend. For the Databricks backend, this should be a "
"cluster spec: see "
"https://docs.databricks.com/api/latest/jobs.html#jobsclusterspecnewcluster "
"for more information. Note that MLflow runs are currently launched against "
"a new cluster.")
"as config to the backend. The exact content which should be "
"provided is different for each execution backend and is documented "
"at https://www.mlflow.org/docs/latest/projects.html.")
@cli_args.NO_CONDA
@click.option("--storage-dir", envvar="MLFLOW_TMP_DIR",
help="Only valid when ``backend`` is local."
Expand Down
20 changes: 10 additions & 10 deletions mlflow/projects/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,17 +211,17 @@ def run(uri, entry_point="main", version=None, parameters=None,
:param version: For Git-based projects, either a commit hash or a branch name.
:param experiment_name: Name of experiment under which to launch the run.
:param experiment_id: ID of experiment under which to launch the run.
:param backend: Execution backend for the run: "local" or "databricks". If running against
Databricks, will run against a Databricks workspace determined as follows: if
a Databricks tracking URI of the form ``databricks://profile`` has been set
(e.g. by setting the MLFLOW_TRACKING_URI environment variable), will run
against the workspace specified by <profile>. Otherwise, runs against the
workspace specified by the default Databricks CLI profile.
:param backend: Execution backend for the run: "local", "databricks", or "kubernetes"
(experimental). If running against Databricks, will run against a Databricks
workspace determined as follows: if a Databricks tracking URI of the form
``databricks://profile`` has been set (e.g. by setting the
MLFLOW_TRACKING_URI environment variable), will run against the workspace
specified by <profile>. Otherwise, runs against the workspace specified by
the default Databricks CLI profile.
:param backend_config: A dictionary, or a path to a JSON file (must end in '.json'), which will
be passed as config to the backend. For the Databricks backend, this
should be a cluster spec: see `Databricks Cluster Specs for Jobs
<https://docs.databricks.com/api/latest/jobs.html#jobsclusterspecnewcluster>`_
for more information.
be passed as config to the backend. The exact content which should be
provided is different for each execution backend and is documented
at https://www.mlflow.org/docs/latest/projects.html.
:param use_conda: If True (the default), create a new Conda environment for the run and
install project dependencies within that environment. Otherwise, run the
project in the current environment without installing any project
Expand Down
2 changes: 1 addition & 1 deletion mlflow/tracking/fluent.py
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ def log_params(params):
def set_tags(tags):
"""
Log a batch of tags for the current run, starting a run if no runs are active.
:param tags: Dictionary of tag_name: String -> value: (String, but will be string-ified if
not)
:returns: None
Expand Down

0 comments on commit 9b5e69f

Please sign in to comment.