-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Argo Pipeline Integration #855
Conversation
@@ -175,7 +175,7 @@ def get_rendered_task_definitions( | |||
loading_blocks=loading_blocks, | |||
pre_call_block="", | |||
call_block=task_def.call_block, | |||
post_call_block="", | |||
post_call_block=task_def.post_call_block, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same thing for pre_call_block makes sense?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also not sure how jinja tackles [None]. Does it render as the string "None" or skips it? if its the former, we should use task_def.post_call_block or ""
here.
lineapy/plugins/jinja_templates/task/localpickle/task_local_pickle_ser.jinja
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AWWW yea
Thanks @lazargugleta for the patience in seeing this one through. Really appreciate it!
@andycui97 |
Description
_module.py
is generated in the same manner as Airflow's so it remained_requirements.txt
as well_dag
file provides an execution file to submit workflows to argo directly using hera-workflows.It is generated based on the argo_dag.jinja
Since python function is a unit of execution it stays pretty Airflow-ish alike.
It defines a WorkflowService, either parameterized by the user or given default parameters,
which is submitted along the Workflow instance.
Tasks and dependencies are generated in a different pattern but utilized the same
methods already providing the tasks and tasks definitions.
The service account token is searched for and appended to the WorkflowService as well.
With the command
python3 {pipeline_name}_dag.py
the workflow is directly submitted to the current argo environment._Dockerfile is fine
Current issues
Type of change
How Has This Been Tested?