Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add jobLogger output for subworkflow running (with parent and subworkflow ID) #7387

Open
wants to merge 2 commits into
base: develop
Choose a base branch
from

Conversation

jlester-msft
Copy link
Contributor

Summary

The current cromwell docker log has UUIDs for sub-workflows only appear when they produce relevant output. This means you can go from looking at one root workflow to suddenly needing to know a UUID that is a sub-workflow of a sub-workflow. It is hard to follow the logs without a peek at something telling you the structure of the workflow (i.e., the cromwell-executions filesystem or the metadata).

This change adds output so a sub-workflow logs: ParentWorkflowID, SubWorkflowID, and RootWorkflowID. When the sub-workflow gets run. basically looking like:

INFO - <ParentWorkflowID> -SubWorkflowExecutionActor-SubWorkflow-name:shard:attempt [UUID...] - Running subworkflow: <SubworkflowID> , root: <RootWorkflowID>

INFO - ecb081a4-0166-4f9f-a2a8-20a50f8e9b19-SubWorkflowExecutionActor-SubWorkflow-outer_subworkflow:-1:1 [UUID(ecb081a4)SubWorkflow-outer_subworkflow:-1:1]: Running subworkflow: 53f62151-1c36-4e2f-8bff-0a2a90d7d8c5, root: ecb081a4-0166-4f9f-a2a8-20a50f8e9b19

This output goes to the cromwell docker log and the root-workflow's cromwell-workflow-log. Still complicated, but at least now you have a chance to follow along.

Note, every time pushWorkflowRunningMetadata is called it is for a new UUID for a new sub-workflow. And the subWorkflowId is known so that's why the log output line was added here.

Example from a simplified real world WDL

In the example below (which has been edited to make it easier to read). We have several examples of sub-workflows calling other sub-workflows.

Without the Running subworkflow output suddenly you're seeing output referencing 9 new UUIDs without any idea of what they are. This kind of output pops up a lot when you have reference/library style tasks in a WDL where a lot of the work happens inside several levels of sub-workflows.

image

Details about the example WDL

  • A root workflow (main_workflow.wdl) creates a subworkflow (LEVEL_1), call outer.outer_workflow
import "outer_subworkflow.wdl" as outer

workflow main_workflow {
   call outer.outer_subworkflow
}
  • LEVEL_1 outer_subworkflow.wdl then creates a scatter of 2 across another subworkflow (call inner.inner_subworkflow/LEVEL_2A and LEVEL_2B)
import "inner_subworkflow.wdl" as inner

workflow outer_subworkflow {
    scatter (i in range(2)) {
        call inner.inner_subworkflow as inner_subworkflow
    }
}
  • inner_subworkflow.wdl/LEVEL_2A and LEVEL_2B then runs a task with a scatter and a scatter of 3 across a final subworkflow (call sub_workflow.sub_subworkflow/ LEVEL_2_X__3_Y)
import "sub_subworkflow.wdl" as sub_subworkflow

task hello_world {
    command {
        echo 'Hello, world!'
        echo 'blah' > output.txt        
    }

    output {
        String message = read_string(stdout())
        File outputFile = "output.txt"
    }

    runtime {
        docker: "ubuntu:latest"
    }
}

workflow inner_subworkflow {
    scatter (i in range(4)) {
        call hello_world
    }
    scatter (i in range(3)) {
        call sub_subworkflow.sub_subworkflow
    }
}
  • This final sub_subworkflow.wdl then runs a scatter across a task:
task sub_hello_world {
    command {
        echo 'Hello from sub.sub_workflow, world!'
    }

    output {
        String message = read_string(stdout())
    }

    runtime {
        docker: "ubuntu:latest"
    }
}

workflow sub_subworkflow {
    scatter (i in range(2)) {
        call sub_hello_world
    }
}

In tree form you have something like this:

  • ROOT_WORKFLOW main_workflow.wdl
    • LEVEL_1 outer_subworkflow.wdl
      • LEVEL_2A inner_subworkflow.wdl
        • LEVEL_2_A__3_A sub_subworkflow.wdl
        • LEVEL_2_A__3_B sub_subworkflow.wdl
        • LEVEL_2_A__3_C sub_subworkflow.wdl
      • LEVEL_2B inner_subworkflow.wdl
        • LEVEL_2_B__3_A sub_subworkflow.wdl
        • LEVEL_2_B__3_B sub_subworkflow.wdl
        • LEVEL_2_B__3_C sub_subworkflow.wdl

LEVEL_2 inner_subworkflow.wdl task outputs end up here:

  cromwell-executions/main_workflow/ecb081a4-0166-4f9f-a2a8-20a50f8e9b19/
      call-outer_subworkflow/outer.outer_subworkflow/53f62151-1c36-4e2f-8bff-0a2a90d7d8c5/
          call-inner_subworkflow/shard-0/inner.inner_subworkflow/cd65e57c-12ee-4213-a698-c98dd0a973fd/
              call-hello_world/shard-0/cacheCopy/execution/rc
or (in more readible form)
  cromwell-executions/main_workflow/'<ROOT_WORKFLOW>'/
      call-outer_subworkflow/outer.outer_subworkflow/'<LEVEL_1>'/
          call-inner_subworkflow/shard-0/inner.inner_subworkflow/'<LEVEL_2_A>'/
              call-hello_world/shard-0/cacheCopy/execution/rc

And LEVEL_2_X__3_X sub_subworkflow.wdl task outputs end up here:

  cromwell-executions/main_workflow/ecb081a4-0166-4f9f-a2a8-20a50f8e9b19/
      call-outer_subworkflow/outer.outer_subworkflow/53f62151-1c36-4e2f-8bff-0a2a90d7d8c5/
          call-inner_subworkflow/shard-0/inner.inner_subworkflow/cd65e57c-12ee-4213-a698-c98dd0a973fd/
              call-sub_subworkflow/shard-0/sub_subworkflow.sub_subworkflow/1cf66266-082c-4131-a191-3d799b71230d/
                  call-sub_hello_world/shard-0/cacheCopy/execution/rc
or (in more readible form)
  cromwell-executions/main_workflow/'<ROOT_WORKFLOW>'/
      call-outer_subworkflow/outer.outer_subworkflow/'<LEVEL_1>'/
          call-inner_subworkflow/shard-0/inner.inner_subworkflow/'<LEVEL_2_A>'/
              call-sub_subworkflow/shard-0/sub_subworkflow.sub_subworkflow/'<LEVEL_2_A__3_A>'/
                  call-sub_hello_world/shard-0/cacheCopy/execution/rc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant