Fred Hutch Resources and Next Steps
Now that you understand the basics of computational workflows and WDL, there are several Fred Hutch resources that can help you find workflows, run analyses, and get support.
These resources are maintained by teams across the Office of the Chief Data Officer (OCDO) and the Research Computing ecosystem.
WDL Resources
WILDS WDL Library
The WILDS WDL Library is a curated collection of reliable WDL workflows maintained for the Fred Hutch community.
These workflows provide ready-to-use pipelines for common bioinformatics analyses.
You can use the library to:
- Find existing workflows
- Learn from example implementations
- Adapt workflows for your own analyses
Repository: WDL Repository on GitHub
WILDS Docker Library
The WILDS Docker Library contains curated Docker images used by WILDS workflows.
These images package the software required for each analysis step, ensuring that workflows run in consistent and reproducible environments.
Benefits include:
- Standardized software environments
- No dependency conflicts
- Improved reproducibility across systems
Repository:
WILDS Docker Library
PROOF Workbench (PWB)
PROOF Workbench provides an interface for running workflows on Fred Hutch compute infrastructure.
It integrates with workflow engines and HPC resources to simplify the process of launching and managing analyses.
You can use PROOF Workbench to:
- Run WDL workflows
- Manage jobs submitted through PWB
- Monitor workflow progress
Community and Support
Several teams and Fred Hutch resources are available to help you learn, troubleshoot, and collaborate.
SciComp
The Scientific Computing (SciComp) team provides support for:
- HPC cluster usage
- Software installation on the cluster
- Cloud computing
- Data management services
If you need help with compute infrastructure or performance issues, SciComp is a good place to start.
Data House Calls (DHC)
Data House Calls provide personalized consultations for researchers who need help with data-related challenges.
This includes support for:
- Research computing
- Workflow development
- Data management
You can submit a request through the Research Computing and Data Management DHC program.
FH Data Slack
The Fred Hutch Data Slack workspace connects researchers, engineers, and data scientists across the institution.
For workflow-related questions, check out: #workflows
This channel is a great place to:
- Ask questions
- Learn from others’ experiences
- Discover new tools and resources
Fred Hutch SciWiki
SciWiki is the community-maintained knowledge base for biomedical data science and research computing at Fred Hutch.
It contains documentation on:
- Computing resources
- Workflow tools
- Data management practices
- Tutorials and guides
Many Hutch-supported tools and workflows are documented here.
What to Do Next
If you're interested in starting with workflows, here are some recommended next steps:
- Explore the WILDS WDL Library to see example workflows.
- Try running a workflow using PROOF Workbench.
- Join the FH Data Slack workspace and visit the
#workflow-managerschannel. - Request a Data House Call if you need help adapting workflows for your project.
- Review relevant documentation on SciWiki.
Workflows can significantly improve the scalability, reproducibility, and shareability of research analyses.
The Hutch ecosystem provides tools and support to help you get started.
Previous: ← Common Pitfalls | Home: Course Overview