Tower community showcase
The Tower community showcase is an example workspace provided by Seqera. The showcase is pre-configured with credentials, compute environments, and pipelines to get you running Nextflow pipelines immediately. The pre-built community AWS Batch environments include 100 free hours of compute. Upon your first login to Tower Cloud, you are directed to the community showcase Launchpad. To run pipelines on your own infrastructure, create your own organization and workspaces.
Launchpad
The community showcase Launchpad contains a list of pre-built community pipelines. A pipeline consists of a pre-configured workflow repository, compute environment, and launch parameters.
Datasets
The community showcase contains a list of sample datasets under the Datasets tab. A dataset is a collection of versioned, structured data (usually in the form of a samplesheet) in CSV or TSV format. A dataset is used as the input for a pipeline run. Sample datasets are used in pipelines with the same name, e.g., the nf-core-rnaseq-test dataset is used as input when you run the nf-core-rnaseq pipeline.
Compute environments
As of Tower version 23.1.3, the community showcase comes pre-loaded with two AWS Batch compute environments, which can be used to run the showcase pipelines. These environments come with 100 free CPU hours. A compute environment is the platform where workflows are executed. It is composed of access credentials, configuration settings, and storage options for the environment.
Credentials
The community showcase includes all the credentials you need to run pipelines in showcase compute environments. Credentials in Tower are the authentication keys needed to access compute environments, private code repositories, and external services. Credentials in Tower are SHA-256 encrypted before secure storage.
Secrets
The community showcase includes pipeline secrets that are retrieved and used during pipeline execution. In your own private or organization workspace, you can store the access keys, licenses, or passwords required for your pipeline execution to interact with third-party services.