What is SCINet?
SCINet is the USDA-ARS’s initiative for scientific computing. It consists of:
- High performance computer clusters for running command-line and graphical programs. There are currently two clusters: Ceres cluster in Ames IA and Atlas cluster in Starkville MS. SCINet also offers AWS cloud computing. See SCINet HPC Systems for more detail.
- Network improvements across ARS.
- Support for computing through the Virtual Research Support Core (VRSC). See VRSC Support for more detail.
- Training and workshop opportunities in multiple areas of scientific computing. See our event calendar for more information.
Users who are new to the HPC environment may benefit from the SCINet/Ceres onboarding video which covers most of the material contained in this guide. Note that /KEEP storage discussed in the video at 16:20 is no longer available. Instead data that cannot be easily reproduced should be manually backed up to Juno. The instructional video at https://www.youtube.com/watch?v=I3lnsCAfx3Q demonstrates how to transfer files between local computer, Ceres, Atlas and Juno using Globus.
User Guides
Use the navgation options or select one of the guides below to get started with SCINet
-
SCINet HPC Resources
HPC Clusters on SCINet
Cluster Name Location Login Nodes Transfer Nodes Ceres Ames, IA ceres.scinet.usda.gov ceres-dtn.scinet.usda.gov Atlas Starkville, MS atlas-login.hpc.msstate.edu atlas-dtn.hpc.msstate.edu -
Logging In
If you have received your login credentials in an email, this guide will help you get connected to SCINet. Otherwise, please email the Virtual Research Support Core at scinet_vrsc@usda.gov for assistance.
-
Storage Locations
There are multiple places to store data on Ceres, Atlas, and Juno that all serve different purposes. -
File Transfer Methods
To help identify the file transfer method and documentation you should use, determine the scenario below that best matches your use case:
- If you are transferring small amounts of data (less than 1 GB), there are multiple file transfer protocols and options that should work well. See Data Transfer to and from Local Computers.
- If you are moving data to and from cloud resources, see Rclone.
- For most other data transfer needs, we recommend that you use Globus.
- If you would like to transfer data to and from your local computer, see Globus Connect Personal.
- If you are on a SCINet café machine at a SCINet-X location, see Data Transfer via Café Machine.
- If you would like to transfer data between SCINet infrastructure and a non-SCINet Globus endpoint, please go directly to Globus Data Transfer.
- If you have to transfer very large amounts of data and network speed at your location is slow, please submit a request to the Virtual Research Support Core (VRSC) to ingress data from a hard drive as described in Large Data Transfer by Shipping External Drives.
-
Software
The login node provides access to a wide variety of scientific software tools that users can access and use via the module system. These software tools were compiled and optimized for use on SCINet by members of the Virtual Research Support Core (VRSC) team. Most users will find the software tools they need for their research among the provided packages and thus will not need to compile their own software packages.
To learn more about graphical software such as Galaxy, CSC, Geneious, RStudio, and Juptyer, please select the Software Preinstalled on Ceres guide -
Open OnDemand Interface
Open OnDemand is an intuitive, innovative, and interactive interface to remote computing resources. The key benefit for SCINet users is that they can use any web browser, including browsers on a mobile phone, to access Ceres.
There are several interactive apps that can be run in Open OnDemand including Jupyter, RStudio Server, Geneious, CLC Genomics Workbench, and more. The desktop app allows a user to run any GUI software.
If you are using Atlas Open OnDemand, visit the Atlas Open OnDemand Guide for more information.
To access Open OnDemand on the Ceres cluster, go to Ceres OpenOndemand