.. _insigneo_iceberg: Iceberg: University of Sheffield's older HPC cluster ==================================================== Introduction ------------ Iceberg is the older of the University of Sheffield's two computer clusters. The majority of the computational resources are available to all researchers with ShARC/Iceberg accounts but INSIGNEO and certain sub-groups/projects have purchased hardware for Iceberg that only they have access to. The documentation on :ref:`getting started with High-Performance Computing at the University of Sheffield ` covers connecting, usage (inc. submitting jobs and filestores. There is a separate section for the :ref:`software that is readily available on Iceberg `. All users should consult that documentation when getting started. INSIGNEO members will then want to consult the following for additional information on making use of INSIGNEO-specific resources in Iceberg and for guidance on INSIGNEO-specific workflows. .. warning:: Researchers should use :ref:`ShARC ` instead of Iceberg unless there is good reason to do so: * Many nodes in Iceberg are no longer under warranty and may not be repaired/replaced if they fail. * New research software packages or updates to research software will only be provided on ShARC. INSIGNEO-related hardware ------------------------- INSIGNEO has purchased nodes for Iceberg, as have three related groups/projects: IMSB (for the `MultiSim `__ project), `Polaris `__ and `NoTremor `__. * Restricted access to the sub-group/project that purchased the nodes (and potentially other related groups if sharing agreements are reached); * Shorter job queue times for researchers as there is less contention for resources; * Specific workloads as the nodes may have e.g. more RAM than is typical for the cluster. Details as of 2017-10-03: .. csv-table:: :file: insig_sge_host_info_iceberg.csv .. warning:: All nodes apart from those containing NVIDIA K80 GPUS are no longer under warranty and may not be repaired/replaced if they fail. The K80 GPU nodes are under warranty until 2019-07-05 but will be moved from Iceberg to ShARC in spring 2020. Gaining access to these nodes ----------------------------- Users need to be explicitly added to particular user groups in order to be able to run jobs on these nodes (in addition to the 'public' nodes). If a researcher would like access then a relevant PI needs to contact the :ref:`INSIGNEO tech team `. Using these nodes ----------------- To run jobs on these 'private' nodes you need to follow the standard instructions for starting interactive sessions and submitting batch jobs but make sure you specify a **Project** and **Queue**, the values of which depend on which research group you are in: +-------------------------+-----------------------+-------------------------+ | Research group | Project | Queue | +=========================+=======================+=========================+ | IMSB (MultiSim) | ``insigneo-imsb`` | ``insigneo-imsb.q`` | +-------------------------+-----------------------+-------------------------+ | NoTremor | ``insigneo-notremor`` | ``insigneo-notremor.q`` | +-------------------------+-----------------------+-------------------------+ | INSIGNEO (non-specific) | ``insigneo-default`` | ``insigneo.q`` | +-------------------------+-----------------------+-------------------------+ Here's how you specify a Project and Queue when starting an interactive session (in this case on an IMSB-owned node): .. code-block:: bash qrshx -P insigneo-imsb -q insigneo-imsb.q -l rmem=256G And here's how to specify a Project and Queue in a batch job submission script (in this case for a job that you wish to run on a general INSIGNEO-owned node): .. code-block:: bash :emphasize-lines: 4,5 #!/bin/bash #$ -l h_rt=24:00:00 #$ -l rmem=6G #$ -P insigneo-default #$ -q insigneo.q #$ -pe openmp 16 #$ -m youremail@sheffield.ac.uk module load apps/java/1.8u71 java -jar MyProgram.jar To see all jobs that are running in a particular queue or are waiting for a particular queue: :: qstat -q queuename.q -u \* e.g. :: qstat -q insigneo-default.q -u \*