The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

COSMOS

COSMOS represent a significant increase in computational capacity and will offer access to modern hardware including GPUs.

Through the LUNARC Desktop new and existing users will be able to draw upon the benefits of high performance computing (HPC) while not having to be burdened by the intricacies of HPC utilisation. At the same time, users proficient in HPC usage will still be able to make use of the computational power represented in the interconnected nodes of COSMOS.

COSMOS consists out of 182 compute nodes funded by Lund university.  Each node has two AMD 7413 processors (Milan), offering 48 compute cores per node.  The nodes have 256 GB ram installed. In addition to the CPU nodes there are also 6 NVIDIA A100 nodes and 6 NVIDIA A40 GPU Nodes. For more specs see below.

COSMOS also features Intel partitions with Intel processors (Caskade Lake), offering 32 compute cores each.   There are 22 CPU nodes, 5 nodes with NVIDIA A40 GPUs and four nodes with A100 GPUs within the Intel partitions.

System information

  • Hostname:
    • cosmos-dt.lunarc.lu.se - Desktop access (Thinlinc)
    • cosmos.lunarc.lu.se - Terminal access (SSH)
  • Queueing system: SLURM
  • Home space:
    • /home
    • NFS mounted, available on all nodes
  • Optional project storage:
    • /lunarc/nobackup/projects
    • 3 PB IBM SpectrumScale Filesystem
  • Linux distribution: Rocky Linux 9 x86_64 (RHEL9 compatible)
  • Software: organised in a hierachical module system

Node Information

  • CPU: 2 x AMD 7413 (2.65 Ghz, 2 x 24-core)
  • Memory: 256 GB (5.3 GB/core)
  • Local disk: 2 TB, temporary directory given by $SNIC_TMP or $TMPDIR
  • Interconnect:
    • HDR InfiniBand (100 Gbit Node / 200 Gbit switches)
    • 25 Gbit Ethernet / Node with 100 Gbit uplinks to core switches

Intel / NVIDIA A100 Node (Compute)

  • CPU: 2 x Intel (2.67 Ghz, 2 x 16-core)
  • Memory: 384 GB (12 GB/core)
  • Local disk: 1 TB, temporary directory given by $SNIC_TMP or $TMPDIR

Intel NVIDIA A40 Node (Compute/Graphics)

  • CPU: 2 x Intel (2.67 Ghz, 2 x 16-core)
  • Memory: 512 GB (16 GB/core)
  • Local disk: 1 TB, temporary directory given by $SNIC_TMP or $TMPDIR

AMD NVIDIA A100 Node (Compute)

  • CPU: 2 x AMD 7413 (2.65 Ghz, 2 x 24-core)
  • Memory: 512 GB (10.7 GB/core)
  • Local disk: 2 TB, temporary directory given by $SNIC_TMP or $TMPDIR
  • Interconnect:
    • HDR InfiniBand (100 Gbit Node / 200 Gbit switches)
    • 25 Gbit Ethernet / Node with 100 Gbit uplinks to core switches

AMD NVIDIA A40 Node (Compute/Graphics)

  • CPU: 2 x AMD 7413 (2.65 Ghz, 2 x 24-core)
  • Memory: 512 GB (10.7 GB/core)
  • Local disk: 2 TB, temporary directory given by $SNIC_TMP or $TMPDIR
  • Interconnect:
    • HDR InfiniBand (100 Gbit Node / 200 Gbit switches)
    • 25 Gbit Ethernet / Node with 100 Gbit uplinks to core switches
Computer wiring. Photo.