HPC – Getting Started

Requesting HPC account

To open HPC account, please fill and submit HPC Account request form.

  1. For account on ATHENA/DGX GPU Cluster, please specify “GPU Cluster Users  Affiliation.
  2. For account on ZEUS CPU Cluster – please choose “CPU Cluster Users”  Affiliation.
  3. Technion – Attached Server OwnerAffiliation is for account on MAFAT cluster
    or any other private cluster (should be checked with your PI – Research Group leader)
  4. Academic Course TeachingAffiliation is for students, who will use HPC resources
    to study  Academic Course (should be checked with your Teacher/PI). Please indicate
    course number as Research Title.
  5. PI – Research Group leader (academic Staff member) should be specified as a Budget Owner.
  6. Research group members should choose “Another person” as Budget Owner and introduce the PI’s details
  7. Please fill all fields of the form in English.  Upon account opening notification letter with detailed
    instructions will be sent to your Technion e-mail address.

Getting Started on ZEUS (CPU-based computing)

  1. The username and password are identical to your Technion mail credentials.   
  2. Your default disk quota for home directory is 300 GB and 7 daily snapshots are defined. You can check your personal quota with the command:  quota-u
  3. You can access ZEUS via ssh from any Linux computer within the Technion network, using the command:
    ssh username@zeus.technion.ac.il    applying your Technion password.  
  4. For Windows users it is recommended to download and install: 
    1. MobaXterm home edition of the X-windows emulator and use its terminal to establish an SSH connection. (license)
    2. WinSCP  for file transfer to/from the HPC server.  Note that text files, transferred from Windows, may contain unprintable characters. It is recommended to apply dos2unix <filename> on such files 
  5. For WIFI connection in the Technion please use the TechSec network.   
  6. Please note that there is no direct access to ZEUS outside the Technion network (e.g. from home), unless you are connected via VPN. To establish VPN connection, please follow the instructions. 
  7. Running interactive jobs on the ZEUS login node is prohibited. To use graphical software and/or graphical post-processing, you can login to ZEUS-POST server using command:
    ssh username@zeus-post.technion.ac.il    applying your Technion password.    
  8. Portable batch System (PBS) must be used to run jobs on ZEUS. For help creating PBS scripts you can use the PBS script generator.  Please note, that It is strictly prohibited to use any applications and scripts that involve  launching  PBS jobs from compute nodes.
  9. PBS CPU queues
    Users who specify Technion General users Affiliation can submit jobs to the following public queues on Zeus:
    Queue Name Unused CPUs Allocated CPUs Total CPUs Unused Memory (GB) Allocated Memory (GB) Total Memory (GB) Running Jobs Queued Jobs Avg Wait Time Queue Policy
    zeus_all_q13184213606340746414600Walltime: 24:00:00
    zeus_long_q29083011202171311252821400Walltime: 336:00:00
    zeus_short_q1340201360641406414100Walltime: 03:00:00
    zeus_new_q1494421536120662012087200Walltime: 72:00:00
    zeus_combined_q1771413417848782471055793022600Walltime: 24:00:00
    zeus_comb_short1100001100049089049089000Walltime: 03:00:00
    There are 2 combined heterogeneous queues on Zeus, which contain public nodes and various private nodes:
    1. zeus_combined_q with priority 80, wall time limit of 24 hours.
    2. zeus_comb_short queue with priority 80, wall time limit of 3 hours.
    The limit on all public queues is 600 cores per user.
  10. PBS GPU queues
    gpu_v100_q – 2  GPU nodes. Each GPU node contains 40 cores and 4 Tesla V100 graphical cards, 378 GB RAM.
  11. PBS billing
    CPU usage – 0.015 NIS per core-thread per hour
    GPU usage – 0.18 NIS per GPU card per hour
    In the beginning of each month all HPC users receive detailed reports on their PBS jobs run during the previous month.
    The budget owners receive reports containing the CPU/GPU usage for each user in their group and the total fees for the previous month.
    They also receive a message with payment instructions.  The up-to-date Zeus usage can be checked by the command prj_usage.

Getting Started on MARS (special purpose cluster)

  1. The username and password are identical to your Technion mail credentials.   
  2. Your default disk quota for home directory is 300 GB and 7 daily snapshots are defined. You can check your personal quota with the command:  quota-u
  3. You can access ZEUS via ssh from any Linux computer within the Technion network, using the command:
    ssh username@mars.technion.ac.il   or   ssh username@zeus.technion.ac.il  applying your Technion password.
  4. For Windows users it is recommended to download and install: 
    1. MobaXterm home edition of the X-windows emulator and use its terminal to establish an SSH connection. (license)
    2. WinSCP  for file transfer to/from the HPC server.  Note that text files, transferred from Windows, may contain unprintable characters. It is recommended to apply dos2unix <filename> on such files 
  5. For WIFI connection in the Technion please use the TechSec network.   
  6. Please note that there is no direct access to ZEUS outside the Technion network (e.g. from home), unless you are connected via VPN. To establish VPN connection, please follow the instructions.
  7. Running interactive jobs on the ZEUS login node is prohibited. To use graphical software and/or graphical post-processing, you can login to ZEUS-POST server using command:
    ssh username@zeus-post.technion.ac.il    applying your Technion password.    
  8. Portable batch System (PBS) must be used to run jobs on ZEUS. For help creating PBS scripts you can use the PBS script generator. Please note, that It is strictly prohibited to use any applications and scripts  that involve  launching  PBS jobs from compute nodes.
  9. PBS CPU queues

Please note that as  MAFAT (MARS)  user you’re entitled to submit your  batch jobs from MARS server to  the PBS queues:

mafat_new_q – 18 compute nodes, each compute node contains 256  core-threads/1 TB RAM

mafat10_q     –  10 compute nodes, each compute node contains 384 core-threads/1.5 TB RAM

mafat14_512_q     –  14 compute nodes, each compute node contains 512 core-threads/1.5 TB RAM

PBS  jobs submitted to these queues are not charged for usage.

Getting Started on ATHENA (GPU-based computing)

    1. Athena cluster consists of  9 GPU nodes, each one with 8 Nvidia A100 graphical cards.
    2. The username and password are identical to your Technion mail credentials.
    3. Your HPC account is a member of the docker group, which grants you access to the Athena cluster
    4. Your default disk quota for home directory is 300 GB and 7 daily snapshots are defined. You can check your personal quota with the command:  quota-u
    5. You can access ATHENA via ssh from any Linux computer within the Technion network, using the command:
      ssh username@dgx-master.technion.ac.il  applying your Technion password.  
    6. For Windows users it is recommended to download and install: 
      1. MobaXterm home edition of the X-windows emulator and use its terminal to establish an SSH connection. (license)
      2. WinSCP  for file transfer to/from the HPC server.  Note that text files, transferred from Windows, may contain unprintable characters. It is recommended to apply dos2unix <filename> on such files 
    7. For WIFI connection in the Technion please use the TechSec network.   
    8. Please note that there is no direct access to Athena outside the Technion network (e.g. from home), unless you are connected via VPN. To establish VPN connection, please follow the instructions.
    9. It is recommended to place your work files under the folder $HOME/work,  a part of your common  group storage  /rg/groupname_prj space, which is limited by the group quota. The default quota is is 2TB .  You can check your group quota with  the command:   quota-g.

Additional storage space is available for pay from the Technion CIS shop by increments of 1 TBThe annual price is 450 NIS per 1 TB.

Only Technion Academic Staff members, holding a research project group registered on the Athena cluster are eligible to purchase additional storage space.

Group ownership for the $HOME/work directories has been set on the Athena Cluster. This means that each member of a project group

can grant  read/access permissions to his/her /home/username/work directory for other group members with the following command:

                                                chmod -R g+rX  $HOME/work

10. Your Technion/campus e-mail address is a member of the  mailing list  cis-gpu-usersl.
Please follow messages to this mailing list for updates  regarding  the Athena cluster.
Additional information regarding the Athena cluster and detailed user instructions can be found in Athena GPU Cluster .

HPC Support

Please see the following Instructions for all types of support.