Requesting HPC account
To open HPC account, please fill and submit HPC Account request form.
- For account on ATHENA/DGX GPU Cluster, please specify “GPU Cluster Users” Affiliation.
- For account on ZEUS CPU Cluster – please choose “CPU Cluster Users” Affiliation.
- “Technion – Attached Server Owner” Affiliation is for account on MAFAT cluster
or any other private cluster (should be checked with your PI – Research Group leader)
- “Academic Course Teaching” Affiliation is for students, who will use HPC resources
to study Academic Course (should be checked with your Teacher/PI). Please indicate
course number as Research Title.
- PI – Research Group leader (academic Staff member) should be specified as a Budget Owner.
- Research group members should choose “Another person” as Budget Owner and introduce the PI’s details
- Please fill all fields of the form in English. Upon account opening notification letter with detailed
instructions will be sent to your Technion e-mail address.
Getting Started on ZEUS (CPU-based computing)
- The username and password are identical to your Technion mail credentials.
- Your default disk quota for home directory is 300 GB and 7 daily snapshots are defined. You can check your personal quota with the command: quota-u
- You can access ZEUS via ssh from any Linux computer within the Technion network, using the command:
ssh username@zeus.technion.ac.il applying your Technion password.
- For Windows users it is recommended to download and install:
- MobaXterm home edition of the X-windows emulator and use its terminal to establish an SSH connection. (license)
- WinSCP for file transfer to/from the HPC server. Note that text files, transferred from Windows, may contain unprintable characters. It is recommended to apply dos2unix <filename> on such files
- For WIFI connection in the Technion please use the TechSec network.
- Please note that there is no direct access to ZEUS outside the Technion network (e.g. from home), unless you are connected via VPN. To establish VPN connection, please follow the instructions.
- Running interactive jobs on the ZEUS login node is prohibited. To use graphical software and/or graphical post-processing, you can login to ZEUS-POST server using command:
ssh username@zeus-post.technion.ac.il applying your Technion password.
- Portable batch System (PBS) must be used to run jobs on ZEUS. For help creating PBS scripts you can use the PBS script generator. Please note, that It is strictly prohibited to use any applications and scripts that involve launching PBS jobs from compute nodes.
- PBS CPU queues
Users who specify Technion General users Affiliation can submit jobs to the following public queues on Zeus:
| Queue Name |
Unused CPUs |
Allocated CPUs |
Total CPUs |
Unused Memory (GB) |
Allocated Memory (GB) |
Total Memory (GB) |
Running Jobs |
Queued Jobs |
Avg Wait Time
| Queue Policy |
| zeus_all_q | 1318 | 42 | 1360 | 6340 | 74 | 6414 | 6 | 0 | 0 | Walltime: 24:00:00
|
| zeus_long_q | 290 | 830 | 1120 | 2171 | 3112 | 5282 | 14 | 0 | 0 | Walltime: 336:00:00
|
| zeus_short_q | 1340 | 20 | 1360 | 6414 | 0 | 6414 | 1 | 0 | 0 | Walltime: 03:00:00
|
| zeus_new_q | 1494 | 42 | 1536 | 12066 | 20 | 12087 | 2 | 0 | 0 | Walltime: 72:00:00
|
| zeus_combined_q | 17714 | 134 | 17848 | 78247 | 1055 | 79302 | 26 | 0 | 0 | Walltime: 24:00:00
|
| zeus_comb_short | 11000 | 0 | 11000 | 49089 | 0 | 49089 | 0 | 0 | 0 | Walltime: 03:00:00
|
There are 2 combined heterogeneous queues on Zeus, which contain public nodes and various private nodes:
- zeus_combined_q with priority 80, wall time limit of 24 hours.
- zeus_comb_short queue with priority 80, wall time limit of 3 hours.
The limit on all public queues is 600 cores per user.
- PBS GPU queues
gpu_v100_q – 2 GPU nodes. Each GPU node contains 40 cores and 4 Tesla V100 graphical cards, 378 GB RAM.
- PBS billing
CPU usage – 0.015 NIS per core-thread per hour
GPU usage – 0.18 NIS per GPU card per hour
In the beginning of each month all HPC users receive detailed reports on their PBS jobs run during the previous month.
The budget owners receive reports containing the CPU/GPU usage for each user in their group and the total fees for the previous month.
They also receive a message with payment instructions. The up-to-date Zeus usage can be checked by the command prj_usage.
Getting Started on MARS (special purpose cluster)
- The username and password are identical to your Technion mail credentials.
- Your default disk quota for home directory is 300 GB and 7 daily snapshots are defined. You can check your personal quota with the command: quota-u
- You can access ZEUS via ssh from any Linux computer within the Technion network, using the command:
ssh username@mars.technion.ac.il or ssh username@zeus.technion.ac.il applying your Technion password.
- For Windows users it is recommended to download and install:
- MobaXterm home edition of the X-windows emulator and use its terminal to establish an SSH connection. (license)
- WinSCP for file transfer to/from the HPC server. Note that text files, transferred from Windows, may contain unprintable characters. It is recommended to apply dos2unix <filename> on such files
- For WIFI connection in the Technion please use the TechSec network.
- Please note that there is no direct access to ZEUS outside the Technion network (e.g. from home), unless you are connected via VPN. To establish VPN connection, please follow the instructions.
- Running interactive jobs on the ZEUS login node is prohibited. To use graphical software and/or graphical post-processing, you can login to ZEUS-POST server using command:
ssh username@zeus-post.technion.ac.il applying your Technion password.
- Portable batch System (PBS) must be used to run jobs on ZEUS. For help creating PBS scripts you can use the PBS script generator. Please note, that It is strictly prohibited to use any applications and scripts that involve launching PBS jobs from compute nodes.
- PBS CPU queues
Please note that as MAFAT (MARS) user you’re entitled to submit your batch jobs from MARS server to the PBS queues:
mafat_new_q – 18 compute nodes, each compute node contains 256 core-threads/1 TB RAM
mafat10_q – 10 compute nodes, each compute node contains 384 core-threads/1.5 TB RAM
mafat14_512_q – 14 compute nodes, each compute node contains 512 core-threads/1.5 TB RAM
PBS jobs submitted to these queues are not charged for usage.
Getting Started on ATHENA (GPU-based computing)
-
- Athena cluster consists of 9 GPU nodes, each one with 8 Nvidia A100 graphical cards.
- The username and password are identical to your Technion mail credentials.
- Your HPC account is a member of the docker group, which grants you access to the Athena cluster.
- Your default disk quota for home directory is 300 GB and 7 daily snapshots are defined. You can check your personal quota with the command: quota-u
- You can access ATHENA via ssh from any Linux computer within the Technion network, using the command:
ssh username@dgx-master.technion.ac.il applying your Technion password.
- For Windows users it is recommended to download and install:
- MobaXterm home edition of the X-windows emulator and use its terminal to establish an SSH connection. (license)
- WinSCP for file transfer to/from the HPC server. Note that text files, transferred from Windows, may contain unprintable characters. It is recommended to apply dos2unix <filename> on such files
- For WIFI connection in the Technion please use the TechSec network.
- Please note that there is no direct access to Athena outside the Technion network (e.g. from home), unless you are connected via VPN. To establish VPN connection, please follow the instructions.
- It is recommended to place your work files under the folder $HOME/work, a part of your common group storage /rg/groupname_prj space, which is limited by the group quota. The default quota is is 2TB . You can check your group quota with the command: quota-g.
Additional storage space is available for pay from the Technion CIS shop by increments of 1 TB. The annual price is 450 NIS per 1 TB.
Only Technion Academic Staff members, holding a research project group registered on the Athena cluster are eligible to purchase additional storage space.
Group ownership for the $HOME/work directories has been set on the Athena Cluster. This means that each member of a project group
can grant read/access permissions to his/her /home/username/work directory for other group members with the following command:
chmod -R g+rX $HOME/work
10. Your Technion/campus e-mail address is a member of the mailing list cis-gpu-users–l.
Please follow messages to this mailing list for updates regarding the Athena cluster.
Additional information regarding the Athena cluster and detailed user instructions can be found in
Athena GPU Cluster .