Interactive R on an HPC cluster

How to run R in an interactive session.
Bash, Linux, LSF, R, SLURM

Here we’re using the Harvard O2 (Orchestra 2) cluster as an example environment.

SSH config

First, set up an SSH keypair and save the public key on your remote host at ~/.ssh/authorized_keys.

For R to work properly in an interactive session on an HPC cluster, you have to enable X11 forwarding. For O2, this requires adding some host options on your local machine at ~/.ssh/config:

Host *
    AddKeysToAgent yes
    Compression yes
    IdentityFile ~/.ssh/id_rsa
    ServerAliveInterval 10
    UseKeychain yes
    XAuthLocation /opt/X11/bin/xauth
    ForwardAgent yes
    ForwardX11 yes
    ForwardX11Trusted yes

Now let’s log in to the remote server. For X11 forwarding to work, make sure you set the -XY flags when logging in over SSH. The -C flag enables optional compression.

ssh -CXY

Once logged in, launch an interactive session using the SLURM srun command:

# Memory is in megabytes
ram_mb="$(($ram_gb * 1024))"

srun -p interactive --pty --mem "$ram_mb" --time 0-12:00 --x11 /bin/bash

Before loading R, create an ~/.Renviron file and set up a user library:


Make sure that the ~/R/library directory exists.

Now load up R. There are two options that work well on O2: the preconfigured R module or r-base managed with conda. Starting out we recommend working with the module:

# module spider R/3.4.1
module load gcc/6.2.0
module load R/3.4.1

Once R is loaded, check to make sure that the graphics are working properly.


Check to ensure that jpeg, png, tiff, X11, and cairo are all TRUE.

If this this the case, you should be all set running R remotely.