Difference between revisions of "Managing multiple SSH agents"
← Older edit
Newer edit →
Managing multiple SSH agents (view source)
Revision as of 13:31, 9 March 2018
more Cloud Services renaming
<code>launchctl start org.wmflabs.ssh-agent</code>.
This will start an ssh agent instance every time you login that will be reachable at <code>/private/tmp/.ssh-agent-​labs​cloud​</code>.
Repeat the process for every domain you're connecting to.
This requires the use of a Linux distribution using systemd as the init system (all current releases do that, e.g. Debian jessie or Ubuntu 15.10 and later).
You can start multiple ssh-agents through systemd user units. The following unit would e.g. connect to Toolforge/CloudVPS, copy it to /usr/lib/systemd/user/ssh-​labs​cloud​.service
(and similar to wherever you else want to connect):
ExecStart=/usr/bin/ssh-agent -a $SSH_AUTH_SOCK
Then run the following command as your regular user (and similar for the other agent(s)):
systemctl --user enable ssh-labscloud
This will create the agent socket ssh-labscloud.socket inside the $XDG_RUNTIME_DIR directory (which is automatically created and usually refers to /run/user/1000/, so the effective SSH agent socket would be /run/user/1000/ssh-​labs​cloud​.socket).
Start the agent as follows to check if the systemd user unit works properly. There is no need to do this afterwards, later on the unit will be started during your first login.
systemctl --user start ssh-labscloud.service
Host *.wmflabs gerrit.wikimedia.org *.wmflabs.org
User foo
IdentityFile /home/foo/.ssh/​id_labs​id_cloud
IdentityAgent /run/user/1000/ssh-​labs​cloud​.socket
IdentitiesOnly yes
ForwardAgent no
If you don't have openssh 7.3 yet, you need to set the set the environment variable SSH_AUTH_SOCK to the respective socket before connecting, e.g.
export SSH_AUTH_SOCK="/run/user/1000/ssh-​labs​cloud​.socket"
=== The simplest solution ===
Next we set up a function specifically for connecting to Toolforge/CloudVPS
# ssh into Toolforge/CloudVPS with an isolated agent
function labscloud() {
persistent_agent /tmp/$USER-ssh-agent/​labs​cloud​-agent
# add the key if necessary
if ! ssh-add -l | grep -q labscloud-key-rsa; then
ssh-add ~/.ssh/​labs​cloud​-key-rsa
ssh -A -D 8080 bastion.wmflabs.org
And one to copy content into Toolforge/CloudVPS (scp into Toolforge/CloudVPS)
# scp into Toolforge/CloudVPS with an isolated agent
function labscpcloudcp() {
persistent_agent /tmp/$USER-ssh-agent/​labs​cloud​-agent
# add the key if necessary
if ! ssh-add -l | grep -q labscloud-key-rsa; then
ssh-add ~/.ssh/​labs​cloud​-key-rsa
scp "$@"
2048 25:9e:91:d5:2f:be:73:e8:ff:37:63:ae:83:5b:33:e1 /Users/ben/.ssh/id_rsa (RSA)
The first time (in a given day) you connect to Toolforge/CloudVPS, you are prompted to enter the passphrase for your key, and when you get to bastion, it can only see your Toolforge/CloudVPS key:
ben@green:~$ labscloud
triggering new agent
Agent pid 32638
`/tmp/ben-ssh-agent/​labs​cloud​-agent' -> `/tmp/ssh-YfZWc32637/agent.32637'
Enter passphrase for /home/ben/.ssh/​labs​cloud​-key:
Identity added: /home/ben/.ssh/​labs​cloud​-key (/home/ben/.ssh/​labs​cloud​-key)
[motd exerpted]
ben@bastion:~$ ssh-add -l
2048 60:a2:b5:a5:fe:47:07:d6:d5:78:50:50:ba:50:14:46 /home/ben/.ssh/​labs​cloud​-key (RSA)
When connecting the subsequent shells (until the end of the day when you log out of your workstation and all your agents are killed), you are connected without being prompted for your passphrase.
ben@green:~$ labscloud
[motd exerpted]
Copying files means just using labscpcloudcp instead of scp:
ben@green:~$ labscpcloudcp foo bastion.wmflabs.org:/tmp/
foo 100% 43KB 43.0KB/s 00:00
But when you log out of bastion (in any connection), your normal key is once again available for connecting to personal or other hosts:
Arturo Borrero Gonzalez
Privacy policy
Terms of Use
HomeRandomLog in Settings DonateAbout WikitechDisclaimers