Managing multiple SSH agents
This describes a method for maintaining a separate ssh-agent to hold your ssh key for connecting to Toolforge/CloudVPS.
The problem
You use an ssh-agent to connect to your personal or company systems. You want to connect to Toolforge/CloudVPS using an agent and you created a separate ssh key to connect to Toolforge/CloudVPS, but you don't want to forward your personal key to Toolforge/CloudVPS systems. If you just add both keys to your existing agent, they both get forwarded to Toolforge/CloudVPS. It's a pain to constantly remove your personal key from your agent each time you want to connect to Toolforge/CloudVPS. Additionally, you might be connected to both your personal system and Toolforge/CloudVPS simultaneously, so just removing the key is insufficent; you must run a separate ssh-agent. You don't want to run one agent per connection because then you have to type your passphrase on every connection (and you have a nice long secure passphrase on your key).
This page describes a method for getting your shell to maintain two agents, your primary agent and your Toolforge/CloudVPS agent. When you connect to Toolforge/CloudVPS you connect to the existing Toolforge/CloudVPS agent (or create one if it doesn't exist) and the rest of the time you use your default agent.
OS X solution
Using multiple agents via launchd (better)
This has been tested on Mac OS X Catalina. It should work on older releases, please update this text if it works with later versions of OSX.
You can start multiple ssh-agents through launchd user LaunchAgents.
To make this work write the following plist to ~/Library/LaunchAgents/org.wmflabs.ssh-agent.plist
<?xml version="1.0" encoding="UTF-8"?>​<!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN" "">​<plist version="1.0"><dict> <key>Label</key> <string>​org.wmflabs.ssh-agent​</string> <key>​ProgramArguments​</key> <array> <string>​/usr/bin/ssh-agent​</string> <string>-l</string> </array> <key>​ServiceIPC​</key> <true/> <key>Sockets</key> <dict> <key>Listeners</key> <dict> <key>​SockPathName​</key> <string>​/private/tmp/.ssh-agent-cloud​</string> </dict> </dict> <key>​RunAtLoad​</key><true/> <key>​EnableTransactions​</key> <true/></dict></plist>
Then load the agent: launchctl load ~/Library/LaunchAgents/org.wmflabs.ssh-agent.plist and if you want, start it launchctl start org.wmflabs.ssh-agent​.
This will start an ssh agent instance every time you login that will be reachable at /private/tmp/.ssh-agent-cloud​.
Repeat the process for every domain you're connecting to.
You can then proceed as suggested in the Linux section below in order to configure ssh. Please note that openssh 7.3 is only available via homebrew at the time of writing. However, do NOT use homebrew's ssh-agent in the launch agent as it's not interacting well with launchd.
Run one agent per terminal
The default terminal application can be modified in how it runs to have every tab run its own ssh-agent.
Also ensure 'Run in Shell' is checked.
New tabs you open will now use this setting.
Linux solutions
Using multiple agents via systemd
This requires the use of a Linux distribution using systemd as the init system which is the norm these days.
You can start multiple ssh-agents through systemd user units. For each instance, you would create a unit, for example, to connect to Toolforge/CloudVPS, copy this file to /etc/systemd/user/ssh-cloud.service to make it available to all users, or if you want it available only for your user, you can add it to $HOME/.local/share/systemd/user/ssh-cloud.service, creating the directory tree if necessary:
[Unit] Description=SSH authentication agent for Toolforge/CloudVPS [Service] Type=forking Environment=SSH_AUTH_SOCK=%t/ssh-cloud.socket ExecStart=/usr/bin/ssh-agent -a $SSH_AUTH_SOCK [Install]
Then run the following command as your regular user (and similar for the other agent(s)):
systemctl --user enable ssh-cloud
This will create the agent socket ssh-cloud.socket inside the $XDG_RUNTIME_DIR directory (which is automatically created and usually refers to /run/user/1000/, so the effective SSH agent socket would be /run/user/1000/ssh-cloud.socket).
Start the agent as follows to check if the systemd user unit works properly. There is no need to do this afterwards, later on the unit will be started during your first login.
systemctl --user start ssh-cloud.service

Finally whenever you want to connect to either Toolforge/CloudVPS or production via SSH, you need to point your SSH client to the respective agent socket:
If you're using openssh 7.3 (available in Debian unstable since 7th August 2016), this is really simple: You can use the new IdentityAgent directive, so wherever you configure the IdentityFile, simply add the respective SSH agent socket created by the systemd user units above. Here's an example for configuring access for Toolforge/CloudVPS:
Host *.wmflabs * User foo IdentityFile /home/foo/.ssh/id_cloud IdentityAgent /run/user/1000/ssh-cloud.socket IdentitiesOnly yes ForwardAgent no # once identity is used for the first time, add it to the agent AddKeysToAgent yes
If you don't have openssh 7.3 yet, you need to set the set the environment variable SSH_AUTH_SOCK to the respective socket before connecting, e.g.
export SSH_AUTH_SOCK="/run/user/1000/ssh-cloud.socket"
Last edited on 2 November 2020, at 15:21
Content is available under CC BY-SA 3.0 unless otherwise noted.
Privacy policy
Terms of Use
 Home Random Log in  Settings  Donate  About Wikitech  Disclaimers