Difference between revisions of "Managing multiple SSH agents"
← Older edit
Newer edit →
Managing multiple SSH agents (view source)
Revision as of 13:29, 9 March 2018
392 BYTES ADDED
,  3 YEARS AGO
Cloud Services renaming
This describes a method for maintaining a separate ssh-agent to hold your ssh key for connecting to Labs​Toolforge/CloudVPS​.
 
== The problem ==
You use an ssh-agent to connect to your personal or company systems. You want to connect to Labs​Toolforge/CloudVPS using an agent and you created a separate ssh key to connect to Labs​Toolforge/CloudVPS​, but you don't want to forward your personal key to Labs​Toolforge/CloudVPS systems. If you just add both keys to your existing agent, they both get forwarded to Labs​Toolforge/CloudVPS​. It's a pain to constantly remove your personal key from your agent each time you want to connect to Labs​Toolforge/CloudVPS​. Additionally, you might be connected to both your personal system and labs​Toolforge/CloudVPS simultaneously, so just removing the key is insufficent; you must run a separate ssh-agent. You don't want to run one agent per connection because then you have to type your passphrase on every connection (and you have a nice long secure passphrase on your key).
 
This page describes a method for getting your shell to maintain two agents, your primary agent and your labs​Toolforge/CloudVPS agent. When you connect to labs​Toolforge/CloudVPS you connect to the existing labs​Toolforge/CloudVPS agent (or create one if it doesn't exist) and the rest of the time you use your default agent.
 
== OS X solution ==
This requires the use of a Linux distribution using systemd as the init system (all current releases do that, e.g. Debian jessie or Ubuntu 15.10 and later).
 
You can start multiple ssh-agents through systemd user units. The following unit would e.g. connect to labs​Toolforge/CloudVPS​, copy it to /usr/lib/systemd/user/ssh-labs.service
(and similar to wherever you else want to connect):
 
[Unit]
Description=SSH authentication agent for labs​Toolforge/CloudVPS
Before=default.target
 
 
Finally whenever you want to connect to either labs​Toolforge/CloudVPS or production via SSH, you need to point your SSH client to the respective agent socket:
 
If you're using openssh 7.3 (available in Debian unstable since 7th August 2016), this is really simple: You can use the new ''IdentityAgent'' directive, so wherever you configure the IdentityFile, simply add the respective SSH agent socket created by the systemd user units above. Here's an example for configuring access for labs​Toolforge/CloudVPS​:
 
Host *.wmflabs gerrit.wikimedia.org *.wmflabs.org
 
=== The simplest solution ===
There is an easy answer to this problem, though it's not very flexible. Run two terminals on your workstation. Load a fresh agent in one of them. Always use one to connect to labs​Toolforge/CloudVPS and the other to connect other places.
 
=== A more complex solution ===
:'' The items listed here are entirely untested by current staff, and left over from the past.''
This solution has the advantage of being able to connect to Labs​Toolforge/CloudVPS or other hosts indiscriminately from any terminal running on your workstation (or in screen) etc. It protects you against accidentally attempting to authenticate against labs​Toolforge/CloudVPS with the wrong key.
 
==== Setup ====
This solution assumes you are running bash as your local shell. It can probably be adapted for other shells with minimal effort. It involves creating a socket connected to your ssh-agent at a predictable location and using a bash function to change your environment to use the labs​Toolforge/CloudVPS agent when connecting to labs​Toolforge/CloudVPS​.
 
This solution is also geared towards running [http://www.gnu.org/software/screen/ screen]. It's a little more complicated than necessary because when disconnecting then reconnecting to a screen session, the SSH_AUTH_SOCK has usually changed. We override that with a predictable location so that as the agent moves around the old screen sessions still have access to the current agent.
if [ -f ~/.persistent_agent ]; then source ~/.persistent_agent; fi
persistent_agent /tmp/$USER-ssh-agent/valid-agent
Next we set up a function specifically for connecting to labs​Toolforge/CloudVPS
# ssh into labs​Toolforge/CloudVPS with an isolated agent
function labs() {
oldagent=$SSH_AUTH_SOCK
SSH_AUTH_SOCK=$oldagent
}
And one to copy content into labs​Toolforge/CloudVPS (scp into labs​Toolforge/CloudVPS​)
# scp into labs​Toolforge/CloudVPS with an isolated agent
function labscp() {
oldagent=$SSH_AUTH_SOCK
SSH_AUTH_SOCK=$oldagent
}
Last, we make sure we clean up our old agents if we completely disconnect from the system otherwise we'll wind up with the agent running even when we're not connected to labs​Toolforge/CloudVPS​. This is a little tricky because we don't want to kill the agent when we close the first connection we made to labs​Toolforge/CloudVPS but only when we're actually done working. As a proxy for 'done working', I use 'I log out of the last shell i have open on this system'. This is not a great solution because if the connection dies or I just quit Terminal or something like that instead of specifically logging out, .bash_logout doesn't get run. Add to .bash_logout:
# if this is the last copy of my shell exiting the host and there are any agents running, kill them.
if [ $(w | grep $USER | wc -l) -eq 1 ]; then
ben@green:~$ ssh-add -l
2048 25:9e:91:d5:2f:be:73:e8:ff:37:63:ae:83:5b:33:e1 /Users/ben/.ssh/id_rsa (RSA)
The first time (in a given day) you connect to labs​Toolforge/CloudVPS​, you are prompted to enter the passphrase for your key, and when you get to bastion, it can only see your labs​Toolforge/CloudVPS key:
ben@green:~$ labs
triggering new agent
Arturo Borrero Gonzalez
BUREAUCRATS, ADMINISTRATORS
1,909
EDITS
Wikitech
Privacy policy
Terms of Use
Desktop
HomeRandomLog in Settings DonateAbout WikitechDisclaimers