ssh remote on cluster

chocosabrinax2017

Active Member
Oct 23, 2017
29
0
41
34
hai,

need some enlightenment about ssh remote on cluster.
so I have 3 nodes before, all nodes can remote the ssh of another node.Then just add 2 more nodes. from one node I just can remote 3 of 4 nodes. thats why I'm confuse.
it is possible to remote all nodes ?

thank you

#sorry my english is not good.
 
did you edit or change in other way the .ssh/authorized_keys file on that node?

.ssh/authorized_keys should be a symbolic link:

root@gar-ha-kvm11:~# ll .ssh/authorized_keys
lrwxrwxrwx 1 root root 29 Mar 29 15:12 .ssh/authorized_keys -> /etc/pve/priv/authorized_keys
root@gar-ha-kvm11:~#
 
already tried :

ssh-copy-id remote-account(IP of my remote node IP)

after that I can remote the node but remote another node.
 
Hi,

we changed the way of setting up the SSH auth slightly by always using the node name as SSH Hostname, to avoid problems when changing node IPs or using different networks for doing stuff, e.g. migration network.

You can always just use:
Code:
ssh-copy-id root@NODE-NAME-OR-IP

To add access for a specific IP.

Or use the nodenames for connecting, not their IP adresses.
 
did you edit or change in other way the .ssh/authorized_keys file on that node?

.ssh/authorized_keys should be a symbolic link:

root@gar-ha-kvm11:~# ll .ssh/authorized_keys
lrwxrwxrwx 1 root root 29 Mar 29 15:12 .ssh/authorized_keys -> /etc/pve/priv/authorized_keys
root@gar-ha-kvm11:~#


Not yet.

what I need to do
Hi,

we changed the way of setting up the SSH auth slightly by always using the node name as SSH Hostname, to avoid problems when changing node IPs or using different networks for doing stuff, e.g. migration network.

You can always just use:
Code:
ssh-copy-id root@NODE-NAME-OR-IP

To add access for a specific IP.

Or use the nodenames for connecting, not their IP adresses.

more information
172.81.81.92 theironstone
172.81.81.76 kota

I tried using the server name but the connection from port 22 refused.
. ssh.png





so I tried remote using node name from the server that I was able to remotely through the IP connection also refused. but by using IP it can be remote.


ssh2.png


does this happen because when I add all the nodes on my cluster, I'm using IP address not the server-name ?

(#pvecm add 172.81.81.92 )
 
Does 'theironstone' rresolve to the correct IP?
Because the "port 22: Connection refused" Error is normally a Network problem, SSH on another port, or SSHd not running (the latter two can be excluded, as else the connect through IP would not work...). Does not looks like an missing SSH known host entry error to me...
 
Does 'theironstone' rresolve to the correct IP?
Because the "port 22: Connection refused" Error is normally a Network problem, SSH on another port, or SSHd not running (the latter two can be excluded, as else the connect through IP would not work...). Does not looks like an missing SSH known host entry error to me...

I just know all the nodes (IP) inside my cluster are connecting in my network. I can ping all my nodes from another node. and I can remote ssh all nodes went i just have 3 nodes inside my cluster. ssh3.png



even I just tried fixed tby use [ ssh-copy-id (IP of node) ] , it's fixed, but another node on my cluster can't connect.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!