[SOLVED] Can't migrate from node.

Esben Viborg

Member
Oct 12, 2017
22
0
6
39
Hi

I have a cluster with 7 nodes.
Almost everything works as it should, however im having issues migrating from one of my nodes.

When i try migrating i get this error:
Code:
2018-05-17 15:15:53 # /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=xxxxx' root@10.0.0.50 /bin/true
2018-05-17 15:15:53 Permission denied (publickey,password).
2018-05-17 15:15:53 ERROR: migration aborted (duration 00:00:01): Can't connect to destination address using public key
TASK ERROR: migration aborted

When i try to ssh from the troublesome node, im promted for the password to the node im trying to ssh to.

I can migrate from all other nodes without any issues.

any ideas.
 
Hi

I have a cluster with 7 nodes.
Almost everything works as it should, however im having issues migrating from one of my nodes.

When i try migrating i get this error:
Code:
2018-05-17 15:15:53 # /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=xxxxx' root@10.0.0.50 /bin/true
2018-05-17 15:15:53 Permission denied (publickey,password).
2018-05-17 15:15:53 ERROR: migration aborted (duration 00:00:01): Can't connect to destination address using public key
TASK ERROR: migration aborted

When i try to ssh from the troublesome node, im promted for the password to the node im trying to ssh to.

I can migrate from all other nodes without any issues.

any ideas.


This problem is solved very simply.
It is done on the node from which we migrate LXC Сontainer / VM:

Code:
ssh -o 'HostKeyAlias=xxxxx' root@10.0.0.50

I hope I helped you get rid of the headache)

P.S I will also duplicate this solution in Russian.
P.P.S I apologize for my bad english.

####

Данная проблема решается очень просто.
Делается на узле с которого мигрируем LXC Контейнер / VM:
Code:
ssh -o 'HostKeyAlias=xxxxx' root@10.0.0.50

Надеюсь, я помог вам избавится от головной боли)
 
Unfortunalty that didn't do the trick. Im still not able to migrate from the node and im still getting promted for password when ever i try to ssh to other nodes, from the affected node.
 
Unfortunalty that didn't do the trick. Im still not able to migrate from the node and im still getting promted for password when ever i try to ssh to other nodes, from the affected node.

All nodes have the same version of pve-manager? Check it, please. In the past, I had problems with container migration when using nodes with different versions pve-manager.
 
Unfortunalty that didn't do the trick. Im still not able to migrate from the node and im still getting promted for password when ever i try to ssh to other nodes, from the affected node.

then verify that the authorized keys file on the cluster nodes is correct, and that the connecting node has an SSH keypair that is allowed to connect to the rest of the cluster? you should not be prompted for a password..
 
I updated all my nodes to 5.2 and restartede them. After the restart the problem was gone.

Thanks for all your inputs :)
 
for anyone that has this issue in 2022 - i fixed it by going to a host that had no SSH issues and

Copied the /root/.ssh/known_hosts from a working host to a host with issues. and it fixed the issue.
 
for anyone that has this issue in 2022 - i fixed it by going to a host that had no SSH issues and

Copied the /root/.ssh/known_hosts from a working host to a host with issues. and it fixed the issue.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!