[SOLVED] odd issue pve-zync disks

killmasta93

Renowned Member
Aug 13, 2017
973
58
68
31
Hi
I was wondering if someone else has had this issue before, Currently running pve-zync every hour and the VM has 3 disks and should keep 28 copies but i check the snapshots and for disk2 only shows one snapshot

Thank you

Code:
0 * * * * root pve-zsync sync --source 197 --dest 192.168.7.150:rpool/data --name bakproxy --maxsnap 28 --method ssh --source-user root --dest-user root


Code:
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_15:07:51      69.8M      -     21.0G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_16:05:16      5.99M      -     21.0G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_17:02:42      6.72M      -     21.0G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_18:02:53      6.67M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_19:01:13      7.33M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_20:00:01      6.88M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_21:05:22      7.05M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_22:02:11      6.88M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-07_23:01:36      6.22M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_00:12:40      8.34M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_02:21:02      6.75M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_03:16:06      5.83M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_04:06:27      5.90M      -     21.1G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_05:10:23      6.86M      -     21.2G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_06:14:07      6.38M      -     21.3G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_07:10:29      6.62M      -     21.3G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_08:01:40      6.02M      -     21.3G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_09:00:19      6.00M      -     21.4G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_10:06:42      6.18M      -     21.4G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_11:06:51      6.90M      -     21.4G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_12:04:10      7.35M      -     21.4G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_13:02:09      6.32M      -     21.5G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_14:06:30      7.07M      -     21.5G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_15:05:25      6.62M      -     21.5G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_16:04:14      5.73M      -     21.6G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_17:00:02      6.40M      -     21.6G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_18:03:59      7.96M      -     21.6G  -
rpool/data/vm-197-disk-0@rep_bakproxy_2022-05-08_19:16:54      6.26M      -     21.6G  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_15:07:51      3.12M      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_16:05:16       760K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_17:02:42       768K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_18:02:53       768K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_19:01:13       816K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_20:00:01       824K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_21:05:22       808K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_22:02:11       820K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-07_23:01:36       828K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_00:12:40       828K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_02:21:02       756K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_03:16:06       760K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_04:06:27       748K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_05:10:23       744K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_06:14:07       752K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_07:10:29       736K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_08:01:40       712K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_09:00:19       708K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_10:06:42       804K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_11:06:51       732K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_12:04:10       752K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_13:02:09       828K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_14:06:30       740K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_15:05:25       748K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_16:04:14       640K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_17:00:02       644K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_18:03:59       828K      -      178M  -
rpool/data/vm-197-disk-1@rep_bakproxy_2022-05-08_19:16:54       728K      -      178M  -
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-08_19:16:54      3.18M      -     3.53G  -
 
Hi,
what is your pve-zsync version? Please share the output of qm config 197 on the source node and zpool history rpool | grep vm-197-disk-2 on both source and target.
 
Thank for the reply, this is the info

pve-manager/6.2-4/9824574a (running kernel: 5.4.34-1-pve)
Please use pveversion -v to see the version of pve-zsync in particular. Likely not related to your issue, but I'd recommend upgrading (at least to 6.4, but all of Proxmox VE 6.x will be end of life soon). Upgrade guide to 7.x is here.

Could you please also share the VM config and the output of
Code:
zfs list -r -t snapshot -Ho name -S creation rpool/data/vm-197-disk-2
 
Thank you so much for the reply,
Code:
proxmox-ve: 6.2-1 (running kernel: 5.4.34-1-pve)
pve-manager: 6.2-4 (running version: 6.2-4/9824574a)
pve-kernel-5.4: 6.2-1
pve-kernel-helper: 6.2-1
pve-kernel-5.4.34-1-pve: 5.4.34-2
ceph-fuse: 12.2.11+dfsg1-2.1+b1
corosync: 3.0.3-pve1
criu: 3.11-3
glusterfs-client: 5.5-3
ifupdown: 0.8.35+pve1
ksm-control-daemon: 1.3-1
libjs-extjs: 6.0.1-10
libknet1: 1.15-pve1
libproxmox-acme-perl: 1.0.3
libpve-access-control: 6.1-1
libpve-apiclient-perl: 3.0-3
libpve-common-perl: 6.1-2
libpve-guest-common-perl: 3.0-10
libpve-http-server-perl: 3.0-5
libpve-storage-perl: 6.1-7
libqb0: 1.0.5-1
libspice-server1: 0.14.2-4~pve6+1
lvm2: 2.03.02-pve4
lxc-pve: 4.0.2-1
lxcfs: 4.0.3-pve2
novnc-pve: 1.1.0-1
proxmox-mini-journalreader: 1.1-1
proxmox-widget-toolkit: 2.2-1
pve-cluster: 6.1-8
pve-container: 3.1-5
pve-docs: 6.2-4
pve-edk2-firmware: 2.20200229-1
pve-firewall: 4.1-2
pve-firmware: 3.1-1
pve-ha-manager: 3.0-9
pve-i18n: 2.1-2
pve-qemu-kvm: 5.0.0-2
pve-xtermjs: 4.3.0-1
pve-zsync: 2.0-3
qemu-server: 6.2-2
smartmontools: 7.1-pve2
spiceterm: 3.1-1
vncterm: 1.6-1
zfsutils-linux: 0.8.3-pve1

Code:
root@prometheus:~# zfs list -r -t snapshot -Ho name -S creation rpool/data/vm-197-disk-2
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-10_23:02:53
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_23:00:53
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_22:05:59
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_21:03:15
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_20:04:19
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_19:07:01
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_18:03:07
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-10_17:04:03

Code:
bootdisk: scsi0
cores: 2
description: sci0 gitlab, OS, portrainer,onlyoffice,hiemdall%0Asci1 disk joplin docker%0Asci2 disk openvas
ide2: local:iso/ubuntu-20.04.1-legacy-server-amd64.iso,media=cdrom
memory: 8048
name: Proxy
net0: virtio=02:C8:9A:5B:91:8F,bridge=vmbr0,tag=3
numa: 0
onboot: 1
ostype: l26
scsi0: local-zfs:vm-197-disk-0,discard=on,size=128G
scsi1: local-zfs:vm-197-disk-1,discard=on,size=50G
scsi2: local-zfs:vm-197-disk-2,discard=on,size=80G
scsihw: virtio-scsi-pci
smbios1: uuid=d8d9d11d-d70a-4905-a78b-49c69ba253a9
sockets: 2
vmgenid: b97f43e6-ea2f-46b9-9790-ca1f9df8697e
 
Is the target of the drpproxy job the same as for the bakproxy job? That was not supported before version 2.0-4. But even if it isn't, there were some improvements since version 2.0-3. Please upgrade pve-zsync and see if the issue persists.
 
Thank you so much for the reply, sorry for the late reply,

bakproxy job copies to another proxmox machine and drpproxy copies to another proxmox machine

just updated to pve-zync 2.2 lets wait to see what happens ill post back

Thank you
 
Hi @fabian Thank you so much
wanted to post back it seems that it working correctly now thank you again


Code:
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_17:02:31
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_17:00:01
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_16:04:45
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_16:01:05
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_15:06:08
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_15:04:01
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_14:03:48
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_14:02:24
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_13:04:00
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_13:00:46
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_12:09:16
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_12:07:23
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_11:06:47
rpool/data/vm-197-disk-2@rep_drpproxy_2022-05-14_11:03:05
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_10:07:01
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_09:07:07
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_08:03:27
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_07:06:13
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_06:05:57
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_05:04:12
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_04:00:50
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_03:01:07
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_02:00:15
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_01:02:44
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-14_00:05:21
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_23:03:01
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_22:06:04
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_21:07:55
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_20:03:52
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_19:03:07
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_18:07:49
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_17:07:00
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_16:06:15
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_15:06:21
rpool/data/vm-197-disk-2@rep_bakproxy_2022-05-13_14:05:36
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!