Duplicate /etc/logrotate.d/ceph-common files

Jun 8, 2016
344
69
68
47
Johannesburg, South Africa
Upgrading to PVE 5.1 and Ceph Luminous results in duplicate Ceph logrotate configuration files:

/etc/logrotate.d/ceph-common:
Code:
/var/log/ceph/*.log {
    rotate 7
    daily
    compress
    sharedscripts
    postrotate
        killall -q -1 ceph-mon ceph-mgr ceph-mds ceph-osd ceph-fuse radosgw || true
    endscript
    missingok
    notifempty
    su root ceph
}

/etc/logrotate.d/ceph-common.logrotate:
Code:
/var/log/ceph/*.log {
    rotate 7
    daily
    compress
    sharedscripts
    postrotate
        killall -q -1 ceph-mon ceph-mds ceph-osd ceph-fuse radosgw || true
    endscript
    missingok
    notifempty
    su root ceph
}


The 'ceph-common.logrotate' file is newer, so I assume it was provided as part of Ceph Luminous:
Code:
[root@kvm5b ~]# ls -l /etc/logrotate.d | grep ceph
-rw-r--r-- 1 root root 237 Sep 26 19:27 ceph-common
-rw-r--r-- 1 root root 228 Oct  4 17:18 ceph-common.logrotate


The older file however includes references to signal ceph-mgr, whereas ceph-common-logrotate doesn't. Was it correct for me to have added 'ceph-mgr' to the ceph-common.logrotate file and to subsequently have deleted the older one (ceph-common)?

Herewith the error messages emailed to us by CRON each day:
Code:
/etc/cron.daily/logrotate:
error: ceph-common.logrotate:1 duplicate log entry for /var/log/ceph/ceph.audit.log
 
I don't know where you got "ceph-common.logrotate" from - but our packages only contain "ceph-common" with
Code:
/var/log/ceph/*.log {
    rotate 7
    daily
    compress
    sharedscripts
    postrotate
        killall -q -1 ceph-mon ceph-mgr ceph-mds ceph-osd ceph-fuse radosgw || true
    endscript
    missingok
    notifempty
    su root ceph
}
 
We have 3 Proxmox 4.4 clusters which were recently upgraded to PVE 5.1 with Ceph Luminous. Each node had both files, definately not created by us...

seems like that is a bug in Ceph's Jewel (10.2.10) package (again).. filed upstream, let's see whether there's an easy fix
 
I have these two files and not what fabian says proxmox creates. Which one should I keep then? Their contents is correct and identical. pve-manager/4.4-18/ef2610e8 (running kernel: 4.4.83-1-pve)

s1:~# ls -l /etc/logrotate.d | grep ceph
-rw-r--r-- 1 root root 228 Oct 4 17:18 ceph-common.logrotate
-rw-r--r-- 1 root root 228 Apr 10 2017 ceph.logrotate
s1:~# cat /etc/logrotate.d/
apt ceph-common.logrotate criu-service glusterfs-common pve-firewall
aptitude ceph.logrotate dpkg pve rsyslog
root@s1:~# cat /etc/logrotate.d/ceph.logrotate
/var/log/ceph/*.log {
rotate 7
daily
compress
sharedscripts
postrotate
killall -q -1 ceph-mon ceph-mds ceph-osd ceph-fuse radosgw || true
endscript
missingok
notifempty
su root ceph
}
s1:~# cat /etc/logrotate.d/ceph-common.logrotate.logrotate
cat: /etc/logrotate.d/ceph-common.logrotate.logrotate: No such file or directory
root@s1:~# cat /etc/logrotate.d/ceph-common.logrotate
/var/log/ceph/*.log {
rotate 7
daily
compress
sharedscripts
postrotate
killall -q -1 ceph-mon ceph-mds ceph-osd ceph-fuse radosgw || true
endscript
missingok
notifempty
su root ceph
}
 
Last edited:
Ah, on the latest added node (which was added using proxmox 4.4, I only have...

sm1:~# cat /etc/logrotate.d/ceph-common.logrotate
/var/log/ceph/*.log {
rotate 7
daily
compress
sharedscripts
postrotate
killall -q -1 ceph-mon ceph-mds ceph-osd ceph-fuse radosgw || true
endscript
missingok
notifempty
su root ceph
}


So /etc/logrotate.d/ceph-common.logrotate is the correct one.
 
keep the ceph-common.logrotate one for now (if you are on Jewel), it should be cleared up on upgrading to Luminous once a bug fix has been released (http://tracker.ceph.com/issues/22273)
 
Any update on this? It's been over a year, I'm fairly current on Luminous (having upgraded from Jewel) and still get these errors daily...
 
Any update on this? It's been over a year, I'm fairly current on Luminous (having upgraded from Jewel) and still get these errors daily...

post your:

> ceph versions
 
  • Like
Reactions: dmulk

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!