Search results

  1. TwiX

    Ceph OSD issue

    I still also have a tmpfs for the deleted osd root@dc-prox-11:~# df -h | grep tmpfs tmpfs 13G 11M 13G 1% /run tmpfs 63G 63M 63G 1% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock...
  2. TwiX

    Ceph OSD issue

    Hi, I'm upgrading a 5 nodes cluster from pve 6.2.6 to 6.2.11 Everything was ok until I reboot the third one . 2 osds didn't start with message : osd 8 and 10 unable to obtain rotating service keys; retrying reboot again and then osd 8 and 10 are ok but osd 11 and 9 are ko with same messages...
  3. TwiX

    CEPH : SSD wearout

    Hi, You're right this is Zabbix :) I get wearout values via SNMP through Dell idrac. I created an item under the disk discovery section. It works for idrac 8 or newer. The related OID is 1.3.6.1.4.1.674.10892.5.5.1.20.130.4.1.49.{#SNMPINDEX} for HDD returned value is 255...
  4. TwiX

    CEPH : SSD wearout

    :p Thanks for these urls. So with still more than 80 % remaining, I have 2 or 3 years before replacing them I guess. it decreases by 1% every 2 months.
  5. TwiX

    CEPH : SSD wearout

    It doesn't matter ;) I know what to check. But before buying lots of SSD in order to replace them, I want to know the wearout value where the drive must be replaced.
  6. TwiX

    CEPH : SSD wearout

    Hi, Unfortunately, the right values are provided by another attribute for me. CF values provided by Dell idrac => 90% remaining here is the result of 'smartctl -a /dev/sda' smartctl -a /dev/sda smartctl 6.6 2016-05-31 r4324 [x86_64-linux-4.15.18-9-pve] (local build) Copyright (C) 2002-16...
  7. TwiX

    CEPH : SSD wearout

    Right wearout values are also confirmed by DELL idrac :
  8. TwiX

    CEPH : SSD wearout

    Here are the S.M.A.R.T values : 90% remaining for this drive You can notice that GUI indicates wrong wearout.
  9. TwiX

    CEPH : SSD wearout

    HI, My 'oldest' prx ceph cluster is based on samsung SM863a drives. After 3 years, wearout for some drives is less than "88% remaining". I don't know if these values are safe enough. Under what kind of Wearout value it is recommended to change the SSD drive ? Thanks !
  10. TwiX

    Cloud-Init for beginners

    Thanks, I tried without the cloud-init drive and without removing the packages. Seems that everything is OK, OS starts quite fast (as usually)
  11. TwiX

    Cloud-Init for beginners

    Hi, I just provisioned some new VMs based on a debian10 template with cloud-init. It works as expected. So, after first boot, do I need to keep the cloud-init drive mounted (a then delete it) ? Also, remove cloud-init package ? Thanks !
  12. TwiX

    Ceph and a Datacenter failure

    thanks You solved your issue ? It was related to ceph min replicas ?
  13. TwiX

    Ceph and a Datacenter failure

    Hi, What is the required bandwidth between these 2 datacenter ? :)
  14. TwiX

    Corosync 3 - Kronosnet - link: host: x link: 1 is down

    Hi, Some nodes never showed up this message (for example dc-prox-22, dc-prox-24 and dc-prox-26). For the other nodes, seems that it never shows up at the same time as you can see :
  15. TwiX

    Corosync 3 - Kronosnet - link: host: x link: 1 is down

    Hello, I just built 6 new pmx v6 nodes (uptodate) with same hardware. Had 2 links per node (2 lacp bond on 2 Intel X520) : bond0 : 2x10 Gb (Management and VMs prod - MTU : 1500) bond1 : 2x10 Gb (Ceph Storage - MTU : 9000) bond0 is declared as primary corosync link (link 0), bond1 as link 1...
  16. TwiX

    Alternative to Samsung 863a for Ceph

    Hi Take a look for Intel DC S4610. They perform as well as sm863a ones.
  17. TwiX

    Ceph nautilus - Raid 0

    Hi, First thing first, I know it is not recommended to use raid 0 disks beyond ceph, however that's what I did on 4 Dell R430 servers with Perc 730 (with 6 15k SAS drives). I have pretty descent performance with it and absolutely no issues for the last 2 years. With full SSD nodes I don't use...
  18. TwiX

    max cluster nodes with pve6?

    Hello, And about the bandwidth for a 36 nodes cluster, what kind of traffic (Mbps) should we expect ?
  19. TwiX

    Corosync 3

    Thanks a lot :)
  20. TwiX

    Corosync 3

    Hi, Corosync 3 doesn't use multicast anymore. It uses unicast. Ok, so I guess the cluster traffic should grow a lot for clusters involving more than 3 nodes ? Thanks in advanced, Antoine