High memory usage with ZFS

ca_maer

Well-Known Member
Dec 5, 2017
181
14
58
44
Hello,

I'm having a issues regarding the ram usage on our hypervisors. It seems there is ram going I don't know where.
The hypervisor has 70GB of usable total ram. The current arc usage is 9GB according to arc_summary. I've calculated the amount of ram used by each LXC containers on the host and the total came out to 15GB.

Here's the view of the ram of the hypervisor in the webui:
Screen Shot 2020-05-07 at 12.45.56 PM.png

So the total usage should be closer to 24GB total (15+9) where is the other 38GB ?

Code:
              total        used        free      shared  buff/cache   available
Mem:             70          62           5           0           3           7
Swap:             0           0           0

The ZFS ARC is already limited to 10% min and 30% max of all memory on the server.

I've read a little bit and some people might be mentioning the slab might be taking it all for ZFS but even with the slub_nomerge kernel parameter the problem persist.

Here are some more info:

Code:
  pool: rpool
 state: ONLINE
  scan: scrub repaired 0B in 0 days 00:37:05 with 0 errors on Sun Apr 12 01:01:06 2020
config:

    NAME                              STATE     READ WRITE CKSUM
    rpool                             ONLINE       0     0     0
      mirror-0                        ONLINE       0     0     0
        wwn-0x5000cca05935b164-part2  ONLINE       0     0     0
        wwn-0x5000cca05934a01c-part2  ONLINE       0     0     0
      mirror-1                        ONLINE       0     0     0
        wwn-0x5000cca05936d108-part2  ONLINE       0     0     0
        wwn-0x5000cca059346164-part2  ONLINE       0     0     0
    logs
      mirror-2                        ONLINE       0     0     0
        wwn-0x5002538050002f4a-part2  ONLINE       0     0     0
        wwn-0x5002538050002e22-part2  ONLINE       0     0     0
    cache
      wwn-0x5002538050002f4a-part3    ONLINE       0     0     0
      wwn-0x5002538050002e22-part3    ONLINE       0     0     0

errors: No known data errors

Code:
proxmox-ve: 6.1-2 (running kernel: 5.3.13-1-pve)
pve-manager: 6.1-7 (running version: 6.1-7/13e58d5e)
pve-kernel-5.3: 6.1-3
pve-kernel-helper: 6.1-3
pve-kernel-4.15: 5.4-12
pve-kernel-5.3.13-3-pve: 5.3.13-3
pve-kernel-5.3.13-1-pve: 5.3.13-1
pve-kernel-4.13: 5.2-2
pve-kernel-4.15.18-24-pve: 4.15.18-52
pve-kernel-4.15.18-21-pve: 4.15.18-48
pve-kernel-4.15.18-10-pve: 4.15.18-32
pve-kernel-4.15.18-9-pve: 4.15.18-30
pve-kernel-4.15.18-8-pve: 4.15.18-28
pve-kernel-4.15.18-7-pve: 4.15.18-27
pve-kernel-4.15.18-5-pve: 4.15.18-24
pve-kernel-4.15.18-4-pve: 4.15.18-23
pve-kernel-4.15.17-2-pve: 4.15.17-10
pve-kernel-4.13.16-4-pve: 4.13.16-51
pve-kernel-4.13.16-3-pve: 4.13.16-50
pve-kernel-4.13.16-2-pve: 4.13.16-48
pve-kernel-4.13.16-1-pve: 4.13.16-46
pve-kernel-4.13.13-6-pve: 4.13.13-42
pve-kernel-4.13.13-5-pve: 4.13.13-38
pve-kernel-4.13.13-4-pve: 4.13.13-35
pve-kernel-4.13.13-2-pve: 4.13.13-33
pve-kernel-4.13.13-1-pve: 4.13.13-31
pve-kernel-4.13.8-3-pve: 4.13.8-30
pve-kernel-4.13.8-2-pve: 4.13.8-28
pve-kernel-4.13.4-1-pve: 4.13.4-26
ceph-fuse: 12.2.11+dfsg1-2.1+b1
corosync: 3.0.3-pve1
criu: 3.11-3
glusterfs-client: 5.5-3
ifupdown: 0.8.35+pve1
ksm-control-daemon: 1.3-1
libjs-extjs: 6.0.1-10
libknet1: 1.14-pve1
libpve-access-control: 6.0-6
libpve-apiclient-perl: 3.0-2
libpve-common-perl: 6.0-11
libpve-guest-common-perl: 3.0-3
libpve-http-server-perl: 3.0-4
libpve-storage-perl: 6.1-4
libqb0: 1.0.5-1
libspice-server1: 0.14.2-4~pve6+1
lvm2: 2.03.02-pve4
lxc-pve: 3.2.1-1
lxcfs: 3.0.3-pve60
novnc-pve: 1.1.0-1
proxmox-mini-journalreader: 1.1-1
proxmox-widget-toolkit: 2.1-3
pve-cluster: 6.1-4
pve-container: 3.0-19
pve-docs: 6.1-4
pve-edk2-firmware: 2.20191127-1
pve-firewall: 4.0-10
pve-firmware: 3.0-4
pve-ha-manager: 3.0-8
pve-i18n: 2.0-4
pve-qemu-kvm: 4.1.1-2
pve-xtermjs: 4.3.0-1
pve-zsync: 2.0-2
qemu-server: 6.1-5
smartmontools: 7.1-pve2
spiceterm: 3.1-1
vncterm: 1.6-1
zfsutils-linux: 0.8.3-pve1

Any ideas ?
 
Here it is:

Code:
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
root      7199  1.8  9.0 14305300 6715320 ?    Sl   Apr30 220:21 /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Djava.io.tmpdir=/tmp -Djetty.home=/opt/PDFreactor/jetty -Djetty.base=/opt/PDFreactor/jetty -Xmx8192m -Dcom.realobjects.pdfreactor.webservice.conversionTimeout=300 -Dcom.realobjects.interceptConsoleOutput=true -Djavax.ws.rs.ext.RuntimeDelegate=org.apache.cxf.jaxrs.impl.RuntimeDelegateImpl -Duser.timezone=America/New_York -Dorg.apache.cxf.Logger=org.apache.cxf.common.logging.Slf4jLogger -Djava.awt.headless=true -cp /opt/PDFreactor/jetty/lib/ext/core/annotations-api.jar:/opt/PDFreactor/jetty/lib/ext/core/aopalliance-1.0.jar:/opt/PDFreactor/jetty/lib/ext/core/asm-5.2.jar:/opt/PDFreactor/jetty/lib/ext/core/asm-commons-3.3.1.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-core-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-manifest.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-bindings-soap-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-bindings-xml-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-databinding-jaxb-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-frontend-jaxrs-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-frontend-jaxws-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-frontend-simple-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-client-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-extension-providers-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-extension-search-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-cors-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-jose-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-jose-jaxrs-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-oauth2-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-oauth-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-sso-saml-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-security-xml-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-rs-service-description-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-transports-http-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-ws-addr-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-wsdl-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-ws-mex-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/cxf-rt-ws-policy-3.2.6.jar:/opt/PDFreactor/jetty/lib/ext/core/geronimo-j2ee_1.4_spec-1.0.jar:/opt/PDFreactor/jetty/lib/ext/core/geronimo-j2ee-management_1.1_spec-1.0.1.jar:/opt/PDFreactor/jetty/lib/ext/core/geronimo-javamail_1.4_mail-1.8.4.jar:/opt/PDFreactor/jetty/lib/ext/core/geronimo-jms_1.1_spec-1.1.1.jar:/opt/PDFreactor/jetty/lib/ext/core/geronimo-jta_1.1_spec-1.1.1.jar:/opt/PDFreactor/jetty/lib/ext/core/geronimo-ws-metadata_2.0_spec-1.1.3.jar:/opt/PDFreactor/jetty/lib/ext/core/jackson-annotations-2.9.6.jar:/opt/PDFreactor/jetty/lib/ext/core/jackson-core-2.9.6.jar:/opt/PDFreactor/jetty/lib/ext/core/jackson-databind-2.9.6.jar:/opt/PDFreactor/jetty/lib/ext/core/jackson-jaxrs-base-2.9.6.jar:/opt/PDFreactor/jetty/lib/ext/core/jackson-jaxrs-json-provider-2.9.6.jar:/opt/PDFreactor/jetty/lib/ext/core/javax.annotation-api-1.3.jar:/opt/PDFreactor/jetty/lib/ext/core/javax.servlet-api-3.1.0.jar:/opt/PDFreactor/jetty/lib/ext/core/javax.ws.rs-api-2.1.jar:/opt/PDFreactor/jetty/lib/ext/core/jaxb-core-2.2.11.jar:/opt/PDFreactor/jetty/lib/ext/core/jaxb-impl-2.2.11.jar:/opt/PDFreactor/jetty/lib/ext/core/jaxb-xjc-2.2.11.jar:/opt/PDFreactor/jetty/lib/ext/core/jaxws-api-2.3.0.jar:/opt/PDFreactor/jetty/lib/ext/core/jcl-over-slf4j-1.7.25.jar:/opt/PDFreactor/jetty/lib/ext/core/neethi-3.1.1.jar:/opt/PDFreactor/jetty/lib/ext/core/slf4j-api-1.7.25.jar:/opt/PDFreactor/jetty/lib/ext/core/slf4j-nop-1.7.25.jar:/opt/PDFreactor/jetty/lib/ext/core/spring-aop-4.3.18.RELEASE.jar:/opt/PDFreactor/jetty/lib/ext/core/spring-beans-4.3.18.RELEASE.jar:/opt/PDFreactor/jetty/lib/ext/core/spring-context-4.3.18.RELEASE.jar:/opt/PDFreactor/jetty/lib/ext/core/spring-core-4.3.18.RELEASE.jar:/opt/PDFreactor/jetty/lib/ext/core/spring-expression-4.3.18.RELEASE.jar:/opt/PDFreactor/jetty/lib/ext/core/spring-web-4.3.18.RELEASE.jar:/opt/PDFreactor/jetty/lib/ext/core/stax2-api-4.1.jar:/opt/PDFreactor/jetty/lib/ext/core/woodstox-core-5.1.0.jar:/opt/PDFreactor/jetty/lib/ext/core/wsdl4j-1.6.3.jar:/opt/PDFreactor/jetty/lib/ext/core/wstools.jar:/opt/PDFreactor/jetty/lib/ext/core/xmlschema-core-2.2.3.jar:/opt/PDFreactor/jetty/lib/ext/pdfreactor.jar:/opt/PDFreactor/jetty/lib/ext/pdfreactor-webservice.jar:/opt/PDFreactor/jetty/lib/servlet-api-3.1.jar:/opt/PDFreactor/jetty/lib/jetty-schemas-3.1.jar:/opt/PDFreactor/jetty/lib/jetty-http-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-server-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-xml-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-util-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-io-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-security-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-servlet-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-webapp-9.4.11.v20180605.jar:/opt/PDFreactor/jetty/lib/jetty-deploy-9.4.11.v20180605.jar org.eclipse.jetty.xml.XmlConfiguration /tmp/start_1556003847369284563.properties /opt/PDFreactor/jetty/etc/console-capture.xml /opt/PDFreactor/jetty/etc/jetty-threadpool.xml /opt/PDFreactor/jetty/etc/jetty.xml /opt/PDFreactor/jetty/etc/jetty-webapp.xml /opt/PDFreactor/jetty/etc/jetty-deploy.xml /opt/PDFreactor/jetty/etc/jetty-http.xml /opt/PDFreactor/jetty/etc/jetty-requestlog.xml
nagios    5626  0.3  2.7 11111888 2049552 ?    Sl   Feb13 464:38 java -Drundeck.jaaslogin=true -Djava.security.auth.login.config=/etc/rundeck/jaas-loginmodule.conf -Dloginmodule.name=RDpropertyfilelogin -Drdeck.config=/etc/rundeck -Drundeck.server.configDir=/etc/rundeck -Dserver.datastore.path=/var/lib/rundeck/data/rundeck -Drundeck.server.serverDir=/var/lib/rundeck -Drdeck.projects=/var/lib/rundeck/projects -Drdeck.runlogs=/var/lib/rundeck/logs -Drundeck.config.location=/etc/rundeck/rundeck-config.properties -Djava.io.tmpdir=/tmp/rundeck -Drundeck.server.workDir=/tmp/rundeck -Dserver.http.port=4440 -Drdeck.base=/var/lib/rundeck -Xmx1024m -Xms256m -XX:MaxMetaspaceSize=256m -server -Dserver.web.context=/prd -jar /var/lib/rundeck/bootstrap/rundeck-3.2.2-20200204.war --skipinstall
root       581  0.7  1.5 1601804 1130244 ?     Ssl  Jan24 1112:03 /usr/sbin/clamd --foreground=true
Debian-+  4445  0.1  0.9 696548 669344 ?       Ss   Jan15 289:31 /usr/sbin/snmpd -Lsd -Lf /dev/null -u Debian-snmp -g Debian-snmp -I -smux mteTrigger mteTriggerConf -f -p /run/snmpd.pid
ntp       7789  0.0  0.6 4342008 475668 ?      Sl   Feb20  73:30 /usr/sbin/mysqld --daemonize --pid-file=/run/mysqld/mysqld.pid
Debian-+ 10503  0.1  0.5 3385040 380268 ?      Ssl  Jan15 173:18 /usr/sbin/mysqld
munin     6196  2.1  0.3 872576 252360 ?       Sl   Jan15 3505:12 /usr/sbin/mysqld --basedir=/usr --datadir=/var/lib/mysql --plugin-dir=/usr/lib/mysql/plugin --user=mysql --skip-log-error --pid-file=/var/run/mysqld/mysqld.pid --socket=/var/run/mysqld/mysqld.sock --port=3306
ntp       4143  0.0  0.3 1747020 229524 ?      Sl   Jan15 101:07 /usr/sbin/mysqld --daemonize --pid-file=/run/mysqld/mysqld.pid
root      7172  0.1  0.2 22417580 183036 ?     Sl   Apr30  12:02 /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dinstall4j.jvmDir=/usr/lib/jvm/java-11-openjdk-amd64 -Dexe4j.moduleName=/opt/PDFreactor/bin/pdfreactorwebservice -Dinstall4j.launcherId=135 -Dinstall4j.swt=false -Di4jv=0 -Di4jv=0 -Di4jv=0 -Di4jv=0 -Di4jv=0 -Di4j.vpt=true -classpath /opt/PDFreactor/.install4j/i4jruntime.jar:/opt/PDFreactor/jetty/start.jar com.install4j.runtime.launcher.UnixLauncher start 10eb7511 0 0 org.eclipse.jetty.start.Main
root     27150  2.3  0.2 572704 177064 ?       SLsl Mar05 2145:51 /usr/sbin/corosync -f
Debian-+ 11341  0.0  0.2 1282988 164008 ?      Ssl  Jan15  97:40 /usr/sbin/mysqld
nagios   11425  0.1  0.2 920608 158052 ?       Ssl  Jan15 175:36 /usr/local/nagios/bin/nagios -d /usr/local/nagios/etc/nagios.cfg
systemd+ 14373  0.2  0.1 2429680 144192 ?      Ssl  May04  15:41 /opt/mattermost/bin/mattermost
Debian-+   336  0.0  0.1 291924 136780 ?       Ss   Jan15   4:24 postgres: checkpointer process
www-data  6981  0.0  0.1 349664 125628 ?       Ss   Jan15   3:17 pveproxy
www-data 22659  0.0  0.1 357892 124400 ?       S    00:00   0:04 pveproxy worker
www-data 22658  0.0  0.1 357880 124308 ?       S    00:00   0:03 pveproxy worker
www-data 22657  0.0  0.1 357752 124208 ?       S    00:00   0:03 pveproxy worker
root      4735  0.0  0.1 348068 123136 ?       Ss   Jan15   2:42 pvedaemon
root     22482  0.0  0.1 356288 122140 ?       S    09:30   0:01 pvedaemon worker
root     22078  0.0  0.1 356288 121952 ?       S    11:05   0:00 pvedaemon worker
root     24672  0.0  0.1 356292 121148 ?       S    05:00   0:04 pvedaemon worker
root      4946  3.1  0.1 693900 111856 pts/2   S+   11:29   0:00 php -t /var/www/html/daemons execDaemons.php
Debian-+  8217  0.0  0.1 292840 104472 ?       Ss   Jan23   0:57 postgres: mmuser mattermost 127.0.0.1(33624) idle
systemd+  7861  0.0  0.1 1271584 93636 ?       Ssl  Jan23  18:52 node node_modules/.bin/coffee node_modules/.bin/hubot --name Hubot --adapter matteruser
root      6637  0.0  0.1 330088 92572 ?        Ss   Jan15  17:48 pve-ha-crm
root      7054  0.0  0.1 329708 92132 ?        Ss   Jan15  35:15 pve-ha-lrm
root     30284  0.2  0.1 291780 86432 ?        Ss   Jan15 385:19 pve-firewall
root      4725  2.8  0.1 288608 85952 ?        Ss   11:29   0:00 /usr/bin/perl -T /usr/bin/pvesr run --mail 1
root     30603  2.4  0.1 290312 85648 ?        Ss   Jan15 4005:09 pvestatd
root     29997  0.0  0.1 115848 80912 ?        Ss   Mar05  41:54 /lib/systemd/systemd-journald
 
According to those figures, the container summary could be right.

Have you configured hugepages?
I have not so it's still the default. Should I be tweaking those ?

Here's what I have
Code:
AnonHugePages:    120832 kB
ShmemHugePages:        0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
Hugetlb:               0 kB
 
So there are another 120 MB, but that's also not a lot. No, nothing to tweak, but this ram usage is also not shown on ps.

While we're on /proc/meminfo, could you paste it as a whole?
 
Thanks for the help.

Here it is:

Code:
MemTotal:           70.7707     GB
MemFree:            3.98357     GB
MemAvailable:       6.47202     GB
Buffers:            0           GB
Cached:             2.46402     GB
SwapCached:         0           GB
Active:             17.8018     GB
Inactive:           0.937256    GB
Active(anon):       16.4149     GB
Inactive(anon):     0.278992    GB
Active(file):       1.38698     GB
Inactive(file):     0.658264    GB
Unevictable:        0.153385    GB
Mlocked:            0.153385    GB
SwapTotal:          0           GB
SwapFree:           0           GB
Dirty:              0.00187683  GB
Writeback:          0           GB
AnonPages:          16.4287     GB
Mapped:             0.8395      GB
Shmem:              0.48365     GB
KReclaimable:       1.25263     GB
Slab:               21.1368     GB
SReclaimable:       1.25263     GB
SUnreclaim:         19.8842     GB
KernelStack:        0.0310478   GB
PageTables:         0.223083    GB
NFS_Unstable:       0           GB
Bounce:             0           GB
WritebackTmp:       0           GB
CommitLimit:        35.3854     GB
Committed_AS:       31.1762     GB
VmallocTotal:       32768       GB
VmallocUsed:        1.34884     GB
VmallocChunk:       0           GB
Percpu:             19.5018     GB
HardwareCorrupted:  0           GB
AnonHugePages:      1.44727     GB
ShmemHugePages:     0           GB
ShmemPmdMapped:     0           GB
CmaTotal:           0           GB
CmaFree:            0           GB
HugePages_Total:    0
HugePages_Free:     0
HugePages_Rsvd:     0
HugePages_Surp:     0
Hugepagesize:       0.00195312  GB
Hugetlb:            0           GB
DirectMap4k:        52.9064     GB
DirectMap2M:        19.084      GB
 
Where does the GB values come from? I've never seen this output format. The default is for decades in kB.

Slap is indeed very high, I normally see values around 1-3 GB. Could you please post the contents of /proc/buddyinfo
 
I formated it using awk because I thought it would be easier to read.
awk '$3=="kB"{$2=$2/1024^2;$3="GB";} 1' /proc/meminfo | column -t

Here's /proc/buddyinfo

Code:
Node 0, zone      DMA      0      0      0      1      2      1      1      0      1      1      3
Node 0, zone    DMA32   1792    516    466   1595   1976   1088    608    278    161    286    211
Node 0, zone   Normal  37463  23576  11998   6817   1085    223     41     11      1      0      0
Node 1, zone   Normal  43555  34648  12332   5444   3926   1971    936    303    106     83    206
 
Another shoot ... please install memstat and post the output of memstat -w (may be long) but that'll give us a complete overview.
 
Here's DF from the host:

Code:
Filesystem                       Size  Used Avail Use% Mounted on
udev                              36G     0   36G   0% /dev
tmpfs                            7.1G   17M  7.1G   1% /run
rpool/ROOT/pve-1                 811G  7.3G  803G   1% /
tmpfs                             36G   57M   36G   1% /dev/shm
tmpfs                            5.0M     0  5.0M   0% /run/lock
tmpfs                             36G     0   36G   0% /sys/fs/cgroup
rpool                            803G  128K  803G   1% /rpool
rpool/ROOT                       803G  128K  803G   1% /rpool/ROOT
rpool/data                       803G  256K  803G   1% /rpool/data
rpool/data/subvol-114-disk-0      10G  2.4G  7.7G  24% /rpool/data/subvol-114-disk-0
rpool/data/subvol-136-disk-0      10G  2.2G  7.9G  22% /rpool/data/subvol-136-disk-0
rpool/data/subvol-124-disk-0      25G   11G   15G  41% /rpool/data/subvol-124-disk-0
rpool/data/subvol-101-disk-0      15G  5.1G   10G  34% /rpool/data/subvol-101-disk-0
rpool/data/subvol-100-disk-1      20G  4.8G   16G  24% /rpool/data/subvol-100-disk-1
rpool/data/subvol-129-disk-1     350G  222G  129G  64% /rpool/data/subvol-129-disk-1
rpool/data/subvol-123-disk-1      10G  1.4G  8.7G  14% /rpool/data/subvol-123-disk-1
rpool/data/subvol-131-disk-1      15G  4.9G   11G  33% /rpool/data/subvol-131-disk-1
rpool/data/subvol-126-disk-1      10G  1.9G  8.2G  19% /rpool/data/subvol-126-disk-1
rpool/data/subvol-128-disk-0      25G  3.4G   22G  14% /rpool/data/subvol-128-disk-0
rpool/data/subvol-127-disk-1     8.0G  1.1G  7.0G  14% /rpool/data/subvol-127-disk-1
rpool/data/subvol-137-disk-0      10G  2.1G  8.0G  21% /rpool/data/subvol-137-disk-0
rpool/data/subvol-122-disk-0     4.0G  1.4G  2.7G  35% /rpool/data/subvol-122-disk-0
prd-nas-1:/mnt/Pool-MTL/Backups  9.9T  1.5T  8.5T  15% /mnt/prd-nas-1
prd-nas-1:/mnt/Pool-MTL/Proxmox  8.6T  131G  8.5T   2% /mnt/pve/prd-nas-1
tmpfs                            7.1G     0  7.1G   0% /run/user/0
rpool/data/subvol-130-disk-0      15G  1.6G   14G  11% /rpool/data/subvol-130-disk-0
/dev/fuse                         30M  132K   30M   1% /etc/pve
tmpfs                            7.1G     0  7.1G   0% /run/user/1001
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!