TrueNAS Storage Plugin

I think I see a few issues.

1. You have the MTU on ens3f0.11 set to 9216, usually 9216 is what you set the MTU on the switch as, so header information can fit ontop of a 9000 MTU packet. Is it supposed to be like that? An MTU mismatch between hosts will give you issues.

You can try pinging different interfaces and hosts with the MTU set manually as well. For an MTU of 9000 it's usually safe to ping with a packet size of 8972

Example:

Bash:
# Send 100 pings with high packet size
ping -c 100 -s 8972 10.15.14.172

2. Make sure you have the multipath service configured:
https://pve.proxmox.com/wiki/ISCSI_Multipath
Bash:
systemctl enable multipathd
systemctl start multipathd
systemctl status multipathd

Basically make sure the iscsid.conf is configured and restart the multipath-tools.service

3. Make sure the Proxmox nodes can see and login to the portals. The plugin is supposed to do this automaticaly, but maybe there's an issue.

Bash:
iscsiadm -m discovery -t sendtargets -p 172.16.80.1:3260
iscsiadm -m discovery -t sendtargets -p 172.16.81.1:3260

4. Make sure your portals are setup on TrueNAS, if either of the commands above fail, that might be the issue.
Under Shares > iSCSI > Portals make sure your portal ID has both interfaces as listening.

If they are, attempt to login manually:

iscsiadm -m node -T YOURTRUENASBASEIQN:YOURTARGET --login

ex: iscsiadm -m node -T iqn.2005-10.org.freenas.ctl:proxmox --login

---

If that all still isn't working, you can enable debug mode by editing your storage.cfg and add "debug 2" to the config.

ex:

INI:
    use_multipath 1
    portals 10.20.30.20:3260,10.20.31.20:3260
    debug 2
    force_delete_on_inuse 1
    content images

This will dump a LOT of info into the journalctl logs. If you let it run for about 10 mins with debug 2 on and then run the diagnostics bundler in the diagnostics menu on the alpha branches install.sh - That would give me a pretty good idea of what's going on. You can PM me the bundle, as there's some sensitive information in there, but I try to have the installer redact that information.

Let me know what you find out
it appears that my proxmox boxes are targeting the "mgmt" interface.
root@dlk0entpve801:~# ping -c 100 -s 8972 172.16.80.1
PING 172.16.80.1 (172.16.80.1) 8972(9000) bytes of data.
8980 bytes from 172.16.80.1: icmp_seq=1 ttl=64 time=0.284 ms
8980 bytes from 172.16.80.1: icmp_seq=2 ttl=64 time=0.347 ms
8980 bytes from 172.16.80.1: icmp_seq=3 ttl=64 time=0.235 ms
8980 bytes from 172.16.80.1: icmp_seq=4 ttl=64 time=0.298 ms
8980 bytes from 172.16.80.1: icmp_seq=5 ttl=64 time=0.212 ms
^C
--- 172.16.80.1 ping statistics ---
5 packets transmitted, 5 received, 0% packet loss, time 4102ms
rtt min/avg/max/mdev = 0.212/0.275/0.347/0.047 ms
root@dlk0entpve801:~# ping -c 100 -s 8972 172.16.81.1
PING 172.16.81.1 (172.16.81.1) 8972(9000) bytes of data.
8980 bytes from 172.16.81.1: icmp_seq=1 ttl=64 time=0.262 ms
8980 bytes from 172.16.81.1: icmp_seq=2 ttl=64 time=0.158 ms
8980 bytes from 172.16.81.1: icmp_seq=3 ttl=64 time=0.218 ms
^C
--- 172.16.81.1 ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2044ms
rtt min/avg/max/mdev = 0.158/0.212/0.262/0.042 ms
root@dlk0entpve801:~# systemctl status multipathd
● multipathd.service - Device-Mapper Multipath Device Controller
Loaded: loaded (/lib/systemd/system/multipathd.service; enabled; preset: e>
Active: active (running) since Tue 2026-01-13 21:42:33 EST; 1 day 12h ago
TriggeredBy: ● multipathd.socket
Process: 523216 ExecStartPre=/sbin/modprobe dm-multipath (code=exited, stat>
Main PID: 523267 (multipathd)
Status: "up"
Tasks: 7
Memory: 36.1M
CPU: 59.629s
CGroup: /system.slice/multipathd.service
└─523267 /sbin/multipathd -d -s

Jan 14 11:13:33 dlk0entpve801 multipathd[523267]: reconfigure all (operator)
Jan 14 11:13:33 dlk0entpve801 multipathd[523267]: reconfigure: setting up paths>
Jan 14 11:13:34 dlk0entpve801 multipathd[523267]: reconfigure all (operator)
Jan 14 11:13:35 dlk0entpve801 multipathd[523267]: reconfigure: setting up paths>
Jan 14 11:17:29 dlk0entpve801 multipathd[523267]: reconfigure all (operator)
Jan 14 11:17:29 dlk0entpve801 multipathd[523267]: reconfigure: setting up paths>
Jan 14 11:17:29 dlk0entpve801 multipathd[523267]: reconfigure all (operator)
Jan 14 11:17:29 dlk0entpve801 multipathd[523267]: reconfigure all (operator)
Jan 14 11:17:29 dlk0entpve801 multipathd[523267]: reconfigure all (operator)
Jan 14 11:17:30 dlk0entpve801 multipathd[523267]: reconfigure: setting up paths>
root@dlk0entpve801:~# iscsiadm -m discovery -t sendtargets -p 172.16.80.1:3260
172.16.80.1:3260,1 iqn.2005-10.org.freenas.ctl:vm
10.20.35.12:3260,1 iqn.2005-10.org.freenas.ctl:vm
172.16.81.1:3260,1 iqn.2005-10.org.freenas.ctl:vm
root@dlk0entpve801:~# iscsiadm -m discovery -t sendtargets -p 172.16.81.1:3260
172.16.81.1:3260,1 iqn.2005-10.org.freenas.ctl:vm
10.20.35.12:3260,1 iqn.2005-10.org.freenas.ctl:vm
172.16.80.1:3260,1 iqn.2005-10.org.freenas.ctl:vm
root@dlk0entpve801:~# iscsiadm -m node -T iqn.2005-10.org.freenas.ctl:dev-stor --login
iscsiadm: No records found
root@dlk0entpve801:~# iscsiadm -m node -T iqn.2005-10.org.freenas.ctl:vm --login
Logging in to [iface: default, target: iqn.2005-10.org.freenas.ctl:vm, portal: 10.20.35.12,3260]
iscsiadm: default: 1 session requested, but 1 already present.
iscsiadm: default: 1 session requested, but 1 already present.
Login to [iface: default, target: iqn.2005-10.org.freenas.ctl:vm, portal: 10.20.35.12,3260] successful.
iscsiadm: Could not log into all portals
 
@warlocksyno
after several hours of AI research, here is our conclusion:
rueNAS Python middleware is simply refusing to generate the correct PORTAL_GROUP syntax required for IP isolation.

Since the middleware keeps overwriting your manual fixes, we have to use the "Post-Init" back door to force the kernel into compliance after the middleware finishes its broken routine.


Part 1: The "Immediate" Solution (Post-Init Persistence)​

This bypasses the UI/Middleware bug by injecting the correct configuration directly into the kernel after the service starts.

  1. Navigate to: System Settings -> Advanced -> Init/Shutdown Scripts.
  2. Add a new script:
    • Type: Command
    • Description: Force iSCSI IP Isolation (Override Middleware Bug)
    • Command: ```bash

      Wait for middleware to finish starting SCST​

      sleep 10

      Force the kernel to bind the Target to specific IPs only​

      echo "add_target_attribute iqn.2005-10.org.freenas.ctl:vm allowed_portal 172.16.80.1" > /sys/kernel/scst_tgt/targets/iscsi/mgmtecho "add_target_attribute iqn.2005-10.org.freenas.ctl:vm allowed_portal 172.16.81.1" > /sys/kernel/scst_tgt/targets/iscsi/mgmt
    • When: Post-Init
  3. Save and Reboot (or run those commands manually now).

Part 2: Technical Report for TrueNAS Support​

Subject: SCST Template Generation Bug: Failure to create PORTAL_GROUP in scst.conf (SCALE 25.04)

Environment:

  • OS: TrueNAS SCALE 25.04 (Electric Eel)
  • Networking: iSCSI on tagged VLAN interfaces (vlan11, vlan21) with Jumbo Frames (MTU 9216).
  • Storage Target: SCST driver.
The Problem:The middlewared template generator is failing to produce a valid PORTAL_GROUP section in /etc/scst.conf despite the Portal (ID 1) being configured in the database with specific IP addresses (172.16.80.1, 172.16.81.1).

Observations:

  1. Middleware Failure: The generated /etc/scst.conf lacks the PORTAL_GROUP header. Instead, it places the IPs as metadata comments in the INITIATOR lines (e.g., INITIATOR iqn...#172.16.80.1).
  2. Kernel State: Because the PORTAL_GROUP is missing, the iscsi-scst driver defaults to a wildcard listener (0.0.0.0:3260), exposing the storage network to the Management and Public subnets.
  3. Validation Loop: Manually adding PORTAL_GROUP to scst.conf causes a service crash on restart, indicating that the middleware is likely not initializing the driver-level portals required for the target-level group to bind.
  4. Interface State: midclt call interface.query confirms the IPs are present and active on the VLAN interfaces. MTU is 9216. No bridge interface is present to conflict with the bind.
Expected Behavior:The middleware should generate a configuration that includes:

Plaintext

TARGET iqn... {<br> PORTAL_GROUP default {<br> allowed_portal 172.16.80.1<br> allowed_portal 172.16.81.1<br> ...<br> }<br>}<br>
Steps to Reproduce:

  1. Create a Portal with specific IPs on VLAN interfaces.
  2. Associate the Portal with a Target and an Initiator Group.
  3. Observe that ss -tulpn shows 0.0.0.0:3260 instead of the specified IPs.

Summary of the "Tail Chasing"​

We confirmed that the TrueNAS Database (via midclt) is correct, and the Linux Kernel (via ip addr) is correct. The "Black Box" is the Python script that translates the Database into the scst.conf file. It is currently unable to handle specific portal bindings in this build, likely due to a regression in how it handles VLAN-backed portals.
 
even after all of that:

truenas_admin@dlk0entsto801[~]$ ss -tulpn | grep 3260

tcp LISTEN 0 256 0.0.0.0:3260 0.0.0.0:*

tcp LISTEN 0 256 [::]:3260 [::]:*

This is the "Smoking Gun" of the TrueNAS iSCSI stack. With your background, you’ll recognize what’s happening here: SCST is not acting like a standard user-space daemon.

Because SCST is a kernel-level target, 0.0.0.0:3260 isn't necessarily a "socket" in the way Nginx or SSH uses one. In the current TrueNAS SCALE implementation, the iscsi-scst module loads a global listener into the kernel stack. Even when you tell a specific Target to only allow certain IPs, the driver itself continues to advertise port 3260 on all interfaces.

The allowed_portal attribute we injected earlier acts as a Layer 7 filter, not a Layer 3 bind. It tells the target: "If an initiator talks to me on the management IP, ignore the login request," but it doesn't stop the kernel from listening on that IP.

The Final "IT Pro" Verdict
On TrueNAS SCALE 25.04, true IP-level binding (restricting the listener itself) is currently broken or unsupported in the middleware.

If your security requirement is hard isolation (where the management IP must not even respond on 3260), you cannot rely on the iSCSI configuration alone. You have to move to the OS level:

1. The "Brute Force" Firewall Fix
Since we can't stop the kernel from listening, we block the traffic before it hits the SCST module. TrueNAS SCALE uses nftables.

Bash
# Block iSCSI on the Management Interface (Assuming it's enp1s0 or similar)
# Replace 'mgmt_interface' with your actual management NIC name
sudo nft add rule inet fw-filter input iifname "mgmt_interface" tcp dport 3260 counter drop
2. Verify with a Port Scan
From your workstation or a Proxmox node on the management network:

Bash
nmap -p 3260 10.20.35.12
If it shows Filtered or Closed, you have achieved your goal, even if the TrueNAS ss output still shows 0.0.0.0.

Summary for your Support Ticket
You should add this specific technical detail to the report I drafted for you:

"The iscsi-scst driver in SCALE 25.04 appears to be hard-coded to a global wildcard bind. Target-level allowed_portal attributes function only as initiator filters and do not restrict the kernel's listening sockets. True 'Portal Group' bind isolation is non-functional in the current middleware template."

One last thing you can try
If you want to see if it's even possible to drop the 0.0.0.0 listener manually:

Bash
echo "del_target_portal 0.0.0.0 3260" | sudo tee /sys/kernel/scst_tgt/targets/iscsi/mgmt
If that returns Invalid argument or Not found, then the wildcard is compiled into the driver/module load options by IX Systems, and no amount of configuration will remove it.

The above "Post-Init" step works. It's ugly and a hack, but.
 
Last edited:
Hmm, that is actually a Linux default behavior. That is actually something I ran into when configuring a multipath Proxmox cluster on a Pure Storage environment. VMware doesn't care if you have multiple interfaces on the same subnet and will only reply to the interface being addressed. Linux by default responds on any interface that has a route. So in your case it shouldn't be able to...

But I will investigate what's going on.

I do see that right here in your console output:
Bash:
root@dlk0entpve801:~# iscsiadm -m discovery -t sendtargets -p 172.16.80.1:3260
172.16.80.1:3260,1 iqn.2005-10.org.freenas.ctl:vm
10.20.35.12:3260,1 iqn.2005-10.org.freenas.ctl:vm
172.16.81.1:3260,1 iqn.2005-10.org.freenas.ctl:vm

---
edit:
---
My TrueNAS is only responding on the interfaces bound to a portal:
Code:
root@pve-m920x-1:~# iscsiadm -m discovery -t sendtargets -p 10.20.31.20:3260
10.20.31.20:3260,1 iqn.2005-10.org.freenas.ctl:proxmox
10.20.30.20:3260,1 iqn.2005-10.org.freenas.ctl:proxmox
10.20.31.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:a8sd1
10.20.30.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:a8sd1
10.20.31.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:test-iscsi
10.20.30.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:test-iscsi

Can you double check your iSCSI service does not have any other Portal IDs that has your management interface?
1768514163456.png

If you have 0.0.0.0 in there too it will also respond on all available interfaces.


Another jank way of fixing it would be to just dis-allow the other networks on the iSCSI target in TrueNAS:
Something like this:
1768514306563.png
Should be able to just have your high speed networks on there.

Then tell the iSCSI to logout of all sessions on Proxmox, then log back in. Then only the interfaces allowed to communicate will be logged in.
 
Last edited:
Hmm, that is actually a Linux default behavior. That is actually something I ran into when configuring a multipath Proxmox cluster on a Pure Storage environment. VMware doesn't care if you have multiple interfaces on the same subnet and will only reply to the interface being addressed. Linux by default responds on any interface that has a route. So in your case it shouldn't be able to...

But I will investigate what's going on.

I do see that right here in your console output:
Bash:
root@dlk0entpve801:~# iscsiadm -m discovery -t sendtargets -p 172.16.80.1:3260
172.16.80.1:3260,1 iqn.2005-10.org.freenas.ctl:vm
10.20.35.12:3260,1 iqn.2005-10.org.freenas.ctl:vm
172.16.81.1:3260,1 iqn.2005-10.org.freenas.ctl:vm

---
edit:
---
My TrueNAS is only responding on the interfaces bound to a portal:
Code:
root@pve-m920x-1:~# iscsiadm -m discovery -t sendtargets -p 10.20.31.20:3260
10.20.31.20:3260,1 iqn.2005-10.org.freenas.ctl:proxmox
10.20.30.20:3260,1 iqn.2005-10.org.freenas.ctl:proxmox
10.20.31.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:a8sd1
10.20.30.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:a8sd1
10.20.31.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:test-iscsi
10.20.30.20:3260,1 iqn.2005-10.org.freenas.ctl:iqn.2026-01.org.freenas.ctl:test-iscsi

Can you double check your iSCSI service does not have any other Portal IDs that has your management interface?
View attachment 94965

If you have 0.0.0.0 in there too it will also respond on all available interfaces.


Another jank way of fixing it would be to just dis-allow the other networks on the iSCSI target in TrueNAS:
Something like this:
View attachment 94967
Should be able to just have your high speed networks on there.

Then tell the iSCSI to logout of all sessions on Proxmox, then log back in. Then only the interfaces allowed to communicate will be logged in.
One of the adjustments I made early on. I hadn't thought about "disallowing" the 10.20.32.x network... :D
1769025615101.png
 
I tried to set up the plugin the last WE, but unfortunately it is not working for me. I tried to use iscsi and nvme-tcp for my existing dataset.
This is what the log prints
JSON:
[2026-02-02 01:43:39] [INFO] Enter your TrueNAS API key:
[2026-02-02 01:43:39] [INFO] (Generate one in TrueNAS: Settings → API Keys → Add)
[2026-02-02 01:43:53] [INFO] tn_api_call: method=system.info
[2026-02-02 01:43:54] [SUCCESS] Connected to TrueNAS successfully (version: 25.10.1)
[2026-02-02 01:43:54] [INFO] Configuration Progress:
[2026-02-02 01:44:02] [INFO] tn_api_call: method=pool.dataset.query
[2026-02-02 01:44:03] [ERROR]tn_api_call failed: JSON - RPC error: {
    "message": "Invalid params",
    "code": -32602,
    "data": {
        "trace": {
            "class": "ValidationErrors",
            "repr": "ValidationErrors([ValidationError('filters', 'Value error, Invalid operation: i', 22)])",
            "frames": [{
                    "method": "process_method_call",
                    "argspec": ["self", "app", "id_", "method", "params"],
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/server/ws_handler/rpc.py",
                    "locals": {
                        "app": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>",
                        "self": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketHandler object at 0x7f46ebfd3fd0>",
                        "e": "ValidationErrors([ValidationError('filters', 'Value error, Invalid operation: i', 22)])",
                        "method": "<middlewared.api.base.server.method.Method object at 0x7f46ecbec690>",
                        "id_": "2",
                        "params": "[['id', '=', 'NVME/test']]"
                    },
                    "lineno": 372,
                    "line": "                app.send_truenas_validation_error(id_, sys.exc_info(), list(e))\n"
                }, {
                    "line": "        result = await self.middleware.call_with_audit(self.name, self.serviceobj, methodobj, params, app,\n",
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/server/method.py",
                    "locals": {
                        "params": "[['id', '=', 'NVME/test']]",
                        "id_": "2",
                        "methodobj": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "self": "<middlewared.api.base.server.method.Method object at 0x7f46ecbec690>",
                        "app": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>",
                        "mock": "None"
                    },
                    "lineno": 57,
                    "argspec": ["self", "app", "id_", "params"],
                    "method": "call"
                }, {
                    "method": "call_with_audit",
                    "argspec": ["self", "method", "serviceobj", "methodobj", "params", "app"],
                    "keywordspec": "kwargs",
                    "line": "                await log_audit_message_for_method(success)\n",
                    "lineno": 965,
                    "filename": "/usr/lib/python3/dist-packages/middlewared/main.py",
                    "locals": {
                        "methodobj": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "job_on_finish_cb": "<function Middleware.call_with_audit.<locals>.job_on_finish_cb at 0x7f46d8386fc0>",
                        "params": "[['id', '=', 'NVME/test']]",
                        "job": "None",
                        "app": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>",
                        "method": "'pool.dataset.query'",
                        "serviceobj": "<CompoundService: <middlewared.plugins.pool_.dataset_recordsize.PoolDatasetService object at 0x7f46ec8dd190>, <middlewared.plugins.pool_.dataset_attachments.PoolDatasetService object at 0x7f46ec8dd150>, <middlewared.plugins.pool_.dataset_encryption_info.PoolDatasetService object at 0x7f46ec8dd310>, <middlewared.plugins.pool_.unlock.PoolDatasetService object at 0x7f46ec8dd1d0>, <middlewared.plugins.pool_.dataset_details.PoolDatasetService object at 0x7f46ec8dd090>, <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>, <middlewared.plugins.pool_.dataset_encryption_lock.PoolDatasetService object at 0x7f46ecde4ed0>, <middlewared.plugins.pool_.dataset_processes.PoolDatasetService object at 0x7f46ec90b550>, <middlewared.plugins.pool_.dataset_quota.PoolDatasetService object at 0x7f46ec90b4d0>, <middlewared.plugins.pool_.dataset_encryption_operations.PoolDatasetService object at 0x7f46ec8dd350>, <middlewared.plugins.pool_.snapshot_count.PoolDatasetService object at 0x7f46ecdfb2d0>, <middlewared.plugins.pool_.dataset_info.PoolDatasetService object at 0x7f46ece10e90>>",
                        "success": "False",
                        "log_audit_message_for_method": "<function Middleware.call_with_audit.<locals>.log_audit_message_for_method at 0x7f46d8384ae0>",
                        "kwargs": "{'message_id': 2}",
                        "audit_callback_messages": "[]",
                        "self": "<middlewared.main.Middleware object at 0x7f47298aed10>"
                    }
                }, {
                    "lineno": 782,
                    "filename": "/usr/lib/python3/dist-packages/middlewared/main.py",
                    "locals": {
                        "serviceobj": "<CompoundService: <middlewared.plugins.pool_.dataset_recordsize.PoolDatasetService object at 0x7f46ec8dd190>, <middlewared.plugins.pool_.dataset_attachments.PoolDatasetService object at 0x7f46ec8dd150>, <middlewared.plugins.pool_.dataset_encryption_info.PoolDatasetService object at 0x7f46ec8dd310>, <middlewared.plugins.pool_.unlock.PoolDatasetService object at 0x7f46ec8dd1d0>, <middlewared.plugins.pool_.dataset_details.PoolDatasetService object at 0x7f46ec8dd090>, <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>, <middlewared.plugins.pool_.dataset_encryption_lock.PoolDatasetService object at 0x7f46ecde4ed0>, <middlewared.plugins.pool_.dataset_processes.PoolDatasetService object at 0x7f46ec90b550>, <middlewared.plugins.pool_.dataset_quota.PoolDatasetService object at 0x7f46ec90b4d0>, <middlewared.plugins.pool_.dataset_encryption_operations.PoolDatasetService object at 0x7f46ec8dd350>, <middlewared.plugins.pool_.snapshot_count.PoolDatasetService object at 0x7f46ecdfb2d0>, <middlewared.plugins.pool_.dataset_info.PoolDatasetService object at 0x7f46ece10e90>>",
                        "kwargs": "{'app': <middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>, 'audit_callback': <built-in method append of list object at 0x7f46b836d740>, 'message_id': 2}",
                        "params": "[['id', '=', 'NVME/test']]",
                        "name": "'pool.dataset.query'",
                        "methodobj": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "self": "<middlewared.main.Middleware object at 0x7f47298aed10>",
                        "prepared_call": "PreparedCall(args=[<_thread._local object at 0x7f47287143b0>, ['id', '=', 'NVME/test']], executor=<middlewared.utils.threading.IoThreadPoolExecutor object at 0x7f4728766590>, job=None, is_coroutine=False)"
                    },
                    "line": "        return await self.run_in_executor(prepared_call.executor, methodobj, *prepared_call.args)\n",
                    "keywordspec": "kwargs",
                    "argspec": ["self", "name", "serviceobj", "methodobj", "params"],
                    "method": "_call"
                }, {
                    "varargspec": "args",
                    "argspec": ["self", "pool", "method"],
                    "method": "run_in_executor",
                    "locals": {
                        "loop": "<_UnixSelectorEventLoop running=True closed=False debug=False>",
                        "method": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "pool": "<middlewared.utils.threading.IoThreadPoolExecutor object at 0x7f4728766590>",
                        "kwargs": "{}",
                        "self": "<middlewared.main.Middleware object at 0x7f47298aed10>",
                        "args": "('***', '***')"
                    },
                    "filename": "/usr/lib/python3/dist-packages/middlewared/main.py",
                    "lineno": 665,
                    "line": "        return await loop.run_in_executor(pool, functools.partial(method, *args, **kwargs))\n",
                    "keywordspec": "kwargs"
                }, {
                    "argspec": ["self"],
                    "method": "run",
                    "lineno": 62,
                    "filename": "/usr/lib/python3.11/concurrent/futures/thread.py",
                    "locals": {
                        "self": "None"
                    },
                    "line": "            self = None\n"
                }, {
                    "line": "                args = list(args[:args_index]) + accept_params(accepts, args[args_index:])\n",
                    "lineno": 114,
                    "locals": {
                        "args_index": "2",
                        "accepts": "<class 'middlewared.api.v25_10_1.common.QueryArgs'>",
                        "args": "('***', '***', '***')",
                        "func": "<function PoolDatasetService.query at 0x7f46ed0bb920>"
                    },
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/decorator.py",
                    "method": "wrapped",
                    "varargspec": "args"
                }, {
                    "method": "accept_params",
                    "argspec": ["model", "args", "exclude_unset", "expose_secrets"],
                    "lineno": 25,
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py",
                    "locals": {
                        "model": "<class 'middlewared.api.v25_10_1.common.QueryArgs'>",
                        "exclude_unset": "False",
                        "args": "(['id', '=', 'NVME/test'],)",
                        "args_as_dict": "{'filters': ['id', '=', 'NVME/test']}",
                        "expose_secrets": "True"
                    },
                    "line": "    dump = validate_model(model, args_as_dict, exclude_unset=exclude_unset, expose_secrets=expose_secrets)\n"
                }, {
                    "method": "validate_model",
                    "argspec": ["model", "data", "exclude_unset", "expose_secrets"],
                    "lineno": 84,
                    "locals": {
                        "exclude_unset": "False",
                        "data": "{'filters': ['id', '=', 'NVME/test']}",
                        "model": "<class 'middlewared.api.v25_10_1.common.QueryArgs'>",
                        "loc": "['filters']",
                        "error": "{'type': 'value_error', 'loc': ('filters',), 'msg': 'Value error, Invalid operation: i', 'input': ['id', '=', 'NVME/test'], 'ctx': {'error': ValueError('Invalid operation: i')}, 'url': 'https://errors.pydantic.dev/2.9/v/value_error'}",
                        "verrors": "ValidationErrors([ValidationError('filters', 'Value error, Invalid operation: i', 22)])",
                        "msg": "'Value error, Invalid operation: i'",
                        "expose_secrets": "True"
                    },
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py",
                    "line": "        raise verrors from None\n"
                }
            ],
            "formatted": "Traceback (most recent call last):\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/server/ws_handler/rpc.py\", line 360, in process_method_call\n    result = await method.call(app, id_, params)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/server/method.py\", line 57, in call\n    result = await self.middleware.call_with_audit(self.name, self.serviceobj, methodobj, params, app,\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/main.py\", line 954, in call_with_audit\n    result = await self._call(method, serviceobj, methodobj, params, app=app,\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/main.py\", line 782, in _call\n    return await self.run_in_executor(prepared_call.executor, methodobj, *prepared_call.args)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/main.py\", line 665, in run_in_executor\n    return await loop.run_in_executor(pool, functools.partial(method, *args, **kwargs))\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3.11/concurrent/futures/thread.py\", line 58, in run\n    result = self.fn(*self.args, **self.kwargs)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/decorator.py\", line 114, in wrapped\n    args = list(args[:args_index]) + accept_params(accepts, args[args_index:])\n                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py\", line 25, in accept_params\n    dump = validate_model(model, args_as_dict, exclude_unset=exclude_unset, expose_secrets=expose_secrets)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py\", line 84, in validate_model\n    raise verrors from None\nmiddlewared.service_exception.ValidationErrors: [EINVAL] filters: Value error, Invalid operation: i\n\n"
        },
        "error": 22,
        "errname": "EINVAL",
        "extra": [["filters", "Value error, Invalid operation: i", 22]],
        "reason": "[EINVAL] filters: Value error, Invalid operation: i\n"
    }
}
at / usr / share / perl5 / PVE / Storage / Custom / TrueNASPlugin.pm line 852.
[2026-02-02 01:44:03] [WARNING] Dataset 'NVME/test' not found or not accessible
[2026-02-02 01:44:03] [WARNING] Dataset verification failed. The dataset may not exist or may not be accessible.
 
I tried to set up the plugin the last WE, but unfortunately it is not working for me. I tried to use iscsi and nvme-tcp for my existing dataset.
This is what the log prints
JSON:
[2026-02-02 01:43:39] [INFO] Enter your TrueNAS API key:
[2026-02-02 01:43:39] [INFO] (Generate one in TrueNAS: Settings → API Keys → Add)
[2026-02-02 01:43:53] [INFO] tn_api_call: method=system.info
[2026-02-02 01:43:54] [SUCCESS] Connected to TrueNAS successfully (version: 25.10.1)
[2026-02-02 01:43:54] [INFO] Configuration Progress:
[2026-02-02 01:44:02] [INFO] tn_api_call: method=pool.dataset.query
[2026-02-02 01:44:03] [ERROR]tn_api_call failed: JSON - RPC error: {
    "message": "Invalid params",
    "code": -32602,
    "data": {
        "trace": {
            "class": "ValidationErrors",
            "repr": "ValidationErrors([ValidationError('filters', 'Value error, Invalid operation: i', 22)])",
            "frames": [{
                    "method": "process_method_call",
                    "argspec": ["self", "app", "id_", "method", "params"],
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/server/ws_handler/rpc.py",
                    "locals": {
                        "app": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>",
                        "self": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketHandler object at 0x7f46ebfd3fd0>",
                        "e": "ValidationErrors([ValidationError('filters', 'Value error, Invalid operation: i', 22)])",
                        "method": "<middlewared.api.base.server.method.Method object at 0x7f46ecbec690>",
                        "id_": "2",
                        "params": "[['id', '=', 'NVME/test']]"
                    },
                    "lineno": 372,
                    "line": "                app.send_truenas_validation_error(id_, sys.exc_info(), list(e))\n"
                }, {
                    "line": "        result = await self.middleware.call_with_audit(self.name, self.serviceobj, methodobj, params, app,\n",
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/server/method.py",
                    "locals": {
                        "params": "[['id', '=', 'NVME/test']]",
                        "id_": "2",
                        "methodobj": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "self": "<middlewared.api.base.server.method.Method object at 0x7f46ecbec690>",
                        "app": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>",
                        "mock": "None"
                    },
                    "lineno": 57,
                    "argspec": ["self", "app", "id_", "params"],
                    "method": "call"
                }, {
                    "method": "call_with_audit",
                    "argspec": ["self", "method", "serviceobj", "methodobj", "params", "app"],
                    "keywordspec": "kwargs",
                    "line": "                await log_audit_message_for_method(success)\n",
                    "lineno": 965,
                    "filename": "/usr/lib/python3/dist-packages/middlewared/main.py",
                    "locals": {
                        "methodobj": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "job_on_finish_cb": "<function Middleware.call_with_audit.<locals>.job_on_finish_cb at 0x7f46d8386fc0>",
                        "params": "[['id', '=', 'NVME/test']]",
                        "job": "None",
                        "app": "<middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>",
                        "method": "'pool.dataset.query'",
                        "serviceobj": "<CompoundService: <middlewared.plugins.pool_.dataset_recordsize.PoolDatasetService object at 0x7f46ec8dd190>, <middlewared.plugins.pool_.dataset_attachments.PoolDatasetService object at 0x7f46ec8dd150>, <middlewared.plugins.pool_.dataset_encryption_info.PoolDatasetService object at 0x7f46ec8dd310>, <middlewared.plugins.pool_.unlock.PoolDatasetService object at 0x7f46ec8dd1d0>, <middlewared.plugins.pool_.dataset_details.PoolDatasetService object at 0x7f46ec8dd090>, <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>, <middlewared.plugins.pool_.dataset_encryption_lock.PoolDatasetService object at 0x7f46ecde4ed0>, <middlewared.plugins.pool_.dataset_processes.PoolDatasetService object at 0x7f46ec90b550>, <middlewared.plugins.pool_.dataset_quota.PoolDatasetService object at 0x7f46ec90b4d0>, <middlewared.plugins.pool_.dataset_encryption_operations.PoolDatasetService object at 0x7f46ec8dd350>, <middlewared.plugins.pool_.snapshot_count.PoolDatasetService object at 0x7f46ecdfb2d0>, <middlewared.plugins.pool_.dataset_info.PoolDatasetService object at 0x7f46ece10e90>>",
                        "success": "False",
                        "log_audit_message_for_method": "<function Middleware.call_with_audit.<locals>.log_audit_message_for_method at 0x7f46d8384ae0>",
                        "kwargs": "{'message_id': 2}",
                        "audit_callback_messages": "[]",
                        "self": "<middlewared.main.Middleware object at 0x7f47298aed10>"
                    }
                }, {
                    "lineno": 782,
                    "filename": "/usr/lib/python3/dist-packages/middlewared/main.py",
                    "locals": {
                        "serviceobj": "<CompoundService: <middlewared.plugins.pool_.dataset_recordsize.PoolDatasetService object at 0x7f46ec8dd190>, <middlewared.plugins.pool_.dataset_attachments.PoolDatasetService object at 0x7f46ec8dd150>, <middlewared.plugins.pool_.dataset_encryption_info.PoolDatasetService object at 0x7f46ec8dd310>, <middlewared.plugins.pool_.unlock.PoolDatasetService object at 0x7f46ec8dd1d0>, <middlewared.plugins.pool_.dataset_details.PoolDatasetService object at 0x7f46ec8dd090>, <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>, <middlewared.plugins.pool_.dataset_encryption_lock.PoolDatasetService object at 0x7f46ecde4ed0>, <middlewared.plugins.pool_.dataset_processes.PoolDatasetService object at 0x7f46ec90b550>, <middlewared.plugins.pool_.dataset_quota.PoolDatasetService object at 0x7f46ec90b4d0>, <middlewared.plugins.pool_.dataset_encryption_operations.PoolDatasetService object at 0x7f46ec8dd350>, <middlewared.plugins.pool_.snapshot_count.PoolDatasetService object at 0x7f46ecdfb2d0>, <middlewared.plugins.pool_.dataset_info.PoolDatasetService object at 0x7f46ece10e90>>",
                        "kwargs": "{'app': <middlewared.api.base.server.ws_handler.rpc.RpcWebSocketApp object at 0x7f46d85304d0>, 'audit_callback': <built-in method append of list object at 0x7f46b836d740>, 'message_id': 2}",
                        "params": "[['id', '=', 'NVME/test']]",
                        "name": "'pool.dataset.query'",
                        "methodobj": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "self": "<middlewared.main.Middleware object at 0x7f47298aed10>",
                        "prepared_call": "PreparedCall(args=[<_thread._local object at 0x7f47287143b0>, ['id', '=', 'NVME/test']], executor=<middlewared.utils.threading.IoThreadPoolExecutor object at 0x7f4728766590>, job=None, is_coroutine=False)"
                    },
                    "line": "        return await self.run_in_executor(prepared_call.executor, methodobj, *prepared_call.args)\n",
                    "keywordspec": "kwargs",
                    "argspec": ["self", "name", "serviceobj", "methodobj", "params"],
                    "method": "_call"
                }, {
                    "varargspec": "args",
                    "argspec": ["self", "pool", "method"],
                    "method": "run_in_executor",
                    "locals": {
                        "loop": "<_UnixSelectorEventLoop running=True closed=False debug=False>",
                        "method": "<bound method PoolDatasetService.query of <middlewared.plugins.pool_.dataset.PoolDatasetService object at 0x7f46ec8dd3d0>>",
                        "pool": "<middlewared.utils.threading.IoThreadPoolExecutor object at 0x7f4728766590>",
                        "kwargs": "{}",
                        "self": "<middlewared.main.Middleware object at 0x7f47298aed10>",
                        "args": "('***', '***')"
                    },
                    "filename": "/usr/lib/python3/dist-packages/middlewared/main.py",
                    "lineno": 665,
                    "line": "        return await loop.run_in_executor(pool, functools.partial(method, *args, **kwargs))\n",
                    "keywordspec": "kwargs"
                }, {
                    "argspec": ["self"],
                    "method": "run",
                    "lineno": 62,
                    "filename": "/usr/lib/python3.11/concurrent/futures/thread.py",
                    "locals": {
                        "self": "None"
                    },
                    "line": "            self = None\n"
                }, {
                    "line": "                args = list(args[:args_index]) + accept_params(accepts, args[args_index:])\n",
                    "lineno": 114,
                    "locals": {
                        "args_index": "2",
                        "accepts": "<class 'middlewared.api.v25_10_1.common.QueryArgs'>",
                        "args": "('***', '***', '***')",
                        "func": "<function PoolDatasetService.query at 0x7f46ed0bb920>"
                    },
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/decorator.py",
                    "method": "wrapped",
                    "varargspec": "args"
                }, {
                    "method": "accept_params",
                    "argspec": ["model", "args", "exclude_unset", "expose_secrets"],
                    "lineno": 25,
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py",
                    "locals": {
                        "model": "<class 'middlewared.api.v25_10_1.common.QueryArgs'>",
                        "exclude_unset": "False",
                        "args": "(['id', '=', 'NVME/test'],)",
                        "args_as_dict": "{'filters': ['id', '=', 'NVME/test']}",
                        "expose_secrets": "True"
                    },
                    "line": "    dump = validate_model(model, args_as_dict, exclude_unset=exclude_unset, expose_secrets=expose_secrets)\n"
                }, {
                    "method": "validate_model",
                    "argspec": ["model", "data", "exclude_unset", "expose_secrets"],
                    "lineno": 84,
                    "locals": {
                        "exclude_unset": "False",
                        "data": "{'filters': ['id', '=', 'NVME/test']}",
                        "model": "<class 'middlewared.api.v25_10_1.common.QueryArgs'>",
                        "loc": "['filters']",
                        "error": "{'type': 'value_error', 'loc': ('filters',), 'msg': 'Value error, Invalid operation: i', 'input': ['id', '=', 'NVME/test'], 'ctx': {'error': ValueError('Invalid operation: i')}, 'url': 'https://errors.pydantic.dev/2.9/v/value_error'}",
                        "verrors": "ValidationErrors([ValidationError('filters', 'Value error, Invalid operation: i', 22)])",
                        "msg": "'Value error, Invalid operation: i'",
                        "expose_secrets": "True"
                    },
                    "filename": "/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py",
                    "line": "        raise verrors from None\n"
                }
            ],
            "formatted": "Traceback (most recent call last):\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/server/ws_handler/rpc.py\", line 360, in process_method_call\n    result = await method.call(app, id_, params)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/server/method.py\", line 57, in call\n    result = await self.middleware.call_with_audit(self.name, self.serviceobj, methodobj, params, app,\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/main.py\", line 954, in call_with_audit\n    result = await self._call(method, serviceobj, methodobj, params, app=app,\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/main.py\", line 782, in _call\n    return await self.run_in_executor(prepared_call.executor, methodobj, *prepared_call.args)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/main.py\", line 665, in run_in_executor\n    return await loop.run_in_executor(pool, functools.partial(method, *args, **kwargs))\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3.11/concurrent/futures/thread.py\", line 58, in run\n    result = self.fn(*self.args, **self.kwargs)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/decorator.py\", line 114, in wrapped\n    args = list(args[:args_index]) + accept_params(accepts, args[args_index:])\n                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py\", line 25, in accept_params\n    dump = validate_model(model, args_as_dict, exclude_unset=exclude_unset, expose_secrets=expose_secrets)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/usr/lib/python3/dist-packages/middlewared/api/base/handler/accept.py\", line 84, in validate_model\n    raise verrors from None\nmiddlewared.service_exception.ValidationErrors: [EINVAL] filters: Value error, Invalid operation: i\n\n"
        },
        "error": 22,
        "errname": "EINVAL",
        "extra": [["filters", "Value error, Invalid operation: i", 22]],
        "reason": "[EINVAL] filters: Value error, Invalid operation: i\n"
    }
}
at / usr / share / perl5 / PVE / Storage / Custom / TrueNASPlugin.pm line 852.
[2026-02-02 01:44:03] [WARNING] Dataset 'NVME/test' not found or not accessible
[2026-02-02 01:44:03] [WARNING] Dataset verification failed. The dataset may not exist or may not be accessible.


The error at the end indicates that the 'NVME/test' dataset is not present is that how it is configure on your TrueNAS?
 
the dataset (NVME/test) is present for sure. I switched back using proxmox-truenas-native plugin which got no problems finding datasets and is working without issues. I'd like to get going with nvme-tcp though.
 
@warlocksyno
it would seem that adding a section to the documentation about creating an /etc/multipath.conf:

defaults {
user_friendly_names yes
path_grouping_policy multibus
path_selector "round-robin 0"
rr_min_io_rq 1
failback immediate
no_path_retry queue
find_multipaths no
}

blacklist {
devnode "^(ram|raw|loop|fd|md|dm-|sr|scd|st)[0-9]*"
devnode "^hd[a-z]"
devnode "^sda"
}

devices {
device {
vendor "TrueNAS"
product "iSCSI Disk"
path_grouping_policy multibus
path_selector "round-robin 0"
hardware_handler "0"
rr_weight uniform
rr_min_io_rq 1
}
}

could eliminate some initial install configuration issues.. this one might not be optimized but it seems to work with my configuration.