ZFS over ISCSI using THEGRANDWAZOO script with TRUENAS SCALE

thaimichael

New Member
Apr 15, 2022
4
0
1
There seem to be a lot of posts about ZFS over ISCSI but not really any that detail all the steps in one full tutorial.
Currently, I have 2 networks setup one is my main lan 1Gb and one is dedicated to my storage network 40Gb. Can create a VM with ZFS over ISCSI. The disk appears on the Truenas machine. When I start the machine it fails to start. SSH authentication works fine from Proxmox to Truenas. This is the first machine that will be installed expanding to more but not enough at the beginning for CEPH.

1665093148556.png

1665092000597.png

Can Ping both directions Proxmox server to the Truenas Scale without an issue.

storage.cfg ZFS over ISCSI

zfs: TruenasScaleISCSI
blocksize 8k
iscsiprovider freenas
pool NOVA/ISCSI
portal 10.168.69.8
target iqn.2005-10.org.freenas.ctl:proxmox-tgt
content images
freenas_apiv4_host 10.168.69.8
freenas_password <root_password>
freenas_use_ssl 0
freenas_user root
nowritecache 0
sparse 1

VM Conf

boot: order=virtio0;ide2;net0
cores: 6
ide2: TrunasScaleISO:iso/ubuntu-20.04.5-live-server-amd64.iso,media=cdrom,size=1373568K
memory: 9472
meta: creation-qemu=7.0.0,ctime=1665090170
name: TestICSI
net0: virtio=2E:E4:5A:AE:06:C2,bridge=vmbr0,firewall=1
numa: 0
ostype: l26
scsihw: virtio-scsi-pci
smbios1: uuid=9a24b130-ba2d-4a91-9bfb-6322d136c16e
sockets: 1
virtio0: TruenasScaleISCSI:vm-101-disk-0,size=100G
vmgenid: b346d4ff-1b08-41db-bb57-0963122e18ab

Truenas showing the automaticly created disk

1665094159138.png

syslog when starting machine

Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: start VM 101: UPID:proxmox1:002752DD:017A3D77:633F452A:qmstart:101:root@pam:
Oct 07 04:14:18 proxmox1 pvedaemon[2573200]: <root@pam> starting task UPID:proxmox1:002752DD:017A3D77:633F452A:qmstart:101:root@pam:
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : list_lu(/dev/zvol/NOVA/ISCSI/vm-101-disk-0)
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_check : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_connect : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_connect : REST connection header Content-Type:'text/html'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_connect : Changing to v2.0 API's
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_connect : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_connect : REST connection header Content-Type:'application/json; charset=utf-8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_connect : REST connection successful to '10.168.69.8' using the 'http' protocol
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_check : successful : Server version: TrueNAS-SCALE-22.02.4
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_check : TrueNAS-SCALE Unformatted Version: 22020400
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_check : Using TrueNAS-SCALE API version v2.0
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_globalconfiguration : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_globalconfiguration : target_basename=iqn.2005-10.org.freenas.ctl
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : called with (method: 'list_lu'; result_value_type: 'name'; param[0]: '/dev/zvol/NOVA/ISCSI/vm-101-disk-0')
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : TrueNAS object to find: 'zvol/NOVA/ISCSI/vm-101-disk-0'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_lu 'zvol/NOVA/ISCSI/vm-101-disk-0' with key 'name' found with value: '/dev/zvol/NOVA/ISCSI/vm-101-disk-0'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : list_view(/dev/zvol/NOVA/ISCSI/vm-101-disk-0)
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_view : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : called
Oct 07 04:14:18 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : called with (method: 'list_view'; result_value_type: 'lun-id'; param[0]: '/dev/zvol/NOVA/ISCSI/vm-101-disk-0')
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : TrueNAS object to find: 'zvol/NOVA/ISCSI/vm-101-disk-0'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_lu 'zvol/NOVA/ISCSI/vm-101-disk-0' with key 'lun-id' found with value: '0'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : list_extent(/dev/zvol/NOVA/ISCSI/vm-101-disk-0)
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_extent : called with (method: 'list_extent'; params[0]: '/dev/zvol/NOVA/ISCSI/vm-101-disk-0')
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : called
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : called
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '10.168.69.8'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : successful
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_extent TrueNAS object to find: 'zvol/NOVA/ISCSI/vm-101-disk-0'
Oct 07 04:14:19 proxmox1 pvedaemon[2577117]: PVE::Storage::LunCmd::FreeNAS::run_list_extent 'zvol/NOVA/ISCSI/vm-101-disk-0' wtih key 'naa' found with value: '0x6589cfc00000083bbdffe8d0cdf4acc6'
Oct 07 04:14:22 proxmox1 systemd[1]: Started 101.scope.
Oct 07 04:14:22 proxmox1 systemd[1]: 101.scope: Succeeded.
Oct 07 04:14:22 proxmox1 pvedaemon[2577117]: start failed: QEMU exited with code 1
Oct 07 04:14:22 proxmox1 pvedaemon[2573200]: <root@pam> end task UPID:proxmox1:002752DD:017A3D77:633F452A:qmstart:101:root@pam: start failed: QEMU exited with code 1

The log appears good and it is talking back and forth from what I see and the disk is created. If I remove the ISCSI disk I can start the VM without an issue.

Please be gentle first time with ISCSI
Thanks
Thaimichael
 
You can try perl -MCarp::Always /usr/sbin/qm start 101 to see if you get more data.

and/or run pvedaemon in debug mode:
Code:
NAME
       pvedaemon - PVE API Daemon

SYNOPSIS
       pvedaemon <COMMAND> [ARGS] [OPTIONS]

       pvedaemon help [OPTIONS]

       Get help about specified command.

       --extra-args <array>
           Shows help for a specific command

       --verbose <boolean>
           Verbose output format.

       pvedaemon restart

       Restart the daemon (or start if not running).

       pvedaemon start [OPTIONS]

       Start the daemon.

       --debug <boolean> (default = 0)
           Debug mode - stay in foreground

Beyond that, your best bet is probably to reach to the author of the plugin, which you can do via github issues.


Blockbridge : Ultra low latency all-NVME shared storage for Proxmox - https://www.blockbridge.com/proxmox
 
  • Like
Reactions: LnxBil
There seems to be a problem with the perl environment I got the following error when tried the command

root@proxmox1:~# perl -MCarp::Always /usr/sbin/qm start 101
Can't locate Carp/Always.pm in @INC (you may need to install the Carp::Always module) (@INC contains: /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.32.1 /usr/local/share/perl/5.32.1 /usr/lib/x86_64-linux-gnu/perl5/5.32 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl-base /usr/lib/x86_64-linux-gnu/perl/5.32 /usr/share/perl/5.32 /usr/local/lib/site_perl).
BEGIN failed--compilation aborted.

I tried
root@proxmox1:~# apt-get update
Hit:1 http://download.proxmox.com/debian/pve bullseye InRelease
Hit:2 http://security.debian.org bullseye-security InRelease
Hit:3 http://ftp.debian.org/debian bullseye InRelease Hit:4 http://ftp.debian.org/debian bullseye-updates InRelease
Hit:5 https://ksatechnologies.jfrog.io/artifactory/truenas-proxmox bullseye InRelease
Reading package lists... Done

root@proxmox1:~# apt-get dist-upgrade
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
Calculating upgrade... Done
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.

Which does not seem to show any issues there.

PVE Manager Version
pve-manager/7.2-11/b76d3178

Looked in the bugtracker and did not see any issues that looked similar there.


Thanks for your suggestions hope there is something related to the perl error.
 
That makes more sense had to read further the first few posts seemed to talk about cleaning up repositories and fixing dependencies, so I had gone down the wrong rabbit hole.

I got the module installed ran the command: perl -MCarp::Always /usr/sbin/qm start 101

So looking at this I am seeing fail to connect to LUN, Failed to log in to target and target not found. This makes me believe I have something misconfigured for authentication probably on the Truenas side.

iscsiadm: No session found.
kvm: -drive file=iscsi://10.168.69.8/iqn.2005-10.org.freenas.ctl:proxmox-tgt/0,if=none,id=drive-virtio0,format=raw,cache=none,aio=io_uring,detect-zeroes=on: iSCSI: Failed to connect to LUN : Failed to log in to target. Status: Target not found(515)
start failed: QEMU exited with code 1 at /usr/share/perl5/PVE/QemuServer.pm line 5675. PVE::QemuServer::__ANON__() called at /usr/share/perl5/PVE/Tools.pm line 989 eval {...} called at /usr/share/perl5/PVE/Tools.pm line 988 PVE::Tools::run_fork_with_timeout(undef, CODE(0x55bac462e5b8)) called at /usr/share/perl5/PVE/Tools.pm line 1037 PVE::Tools::run_fork(CODE(0x55bac462e5b8)) called at /usr/share/perl5/PVE/QemuServer.pm line 5677 PVE::QemuServer::__ANON__() called at /usr/share/perl5/PVE/QemuServer.pm line 5702 eval {...} called at /usr/share/perl5/PVE/QemuServer.pm line 5702 PVE::QemuServer::vm_start_nolock(HASH(0x55bac40d4700), 101, HASH(0x55bac41188d8), HASH(0x55bac41220d8), HASH(0x55bac4121fb8)) called at /usr/share/perl5/PVE/QemuServer.pm line 5414 PVE::QemuServer::__ANON__() called at /usr/share/perl5/PVE/AbstractConfig.pm line 299 PVE::AbstractConfig::__ANON__() called at /usr/share/perl5/PVE/Tools.pm line 226 eval {...} called at /usr/share/perl5/PVE/Tools.pm line 226 PVE::Tools::lock_file_full("/var/lock/qemu-server/lock-101.conf", 10, 0, CODE(0x55bac4118f80)) called at /usr/share/perl5/PVE/AbstractConfig.pm line 302 PVE::AbstractConfig::__ANON__("PVE::QemuConfig", 101, 10, 0, CODE(0x55bac4122180)) called at /usr/share/perl5/PVE/AbstractConfig.pm line 322 PVE::AbstractConfig::lock_config_full("PVE::QemuConfig", 101, 10, CODE(0x55bac4122180)) called at /usr/share/perl5/PVE/AbstractConfig.pm line 330 PVE::AbstractConfig::lock_config("PVE::QemuConfig", 101, CODE(0x55bac4122180)) called at /usr/share/perl5/PVE/QemuServer.pm line 5415 PVE::QemuServer::vm_start(HASH(0x55bac40d4700), 101, HASH(0x55bac41220d8), HASH(0x55bac4121fb8)) called at /usr/share/perl5/PVE/API2/Qemu.pm line 2612 PVE::API2::Qemu::__ANON__("UPID:proxmox1:0000ED19:0007C70D:6340632F:qmstart:101:root\@pam:") called at /usr/share/perl5/PVE/RESTEnvironment.pm line 614 eval {...} called at /usr/share/perl5/PVE/RESTEnvironment.pm line 605 PVE::RESTEnvironment::fork_worker(PVE::RPCEnvironment=HASH(0x55bac4118aa0), "qmstart", 101, "root\@pam", CODE(0x55bac40b3180)) called at /usr/share/perl5/PVE/API2/Qemu.pm line 2616 PVE::API2::Qemu::__ANON__(HASH(0x55bac40dc5a0)) called at /usr/share/perl5/PVE/RESTHandler.pm line 451 PVE::RESTHandler::handle("PVE::API2::Qemu", HASH(0x55bac3df87e0), HASH(0x55bac40dc5a0), 1) called at /usr/share/perl5/PVE/RESTHandler.pm line 866 eval {...} called at /usr/share/perl5/PVE/RESTHandler.pm line 849 PVE::RESTHandler::cli_handler("PVE::API2::Qemu", "qm start", "vm_start", ARRAY(0x55babfe53260), ARRAY(0x55bac41341d0), HASH(0x55bac4134218), CODE(0x55bac41078b0), undef) called at /usr/share/perl5/PVE/CLIHandler.pm line 591 PVE::CLIHandler::__ANON__(ARRAY(0x55babe6e23c0), undef, CODE(0x55bac41078b0)) called at /usr/share/perl5/PVE/CLIHandler.pm line 668 PVE::CLIHandler::run_cli_handler("PVE::CLI::qm") called at /usr/sbin/qm line 8 at /usr/share/perl5/PVE/Tools.pm line 1031. PVE::Tools::run_fork_with_timeout(undef, CODE(0x55bac462e5b8)) called at /usr/share/perl5/PVE/Tools.pm line 1037 PVE::Tools::run_fork(CODE(0x55bac462e5b8)) called at /usr/share/perl5/PVE/QemuServer.pm line 5677 PVE::QemuServer::__ANON__() called at /usr/share/perl5/PVE/QemuServer.pm line 5702 eval {...} called at /usr/share/perl5/PVE/QemuServer.pm line 5702 PVE::QemuServer::vm_start_nolock(HASH(0x55bac40d4700), 101, HASH(0x55bac41188d8), HASH(0x55bac41220d8), HASH(0x55bac4121fb8)) called at /usr/share/perl5/PVE/QemuServer.pm line 5414 PVE::QemuServer::__ANON__() called at /usr/share/perl5/PVE/AbstractConfig.pm line 299 PVE::AbstractConfig::__ANON__() called at /usr/share/perl5/PVE/Tools.pm line 226 eval {...} called at /usr/share/perl5/PVE/Tools.pm line 226 PVE::Tools::lock_file_full("/var/lock/qemu-server/lock-101.conf", 10, 0, CODE(0x55bac4118f80)) called at /usr/share/perl5/PVE/AbstractConfig.pm line 302 PVE::AbstractConfig::__ANON__("PVE::QemuConfig", 101, 10, 0, CODE(0x55bac4122180)) called at /usr/share/perl5/PVE/AbstractConfig.pm line 322 PVE::AbstractConfig::lock_config_full("PVE::QemuConfig", 101, 10, CODE(
 
yes, as I suspected. I didnt see anything related to iSCSI in your original log output, bug given that I am not familiar with this plugin it was too early to point there. As usual, just because the error is not reported - doesnt mean its not there :)

Use "code" tags to post large swaths of long lines, makes it much easier to read.

Good luck.


Blockbridge : Ultra low latency all-NVME shared storage for Proxmox - https://www.blockbridge.com/proxmox
 
I just thought I would update here in case any others run down the same rabbit hole. I was unsuccessful in getting this solution working using ZFS over ISCSI with Truenas Scale with THEGRANDWAZOO script. It appeared to me to be a permission issue but the solution eluded me.
My original plan had been as follows:
PVE1 for Virtualization.
Truenas1 for shared storage.
PBS1 for a backup solution with my HP Tape Library.

The new plan is as follows:
PVE1 main Virtualization host/Ceph.
PVE2 secondary Virtualization host/Ceph.
PVEPBS3 ability to host VM/Ceph/mainly backup server(will install PBS on top of PVE).

I want to thank everyone who made suggestions for this issue.
Thaimichael
 
I was able to see the storage but i wasn't able to write on it. it looks like a permission error but I can't figure how to fix it...
 
im surprise there is not more intereset for getting a plugin woking with proxmox and truenas.
Me too. There are not much of info. I mean this is obviously better than traditional iscsi protocol if you have the required hardware. Are you using it?
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!