What open source solutions are available to use "ZFS over iSCSI with Proxmox"?

Discussion in 'Proxmox VE: Installation and configuration' started by al.semenenko88, Mar 26, 2018.

  1. al.semenenko88

    al.semenenko88 New Member

    Joined:
    Dec 25, 2017
    Messages:
    1
    Likes Received:
    0
    What open source solutions are available to use "ZFS over iSCSI with Proxmox"?
    Will FreeNas 9 or 10 work for "ZFS over iSCSI"?
    What solutions are on Linux?
     
  2. LnxBil

    LnxBil Well-Known Member

    Joined:
    Feb 21, 2015
    Messages:
    2,845
    Likes Received:
    220
    @mir is working on a FreeNAS storage plugin for Proxmox VE. It is currently in review.
     
  3. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    Refreshing.
    Any news on that?
     
  4. LnxBil

    LnxBil Well-Known Member

    Joined:
    Feb 21, 2015
    Messages:
    2,845
    Likes Received:
    220
  5. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    These patches added support for native Linux ZFS over ISCSI support. And what about FreeNAS support?
    I've got problems with running OmniOS on my hardware. What would be a better choice for NAS/SAN storage in near future? Linux server with LIO/targetcli or FreeNAS?
     
  6. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    Bump

    I've managed to apply ILO patches on my testing virtual environment. Looks like it works - I can create, delete and migrate offline KVM VMs. Great work of Udo Rader !!!
    Can't test it on running VMs yet, as Virtualbox doesn't allow to run nested virtualization. Right now I'm preparing to move my tests to bare metal hardware.
    Is there any ETA to include patches to official Proxmox release?
     
  7. Knuuut

    Knuuut New Member

    Joined:
    Jun 7, 2018
    Messages:
    19
    Likes Received:
    3
    raku likes this.
  8. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    Cool. Now I need to choose between ZFS On Linux and FreeBSD :)
    I think, FreeBSD would be a better choice.

    Are these patches going to be included in Proxmox in near future?
     
  9. Knuuut

    Knuuut New Member

    Joined:
    Jun 7, 2018
    Messages:
    19
    Likes Received:
    3
    I like the way how the connection from proxmox to freenas is set up. It's done over the api, not over ssh with config file manipulation.
    I'd appreciate too, when this would be included into proxmox mainstream.
     
  10. stefanzman

    stefanzman Member
    Proxmox VE Subscriber

    Joined:
    Jan 12, 2013
    Messages:
    32
    Likes Received:
    0
    Raku - What kind of problems did you experience with OmniOS? This was just recommended to for an iSCSI shared storage solution for a PVE cluster. Is FreeNAS a better choice in your opinion?
     
  11. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    Typical hardware compatibility problems:
    1) OmniOS doesn't recognize my SATA drives connected to LSI based RAID controller. Only SAS drives are visible
    2) OmniOS doesn't work with my Mellanox 10Gbit network adapter

    Is FreeNAS better? I don't know. I've patched my cluster with freenas-proxmox from https://github.com/TheGrandWazoo/freenas-proxmox. I was able to connect the cluster to my FreeNAS storage (16 x 4 TB SAS). I can create, run, migrate, destroy KVM machines. I've tested sequential writes with dd and got transfers about 1GB/s on empty cluster. Everything works good so far.
    What I miss is https access from proxmox to FreeNAS API. I'm working to fix that and when it's ready, I'll create pull request to include my patches to freenas-proxmox project.
     
  12. mir

    mir Well-Known Member
    Proxmox VE Subscriber

    Joined:
    Apr 14, 2012
    Messages:
    3,425
    Likes Received:
    89
    This is strange since the drivers for Mellanox 10Gbit in FreeNAS/FreeBSD is the ported drivers from OmniosCE. I have no problems with either Mellanox 10 or 20 Gbit in both Omnios and FreeNAS.
     
  13. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
  14. stefanzman

    stefanzman Member
    Proxmox VE Subscriber

    Joined:
    Jan 12, 2013
    Messages:
    32
    Likes Received:
    0
    Thanks for the clarification, Raku - please keep me posted on your progress within this topic?

    We are trying to establish a standard / preferred shared cluster storage to recommend to our Proxmox customers and prospects, and I would like to have assurances that people have had consistent success with specific configuration(s).

    Thanks very much.
     
  15. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    Progress report:
    I've configured FreeNAS storage and found it faulting while cloning VMs and booting from virtio SCSI hard drive. So I moved to Ubuntu 18.04 and iSCSI via LIO. First look after patching Proxmox and configuring ZFS pool and iSCSI target - all works smoothly. But closer look revealed huge immature of ZoL and iSCSI soluton.

    I've tested storage performance inside KVM VM:
    FreeNAS: about 450-700 MB/s, 3500 IOPS
    Ubuntu: about 80-120 MB/s, 700 IOPS

    So I switched back to FreeNAS and just a couple minutes ago solved my issues:
    https://github.com/TheGrandWazoo/freenas-proxmox/issues/9
    https://github.com/TheGrandWazoo/freenas-proxmox/issues/10

    I think, I'm going to stay with the FreeNAS solution. Zvol over iSCSI on linux sucks.
     
    #15 raku, Jul 18, 2018 at 23:42
    Last edited: Jul 19, 2018 at 00:18
  16. stefanzman

    stefanzman Member
    Proxmox VE Subscriber

    Joined:
    Jan 12, 2013
    Messages:
    32
    Likes Received:
    0
    OK, so the current version of FreeNAS combined with the Proxmox + patches is a workable configuration for ZFS over iSCSI - and shared with a cluster?
     
  17. raku

    raku New Member

    Joined:
    Apr 16, 2016
    Messages:
    10
    Likes Received:
    0
    All features I need work OK. But so far, I run only 1-2 VMs simultaneously. I need to do some more tests running about 50-100 VMs. If that works, I'll say it is production ready on my site and start to migrate from the old XenServer cluster.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice