TLDR; novice needs help with a setup for a charity hospital in rural Africa
Greetings,
So a bit of a primer: I'm a surgeon, not an IT professional. Unfortunately, I'm also one of the most technically capable individuals at my hospital in rural Burundi. There is a national push toward electronic health records and we are slowly trying to put in place the necessary pieces of hardware to make that possible. I just finished setting up a Unifi network with 30 access points using a netgate box for pfsense/pfblockerng firewall/filtration. The network is running well. It's currently set up with infrastructure on VLAN1, admin devices connected to VLAN 10, staff and medical students to VLAN 20, and an after-business hours open network on VLAN30 which are all currently completely isolated from one another. Our 6 MB/s connection (yes only 6) is now useable despite all of the concurrent users.
Here's what I have on hand to configure for our server:
Dell PowerEdge 720xd
2x Xenon E5-2670 8 core
24x 8GB ram (192GB total)
H710P mini raid controller w/ 1GB cache
4 port gbit NIC
4x 2TB Hitachi SAS drives
4x 4TB Seagate EXOS SATA drives
2x 240GB SSDs
Here's what we plan to use:
1 Ubuntu VM to run the electronic health record - OpenClinic GA
1 Ubuntu VM to run an imaging server for our digital xray and ultrasound images - Orthanc
1 Ubuntu VM to run a file server, print server, and any other less mission critical services
Theoretical number of concurrent users:
~35 doctors
~62 nurses
~20 ancillary staff
(caveat emptor - we have nowhere near this many computers available, many if not most of our staff do not have their own devices to bring to the table, and we are constantly trying to hire more staff to meet the needs of our hospital. It's constantly in flux but on any given weekday we have around 80 employees on site.)
I'm wondering how to best utilize the above hardware to set up a robust but speedy system. Entropy seems to work a little harder in our context, so data security is important. The integrated backup solution is one of the reasons I'm preferring Proxmox over something like ESXi. Multiple nodes is way outside our budget (though we are working toward purchasing a back server to run a prod/backup setup) so from what I've read Ceph is a bad idea despite its data integrity and data restoration benefits.
Should I go for hardware raid using the built-in controller? Should I head down the ZFS rabbit hole and if so, should I use one of the SSDs for write caching or would it be better to set them up in RAID 1?
Any guidance would be most appreciated. We're going to do a progressive build out (imaging first so we can start using our digital xray panel) and I don't want to mess up the initial configuration and then have to back pedal and try to fix things while still needing the system to be up.
Many thanks,
Michael
Couple notes:
Procurement is really difficult here, but if there is something essential that's missing, we can work toward obtaining it before I attempt an install.
I'm a novice, but hopefully not an idiot. Been into computers since the Pentium 2 days, studied Biomedical Engineering before moving onto medical school and doing my surgical training.
Greetings,
So a bit of a primer: I'm a surgeon, not an IT professional. Unfortunately, I'm also one of the most technically capable individuals at my hospital in rural Burundi. There is a national push toward electronic health records and we are slowly trying to put in place the necessary pieces of hardware to make that possible. I just finished setting up a Unifi network with 30 access points using a netgate box for pfsense/pfblockerng firewall/filtration. The network is running well. It's currently set up with infrastructure on VLAN1, admin devices connected to VLAN 10, staff and medical students to VLAN 20, and an after-business hours open network on VLAN30 which are all currently completely isolated from one another. Our 6 MB/s connection (yes only 6) is now useable despite all of the concurrent users.
Here's what I have on hand to configure for our server:
Dell PowerEdge 720xd
2x Xenon E5-2670 8 core
24x 8GB ram (192GB total)
H710P mini raid controller w/ 1GB cache
4 port gbit NIC
4x 2TB Hitachi SAS drives
4x 4TB Seagate EXOS SATA drives
2x 240GB SSDs
Here's what we plan to use:
1 Ubuntu VM to run the electronic health record - OpenClinic GA
1 Ubuntu VM to run an imaging server for our digital xray and ultrasound images - Orthanc
1 Ubuntu VM to run a file server, print server, and any other less mission critical services
Theoretical number of concurrent users:
~35 doctors
~62 nurses
~20 ancillary staff
(caveat emptor - we have nowhere near this many computers available, many if not most of our staff do not have their own devices to bring to the table, and we are constantly trying to hire more staff to meet the needs of our hospital. It's constantly in flux but on any given weekday we have around 80 employees on site.)
I'm wondering how to best utilize the above hardware to set up a robust but speedy system. Entropy seems to work a little harder in our context, so data security is important. The integrated backup solution is one of the reasons I'm preferring Proxmox over something like ESXi. Multiple nodes is way outside our budget (though we are working toward purchasing a back server to run a prod/backup setup) so from what I've read Ceph is a bad idea despite its data integrity and data restoration benefits.
Should I go for hardware raid using the built-in controller? Should I head down the ZFS rabbit hole and if so, should I use one of the SSDs for write caching or would it be better to set them up in RAID 1?
Any guidance would be most appreciated. We're going to do a progressive build out (imaging first so we can start using our digital xray panel) and I don't want to mess up the initial configuration and then have to back pedal and try to fix things while still needing the system to be up.
Many thanks,
Michael
Couple notes:
Procurement is really difficult here, but if there is something essential that's missing, we can work toward obtaining it before I attempt an install.
I'm a novice, but hopefully not an idiot. Been into computers since the Pentium 2 days, studied Biomedical Engineering before moving onto medical school and doing my surgical training.