Hi everyone
My hardware + software for work 0-24 all day as home server:
CPU i7-4790K, 32GB RAM but if not needed all, I can use with only 16GB too.
HDD: 2x 1TB HDD (Seagate Enterprise Capacity, ST1000NM0008) with software raid
Latest Debian + software raid + latest Proxmox install, all works fine.
I know, with SSD will be better. Maybe if too slow, i replace HDD to SSD. But for start I try this.
These services are needed:
samba: file sharing: movies, music (for raspberry pi music players) and for windows notebooks for movies, music all about 200GB
nfs and tftp: raspberry pi network boot and usage without disk. For 4 raspberry pi with sensors, posting data to mqtt
mysql: for zabbix 5 monitoring, for home assistant and for sensor data
redis: for emoncms
mqtt: for raspberry pi for sensor data manage
influxdb: for sensor data
rrdtool: for sensor data visualization
grafana: for sensor data visualization
emoncms (PHP 7.0.30): for power usage data stats and visualization
zoneminder (PHP: 7.4): for security, with 3 web usb webcam
zabbix 5 (PHP: 7.4): monitoring devices and alarms
home assistant: home automation, sensor data triggers
qbittorrent: for downloading
My plan to use only possible the minimum count of LXC containers, my current settings:
4 cores, 12GB ram, 100 file sharing: samba tftp nfs , qbittorrent
4 cores, 20GB ram, 101 databases: mysql, redis, MQTT influxdb
4 cores, 12GB ram, 102 monitoring: zabbix 5, zoneminder, emoncms, home assistant (Python), grafana, rrdtool
But now I see: nfs not working at same machine as the samba, etc.
Please recommend me an ideal container count with a recommended services at same container.
Also, how many ram and CPU recommended for that. Or other settings what optimal for that.
You recommend me to use only 16GB of ram instead of 32GB?
Reason why I share more ram than I have: if any container not using all ram, others can use. It's good idea or not?
My hardware + software for work 0-24 all day as home server:
CPU i7-4790K, 32GB RAM but if not needed all, I can use with only 16GB too.
HDD: 2x 1TB HDD (Seagate Enterprise Capacity, ST1000NM0008) with software raid
Latest Debian + software raid + latest Proxmox install, all works fine.
I know, with SSD will be better. Maybe if too slow, i replace HDD to SSD. But for start I try this.
These services are needed:
samba: file sharing: movies, music (for raspberry pi music players) and for windows notebooks for movies, music all about 200GB
nfs and tftp: raspberry pi network boot and usage without disk. For 4 raspberry pi with sensors, posting data to mqtt
mysql: for zabbix 5 monitoring, for home assistant and for sensor data
redis: for emoncms
mqtt: for raspberry pi for sensor data manage
influxdb: for sensor data
rrdtool: for sensor data visualization
grafana: for sensor data visualization
emoncms (PHP 7.0.30): for power usage data stats and visualization
zoneminder (PHP: 7.4): for security, with 3 web usb webcam
zabbix 5 (PHP: 7.4): monitoring devices and alarms
home assistant: home automation, sensor data triggers
qbittorrent: for downloading
My plan to use only possible the minimum count of LXC containers, my current settings:
4 cores, 12GB ram, 100 file sharing: samba tftp nfs , qbittorrent
4 cores, 20GB ram, 101 databases: mysql, redis, MQTT influxdb
4 cores, 12GB ram, 102 monitoring: zabbix 5, zoneminder, emoncms, home assistant (Python), grafana, rrdtool
But now I see: nfs not working at same machine as the samba, etc.
Please recommend me an ideal container count with a recommended services at same container.
Also, how many ram and CPU recommended for that. Or other settings what optimal for that.
You recommend me to use only 16GB of ram instead of 32GB?
Reason why I share more ram than I have: if any container not using all ram, others can use. It's good idea or not?