Best products from r/unRAID
We found 76 comments on r/unRAID discussing the most recommended products. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 178 products and ranked them based on the amount of positive reactions they received. Here are the top 20.
2. I/O Crest 4 Port SATA III PCI-e 2.0 x1 Controller Card Marvell 9215 Non-Raid with Low Profile Bracket SI-PEX40064
- 4 Internal SATA 6Gb/s Ports
- Compatible with SATA 6G, 3G and 1.5G Hard Drives
- PCI-Express x1 Interface is Compatible with PCI-Express x2, x4, x8, and x16 slots
- HyperDuo is configured with at least 1 hard disk drive (HDD) and up to 3 solid state drives (SSD). By embedding automated tiering technology into the chipset.
- 4 Internal SATA 6Gb/s Ports
- Compatible with SATA 6G, 3G and 1.5G Hard Drives
- PCI-Express x1 Interface is Compatible with PCI-Express x2, x4, x8, and x16 slots
Features:
3. Rosewill 4U Server Chassis/Server Case/Rackmount Case, Metal Rack Mount Computer Case Support with 15 Bays & 7 Fans Pre-Installed (RSV-L4500)
Superb Scalability: Support up to 15 internal 3.5" HDDs and seven expansion slots, so users can expand your server system easily.Unmatched Cooling: 2 x 80mm rear fans, 3 x 120mm front fans and 3 x 120mm middle fans, total 8 cooling fans deliver exceptional thermal performance you can rely on.Front D...
4. Intel Xeon X5650 CPU 2.66GHz 12MB 6.4GT/s Hexa 6 Core Server Processor SLBV3
Partnumber: X5650Spec Code: SLBV3# of Cores: 6Base Frequency: 2.66GHzCache: 12MB
5. Samsung M393B1K70CH0-YH9 8GB PC3L-10600R DDR3-1333 ECC Registered 2RX4 Server Memory
SAMSUNG - 8GB (1X8GB)1333MHZ PC3-10600 CL9 DUAL RANK ECCDDR3 SDRAM 240-PINSAMSUNG MEMORY MODULE FOR SERVER
6. EVGA SuperNOVA 650 G1, 80+ GOLD 650W, Fully Modular, 10 Year Warranty, Includes FREE Power On Self Tester Power Supply 120-G1-0650-XR
EVGA 650 G1 - Performance, Price, Perfect. Operating Temperature : 0° to 50° C80 PLUS Gold certified, with up to 90% efficiency under typical loadsFan Size / Bearing: 135mm Double Ball BearingHeavy-duty protections, including OVP, UVP, OCP, OPP, SCP, and OTP10 Year Warranty
7. Addonics AD4SA6GPX2 AD4SA6GPX2 6GB/S 4-Port SATA Controller
Addonics is a privately held Company located in the silicon valley, CaliforniaIts mission is to provide professionals and business users a complete family of innovative storage solutions with the highest quality and best compatibility in mindOur business focus is to deliver a family of data storage ...
8. Fractal Design Define R5 - Mid Tower Computer Case - ATX - Optimized for High Airflow and Silent - 2X Dynamix GP-14 140mm Silent Fans Included - Water-cooling Ready - Black
Optimally designed for silent computing with high density noise-reducing material throughout the case for maximum sound absorption, while not compromising on airflow and cooling capabilitiesExtensive water cooling support for a case of this size; housing radiators up to 420mm in the top and 360mm in...
9. Mikrotik CSS326-24G-2S+RM 24 port Gigabit Ethernet switch with two SFP+ ports
Cloud Smart Switch 326-24G-2S+RM is SwOS powered 24 port Gigabit Ethernet switch with two SFP+ portIt gives you all the basic functionality for a managed switch, plus moreAllows to manage port-to-port forwarding, apply MAC filter, configure VLANs, mirror traffic, apply bandwidth limitation and even ...
10. Inateck Superspeed 7 Ports PCI-E to USB 3.0 Expansion Card - 5 USB 3.0 Ports and 2 Rear USB 3.0 Ports Express Card Desktop with 15 Pin SATA Power Connector, Including Two Power Cables (KT5002)
SuperSpeed USB 3.0 supports transfer rates of up to 5Gbps - The actual transmission speed is limited by the setting of the device connected.7 Ports USB 3.0 downstream ports for standard desktop PCs; Support Hot Plug, Plug & Play; Support LPM, Low Energy Consumption.Operating System Compatibility: Wi...
11. iStar D Value D-416 4U Rackmount Server Chassis (Black)
Case Type: 4U RackmountMaterial: Front Bezel- Aluminum, Handle- Aluminum, Main Chassis- Zinc-Coated SteelM/B Type: ATX, Micro-ATX, Up to 12.0 x 9.6 inchExternal Bays: 6x 5.25" Drive BaysInternal Bays: 5x 3.5'' Drive BaysCase Type: 4U RackmountMaterial: Front Bezel- Aluminum, Handle- Aluminum, Main C...
12. Noctua NF-A14 iPPC-3000 PWM, Heavy Duty Cooling Fan, 4-Pin, 3000 RPM (140mm, Black)
- Heavy duty cooling fan, 140x140x25 mm, 12V, 4-pin PWM, max. 3000 RPM, max. 41.3 dB(A), >150,000 h MTTF
- Award-winning 140x25mm A-series fan with Flow Acceleration Channels and Advanced Acoustic Optimisation frame for superior cooling efficiency
- 12V 3000rpm model (41.3dB) with 4-pin PWM connector for automatic speed control via 4-pin PWM fan headers, broad 800-3000rpm speed range
- High-speed industrialPPC version for industrial heavy-duty applications that require extreme cooling performance and advanced ingress protection
- Ruggedised fibre-glass reinforced polyamide construction and IP52 certified water and dust protection
Features:
13. ICY DOCK Dual Tool-Less Dual 2.5 to 3.5 HDD Drive Bay SSD Mounting Bracket Kit Adapter - EZ-Fit Lite MB290SP-B
Fits 2 x 2.5” SATA/IDE SSD/HDD into a single internal 3.5” drive bayCompatible with 2.5” SSD & HDD with 7mm to 9.5mm drive heightQuick release eject mechanism for hassle free tool-less drive installationUniversally designed screw placement for maximum compatibility60 screw holes provide maximu...
14. Icy Dock MB074SP-B Black Vortex Removable HDD 4 in 3 Module Cooler Cage
- Fits 4 x 3.5” HDD into 3 x external 5.25" bay
- Fits 3.5” ide, SATA & sas HDD including our 2.5” to 3.5” SSD converters
- Front access with ez-grip design for easily installing/removing drives without the hassle
- Ez-grip handles allow easy installation and compatible with HDD docking stations
- Huge 120 mm cooling fan effectively keep the drives cool and decrease overall system temperature
Features:
15. MSI Gaming GeForce GT 710 2GB GDRR3 64-bit HDCP Support DirectX 12 OpenGL 4.5 Heat Sink Low Profile Graphics Card (GT 710 2GD3H LP)
- Chipset: NVIDIA GeForce GT 710
- Recommended psu is 300. Directx version support is 12 api
- Video Memory: 2GB DDR3. Core Clock: 954 MHz. Thermal: Fanless
- Memory Interface: 64-bit. Connectors: VGA, DVI-D Dual Link, HDMI
- Max. Resolution: 4096 x 2160, Support 2x Display Monitors
Features:
16. ZOTAC GeForce GT 730 Zone Edition 4GB DDR3 PCI Express 2.0 x16 (x8 lanes) Graphics Card (ZT-71115-20L)
The supported OS are Windows 10 / 8 / 7 / Vista / XP . The card length is approximately 145.79 millimeters x 111.15 millimeters1 x DL DVI D. 1 x VGA. 1 x HDMI. Triple simultaneous display capable. HDCP compliant300 watt power supply recommended. 25 watt max power consumptionHDTV READY: ED 480p. HD ...
17. Gigabyte GeForce GT 710 2GB Graphic Cards and Support PCI Express 2.0 X8 Bus Interface. Graphic Cards GV-N710D5-2GL
Powered by NVIDIA GeForce GT 710 GPUIntegrated with 2GB GDDR5 64bit memory InterfaceCore Clock: 954MHzFeatures Dual-link DVI-I/ HDMISupport PCI Express 2.0 x8 bus interfaceForm Factor: low profile
18. QNINE Dual M.2 PCIe Adapter, M.2 NVME SSD M Key or M.2 SATA SSD B Key 22110 2280 2260 2242 2230 to PCIe 3.0 x4 Host Controller Expansion Card with Low Profile Bracket for PC Desktop
You could use M.2 NVME SSD & M.2 SATA SSD at the same time with this adapter on the motherboard without M.2 slotUp to 3200+ MB/s random Read and 1500+ MB/s Write with NVME SSD, perfect upgraded kit for PCUp to 420 MB/s random Read and 350 MB/s Write with M.2 SATA SSDSupport both SATA or PCIe M.2 SSD...
19. GL.iNet GL-AR750S-Ext (Slate) Gigabit Travel AC VPN Router, 300Mbps(2.4G)+433Mbps(5G) Wi-Fi, 128MB RAM, MicroSD Support, Repeater Bridge, OpenWrt/LEDE pre-Installed, Cloudflare DNS
- 【DUAL BAND AC WIRELESS ROUTER】 Simultaneous dual band with wireless speed 300Mbps(2.4G)+433Mbps(5G). Tethering Compatible. Convert a public network(wired/wireless) to a private Wi-Fi for secure surfing.
- 【OPEN SOURCE & PROGRAMMABLE】 OpenWrt/LEDE pre-installed, backed by software repository.
- 【VPN CLIENT & SERVER】 OpenVPN and WireGuard pre-installed, compatible with 30+ VPN service providers.
- 【LARGER STORAGE & EXTENSIBILITY】 128MB RAM, 16MB NOR Flash and 128MB NAND Flash, up to 128GB MicroSD slot, USB 2.0 port, three Gigabit Ethernet ports (1 WAN and 2 LAN).
- 【PACKAGE CONTENTS】 GL-AR750S-Ext (Slate) router with 1-year limited warranty, power adapter(US Plug), USB cable, Ethernet cable and user manual. Please update to the latest firmware from our web site. For those orders which are fulfilled by seller, it may take around 7-14 working days for mail items to reach recipients.
Features:
20. Vantec NexStar TX Dual Bay USB 3.0 Hard Drive Dock (NST-D428S3-BK)
- Clear LED activity indicator
- SuperSpeed USB 3.0 transfer rates of up to 5Gbps, backward compatible with USB 2.0/1.1
- Hot-swappable: Plug & play without Rebooting
- Support two bigger capacity drives*
- Insert and use 2.5”/3.5” SATA HDD/SSD into the Dual bay Dock for ultimate convenience
Features:
Sorry in advance if this is a bit of a spammy post, it's been growing in each topic I put it in as I assemble more and more info for people.
I've posted about this in a couple different threads so I'll just copypasta some of it here.
The cost for my setup, not including the drives (of which I had quite a few laying around from other builds) and an unRAID Pro License, was about $800 all together.
(UPDATE 11/27/17 - Prices have fluctuated a bit higher since it seems that these setups are in somewhat high demand right now. That may or may not be due to me posting this info in several places for people, but hopefully my attempts to help people aren't pricing this setup out of their reach)
It can do pretty much everything except maybe Live TV PVR, but that's only because of a limitation in the Plex unRAID Docker itself. So if you plan on running Plex in the Docker that's gonna be the case no matter what you end up running it on hardware wise.
I have about 40 friends around the globe who regularly access my server and the only real bottleneck I've encountered is my upload speed when too many streams are pushing out at once.
__
You can make a good unRAID rig for FAR cheaper if you simply use older server components, for example:
SuperMicro X8DT3-LN4F Motherboard
($89.99)($115.99 - as of 11/27/17)2x Intel Xeon X5650 LGA1366 CPU's
($43.48 each)($40.00 each - as of 11/27/17)EVGA SuperNOVA 650 Watt 80 Plus Gold Modular ATX Power Supply
($79.99)($93.57 - as of 11/27/17)Some DDR3 ECC Server RAM, Usually pretty cheap
($24.00)($30.00 - as of 11/27/17)Then you just need any EATX capatible case, any two LGA1366 coolers, and any drives you want. All together your probably looking at no more than $600-700 for a system that will likely preform the same if not better than the setup you posted but will have 12-Cores (24 Hyper-threaded) @ 2.66Ghz
I should point out that I ALREADY have this style of setup working with unRAID, so this is not theory but a proven concept. I found as many of the original sources that I used as I could, but I made this a while ago so not all were current. Either way tho, the price for doing this kind of setup only tends to go DOWN over time, so it will only get easier to put together. Heck I've seen some sales of the X8DT3-LN4F Motherboard that come with RAM & CPU's already, so you might be able to pick up a complete setup for about the same cost as getting it piece by piece.
__
The only thing I would really add to the above is that the SAS module on the above motherboard actually has issues with operating Drives larger than 2TB, so If you want to add more than the 6 SATA 3TB+ drives that the Mobo nativly supports, your gonna need a PCIe SATA expansion card, such as This One that I ended up using.
I also have had some boot issues with the SAS function enabled in general, so if you do end up going with the above board I would just leave it disabled.
Also I would highly recommend watching most of Space Invader One's unRAID tutorial videos. Especially the ones about Docker CPU pinning and optimization plugins.
__
On my current setup I am running:
Dockers --
Plex, PlexPy (Plex Statistics and Notifications, I'm using it to run a Discord Bot that announces when new things are added on the server to friends), Omni (Plex Request System), Radarr, Sonarr, Jackett (Lets Radarr & Sonarr search private & public torrent trackers automatically), Deluge (Bittorrent Client), OpenVPN (For secure remote access over VPN) & Krusader (File Manager, Booted on Demand)
VMs --
Windows 8.1 VM (As a VNC GUI remote interface and to run ExtractNow to automatically deal with rared or zipped media torrents) & Windows 10 VM (Passing through a GTX970 and being used as a Steam In-Home streaming Client. Hooked directly up to a 4K TV so I can stream games from my main High End Gaming Rig to my TV. Booted on Demand)
Plugins --
A Bunch from the Community Applications Suite (Auto Turbo Write Mode, Auto Update Applications, Backup/Restore Appdata, Cleanup Appdata, Config Editor), Several Dynamix Plugins (Cache Directories, File Integrity, SSD TRIM, System Information, System Statistics), Fix Common Problems, Nerd Tools, Tips and Tweaks, Unassigned Devices, unBALANCE, User Scripts
Drives --
Nine 3TB HDDs (1 For Parity & 8 For Storage), One 120GB SSD (Cache)
__
Looks like you can get a refurbished X8DT3-F for about $120. The only real difference between the X8DT3-F and X8DT3-LN4F is if they have 2 or 4 Gigabit Ethernet Ports.
I actually have all 4 of mine connected to a high speed switch then into my router as a load balanced bond (effectively getting 4xGigabit speeds, at least within my LAN, which IS useful when streaming 4K games from my gaming PC to the server's client VM) but for most setups both boards are effectively equivalent.
Story Time
​
Initially, I started with this case ( https://www.amazon.com/gp/product/B00Q2Z11QE ) as I was thinking of throwing something together like what you're talking about. Then my "project" began to grow.
That's when I ended up ordering this case instead ( https://www.amazon.com/gp/product/B005KQ66ZC ). That said, my server consists of a Threadripper 2990WX with an AIO water cooler. Well...this case wasn't made for that. So my father in law machined a hole in the top to mount the radiator on the top of the case like a blower on a car. This worked VERY well for a couple of weeks, but I just wasn't happy with it.
Finally, I ordered this case ( https://www.amazon.com/gp/product/B0091IZ1ZG ), to which I was able to fit everything inside of with a few extra bolts that still need to be trimmed. Here's a pic of the inside of mine and the temp 32 cores runs at ( https://imgur.com/tek9ID0 - https://imgur.com/vEPFLv5 ), do excuse the dust.
​
As far as SSD's go, just do something like this ( https://www.amazon.com/gp/product/B00GMGZBP0 ). Saves space and doesn't hurt them as they only take a single HDD slot. Taping them to the side of the case doesn't hurt either if you don't care about the looks. Also, I want to boast about these fans for a min ( https://www.amazon.com/gp/product/B00KFCRF1A ). Move a lot of air and aren't as loud as you'd think. The 120mm variant is a good bit louder, but still well worth it.
Thanks, this is great info. A few things:
>1 - CPU pinning, which is what you are doing, is not dedicating cores to the VM. The Linux scheduler (UnRAID), is free to run other tasks on those cores if it needs to, but, it will only run the VM processes on those 2 cores. If you truly wanted to dedicate those 2 cores to the VM, you would isolate them, then pin the VM to those cores. RAM is treated like any other processes RAM on Linux, and as long as there is space, it wont page it out to disk.
Sounds like then there is no real disadvantage for me to up the cores. Because if the VM isn't under lead (which is won't be since I'm using it to remote into for desktop tasks, not long lived hard core jobs or intense gaming), the cores are available for the server to use for other things outside the VM. Do i have that right?
>3 - I would start with adding a cheap GPU and passing it through. You may need a dongle to trick it into thinking there is a monitor connected if you are purely going Remote Desktop though.
I have a Supermicro X10SLm-F which has 3 PCI slots: One PCIe 2.0 x8, One PCIe 3.0 x8 and One PCIe 3.0 x16.
In the PCIe 2.0 x8, I have an LSI 9201 sata card.
In the PCIe 3.0 x18, I have the Nvidia P4000 GPU for Plex hardware transcoding.
So this leads the PCIe 3.0 x8. Any suggestions for what kind of GPU to get? Quick look on amazon came up with these:
MSI Geforce ZT 710 2GB ($45): https://www.amazon.com/MSI-GT-710-2GD3H-LP/dp/B01AZHOX5K/r
Zotac Geforce ZT 710 2GB ($80): https://www.amazon.com/ZOTAC-GeForce-Express-Graphics-ZT-71115-20L/dp/B00R5UW038/
Gigabyte Geforce ZT 710 4GB ($55): https://www.amazon.com/Gigabyte-GeForce-Graphic-Interface-GV-N710D5-2GL/dp/B073SWN4ZM/
>From there I would look at a new CPU/Motherboard. If you are wanting more VM's, now is a great time with AMD. You could triple your core count with a Ryzen 3900x, wait for the 3950x and quadruple it, or wait even a bit longer and get 24+ cores with Threadripper 3 that is expected soon.
I'll start w/ GPU first like you suggested and go from there. I considered the Ryzen originally when i did some work to upgrade my server recently, but opted to stay with my current mainboard as I read there were issues with Ryzen motherboard BIOS and unraid. Is this resolved?
i/o crest works wonders, it's x1, 4 sata III ports that be had at $35. I/O Crest SI-PEX40064 i'm sure you can find them cheaper.
​
they are also known as SYBA SI-PEX40064 aka. IOCrest IO-PCE9215-4I
(from unraid HW comp list: 4 port, PCIex1, SATA III, Marvell 88SE9215, bootable, working out of the box, supports drives > 2.2 TB)
I use that on my low power box with four 2TB wd greens and don't have any issues. if you want to go with something better, SAS2008/LSI 9201/9211 HBA card on IT MODE is the clear cut winner for ease and compatibility. cons: they're a little more expensive ($65 + price of cables).
I just went through this nightmare.
My setup uses a Supermicro X9DRi-F that "DOES" support bifurcation.
So mistake number 1 was ordering this ( https://www.amazon.com/QNINE-Adapter-Controller-Expansion-Profile/dp/B077YHFJZM/) cheap adapter assuming it would work with 2x nvme ssds. It supports 1 SSD at NVMe and the other at SATA via the built in SATA port. Do not get this
so okay i figured out what i did wrong, and found that supermicro makes a simple dual nvme adapter that should work with bifuraction no problems! The part number is AOC-SLG3-2M2 https://www.supermicro.com/en/products/accessories/addon/AOC-SLG3-2M2.php
What i didn't notice is that the X9 boards are not under the supported list. But looking through the pdf manual and in my bios i find that it does support bifurcation.
Welp, i could never get it working. I looked through some sites that did custom bios but most of those guys were trying to Boot from NVMe, not just use them as additional drives.
So i gave up and am now using BOTH cards with 1 nvme ssd installed in each. If i were to do it again, if you can spare the extra pcie slot, just get 2 of the cheap nvme adapters and call it a day.
I also get a lot of notifications of nvme "overheating" so some heatsinks isn't a bad idea and also might want to turn the threshold for notifications up a bit. NVMe will be okay for a little bit at warmer temps. But the heatsinks are actually nice for extended loads.
TLDR: save yourself some headache and just get 2x of these https://www.amazon.com/GLOTRENDS-Adapter-Aluminum-Heatsink-PA09_HS/ if you can spare the pcie slots.
+1 for LSI SAS92xx-8i hba cards, they work great with unRaid. I purchased mine from ebay seller theartofserver, he flashes them to IT-mode and thoroughly tests all ports. He has lots of informational youtube videos about these cards and others. Sometimes you can even find a 9201-16i card like this:
https://www.amazon.com/gp/product/B07JFFSZ1M/ref=ppx_yo_dt_b_asin_title_o03__o00_s00?ie=UTF8&psc=1
If you need more than 8 (or 16) drives, or want to expand later, get an hp expander card; very inexpensive.
https://www.amazon.com/gp/product/B07JFFSZ1M/ref=ppx_yo_dt_b_asin_title_o03__o00_s00?ie=UTF8&psc=1
Get 2x SFF-8087 - to - SFF -8087 cables to connect the two cards together, then you can connect up to 24 drives to the expander card using SFF-8087 - to - SATA forward breakout cables.
If using SSD's for cache and/or spinning drive(s) for parity, connect them up to your motherboard SATA3 connectors so those drives can negotiate at up to 6Gb/s. When using normal SATA drives connected to the HP expander, they only negotiate at 3Gb/s, common SATA2 speed. Don't need more than that for data drives.
The HP expander card doesn't need to be connected to your motherboard if you're short on pcie slots. All it needs it pcie power, so you can use one of the pcie riser cards like cryptocurrency miners use. Purchase one that has a power connector that fits your needs. I use one that has a pcie 6-pin connector that's used for video cards, since most newer power supplies are equipped with extra cables of that type.
I have a travel setup that I take with me, but it consists of a small router that will connect to my VPN at home and will bridge the local WIFI or allow me to plug in and a Synology DS416 slim. This all goes into a case with my DSLR gear.
It covers all the bases, allows me to sync all my cloud accounts, and secure connection to the home network where I can check on the servers there and security cameras, etc.
I've tried going the home build before, but was never able to get something as robust and "slick".
This is the latest router that I'm using - https://www.amazon.ca/gp/product/B07GBXMBQF/ (not an affiliate link)
I put together almost the exact same build a year or so ago to replace my Drobo. Like your case selection better than mine. The only thing I might suggest is springing for an i5 if you're going to be transcoding multiple streams.
I recently decided to add more HDDs to my build and ran out of SATA ports. Expanded with this. Good luck!
Depends on what you have available. Easiest would be to copy the data across the network. But it will take a while, and you risk losing progress if the network drops (take a look at using rclone to mitigate this).
If you're going to put the drives in the array anyways, I'd suggest putting the old drives in the system, but DO NOT assign them to the array. Instead, take a look at the Unassigned Devices plugin. This will allow you to mount the drives in the unraid OS separate from the array, and access the data on them. Then you can use something like Krusader or just the standard command line to copy the data off the drives to shares created on your array. Once the data has been migrated off, unmount the drives, stop the array, and then add the old drives as new drives to the array. Since you've been using them already, there's no need to run pre-clear on them to test for pre-mature failure. Instead, once you bring the array back online, you'll be able to access your shares, and unraid will prep the drives in the background and bring them online when they're ready.
​
If you don't want to mount them in your system until you're ready for them, another option would be to use an external USB encloser or dock (ie. https://www.amazon.ca/Vantec-NexStar-Dual-Drive-NST-D428S3-BK/dp/B01JNLCFQI/ref=sr_1_3?ie=UTF8&qid=1541694017&sr=8-3&keywords=usb+drive+dock) to attach the drives to your system, but still use Unassigned Devices to mount them and access the data.
I ordered this one and am pretty happy with it.
Rosewill 4U Server Chassis / Server Case / Rackmount Case, Metal Rack Mount Computer Case support with 15 bays & 7 Fans Pre-Installed (RSV-L4500) https://www.amazon.com/dp/B0091IZ1ZG/ref=cm_sw_r_cp_api_x5PrzbJ9MRYFP
My buddy and I each built unraid servers in the past month. He went higher specs with a Xeon e3-1250v3 and a higher end consumer motherboard. Hes going to get an AMD rx480 video card for it so he has a second gaming computer for anyone that comes over. 16 gigs of ecc RAM. I went more power efficient and bought a supermicro board with an Intel Avalon C2750 CPU. It's essentially a server Atom CPU. It uses 20watts and has eight cores and 16 gigs of ecc RAM too. The motherboard has the right features I wanted. ipmi built in, four nics and some other stuff. I was worried the CPU would be under powered by it packs plenty of power for my docker containers. Sonarr for auto TV downloading, couch potato, nextcloud server, web server, MySQL server, modded Minecraft server, crash plan backup server, and others. I barely eat up 30% CPU when everything is running and actually doing something. Idle is below 5%. I don't have Plex on it because my Nvidia shield does that. It's surprised me a lot how much power it has. If you want gaming, it's not for you but it is more than enough as a file server and the applications its running and plenty more.
Motherboard/CPU
16GB RAM
SATA Controller Card (needed more sata ports than motherboard had)
Power Supply
[2x SSD for Cache/Pool set up]
(https://www.amazon.com/gp/product/B01FJ4UN76/ref=oh_aui_detailpage_o06_s00?ie=UTF8&psc=1)
5x WD Red 3TB
Better fans for case
Case (LOVE the case)
Yeah, I use it as my main PC/gaming case and love it. I gutted it for gaming (better airflow and graphics card radiator positioning). I love that they use screws instead of rivets for modularity. I had 0 hesitations buying it again for my unraid rebuild.
As /u/Douglas_D pointed out, there can be some issues with the Rosewell cage I linked. You may consider the Icy Dock (https://www.amazon.com/MB074SP-B-Vortex-Removable-Module-Cooler/dp/B00GSQMYY0) cage instead. Same price and same basic function (minus the hotswap).
Those cages fit into 3x 5.25" bays. I had no issues sliding it into my case. It just...sticks in there. It's recessed into the case a bit but is otherwise solid and stable.
The question about PCI-E SATA cards is how much you are willing to spend and what available PCI-E slots do you have on your motherboard.
The cheapest I've tried (with slowest throughput) when you only have PCI-E 1x slots free is to use four port SATA cards like this Marvell 88SE9215 chipset based card for $33 on Amazon:
(http://www.amazon.com/gp/product/B00AZ9T3OU)
If you got at least a PCI-E 4x slot you can something faster for $100 - $160 such as (note these are 8 port cards):
On eBay used:
A number of the above solutions are not as fast as you can go since they use PCI-E 4x slots. But 8x slot cards can cost a lot more. Personally I don't notice the slow down as much since I'm really using these drives to stream and don't notice that parity checks and moving data from cache to permanent drives take longer.
https://www.amazon.com/Fractal-Design-Define-Gaming-FDCADEFR5BK/dp/B00Q2Z11QE
unRAID OS is installed and always running from the USB drive. It will never be removed and cannot be installed on a hard drive. Just make sure it is a trustworthy brand name and you can very easily see the serial number etched on the metal.
Just a data point from here, I had that Rosewell cage and am moving away from it because if the server is jostled at all, it has a potential to knock one of my drives offline. It doesn't seem like the connectors are super secure in mine and any little bump is a potential parity re-build scenario :/ I wound up with this Icy Dock cage instead since I don't really need the hot-swap feature and the connectors go straight into the drive instead of through a backplate. I also get better airflow on the Icy Dock cage.
I'm fairly certain it's 10Gbit all the way. Mellanox Connect-X 2 in my PC, one of these SFP+ modules, LC fiber to the other SFP+ in the switch, this from the switch to the other Mellanox card in my unRAID server.
Oh I know; it's more of a learning exercise with the benefit of at least getting more than 1Gb/s between my PC and unRAID server for copying files.
I’ve used [this one](Inateck Superspeed 7 Ports PCI-E to USB 3.0 Expansion Card - 5 USB 3.0 Ports and 2 Rear USB 3.0 Ports Express Card Desktop with 15 Pin SATA Power Connector, Including Two Power Cables (KT5002) https://www.amazon.com/dp/B00FPIMICA/ref=cm_sw_r_cp_api_JfQ.BbX8YEN77) without issue in my Win 10 Gaming VM and also natively recognized in my macOS VM. There is a 4 port version as well.
I set up a 10 Gbps backbone for my home network this year, with 3 10 gig devices connected to it. FreeNAS server, unRAID server, and one Windows desktop. I don't use pfsense, so you should double-check that pfsense includes drivers for the cards you pick, or you could be in for some pain.
While you can achieve 10 Gbps over quality copper network cables, I went with fiber optic. Fiber optic networking has been around for a long time in many forms, so there are a lot of standards. There are two main types of cable. Multi mode and single mode. This cable type must match the fiber optic transceivers you use on each end. Then there are different qualities of cable. OS1, OS2 for single mode, and OM1, OM2, OM3, OM4 for multi mode. Higher numbers indicate better cable quality. Read up on the limitations of each. Finally there are a bunch of different connector types. LC is the most common from what I have seen. There are actually two kinds of LC, and one of them has an angled end, but those are a lot less common than ends that are cut off at 90 degrees. I'm not really clear on why two kinds exist.
​
ANYWAY these are what I bought and they all work fine together:
8x transceiver: https://www.ebay.com/itm/Finisar-FTLX8571D3BNL-10GB-SFP-SR-850nm-Transceiver/173943155751?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649
2x NIC card: https://www.ebay.com/itm/Mellanox-MHZH29-XTR-ConnectX-2-VPI-Standard-Profile-Network-Adapter/333292618107?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649
1x NIC card: https://www.ebay.com/itm/MHZH29-XTR-MELLANOX-CONNECTX-2-VPI-DUAL-PORT-NETWORK-ADAPTER-CARD/223585259766?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649
1x switch: http://amzn.com/B0723DT6MN
1x switch: http://amzn.com/B07LFKGP1L
1x long armored cable (Multimode, LC-LC duplex, OM3): http://amzn.com/B07JHKKCVY
Plus a bunch of different length patch cords (Multimode, LC-LC duplex, OM3) from fs.com
​
I specifically chose new Mikrotik switches as opposed to buying older used enterprise switches because the price difference isn't that great, and the Mikrotik switches are fanless.
Saved a boatload of money buying used NICs, and also quite a bit buying used fiber optic transceivers. New 10 GBE transceivers can easily run $20+ each, and new NICs can easily be $100+ USD each.
​
Total cost was still a few hundred USD, but that is a LOT lower than it could have been!
it may be to late, but on my third server i use this:
https://www.amazon.com/Inateck-Superspeed-Ports-PCI-Expansion/dp/B00FPIMICA/ref=sr_1_3?keywords=USB+3.0+PCIe+adapter&qid=1554238644&s=gateway&sr=8-3
may not be what your after but it allows me to setup each vm to have a dedicated external usb slave drive
I know for a fact that this one works:
https://www.amazon.com/gp/product/B00AZ9T3OU/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
and this one does not work
https://www.newegg.com/Product/Product.aspx?Item=N82E16816132018
Hope this helps
edit: Pretty sure the Marvell chipset makes the difference
This only has 15 proper bays but you can could probably shoehorn a few extra in there if you're creative. It is technically a rack mount chassis but it can be put on it's side if you felt so inclined. https://www.amazon.com/Rosewill-Rackmount-Computer-Pre-Installed-RSV-L4500/dp/B0091IZ1ZG
I can't vouch for this specific model, but you'll need a Sata expansion card like this...
https://www.google.com/url?sa=t&source=web&rct=j&url=https://www.amazon.com/IO-Crest-Controller-Non-Raid-SI-PEX40064/dp/B00AZ9T3OU&ved=2ahUKEwil8uj-8JPjAhWMVs0KHWguD54QFjAAegQIAhAB&usg=AOvVaw0FgV4dsct6N4qaiRbPzxAm
Before I switched to a Rosewill RSV4500 I was using an Azza Solano 1000R full tower case. It had a ton of 5.25" bays and I used some cheap Cooler Master 4 in 3 bays to stuff it full of drives. It worked fairly well and I didn't need to modify the case at all. This was handy because I reused it down the line after migrating Unraid to the Rosewill.
The Rosewill case was the cheapest rackmount case ($80) I could get that fit my drives. I have considered upgrading to a hotswap-type of case like the Norcos but so far it has been more economical to just upgrade my drives to larger capacity rather than expand my capacity to hold drives. I swap drives so rarely that the hotswap feature isn't necessary. The Rosewill is annoying to work with when I have to swap a drive though (and I've removed the center partition).
I ended up with this card and got nothing but errors.
​
https://www.amazon.com/gp/product/B00AZ9T3OU/ref=ppx_od_dt_b_asin_title_s00?ie=UTF8&psc=1
This is the one I use and it works fine - https://www.amazon.com/IO-Crest-Controller-Non-Raid-SI-PEX40064/dp/B00AZ9T3OU/ref=sr_1_3?ie=UTF8&qid=1543180924&sr=8-3&keywords=IOCrest+SI-PEX40064
Although it costs a lot more that it should be for some reason. I paid $15 for it on Newegg.
I currently have 3 4-in-3 bays - https://www.amazon.com/MB074SP-B-Vortex-Removable-Module-Cooler/dp/B00GSQMYY0/ref=sr_1_10 - with a fan controller (which is just filling a gap in the case) and a 4x2.5"-in-1 for cache SSDs, but I'm up to 11 3.5" drives, so it's pretty close to full capacity unless I start swapping out drives for more expensive models :)
I have a 4-port PCI-e expansion card, using it without any issues on 2 1TB drives, and 1 500GB drive. Model number is SI-PEX40064.
https://www.amazon.com/gp/product/B00AZ9T3OU/
I have this adapter and it doesn't show up in BIOS on my B450M board.
Should I get a riser then switch adapters?
This is what I am currently using.
https://www.amazon.com/Rosewill-Server-Chassis-Rackmount-Metal/dp/B0091IZ1ZG