(Part 2) Best products from r/unRAID

We found 43 comments on r/unRAID discussing the most recommended products. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 178 products and ranked them based on the amount of positive reactions they received. Here are the products ranked 21-40. You can also go back to the previous section.

34. Rosewill 4U Server Chassis/Server Case/Rackmount Case, Metal Rack Mount Computer Case with 8 Bays & 4 Fans Pre-Installed (RSV-R4000)

    Features:
  • Superb Scalability : With three 5.25-inch external bays (which can switch to a 3.5-inch HDD x 4 module), eight 3.5-inch internal drive bays, and seven expansion slots, you can expand your server computer easily.
  • Excellent Cooling Design with 4 included case fan : The rackmount server chassis is engineered with optimal cooling in mind. Two 120mm front fans and two 80mm rear fans are included in the chassis to keep your whole system well ventilated
  • Extra Clever Designs : The RSV-R4000 features dual USB 2.0 connectors on the front panel for easy connectivity.
  • Motherboard Compatibility: The Rack-mount server chassis is compatible with Motherboard: CEB (12" x 10.5") and ATX (12" x 9.6") and below
  • Front Panel Lock: Stylish Black with front panel lock provides a better security for your rackmount server case
  • Solid and Steady Structure : A solid 4U rack mount industrial server case combines huge rooms, security, and expansion all together
  • HDD Screwless Design : Users can easily take off the hard drives with the screwless cage and modular release buttons.
  • Tremendous capacity :RSV-R4000 commits with vast room to meet your demand for an outstanding system. Dimensions (H x W x D)-7 x 16.8 x 21 inches (Including panel). 7 x 16.8 x 23 inches (With handles, including panel)
  • Motherboard Compatibility: CEB (12" x 10.5") and ATX (12" x 9.6")
  • Front door with key lock for better security
Rosewill 4U Server Chassis/Server Case/Rackmount Case, Metal Rack Mount Computer Case with 8 Bays & 4 Fans Pre-Installed (RSV-R4000)
▼ Read Reddit mentions

Top comments mentioning products on r/unRAID:

u/jsdoc · 3 pointsr/unRAID

First off, one thing to think of is you may have solved this by simply moving the boot USB to another port (that is on another IOMMU group) -- IE front panel USB to back or vice versa (I see you tried that but worth noting that it's not just changing ports, but trying to get to a different SET of USB ports). But then you correctly figured out that safe mode keeps you from starting the array, doesn't load any of your plugins, and puts you in failsafe settings and lets you access the USB directly.

BUT:
For future reference, I'd highly suggest you implement a process of backing up your USB drive (including a separate backup of a recent stable build in case the drive fails) so that you can avoid this in future. It's easy to use a plugin to automatically make regular backups. You need to go to the Apps tab and install the CA Backup / Restore Appdata plugin. Then you just define a location where your USB backup resides. Then just be sure to run it just BEFORE you make a VM or passthru change like this and disable automatic USB backups until you're sure the array is stable and like you want it, then turn it back on for auto-backups once you're happy that the changes are stable.

Of note, it's also a VERY wise idea to backup the other important stuff - such as the Docker config share (Appshare usually) and perhaps the Domain share if you value your VM's. I'd advise the same process of backup just before UnRaid version upgrades, just in case a plugin etc. hoses things or of course in case the upgrade corrupts the USB, etc.

As you noted, in terms of an oops on a config change, generally safe mode will at least let you boot UnRaid and get to an attached login screen and save things. If really bad things are happening, at that point I'd stop, clone the USB drive, and contact UnRaid or the forums.

You did this, but just to mention for the post - if you haven't updated the bios on Ryzen boards, that often seems to help the IOMMU options and split out the USB ports more, etc. which can really help. They seem to have all sorts of stuff glommed together on same IOMMU group in their early bios releases. Also of course as above, you can strategically move the boot USB to another port (front to back or vice versa). Since you're on Ryzen, I'd suggest you watch SpaceInvader's other video discussing this exact topic: https://youtu.be/T_BmK9vSjPA?t=11m18s

Final thought - if there's spare usb headers on the motherboard that are unused, then by definition there's a separate IOMMU USB isolated group you could use -- it's easy to get a short header cable with USB plug so you can simply leave the bootUSB plugged in internally. Like this https://www.amazon.com/TRIPP-Motherboard-Header-6-Inch-U024-06N-IDC/dp/B00QVTVB84/ref=pd_lpo_vtph_147_tr_t_2?_encoding=UTF8&psc=1&refRID=T1MBM2ZCJATA9XJMVS54

u/letrainwa · 3 pointsr/unRAID

It's very possible with unraid. Are you actually needing the "raid" part or are you just looking to have a single PC with two video cards be two computers via vituals machines and gpu passthrough? Unraid has a trial period. If you look up space invader he has some videos about virtual machines with you passthrough. I've done this exact thing for the exact purpose you are saying. However I also use many other features of unraid. Unraid makes the process easy but costs $$$. Unraid uses kvm for it's virtual machines. Everything unraid does can be found for free. It just takes more effort depending on skill. Ubuntu server with kvm is pretty easy to setup for what your saying you want. I've setup everything you are wanting and more using free Linux software. Unraid is just easier in my opinion. Especially when I wanted to move hardware.

Edit: it's not over FireWire it's over cat 5. https://www.amazon.com/dp/B07DJ56875/ref=cm_sw_r_cp_awdb_t1_XpzMDb1TVQSD4

u/Cyromaniap · 2 pointsr/unRAID

If you are going to go with the LSI 9211-8i I'd pair it with a Intel RES2SV240 there are two advantages with it. One being it supports Sata III and you don't need a second PCI-E slot to house the card it can be powered by a single molex.

The 9211-8i is plenty capable of running a ton of drives. Each SAS channel is 6Gbps and the card has 8 of them. So effectively, there is 8GBps available in the card. PCI-e 2.0 8x cannot even handle the full bandwidth of the card. Given that a spinning rust hard-drive might give your 130MBps at the best of times, then, with PCI-e 8X, you need to have 30 HDDs at full bandwidth to saturate the bus.

Source

You will need one SFF-8087 to SFF-8087 to from the LSI 9211-8i to the Intel RES2SV240 and then you will need 6x SFF-8087 to SATA breakout cables. That would give you 24 sata connections..


If you wanted to have more bandwidth capable stuff it would cost a bit more and would require a motherboard with PCI-E 3.0 as well as the HBA to support PCI-E 3.0 I believe that card was the LSI 9311-8i


u/polopollo85 · 1 pointr/unRAID

Thanks very much!!

> You don't really need 2 parity drives if your array is not that large. What are the chances of 2 drives failing out of 6 at the same time?

Ok the 2 parity drives was mostly because I plan to "re-use" all my current external 2.5 drives (between 2 and 4TB of storage per drive) into it till they die and replace them with 3.5 (or if I just need more space). And honestly, I have no idea if they are "good" or not. The unRaid "cleaning/set to 0" feature might say they have some errors on it. So I would know if they are potentially prone to failure from start and not put good data on it. (unless the feature tells me that I should not use this disk at all? All this is pretty new to me).


Overall, as I said, I am new to all Raid/unRaid systems. So I need to understand better how to detect when a drive needs to be replaced right away.

> here's what I'd suggest [...]

THANKS a lot! That's the kind of advice I am looking for right now!

Is there a mobo you recommend? I saw you can add a PCIexp card with the SATA connectors. But if you have a mobo that includes that that will be better! You case seems pretty small and cute. My aim was maybe go with a bigger one just to be able to expand as much as I want but I might just go with a small one indeed and trust you guys on a choice of just having 6 drives in it.

I did research a bit before, and I saw that for the RAM I need ECC RAM.

Any specific fan I need for the processor?

else I was reading this post about a hard drive: https://www.reddit.com/r/DataHoarder/comments/7fx0i0/wd_easystore_8tb_compendium/ but I went to my local best buy yesterday and they don't have any of these promo, and this "NEBB VS NESN" is kind of annoying to figure out. It seems a gamble from what I understand, unless I am ready to order the drive from Amazon directly. So much to experiment I am excited!

u/Polaris2246 · 1 pointr/unRAID

My buddy and I each built unraid servers in the past month. He went higher specs with a Xeon e3-1250v3 and a higher end consumer motherboard. Hes going to get an AMD rx480 video card for it so he has a second gaming computer for anyone that comes over. 16 gigs of ecc RAM. I went more power efficient and bought a supermicro board with an Intel Avalon C2750 CPU. It's essentially a server Atom CPU. It uses 20watts and has eight cores and 16 gigs of ecc RAM too. The motherboard has the right features I wanted. ipmi built in, four nics and some other stuff. I was worried the CPU would be under powered by it packs plenty of power for my docker containers. Sonarr for auto TV downloading, couch potato, nextcloud server, web server, MySQL server, modded Minecraft server, crash plan backup server, and others. I barely eat up 30% CPU when everything is running and actually doing something. Idle is below 5%. I don't have Plex on it because my Nvidia shield does that. It's surprised me a lot how much power it has. If you want gaming, it's not for you but it is more than enough as a file server and the applications its running and plenty more.

Motherboard/CPU

16GB RAM

SATA Controller Card (needed more sata ports than motherboard had)

Power Supply

[2x SSD for Cache/Pool set up]
(https://www.amazon.com/gp/product/B01FJ4UN76/ref=oh_aui_detailpage_o06_s00?ie=UTF8&psc=1)

5x WD Red 3TB

Better fans for case

Case (LOVE the case)

u/n0llbyte · 1 pointr/unRAID

Okay thanks!
Ordered this one from Amazon:
https://www.amazon.com/gp/product/B01E9Z2D60

We'll know if it works in 2 weeks time :)

u/ronfar623 · 2 pointsr/unRAID

I can't help with the errors, but I can recommend this card as it works well with my two Windows 10 VMs under 6.2.4: https://www.amazon.com/gp/product/B00FPIMJEW/

Can pass the entire card to the VM and just plug USB peripherals normally. 100x easier.

u/pcbuilder1907 · 1 pointr/unRAID

I have the R6 and it works great. I wish it supported EEB/CEB motherboards (ie was a little deeper) which is why I'm shopping for new cases, but I haven't decided on a system yet and may stick with it if I can find a compelling EATX or ATX and single socket Xeon that will fit my needs and is affordable.

Only problem I have is with EATX it covers up most of the grommets where your 24-pin power goes.

You can also buy more trays from Fractal: https://www.amazon.com/Fractal-Design-HDD-Drive-Tray/dp/B07HY1BFCJ/ref=sr_1_3?keywords=fractal+3.5+hard+drive+tray&qid=1570833146&sr=8-3

u/Butrdtost · 1 pointr/unRAID

https://www.amazon.com/gp/product/B006LL9YL8/ref=ox_sc_act_title_3?smid=A38NZOGK1EZGST&psc=1

Here ya go. I hope I can get some money together and max out this board befoer all those sell out >.< At least I can help someone else out right? :D

u/av1982 · 1 pointr/unRAID

I see people recommending raspberry pie monitors. I've tried one like this in the past. They work well but can be a little sketch in a pocket without a good case and at some point your going to run into an issue where the PC may only have a VGA port. That being said, Just about every Goodwill thrift store I've gone to has 15" old lcd monitors for under $10.

Edit: If you need a portable monitor to throw in a backpack check this one out.

https://www.amazon.com/gp/product/B07HMTMV68/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1

I bought mine in june and has been awesome. Keep in mind.. no VGA

u/chip6439 · 1 pointr/unRAID

I've used a Molex to Sata breakout cable, havn't had any issues with the 8TB ones I'm currently using. May not be an ideal solution for a large number of drives but for my use case it was as easy as plug and boot. All of this assuming you have molex avail of course. https://www.amazon.com/gp/product/B009GULFJ0/ref=ppx_yo_dt_b_asin_title_o03_s02?ie=UTF8&psc=1

u/Cebb · 5 pointsr/unRAID

I set up a 10 Gbps backbone for my home network this year, with 3 10 gig devices connected to it. FreeNAS server, unRAID server, and one Windows desktop. I don't use pfsense, so you should double-check that pfsense includes drivers for the cards you pick, or you could be in for some pain.

While you can achieve 10 Gbps over quality copper network cables, I went with fiber optic. Fiber optic networking has been around for a long time in many forms, so there are a lot of standards. There are two main types of cable. Multi mode and single mode. This cable type must match the fiber optic transceivers you use on each end. Then there are different qualities of cable. OS1, OS2 for single mode, and OM1, OM2, OM3, OM4 for multi mode. Higher numbers indicate better cable quality. Read up on the limitations of each. Finally there are a bunch of different connector types. LC is the most common from what I have seen. There are actually two kinds of LC, and one of them has an angled end, but those are a lot less common than ends that are cut off at 90 degrees. I'm not really clear on why two kinds exist.

​

ANYWAY these are what I bought and they all work fine together:

8x transceiver: https://www.ebay.com/itm/Finisar-FTLX8571D3BNL-10GB-SFP-SR-850nm-Transceiver/173943155751?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649

2x NIC card: https://www.ebay.com/itm/Mellanox-MHZH29-XTR-ConnectX-2-VPI-Standard-Profile-Network-Adapter/333292618107?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649

1x NIC card: https://www.ebay.com/itm/MHZH29-XTR-MELLANOX-CONNECTX-2-VPI-DUAL-PORT-NETWORK-ADAPTER-CARD/223585259766?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649

1x switch: http://amzn.com/B0723DT6MN

1x switch: http://amzn.com/B07LFKGP1L

1x long armored cable (Multimode, LC-LC duplex, OM3): http://amzn.com/B07JHKKCVY

Plus a bunch of different length patch cords (Multimode, LC-LC duplex, OM3) from fs.com

​

I specifically chose new Mikrotik switches as opposed to buying older used enterprise switches because the price difference isn't that great, and the Mikrotik switches are fanless.

Saved a boatload of money buying used NICs, and also quite a bit buying used fiber optic transceivers. New 10 GBE transceivers can easily run $20+ each, and new NICs can easily be $100+ USD each.

​

Total cost was still a few hundred USD, but that is a LOT lower than it could have been!

u/CodeGrunt · 1 pointr/unRAID

I'm currently using a PC case from 1992. It had 8 hdd bays and they seem to be supporting fewer hdd bays these days. When/if I buy a case it will probably be something like the rosewill 4u https://www.amazon.ca/Rosewill-Rackmount-Chassis-Internal-RSV-R4000/dp/B0055EV30W/ref=mp_s_a_1_3

u/BE_chems · 4 pointsr/unRAID

That USB PCIE card is pretty amazing !
https://www.amazon.com/Sonnet-Allegro-Pro-PCIe-card/dp/B00XPUHO10?th=1

Not cheap but a cool find !

But I can't see myself drop $2000 on a cpu..

u/jdrtechnology · 1 pointr/unRAID

I recently put in an LSI card to attach 8 HDD's into my array (I have 5 SSD's attached making up my cache - not ideal, but I had the parts so... ;-). Worked out of the box. no flashing. No updates. I ordered mine from Amazon.com. Was $75, but I did not want to risk it, as this is my server (worth the $25 to me for simple piece of mind).

https://www.amazon.com/gp/product/B0085FT2JC/ref=ppx_yo_dt_b_asin_title_o04_s00?ie=UTF8&psc=1

Combined that with the splitter cables (I used these: https://www.amazon.com/gp/product/B07CKX6HVV/ref=ppx_yo_dt_b_asin_title_o03_s00?ie=UTF8&psc=1 ) and I have had 0 issues.

It was by far the most highly recommended card, and I did not want to deal with a bunch of random issues to save $25 dollars.

u/jaxder_jared · 1 pointr/unRAID

I've got a Thor V2 and when I move my unraid system into the case, I'll be adding two Rosewill HDD Cages (second link). This will put me at a capacity of 10 drives, it is also HUGE. It has plenty of room for two AIOS (140 back, 240 top) and as many PCI cards as you want.

My case: https://www.newegg.com/p/N82E16811147053

HDD Cage: https://www.amazon.com/Rosewill-5-25-Inch-3-5-Inch-Hot-swap-SATAIII/dp/B00DGZ42SM

u/bu2d · 2 pointsr/unRAID

I use this case but there are a few similar ones with different options for drive bays.

https://www.amazon.com/Rosewill-Rackmount-Computer-Pre-Installed-RSV-R4000/dp/B0055EV30W/ref=mp_s_a_1_5?keywords=rosewill%2Bcase&qid=1562677652&s=gateway&sprefix=rosewill&sr=8-5&th=1&psc=1

Your nvme Drive can be passed through as a unassigned drive. I do this with the two vm’s that I run. I also have plex and everything else running on it.

A major factor to consider is the cpu and how it’s setup. If you don’t properly isolate the cores needed for your vm plex will cause lots of lag due to the cpu spikes. If you have a newer 4 core 8 thread processor 4 threads for the vm and 4 for plex and UnRaid should be fine. I wouldn’t try this on anything else. I had my vm’s on a FX 8350 for a while and it worked but I would get random lockup’s.

My current setup is:

Ryzen 7 2700x

32GB Ram

2x 240GB SSD cache drives (one will work fine)

2x 500GB SSD passes through to each vm (one for each vm)

2x GTX 1650 (one for each vm)

12 other drives for storage


I have 3 cores/6 threads assigned to each vm and 2 cores/4 threads for plex, UnRaid and everything else. Each vm has also been given 12GB of RAM.

It’s a fun project that never seems to end as I can always find something new that I want UnRaid to do.

u/sureguy · 7 pointsr/unRAID

Generally when people are discussing USB passthrough they're passing through the controller, so that it is transparent to the guest OS (guest os is responsible for drivers, etc, and has direct hardware access). For hot plug to work the controller would need to be passed through.

Any HUB/Extender that connects to a USB port that you choose to pass through would be passed through in its entirety to a single guest OS.

There is this card that has a separate controller for each port, which means you could have 4 VMs each with their own host controller:

https://www.amazon.com/Sonnet-Allegro-Pro-PCIe-card/dp/B00XPUHO10?th=1

Then you could add a hub to each of the ports if you want more devices connected.

u/porksandwich9113 · 2 pointsr/unRAID

Make sure it's a crimped one, not molded. The QC issues with molded connectors have let to shorts/burns/fires for people in the past.

Example of a crimped one.

Example of molded.

u/gilahacker · 1 pointr/unRAID

I'm using two of these, myself:

https://www.amazon.com/gp/product/B0085FT2JC

They work great with my 4 and 10 TB HGST NAS drives, but I did have a problem with my Samsung 850 EVO SSD. There is a firmware update available for them that I haven't tried yet (I just moved the EVO to on-board SATA ports and it's fine).

Edit: You'll need cables like these (it doesn't come with them): https://www.amazon.com/gp/product/B013G4EMH8

u/KorYi · 2 pointsr/unRAID

I use a inateck kt5001 (5xUSB3, https://www.amazon.com/Inateck-Express-15-Pin-Connector-KT5001/dp/B00FPIMJEW ), works like a dream (got xbox/steam controller and oculus rift). This is on my main rig running arch, but it shouldn't be any different on unraid.

u/phenger · 1 pointr/unRAID

Just rebuilt my server in a phanteks Enthoo Pro (http://www.phanteks.com/Enthoo-Pro.html) case and love it. If you get the Roswell cage (https://www.amazon.com/dp/B00DGZ42SM/ref=cm_sw_r_cp_api_i_16mjDb80Q86DD) it’ll hold 10x 3.5” drives and has separate mounts for up to 4 ssd’s. Also fits a Noctura air cooler no problem. Same with massive cards.

u/Liwanu · 2 pointsr/unRAID

I use the Mellanox ConnectX cards, they are cheap and work with almost all Linux/Debian/Windows distros without any special drivers.
You should be able to use a DAC (Direct attached cable) between your server and desktop machine for up to 10Gbps transfer speeds.
I haven't done it personally, but it should be able to work.
There are also a few cheap switches out there with 10Gb capabilities.
https://www.amazon.com/MikroTik-CRS305-1G-4S-Gigabit-Ethernet-RouterOS/dp/B07LFKGP1L
https://www.youtube.com/watch?v=MDiiHN0MPdA