Best products from r/unRAID

We found 76 comments on r/unRAID discussing the most recommended products. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 178 products and ranked them based on the amount of positive reactions they received. Here are the top 20.

Top comments mentioning products on r/unRAID:

u/Timboman2000 · 7 pointsr/unRAID

Sorry in advance if this is a bit of a spammy post, it's been growing in each topic I put it in as I assemble more and more info for people.

I've posted about this in a couple different threads so I'll just copypasta some of it here.
The cost for my setup, not including the drives (of which I had quite a few laying around from other builds) and an unRAID Pro License, was about $800 all together.

(UPDATE 11/27/17 - Prices have fluctuated a bit higher since it seems that these setups are in somewhat high demand right now. That may or may not be due to me posting this info in several places for people, but hopefully my attempts to help people aren't pricing this setup out of their reach)

It can do pretty much everything except maybe Live TV PVR, but that's only because of a limitation in the Plex unRAID Docker itself. So if you plan on running Plex in the Docker that's gonna be the case no matter what you end up running it on hardware wise.

I have about 40 friends around the globe who regularly access my server and the only real bottleneck I've encountered is my upload speed when too many streams are pushing out at once.

__

You can make a good unRAID rig for FAR cheaper if you simply use older server components, for example:

SuperMicro X8DT3-LN4F Motherboard ($89.99) ($115.99 - as of 11/27/17)

2x Intel Xeon X5650 LGA1366 CPU's ($43.48 each) ($40.00 each - as of 11/27/17)

EVGA SuperNOVA 650 Watt 80 Plus Gold Modular ATX Power Supply ($79.99) ($93.57 - as of 11/27/17)

Some DDR3 ECC Server RAM, Usually pretty cheap ($24.00) ($30.00 - as of 11/27/17)

Then you just need any EATX capatible case, any two LGA1366 coolers, and any drives you want. All together your probably looking at no more than $600-700 for a system that will likely preform the same if not better than the setup you posted but will have 12-Cores (24 Hyper-threaded) @ 2.66Ghz


I should point out that I ALREADY have this style of setup working with unRAID, so this is not theory but a proven concept. I found as many of the original sources that I used as I could, but I made this a while ago so not all were current. Either way tho, the price for doing this kind of setup only tends to go DOWN over time, so it will only get easier to put together. Heck I've seen some sales of the X8DT3-LN4F Motherboard that come with RAM & CPU's already, so you might be able to pick up a complete setup for about the same cost as getting it piece by piece.

__

The only thing I would really add to the above is that the SAS module on the above motherboard actually has issues with operating Drives larger than 2TB, so If you want to add more than the 6 SATA 3TB+ drives that the Mobo nativly supports, your gonna need a PCIe SATA expansion card, such as This One that I ended up using.

I also have had some boot issues with the SAS function enabled in general, so if you do end up going with the above board I would just leave it disabled.

Also I would highly recommend watching most of Space Invader One's unRAID tutorial videos. Especially the ones about Docker CPU pinning and optimization plugins.

__

On my current setup I am running:

Dockers --
Plex, PlexPy (Plex Statistics and Notifications, I'm using it to run a Discord Bot that announces when new things are added on the server to friends), Omni (Plex Request System), Radarr, Sonarr, Jackett (Lets Radarr & Sonarr search private & public torrent trackers automatically), Deluge (Bittorrent Client), OpenVPN (For secure remote access over VPN) & Krusader (File Manager, Booted on Demand)

VMs --
Windows 8.1 VM (As a VNC GUI remote interface and to run ExtractNow to automatically deal with rared or zipped media torrents) & Windows 10 VM (Passing through a GTX970 and being used as a Steam In-Home streaming Client. Hooked directly up to a 4K TV so I can stream games from my main High End Gaming Rig to my TV. Booted on Demand)

Plugins --
A Bunch from the Community Applications Suite (Auto Turbo Write Mode, Auto Update Applications, Backup/Restore Appdata, Cleanup Appdata, Config Editor), Several Dynamix Plugins (Cache Directories, File Integrity, SSD TRIM, System Information, System Statistics), Fix Common Problems, Nerd Tools, Tips and Tweaks, Unassigned Devices, unBALANCE, User Scripts

Drives --
Nine 3TB HDDs (1 For Parity & 8 For Storage), One 120GB SSD (Cache)

__

Looks like you can get a refurbished X8DT3-F for about $120. The only real difference between the X8DT3-F and X8DT3-LN4F is if they have 2 or 4 Gigabit Ethernet Ports.

I actually have all 4 of mine connected to a high speed switch then into my router as a load balanced bond (effectively getting 4xGigabit speeds, at least within my LAN, which IS useful when streaming 4K games from my gaming PC to the server's client VM) but for most setups both boards are effectively equivalent.

u/sirastrix · 2 pointsr/unRAID

Story Time

​

Initially, I started with this case ( https://www.amazon.com/gp/product/B00Q2Z11QE ) as I was thinking of throwing something together like what you're talking about. Then my "project" began to grow.

That's when I ended up ordering this case instead ( https://www.amazon.com/gp/product/B005KQ66ZC ). That said, my server consists of a Threadripper 2990WX with an AIO water cooler. Well...this case wasn't made for that. So my father in law machined a hole in the top to mount the radiator on the top of the case like a blower on a car. This worked VERY well for a couple of weeks, but I just wasn't happy with it.

Finally, I ordered this case ( https://www.amazon.com/gp/product/B0091IZ1ZG ), to which I was able to fit everything inside of with a few extra bolts that still need to be trimmed. Here's a pic of the inside of mine and the temp 32 cores runs at ( https://imgur.com/tek9ID0 - https://imgur.com/vEPFLv5 ), do excuse the dust.

​

As far as SSD's go, just do something like this ( https://www.amazon.com/gp/product/B00GMGZBP0 ). Saves space and doesn't hurt them as they only take a single HDD slot. Taping them to the side of the case doesn't hurt either if you don't care about the looks. Also, I want to boast about these fans for a min ( https://www.amazon.com/gp/product/B00KFCRF1A ). Move a lot of air and aren't as loud as you'd think. The 120mm variant is a good bit louder, but still well worth it.

u/bobby-t1 · 1 pointr/unRAID

Thanks, this is great info. A few things:

>1 - CPU pinning, which is what you are doing, is not dedicating cores to the VM. The Linux scheduler (UnRAID), is free to run other tasks on those cores if it needs to, but, it will only run the VM processes on those 2 cores. If you truly wanted to dedicate those 2 cores to the VM, you would isolate them, then pin the VM to those cores. RAM is treated like any other processes RAM on Linux, and as long as there is space, it wont page it out to disk.

Sounds like then there is no real disadvantage for me to up the cores. Because if the VM isn't under lead (which is won't be since I'm using it to remote into for desktop tasks, not long lived hard core jobs or intense gaming), the cores are available for the server to use for other things outside the VM. Do i have that right?

>3 - I would start with adding a cheap GPU and passing it through. You may need a dongle to trick it into thinking there is a monitor connected if you are purely going Remote Desktop though.

I have a Supermicro X10SLm-F which has 3 PCI slots: One PCIe 2.0 x8, One PCIe 3.0 x8 and One PCIe 3.0 x16.

In the PCIe 2.0 x8, I have an LSI 9201 sata card.

In the PCIe 3.0 x18, I have the Nvidia P4000 GPU for Plex hardware transcoding.

So this leads the PCIe 3.0 x8. Any suggestions for what kind of GPU to get? Quick look on amazon came up with these:

MSI Geforce ZT 710 2GB ($45): https://www.amazon.com/MSI-GT-710-2GD3H-LP/dp/B01AZHOX5K/r

Zotac Geforce ZT 710 2GB ($80): https://www.amazon.com/ZOTAC-GeForce-Express-Graphics-ZT-71115-20L/dp/B00R5UW038/

Gigabyte Geforce ZT 710 4GB ($55): https://www.amazon.com/Gigabyte-GeForce-Graphic-Interface-GV-N710D5-2GL/dp/B073SWN4ZM/

>From there I would look at a new CPU/Motherboard. If you are wanting more VM's, now is a great time with AMD. You could triple your core count with a Ryzen 3900x, wait for the 3950x and quadruple it, or wait even a bit longer and get 24+ cores with Threadripper 3 that is expected soon.

I'll start w/ GPU first like you suggested and go from there. I considered the Ryzen originally when i did some work to upgrade my server recently, but opted to stay with my current mainboard as I read there were issues with Ryzen motherboard BIOS and unraid. Is this resolved?

u/darkslyde27 · 2 pointsr/unRAID

i/o crest works wonders, it's x1, 4 sata III ports that be had at $35. I/O Crest SI-PEX40064 i'm sure you can find them cheaper.

​

they are also known as SYBA SI-PEX40064 aka. IOCrest IO-PCE9215-4I
(from unraid HW comp list: 4 port, PCIex1, SATA III, Marvell 88SE9215, bootable, working out of the box, supports drives > 2.2 TB)


I use that on my low power box with four 2TB wd greens and don't have any issues. if you want to go with something better, SAS2008/LSI 9201/9211 HBA card on IT MODE is the clear cut winner for ease and compatibility. cons: they're a little more expensive ($65 + price of cables).

u/IMI4tth3w · 6 pointsr/unRAID

I just went through this nightmare.

My setup uses a Supermicro X9DRi-F that "DOES" support bifurcation.

So mistake number 1 was ordering this ( https://www.amazon.com/QNINE-Adapter-Controller-Expansion-Profile/dp/B077YHFJZM/) cheap adapter assuming it would work with 2x nvme ssds. It supports 1 SSD at NVMe and the other at SATA via the built in SATA port. Do not get this

so okay i figured out what i did wrong, and found that supermicro makes a simple dual nvme adapter that should work with bifuraction no problems! The part number is AOC-SLG3-2M2 https://www.supermicro.com/en/products/accessories/addon/AOC-SLG3-2M2.php

What i didn't notice is that the X9 boards are not under the supported list. But looking through the pdf manual and in my bios i find that it does support bifurcation.

Welp, i could never get it working. I looked through some sites that did custom bios but most of those guys were trying to Boot from NVMe, not just use them as additional drives.

So i gave up and am now using BOTH cards with 1 nvme ssd installed in each. If i were to do it again, if you can spare the extra pcie slot, just get 2 of the cheap nvme adapters and call it a day.

I also get a lot of notifications of nvme "overheating" so some heatsinks isn't a bad idea and also might want to turn the threshold for notifications up a bit. NVMe will be okay for a little bit at warmer temps. But the heatsinks are actually nice for extended loads.

TLDR: save yourself some headache and just get 2x of these https://www.amazon.com/GLOTRENDS-Adapter-Aluminum-Heatsink-PA09_HS/ if you can spare the pcie slots.

u/ClintE1956 · 5 pointsr/unRAID

+1 for LSI SAS92xx-8i hba cards, they work great with unRaid. I purchased mine from ebay seller theartofserver, he flashes them to IT-mode and thoroughly tests all ports. He has lots of informational youtube videos about these cards and others. Sometimes you can even find a 9201-16i card like this:

https://www.amazon.com/gp/product/B07JFFSZ1M/ref=ppx_yo_dt_b_asin_title_o03__o00_s00?ie=UTF8&psc=1

If you need more than 8 (or 16) drives, or want to expand later, get an hp expander card; very inexpensive.

https://www.amazon.com/gp/product/B07JFFSZ1M/ref=ppx_yo_dt_b_asin_title_o03__o00_s00?ie=UTF8&psc=1

Get 2x SFF-8087 - to - SFF -8087 cables to connect the two cards together, then you can connect up to 24 drives to the expander card using SFF-8087 - to - SATA forward breakout cables.

If using SSD's for cache and/or spinning drive(s) for parity, connect them up to your motherboard SATA3 connectors so those drives can negotiate at up to 6Gb/s. When using normal SATA drives connected to the HP expander, they only negotiate at 3Gb/s, common SATA2 speed. Don't need more than that for data drives.

The HP expander card doesn't need to be connected to your motherboard if you're short on pcie slots. All it needs it pcie power, so you can use one of the pcie riser cards like cryptocurrency miners use. Purchase one that has a power connector that fits your needs. I use one that has a pcie 6-pin connector that's used for video cards, since most newer power supplies are equipped with extra cables of that type.

u/stoploafing · 2 pointsr/unRAID

I have a travel setup that I take with me, but it consists of a small router that will connect to my VPN at home and will bridge the local WIFI or allow me to plug in and a Synology DS416 slim. This all goes into a case with my DSLR gear.

It covers all the bases, allows me to sync all my cloud accounts, and secure connection to the home network where I can check on the servers there and security cameras, etc.

I've tried going the home build before, but was never able to get something as robust and "slick".
This is the latest router that I'm using - https://www.amazon.ca/gp/product/B07GBXMBQF/ (not an affiliate link)

u/Something_Funny · 1 pointr/unRAID

I put together almost the exact same build a year or so ago to replace my Drobo. Like your case selection better than mine. The only thing I might suggest is springing for an i5 if you're going to be transcoding multiple streams.

I recently decided to add more HDDs to my build and ran out of SATA ports. Expanded with this. Good luck!

u/Caldorian · 2 pointsr/unRAID

Depends on what you have available. Easiest would be to copy the data across the network. But it will take a while, and you risk losing progress if the network drops (take a look at using rclone to mitigate this).

If you're going to put the drives in the array anyways, I'd suggest putting the old drives in the system, but DO NOT assign them to the array. Instead, take a look at the Unassigned Devices plugin. This will allow you to mount the drives in the unraid OS separate from the array, and access the data on them. Then you can use something like Krusader or just the standard command line to copy the data off the drives to shares created on your array. Once the data has been migrated off, unmount the drives, stop the array, and then add the old drives as new drives to the array. Since you've been using them already, there's no need to run pre-clear on them to test for pre-mature failure. Instead, once you bring the array back online, you'll be able to access your shares, and unraid will prep the drives in the background and bring them online when they're ready.

​

If you don't want to mount them in your system until you're ready for them, another option would be to use an external USB encloser or dock (ie. https://www.amazon.ca/Vantec-NexStar-Dual-Drive-NST-D428S3-BK/dp/B01JNLCFQI/ref=sr_1_3?ie=UTF8&qid=1541694017&sr=8-3&keywords=usb+drive+dock) to attach the drives to your system, but still use Unassigned Devices to mount them and access the data.

u/mrbeck1 · 3 pointsr/unRAID

I ordered this one and am pretty happy with it.

Rosewill 4U Server Chassis / Server Case / Rackmount Case, Metal Rack Mount Computer Case support with 15 bays & 7 Fans Pre-Installed (RSV-L4500) https://www.amazon.com/dp/B0091IZ1ZG/ref=cm_sw_r_cp_api_x5PrzbJ9MRYFP

u/Polaris2246 · 1 pointr/unRAID

My buddy and I each built unraid servers in the past month. He went higher specs with a Xeon e3-1250v3 and a higher end consumer motherboard. Hes going to get an AMD rx480 video card for it so he has a second gaming computer for anyone that comes over. 16 gigs of ecc RAM. I went more power efficient and bought a supermicro board with an Intel Avalon C2750 CPU. It's essentially a server Atom CPU. It uses 20watts and has eight cores and 16 gigs of ecc RAM too. The motherboard has the right features I wanted. ipmi built in, four nics and some other stuff. I was worried the CPU would be under powered by it packs plenty of power for my docker containers. Sonarr for auto TV downloading, couch potato, nextcloud server, web server, MySQL server, modded Minecraft server, crash plan backup server, and others. I barely eat up 30% CPU when everything is running and actually doing something. Idle is below 5%. I don't have Plex on it because my Nvidia shield does that. It's surprised me a lot how much power it has. If you want gaming, it's not for you but it is more than enough as a file server and the applications its running and plenty more.

Motherboard/CPU

16GB RAM

SATA Controller Card (needed more sata ports than motherboard had)

Power Supply

[2x SSD for Cache/Pool set up]
(https://www.amazon.com/gp/product/B01FJ4UN76/ref=oh_aui_detailpage_o06_s00?ie=UTF8&psc=1)

5x WD Red 3TB

Better fans for case

Case (LOVE the case)

u/phenger · 1 pointr/unRAID

Yeah, I use it as my main PC/gaming case and love it. I gutted it for gaming (better airflow and graphics card radiator positioning). I love that they use screws instead of rivets for modularity. I had 0 hesitations buying it again for my unraid rebuild.

As /u/Douglas_D pointed out, there can be some issues with the Rosewell cage I linked. You may consider the Icy Dock (https://www.amazon.com/MB074SP-B-Vortex-Removable-Module-Cooler/dp/B00GSQMYY0) cage instead. Same price and same basic function (minus the hotswap).

Those cages fit into 3x 5.25" bays. I had no issues sliding it into my case. It just...sticks in there. It's recessed into the case a bit but is otherwise solid and stable.

u/mvillar24 · 2 pointsr/unRAID

The question about PCI-E SATA cards is how much you are willing to spend and what available PCI-E slots do you have on your motherboard.

The cheapest I've tried (with slowest throughput) when you only have PCI-E 1x slots free is to use four port SATA cards like this Marvell 88SE9215 chipset based card for $33 on Amazon:
(http://www.amazon.com/gp/product/B00AZ9T3OU)

If you got at least a PCI-E 4x slot you can something faster for $100 - $160 such as (note these are 8 port cards):

  1. HighPoint RocketRAID 2720SGL 8-Port
  2. Supermicro AOC-SASLP-MV8

    On eBay used:
  3. Dell HV52W PERC H310

    A number of the above solutions are not as fast as you can go since they use PCI-E 4x slots. But 8x slot cards can cost a lot more. Personally I don't notice the slow down as much since I'm really using these drives to stream and don't notice that parity checks and moving data from cache to permanent drives take longer.
u/ClamatoChutney · 2 pointsr/unRAID

https://www.amazon.com/Fractal-Design-Define-Gaming-FDCADEFR5BK/dp/B00Q2Z11QE


unRAID OS is installed and always running from the USB drive. It will never be removed and cannot be installed on a hard drive. Just make sure it is a trustworthy brand name and you can very easily see the serial number etched on the metal.

u/Douglas_D · 1 pointr/unRAID

Just a data point from here, I had that Rosewell cage and am moving away from it because if the server is jostled at all, it has a potential to knock one of my drives offline. It doesn't seem like the connectors are super secure in mine and any little bump is a potential parity re-build scenario :/ I wound up with this Icy Dock cage instead since I don't really need the hot-swap feature and the connectors go straight into the drive instead of through a backplate. I also get better airflow on the Icy Dock cage.

u/PM_Me_Santa_Pics · 1 pointr/unRAID

I'm fairly certain it's 10Gbit all the way. Mellanox Connect-X 2 in my PC, one of these SFP+ modules, LC fiber to the other SFP+ in the switch, this from the switch to the other Mellanox card in my unRAID server.

Oh I know; it's more of a learning exercise with the benefit of at least getting more than 1Gb/s between my PC and unRAID server for copying files.

u/dandruski · 1 pointr/unRAID

I’ve used [this one](Inateck Superspeed 7 Ports PCI-E to USB 3.0 Expansion Card - 5 USB 3.0 Ports and 2 Rear USB 3.0 Ports Express Card Desktop with 15 Pin SATA Power Connector, Including Two Power Cables (KT5002) https://www.amazon.com/dp/B00FPIMICA/ref=cm_sw_r_cp_api_JfQ.BbX8YEN77) without issue in my Win 10 Gaming VM and also natively recognized in my macOS VM. There is a 4 port version as well.

u/Cebb · 5 pointsr/unRAID

I set up a 10 Gbps backbone for my home network this year, with 3 10 gig devices connected to it. FreeNAS server, unRAID server, and one Windows desktop. I don't use pfsense, so you should double-check that pfsense includes drivers for the cards you pick, or you could be in for some pain.

While you can achieve 10 Gbps over quality copper network cables, I went with fiber optic. Fiber optic networking has been around for a long time in many forms, so there are a lot of standards. There are two main types of cable. Multi mode and single mode. This cable type must match the fiber optic transceivers you use on each end. Then there are different qualities of cable. OS1, OS2 for single mode, and OM1, OM2, OM3, OM4 for multi mode. Higher numbers indicate better cable quality. Read up on the limitations of each. Finally there are a bunch of different connector types. LC is the most common from what I have seen. There are actually two kinds of LC, and one of them has an angled end, but those are a lot less common than ends that are cut off at 90 degrees. I'm not really clear on why two kinds exist.

​

ANYWAY these are what I bought and they all work fine together:

8x transceiver: https://www.ebay.com/itm/Finisar-FTLX8571D3BNL-10GB-SFP-SR-850nm-Transceiver/173943155751?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649

2x NIC card: https://www.ebay.com/itm/Mellanox-MHZH29-XTR-ConnectX-2-VPI-Standard-Profile-Network-Adapter/333292618107?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649

1x NIC card: https://www.ebay.com/itm/MHZH29-XTR-MELLANOX-CONNECTX-2-VPI-DUAL-PORT-NETWORK-ADAPTER-CARD/223585259766?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649

1x switch: http://amzn.com/B0723DT6MN

1x switch: http://amzn.com/B07LFKGP1L

1x long armored cable (Multimode, LC-LC duplex, OM3): http://amzn.com/B07JHKKCVY

Plus a bunch of different length patch cords (Multimode, LC-LC duplex, OM3) from fs.com

​

I specifically chose new Mikrotik switches as opposed to buying older used enterprise switches because the price difference isn't that great, and the Mikrotik switches are fanless.

Saved a boatload of money buying used NICs, and also quite a bit buying used fiber optic transceivers. New 10 GBE transceivers can easily run $20+ each, and new NICs can easily be $100+ USD each.

​

Total cost was still a few hundred USD, but that is a LOT lower than it could have been!

u/ceyoung75 · 1 pointr/unRAID

it may be to late, but on my third server i use this:

https://www.amazon.com/Inateck-Superspeed-Ports-PCI-Expansion/dp/B00FPIMICA/ref=sr_1_3?keywords=USB+3.0+PCIe+adapter&qid=1554238644&s=gateway&sr=8-3

may not be what your after but it allows me to setup each vm to have a dedicated external usb slave drive

u/zSars · 1 pointr/unRAID

I know for a fact that this one works:

https://www.amazon.com/gp/product/B00AZ9T3OU/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


and this one does not work

https://www.newegg.com/Product/Product.aspx?Item=N82E16816132018

Hope this helps

edit: Pretty sure the Marvell chipset makes the difference

u/eyecrax · 1 pointr/unRAID

This only has 15 proper bays but you can could probably shoehorn a few extra in there if you're creative. It is technically a rack mount chassis but it can be put on it's side if you felt so inclined. https://www.amazon.com/Rosewill-Rackmount-Computer-Pre-Installed-RSV-L4500/dp/B0091IZ1ZG

u/CuedUp · 2 pointsr/unRAID

Before I switched to a Rosewill RSV4500 I was using an Azza Solano 1000R full tower case. It had a ton of 5.25" bays and I used some cheap Cooler Master 4 in 3 bays to stuff it full of drives. It worked fairly well and I didn't need to modify the case at all. This was handy because I reused it down the line after migrating Unraid to the Rosewill.

The Rosewill case was the cheapest rackmount case ($80) I could get that fit my drives. I have considered upgrading to a hotswap-type of case like the Norcos but so far it has been more economical to just upgrade my drives to larger capacity rather than expand my capacity to hold drives. I swap drives so rarely that the hotswap feature isn't necessary. The Rosewill is annoying to work with when I have to swap a drive though (and I've removed the center partition).

u/Sl0rk · 3 pointsr/unRAID

This is the one I use and it works fine - https://www.amazon.com/IO-Crest-Controller-Non-Raid-SI-PEX40064/dp/B00AZ9T3OU/ref=sr_1_3?ie=UTF8&qid=1543180924&sr=8-3&keywords=IOCrest+SI-PEX40064

Although it costs a lot more that it should be for some reason. I paid $15 for it on Newegg.

u/Kelarik · 2 pointsr/unRAID

I currently have 3 4-in-3 bays - https://www.amazon.com/MB074SP-B-Vortex-Removable-Module-Cooler/dp/B00GSQMYY0/ref=sr_1_10 - with a fan controller (which is just filling a gap in the case) and a 4x2.5"-in-1 for cache SSDs, but I'm up to 11 3.5" drives, so it's pretty close to full capacity unless I start swapping out drives for more expensive models :)

u/manifest3r · 2 pointsr/unRAID

I have a 4-port PCI-e expansion card, using it without any issues on 2 1TB drives, and 1 500GB drive. Model number is SI-PEX40064.

https://www.amazon.com/gp/product/B00AZ9T3OU/

u/GT_YEAHHWAY · 1 pointr/unRAID

I have this adapter and it doesn't show up in BIOS on my B450M board.

Should I get a riser then switch adapters?