Reddit mentions: The best raid controllers

We found 344 Reddit comments discussing the best raid controllers. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 82 products and ranked them based on the amount of positive reactions they received. Here are the top 20.

7. Supermicro AOC-SLG3-2M2 PCIe Add-On Card for up to Two NVMe SSDs

  • Supermicro AOC-SLG3-2M2 PCIe Add-On Card for up to two NVMe SSDs
Supermicro AOC-SLG3-2M2 PCIe Add-On Card for up to Two NVMe SSDs
Height0 Inches
Length5.23998952 Inches
Number of items1
Weight0.2 Kilograms
Width2.70999458 Inches
▼ Read Reddit mentions

🎓 Reddit experts on raid controllers

The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where raid controllers are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
Total score: 26
Number of comments: 3
Relevant subreddits: 2
Total score: 12
Number of comments: 5
Relevant subreddits: 1
Total score: 10
Number of comments: 4
Relevant subreddits: 1
Total score: 8
Number of comments: 5
Relevant subreddits: 3
Total score: 6
Number of comments: 4
Relevant subreddits: 1
Total score: 6
Number of comments: 3
Relevant subreddits: 2
Total score: 5
Number of comments: 3
Relevant subreddits: 3
Total score: 4
Number of comments: 4
Relevant subreddits: 1
Total score: 2
Number of comments: 2
Relevant subreddits: 1
Total score: 2
Number of comments: 2
Relevant subreddits: 1

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Top Reddit comments about RAID Controllers:

u/dragontamer5788 · 5 pointsr/hardware
  1. ECC support -- If the memory of your QNAP gets corrupted, then your data is lost in transit. By buying ECC Memory, I virtually guarantee that this will not happen to me. (ECC RAM is very similar in concept to RAID6 or RAID5, except instead of for disks ECC RAM is for RAM). Because the entire computer I built is out of ECC RAM, I have one more layer of assurances that the data is safe.

    I have unconfirmed ECC Support. Error Correction does not work on this motherboard as I hoped.

  2. ZFS Support -- ZFS is an enterprise filesystem designed to store data and store data well. Bitrot can destroy your data EVEN if you are running RAID drives. By using ZFS (which is constantly scrubbing, checksumming, and double-checking the data), my system is immune to bitrot. Your typical NAS is not.

  3. The Motherboard immediately supports 6 hard drives. The QNAP only supports 2-drives. In the future, when I buy more drives, I can easily expand my computer. The QNAP is stuck with 2-bay at the maximum.

  4. I'm comfortable with FreeBSD -- This is a soft advantage, but I work with Linux systems at work (and Windows at home and work). So I'm very comfortable with tools like RSync and the command line in general. In any case, I have a clear backup strategy for the NAS: insert an external hard drive (probably NTFS formatted) and then RSync the data to the hard drive, and then store the hard drive elsewhere.

  5. ZFS Snapshots -- ZFS has a lot of advantages. Another major advantage that I plan to take advantage of is snapshots. The entire disk can be stored as a snapshot that only takes up space when files are modified. With ZFS Snapshots, I can rollback the filesystem very easily.

  6. I have a full PC -- This box is a fully functioning PC. If I decided to splurge, I can buy a SAS Card and then start chunking out LTO6 tapes (Which are only $30 for 2.5TB of storage). Granted, a LTO6 Tape Drive is extremely expensive, but a "full PC" has almost no limit to the customization options available to me. A more realistic option is to just buy a cheap expansion card and support maybe... 4 more hard drives in my case for only a $40 upgrade.


    So basically, my points come down to:

  7. Reliability (ECC RAM)
  8. Reliability (ZFS Protection vs Bitrot)
  9. ZFS Snapshots and Cloning.
  10. Expandability (6-SATA drives easy. More with a cheap expansion card)
  11. Expandability
  12. I personally have familiarity with *nix command line and can comfortably do advanced tasks on Nas4Free beyond what is even available on the WebGUI.

    Bitrot is a very simple problem to understand. What happens if instead of failing, a Hard Drive starts returning bad data to you? In traditional RAID, the hard drive has NOT crashed, so parity will not be checked. The file may be corrupted despite RAID protecting you. ZFS adds more checks to protect against this problem, while traditional RAID (which most NAS uses) do not.

    There are additional features that are interesting (Webserver support, Bittorrent support, DLNA server, Headless Virtualbox). But I don't plan to do anything complicated. So I'm mostly focused on reliability.

    Of course, NAS4Free supports the standard NAS features. You can easily add hard drives to zpools which can then be added to datastores. Volumes can be exported with iSCSI. Datastores can be exported using CIFS / Samba for Microsoft support, NFS for Linux Support, AFP support to support Mac OSX... or all three if you got a complicated setup. QNAP, Synology and all the commercial solutions will get you at least this much, which is hugely useful.
u/winglerw28 · 3 pointsr/homelab

I have a DL380G6 myself with:

  • (2) Xeon X5670 Processors
  • (12) 4GB DDR3 ECC Memory Modules
  • (6) 450GB SAS 15k RPM Drives
  • (2) 450W HP PSUs
  • (1) P410i SAS/RAID Controller w/ 512MB Cache

    It's a good machine that should serve you well, but HP is a pain in the rear end about their updates because you need to register with them. This advice is based on my experience from the past month, which is as long as I've owned the machine thus far (for whatever size grain of salt you wish to take my advice with):

  • Here is another post with the magnet link to the Proliant service pack; please leave that guy an upvote, because I was losing my mind trying to get HP to play nice and let me download it for a registered server, let alone one from a source I couldn't verify

  • I put links to all the parts I used to upgrade the server above. You'll find even better deals on eBay than Amazon, but eBay links seem like a poor choice since they will be less useful when the listing ends.

  • I chose those HDDs literally because they are the official HP drives specified in the later revisions of the DL380G6's configuration options.

  • The PCIe slots are rated for 150W, but server machines can be fickle when it comes to graphics cards. Whether or not the GPU will work for you is definitely not something I'd be super optimistic about. Also, putting anything in the PCI/PCIe slots causes the fans to max out and they are LOUD in that state.

    AAAAAAAND, since I'm rambling as it is, I might as well give some of my opinions:

  • Use Proxmox VE or Xenserver if you don't want to pay for ESXi; without a VMWare subscription you will be limited in what hardware you can provide per VM. I installed all three recently and am currently happy with Proxmox VE 5.0 (beta, I think?)

  • Learning how to use the ILO2 port might be useful, though I have found I haven't needed mine for anything really

    Hopefully this post was helpful for you and you can benefit from my foray into this. If anything, don't be afraid to experiment if you aren't sure about something and have fun! ;)
u/JamesGibsonESQ · 3 pointsr/DataHoarder

The answer to these is unfortunately hours of information... To sum up as best I can, you can run these or any server or home system setup in several ways... I go with JBOD and a backup, which is Just a Bunch Of Disks... It's like certain raid setups in that the drives all get added into a virtual mega drive... You can also have disk redundancy with a more traditional RAID setup, where the disks are cloned and checked to make sure no bits got corrupt... Both JBOD and RAID can be supported with these boxes, but, before you take this leap, I'd suggest building a test box first... Use either your current motherboard or a different computer, it doesn't matter... Most current boards support raid so you can play around with the board sata ports for testing different raid setups... Grab some (5 to test all raid modes) cheap 500gb drives and see how striping can speed up your file copy speeds dramatically, and how raid redundancy works... Both together would be a RAID 10 setup, but there are many...

WHILE you do that, what I'd truly suggest is to get a sas hba controller card... With this, you can also get expander cards to open up 4-40 new sata connections... Sata drives take little power, 1-5w idle and maybe 20-30w max when spinning, so your power supply doesn't need to be a huge wattage...

This way, you can truly learn all the things the storinators can do, and it's surprisingly easy... All for a combined 300-600 dollars worth of gear to start you off... These professional builds are for mega companies that just need the numbers, and have more money than sense, or for those who don't get computers but need the tech.. you can max out a system with 4 of these cards

And that's 160 sata ports each card supports 40 sata drives... That and a 1500-2000w psu and you have a makeshift storinator.... As long as you're not accessing all drives at the same time, this will work... there is NO easy answer for what's the best way to store data... If you want the most overkill, get a bunch of those rocket 750 cards i posted, and setup a raid 10 or JBOD with parity check, then double that on a backup system... Then also invest in a google $10 a month cloud account and back it up online... It's part of the "3-2-1" backup solution...

At risk of making a word wall, to answer sas or sata, sas drives are faster enterprise drives... Not needed for our needs, but they are better built if you want more overkill... The BEST thing about sas is, it's compatible with sata drives... That's a one way thing; sata controllers can't read sas drives, but sas controllers can read sata... And each sas port can break out to 3-4 sata ports, hence why the rocket750 can do 40 sata drives... It's also safe to use sata power splitters if you need the extra connections, however stay away from molded connectors... You'll want the kind that look like they snapped together... Yeah, it's a lot to take in, but the amount of choices is silly...

I'd say get a big ass case, or even a 4U rack, no tower or cage needed as you can rest a 1-4U rack just like a computer case, and put in any motherboard and cpu, but focus on maxing out pcie lanes... Get as many sas hba and expander cards as you want and skip on a graphics or sound card... Run full onboard ... Stay away from thunderbolt connections as well as they use pcie lanes... Heck, run with no video at all and just admin it remotely... Max out your psu to a 1200-2000w monster as you'll need it once you get up there in drives... I find the power balancing is better on them... Get an LTO7 or LTO8 drive, any one, as they're all made by IBM, and backup your data to both a hard drive backup and a tape backup, and also backup to cloud ... From there, it's a horrible addiction of buying hard drives in some mad need to have enough space to download the internet...

Tldr; honestly, get a rocket750 and start from there, learn about RAID, JBOD, and the basics of redundancy, backups, and you're the only one who in time will know what you needed; speed, total space, access. Choice is the spice of life, and your meal is 1,000,000 scoville in this game

u/thenicnet · 1 pointr/homelab

Hey there. I must admit I've had the solution for a few weeks but I finally have some free time to post! Looks like good timing.

[I used this HighPoint RocketRAID 2720SGL]( RocketRAID 2720SGL 8-Port SAS 6Gb/s PCIe 2.0 x8 RAID HBA Worked like a charm. Don't freak out because the controller itself won't see the drives (in bios). Boot into your favorite flavor of Linux and then there are a bunch of commands to get the TRUE byte size. I had to try 3-4 different commands in order to get the right answer.

Now the sg_format will take a loooong time depending on your drive size. There is a flag to just resize instead of formatting, however since these were second hand drives I really didn't mind cleaning it all with fire and also stress testing the drives a little.

The best part about that controller is with a little wiggle and giggle I fit it right into the 710 and used the same sas cables.

Hit me up if you have more questions. Hope this helps!

u/namelessted · 1 pointr/buildmeapc
PCPartPicker part list / Price breakdown by merchant

CPU | AMD - Ryzen 5 1600 3.2GHz 6-Core Processor | $196.44 @ OutletPC
Motherboard | Gigabyte - GA-AX370-Gaming ATX AM4 Motherboard | $111.87 @ OutletPC
Memory | Team - Vulcan 16GB (2 x 8GB) DDR4-2400 Memory | $122.99 @ Newegg
Storage | Crucial - MX300 275GB M.2-2280 Solid State Drive | $92.99 @ Newegg
Video Card | EVGA - GeForce GTX 1060 3GB 3GB GAMING Video Card | $184.99 @ Newegg
Case | Fractal Design - Define R5 (Black) ATX Mid Tower Case | $79.99 @ Newegg
Power Supply | SeaSonic - FOCUS Plus Gold 550W 80+ Gold Certified Fully-Modular ATX Power Supply | $79.90 @ Amazon
Optical Drive | LG - WH16NS40 Blu-Ray/DVD/CD Writer | $49.99 @ Amazon
| Prices include shipping, taxes, rebates, and discounts |
| Total (before mail-in rebates) | $949.16
| Mail-in rebates | -$30.00
| Total | $919.16
| Generated by PCPartPicker 2017-09-26 14:53 EDT-0400 |

Here is a slightly different build to consider. One main factor is that while you have 3 HDDs and an optical drive now, you may need to add more storage in the future. Given this, I have chosen both a case that can hold all that storage, as well as a motherboard that has enough SATA ports on it to make sure you can connect them all.

In doing this, the motherboard is definitely more expensive. The alternative would be spending less now, but being required to purchase a SATA expansion card later. If you are fine with adding in an expansion card later, or don't think you will need many more HDDs there are plenty of MOBOs available in the $60-70 range than have 6 SATA ports which would allow for 5 total HDDs w/ an optical drive.

Given the potential limit of available SATA ports, I have also chosen an m.2 SSD. As far as the optical drive goes, you mentioned needing one but didn't know if just for DVD but I picked a Bluray burner drive just in case. This can obviously be changed if needed.

The PSU selection also has 10 SATA power connectors, something you have to watch out for with PSU models is only having 6x SATA connectors and needing to use Molex adapters. The unit is also fully modular and has a 10 year warranty.

Its also possible to save money on GPU and drop down to a GTX 1050 Ti depending on what software you use and how you use it. In some scenarios software like Adobe Premiere will take advantage of a GPU, but the actual GPU isn't as important. But, in other situations the more powerful GPU actually yields faster render times. Linus has a video and Puget has a fairly detailed write up. But, that is just for Adobe Premiere, it could very well be different for different software.

In general, I would recommend doing some research on performance for whatever software you use, or are planning on using. All in all, you are going to want a well balanced system, but sometimes it makes sense to spend more or less on certain aspects if you know specific use-case scenarios will benefit from it or not.
u/infiniteGOAT · 1 pointr/PleX

After some research it looks like it would be better performance overall if the card was PCI-E 2.0 x2 or x4. The one you linked is x1 and will definitely work but all 4 hard drives connected to it will share the same PCI-E x1 lane (assuming they were all in use at the same time). Looks like this card may be better maybe? - or

My use case scenario for this server is just to install Unraid and storing several TBs of media for plex streaming (plex server located on same network but different machine btw). So, that being said - in your opinion does the speed difference there even matter for the most part? The drives will all be WD RED. I may add an SSD or two for caching down the road but I would connect those straight to the onboard SATAs if I went that route.

Thanks in advance for helping with all this and sorry if I missed something obvious.

u/M3rc_Nate · 1 pointr/buildapc

Wow, thanks! This is exactly the type of response I was looking for.

If I was to go with those options it looks like I would save a LOT of money. Any recommendation on what PCIe SATA card to get if I'm looking to add say 4 ports?.

That's just $130 for the case, mobo and CPU. Do you have a RAM (16GB) recommendation? I assume if my mobo/cpu choice was overkill that the ECC RAM was too. Still if the RAM is $80 still that's $210...then a PSU (again a recommendation?) which is probably $50, the PCIe card is $40... a grand total of like $300 for a DIY NAS. Pretty awesome.

u/teh_fearless_leader · 1 pointr/DataHoarder

As /u/just_insane has mentioned, plex is a good option for streaming. I'm an opponent of freenas, in favor of using something homebrew (zfsonlinux with debian or ubuntu, in my case, gentoo with zfs) to do what you need. I don't like how finicky freenas can be even with server-grade hardware. It's just not my thing.

That said, for someone who's new, it may be a good idea to try out freenas or nas4free. I just finished building a 16TB usable (20.5TB raw) system last week. I'll link my items below.

2x iocrest controllers

1x16GB kingston ECC ram

1xNZXT source 210

5xHGST 4TB deskstar NAS

1xsupermicro mbd-x9scm-f-o - Great board. Loving it so far. dual onboard nic is nice.

2x850 Pro 256GB that I had laying around

1x 550W PSU laying around.

total ran me about $1300 and I'm able to max out a 2x1GB LACP setup writing and reading directly on rust.

EDIT: my recommendation, in most cases, is to at least do raidz1 (RAID5). RAID is no substitute for backups though, so invest in something offsite and make sure it's staying backed up. I use CrashPlan for offsite and local backups and it works like a charm.

u/doggxyo · 1 pointr/homelab

It's not pretty. I am actually pretty ashamed of how they're in there being a server admin for work. - and honestly the drive layout was part of the reasoning why I'm moving to a real server. I got the 9020 for free - and started using it with FreeNAS a test to play with FreeNAS. It was lovely, and I just continued to use it without thinking of upgrading the host.

The 9020 has two bays for internal disks. Two are there.

The 9020 also has room for two optical disks- so in these two bays, I can fit two 3.5 inch drives each. I've separated them a bit with cardboard to reduce any vibration one disk may cause the one next to it. Four drives are there.

Lastly, there's some weird clip thing underneath the optical disk trays that I have no clue what its purpose is. Oddly enough, it's roughly 3.5" so I was able to park the last disk there.

I had run out of SATA ports on the motherboard, so I had to install this bad boy to get those drives connected. It actually works beautifully for my need. Also unfortunately needed one of these to get the final drive some power.

The server is still running great, it's definitely 'hacked' to work, and I don't want to continue to keep my important files here on a risky system.

I''ll shut down my server to take a picture for you of the internals if you're interested!

u/JDM_WAAAT · 1 pointr/PleX

Edit: Refer to velogeek's comment below, this motherboard has 14 SATA natively.

I don't have a motherboard recommendation, but it's pretty easy to add more.

Cheap and dirty way: 4 port SATA PCI-E card

IMO the better way: Supermicro SAS 1, 2 port card
Flash to IT mode (very easy process with LSI MegaRaid software on windows) and use these or similar cables to breakout into 4 SATA per SAS port, in this case giving you 8 ports total.
The LSI card listed above is a SAS 1 card, while old will still give better performance than the SATA card. If you want even more performance (for SSD's and fast HDD's) check out SAS 2 cards. A little bit more money, but same concept. This guide should help you.

u/nullx · 1 pointr/freenas

Yes, FreeNAS-11.2-U5, was also working on U4.

I literally just plugged it in and plugged a drive in to it and it worked. I also have switched drives between the controller and the motherboard and that works too without breaking my pools. Support for the Marvell 9215 chipset was apparently added in FreeBSD 9.2... Not sure why you would be having trouble, does the card work in a different PC?

Oh fuck my bad, I Just double checked and THIS is the one I got... Slightly different but super close image wise. The one I got has the Marvel 9215, the one OP linked has Marvel 9235.. But based on a quick google it looks like the 9235 was added to be supported in FreeBSD 9 as well...

u/Cyromaniap · 2 pointsr/unRAID

If you are going to go with the LSI 9211-8i I'd pair it with a Intel RES2SV240 there are two advantages with it. One being it supports Sata III and you don't need a second PCI-E slot to house the card it can be powered by a single molex.

The 9211-8i is plenty capable of running a ton of drives. Each SAS channel is 6Gbps and the card has 8 of them. So effectively, there is 8GBps available in the card. PCI-e 2.0 8x cannot even handle the full bandwidth of the card. Given that a spinning rust hard-drive might give your 130MBps at the best of times, then, with PCI-e 8X, you need to have 30 HDDs at full bandwidth to saturate the bus.


You will need one SFF-8087 to SFF-8087 to from the LSI 9211-8i to the Intel RES2SV240 and then you will need 6x SFF-8087 to SATA breakout cables. That would give you 24 sata connections..

If you wanted to have more bandwidth capable stuff it would cost a bit more and would require a motherboard with PCI-E 3.0 as well as the HBA to support PCI-E 3.0 I believe that card was the LSI 9311-8i

u/Ucla_The_Mok · 2 pointsr/homelab

I bought one with 24GB RAM and no hard drives for $299 from eBay back in 2015.

The one on Craigslist is overpriced, for sure. In fact, here's a hex-core Xeon with 12GB RAM for $189.99 (add $43.08 for shipping to a Denver zip code)-

For just under $500 (including shipping to Denver), you could build one with dual hex-core Xeons and 48GB RAM from the same eBay seller-

As far as the machine itself, I have a Xeon x5687 in mine and it's my daily driver (Windows 10). Plex only uses 1-3% CPU if it's not transcoding and I don't even notice it. I run VMWare Workstation 12 Pro and it doesn't break a sweat running multiple VMs at once, so I'm sure it would do fine running ESXI.

(Transcoding is another story due to the lack of cores, which is why I converted all my media to x264/AAC for playback on a Roku 3. I've since upgraded to the Nvidia Shield and haven't come across a file yet that requires transcoding. There's definitely better CPUs out there for transcoding, but it would be an upgrade to your NUC. If you did the dual 6-cores, it would do really well, I think.)

It's not a terrible machine by any means. Keep in mind the tower is huge though. I use an old Samsung 32 inch 720p TV as one of my monitors and it is taller than that.

It runs pretty quiet but I did install SpeedFan so I could manually ramp up the fans when doing CPU intensive stuff such as video conversion (SpeedFan didn't detect the fans until I went into options and clicked a box that says "Enable DELL support (use this function only on DELL notebooks)". After checking that, clicking OK, and closing/relaunching SpeedFan, it detected the fans and allowed me to control them).

You can install a very cheap storage controller and plug it into 1 of the PCIe slots for SATA III capability. I've got an 8TB drive and my 1TB SSD connected to a cheap (under $30 new) Syba card- (If you decide to go this route, you will need to install Windows on an SSD using the onboard controller, install the drivers for the card, and then disconnect/reconnect the drive to the PCIe Sata card. Not sure on how that card would work with ESXI, though.)

I filled the onboard controller with 3 X 2TB refurbished HGST Enterprise drives off eBay that sold for $40 each back in 2015 and have 15TB total storage. Not including the SSD and a 750TI I slapped in for light gaming capability, I spent under $650 for 24GB RAM and 14TB of storage.

Would be even cheaper today, especially with the used workstations with E5v2 Xeons hitting eBay now. Don't spend $400 on this one.

u/averam · 0 pointsr/DataHoarder

How many drives did you connect to it? From the post i've got feeling that there are two connected (each drive). This card uses ASM1061 chipset which uses only one pci-e lane for data so maybe that's where the bottleneck is?

The second card which you've linked looks a bit better for me. It uses an Marvell 88SE9215 chipset which has some pretty solid reviews, for most people: "it just works". But i would never try to connect four drives in it - it uses only PCI Express ×1 x2 connector.

I was searching for inexpensive card for my home NAS to connect one additional drive right now and second in the future. For my server I've chosen an Delock PCI Express Card which uses an Marvell 88SE9230 chipset which is giving me normal, stable SATA 3 performance. Disk is visible in BIOS and is bootable. This card uses PCI-E x4 connector so it got more bandwith to use. Be advised that i'm right now only using this for one drive so it is not throttled by anything else and i'm only planning to keep max two disk on it.

As u/daericg, said: "You get for what you've paid for". If you want high, stable speeds then you should invest in eg. Perc H310. You can probably find many of them in good prices because (as far as i remember) Dell uses them in some of their workstations and some shops are reselling them. If you want inexpensive card for sata disks i would look at the Delock card linked above.

EDIT: links

u/PubliusPontifex · 1 pointr/DataHoarder is me. Has some issues on linux (drives go into scsi error recovery sometimes, but always come back quickly) but on FreeBSD it's been glorious. The one you pictured looks like mine but with 4x pcie/sata 3gbps, which should be fine for most purposes. I'm running ZFS raidz2, the card is just in passthrough, so JBOD is definitely fine.

u/darkslyde27 · 2 pointsr/unRAID

i/o crest works wonders, it's x1, 4 sata III ports that be had at $35. I/O Crest SI-PEX40064 i'm sure you can find them cheaper.


they are also known as SYBA SI-PEX40064 aka. IOCrest IO-PCE9215-4I
(from unraid HW comp list: 4 port, PCIex1, SATA III, Marvell 88SE9215, bootable, working out of the box, supports drives > 2.2 TB)

I use that on my low power box with four 2TB wd greens and don't have any issues. if you want to go with something better, SAS2008/LSI 9201/9211 HBA card on IT MODE is the clear cut winner for ease and compatibility. cons: they're a little more expensive ($65 + price of cables).

u/LoLFirestorm · 2 pointsr/pcmasterrace

Well, the motherboard not supporting SSDs isn't stopping you from using an SSD. If you want AHCI support (TRIM and all that SSD-related stuff) but your current mobo doesn't have it consider buying a PCIE x1 controller card like this one. People are saying good things about it around the internet and it's just 10 or so dollars so even if you're only going to use it temporarily it won't set you back a significant amount. USB 3.0 you can probably live without. 212 evo is legendary but now better alternatives exist, namely Cryorig H7 and SilentiumPC Fortis 3. They cost about the same though I'm not sure about Fortis 3 availability in the USA (if that's where you live). If you can cancel the 212 order to get Fortis 3 instead do it, otherwise don't bother. It won't make a huge difference. You have a 95W xeon, not a 220W FX. X5670 has good OC potential btw, according to HWbot, but be careful with voltages if your motherboard's VRM isn't very strong.

u/peterwemm · 5 pointsr/freebsd

Here's what I would do in your situation:

Put the standalone SSD devices on 6Gb+ AHCI motherboard connectors. These will do quite nicely. Motherboard AHCI slots are pretty well connected.

I'd grab a LSI SAS 9207-8i (about $100 on Amazon) and 2 x SFF-8087-SATA fanout cables (about $10 on amazon). It uses the mps driver in the base system. This combination is very, very solid and reliable. I use it myself for a media server.

You can add a second 9207-8i if you need more ports. I've found the AHCI pci cards work well too but watch the PCIe connectivity.

This device: cost $15.

ahci0: <ASMedia ASM1061 AHCI SATA controller> ...
ahci0: AHCI v1.20 with 2 6Gbps ports, Port Multiplier supported

Keep in mind the PCIe lane bandwidth: 1 x PCIe lane is: PCIe 1.x: 250MByte/sec, 2.x: 500MByte/sec, 3.x 985MByte/sec.

That 2 port AHCI card I linked above is 1 lane PCIe2.0. If you put 2 x SSDs on it that could do 600MB/sec each, the most it can shuffle through the motherboard connection is 500MB/sec. The LSI card is 8 lane PCIe 3.0 so that choke-point isn't there.

I'd add a second 9207-8i if I wanted to do any non-trivial amount of IO on more than 8 ports.

Also, don't set your expectations too high for L2ARC. My personal observations lead me to believe that the overheads of running it don't really pay off until you start having a L2ARC device with a good 5x to 10x performance advantage over the backend devices. YMMV of course, but I've never not been disappointed with L2ARC setups.

Personally, I over-spec system ram in preference to L2ARC.

u/nerplederple · 3 pointsr/freenas

If it's just a data drive and you're not looking to do anything super fancy with it. These work great.

However, be advised that, because the card is PCI-E x1, if you were to actually plug in 4 hard drives or SSDs, you're gonna run smack against bandwidth limitations if you start trying to hammer I/O on the drives connected to the card all at once.

I have this exact card as well as the 2-port PCI-E x2 slot version in use and they work very well for supplementing on-board headers when you're a few short.

I would not attempt to use these cards to run HDDs/SSDs that were going to be datastores for VMs nor as the HBA for something like FreeNAS. If your goal is along those lines, you'd be much better off looking for an HBA like the 9207-8i. You can get those way, way cheap on ebay, and then you just need the correct cables for 'em.

u/Remo_253 · 1 pointr/techsupport

What make/model is the PC? While I agree with /u/LetsGetBlotto in general many of the less expensive PCs from the likes of Dell, HP, etc. use custom motherboards and it wouldn't surprise me a bit if, to save a few pennies per board, they cut out any extra ports.

If that's the case, do you have an open PCI-E slot? If so you can add an expansion adapter.

Taking a step back though, the main reason for adding an SSD is to speed up the PC, speeding up any reads and writes to disk. That means the best use is as the main C: drive where the OS is installed. You can replace the existing drive with an SSD instead of adding it in addition to the HDD. There's a process called cloning that copies everything from the HDD to the SSD. Although many will recommend a clean Windows install, cloning is simpler, no need to reinstall all your programs.

u/rootb33r · 3 pointsr/news

Yeah, better example. Paper towels were the first thing that came to mind lol.

So in your case, are you sure it said the seller was HPE? Because there are two things on an Amazon listing that matter:

(1) brand - this is listed under the description. This is usually going to be correct, so in your case it will say "HP" as the brand or maybe "HPE" I'm not sure how they brand their stuff. (example listing that I quickly found). You get this brand association by listing a product and simply putting the brand name in the listing. It's not hard to fake.

(2) seller - this is in the right-hand side under the "buy buttons". This is the reseller and could be almost anyone. Theoretically these are approved resllers/dealers for the product, but that is by no means a guarantee.

For example, my company (let's call it "ABC Technologies") has 300+ approved dealers/resellers of my products. The brand (1) will be labeled as "ABC Technologies" but any one of those 300+ dealers could list our products on Amazon (2). Also -- and this is where the huge problem is -- literally anyone else could list the product as well... you don't have to be one of the 300+ approved dealers.

u/Arm-the-homeless · 1 pointr/buildapc
What do they have now? I have the Xeon equivalent of a Q9650 in my office PC (mostly my daughter's computer these days) and it's not slow for web browsing or word processing. Hell it runs Fallout 4 @ 1080p on medium settings.

Getting an SSD does make a huge difference. In fact if your parents PC has a Core 2 Duo or Quad in it already, they probably don't need a new computer. You're better off getting a PCI Expresss SATA3 controller card and a cheap SSD. That plus a fresh install of Windows would probably do wonders.

It probably is their aging hard drive that's causing the slowness, and maybe a lack of memory for having Chrome tabs open. Really, you don't need much processing power to browse the web or use office software. I honestly can't tell the difference between my office PC and my gaming PC for those tasks, and my gaming PC is a i7 4790K w/ 16gb of DDR3 while my office PC is the aforementioned C2Q Q9650 w/ 8gb of DDR2. Web browsers and office suites just don't put enough of a load on a processor for it to matter.

Edit: If I had your budgetary constraints and I had to get my parents a new computer, this is what I would build.

PCPartPicker part list / Price breakdown by merchant

CPU | AMD A10-5800K 3.8GHz Quad-Core Processor | $87.99 @ SuperBiiz
Motherboard | Gigabyte GA-F2A68HM-DS2H Micro ATX FM2+ Motherboard | $46.99 @ SuperBiiz
Storage | A-Data Premier Pro SP600 128GB 2.5" Solid State Drive | $44.99 @ Newegg
Case | Fractal Design Core 1000 USB 3.0 MicroATX Mid Tower Case | $28.99 @ NCIX US
Power Supply | SeaSonic 300W 80+ Bronze Certified ATX Power Supply | $35.99 @ SuperBiiz
| Prices include shipping, taxes, rebates, and discounts |
| Total | $244.95
| Generated by PCPartPicker 2015-12-17 06:16 EST-0500 |

And I would save up the extra 45 dollars to make it happen because they're my parents. Your parents deserve a power supply that isn't a fire hazard, they deserve a motherboard that isn't garbage, they deserve a case with some USB 3.0 ports and it's about to be 2016 so be a good child and get them a quad core while you're at it.

Edit2: And just to drive the point home about buying used, this is a much better computer for cheaper. Stick an SSD in it and your 8gb of RAM and it's good to go
u/gingerkidsrage · 2 pointsr/AskTechnology

I opted for a Windows Server with lots of storage bays.
Server 2012 has a "storage pools" feature, if you are fairly tech savvy and interested in that type of thing. I found it super easy to set up and configure.



Edit: For personal, non commercial use you can get Server 2012 R2 from Microsoft for free, via dreamspark.

u/6x9equals42 · 1 pointr/buildapc

Blue Iris looks like popular software. Those cams are a bit less than 4k, so you might not need 30TB but the WD Reds are the drives to get, and you might want a small SSD for the OS. I'd get a 6800k/X99/DDR4 system with a RAID card (here is a lower end unit, but it should work for this using mini-SAS->Sata adapter You don't need a DVR card b/c the cameras are digital and that CPU should be strong enough for all of them and the RAID processing. Just get a cheap nvidia card (750ti or less), the GPU shouldn't make much of a difference.

u/cakepodharry · 2 pointsr/burstcoin

Nope. The single connector on that cable is a SFF-8087 SAS Connector (NOT a SATA connector which your motherboard will have!).

SAS (Serial Attached SCSI, server grade / enterprise) controllers are backwards-compatible with SATA (Serial ATA, "mainstream" / consumer) Drives, but SATA is not forwards-compatible with SAS. If you have a SAS Controller, yes you can hook up 4 drives to a single SAS port, but if you have SAS Drives (And for this purpose that cable can be thought of as turning your drives into SAS Drives even though technically it really isn't) you can't connect them to a SATA Controller.

SAS Controllers are server-grade gear, and come in at several hundreds of dollars for the "cheap" ones.

Get a cheap HBA (Host Bus Adapter) card to go in a PCIe port, like the ones everyone else is suggesting.

This one is 4 port and (allegedly) has hot swap support (no rebooting when plugging / unplugging drives) which your motherboard probably won't have (Some nice ones do though):
But I'm sure you could find cheaper.

u/Arkydo · 1 pointr/pcmasterrace

It really depends on two conditions: SATA or M.2.

for the more common SATA based connectors, the Samsung 850/860 are industry leading and retain their read and write speeds (550+ MB/s).

For M.2 (If your motherboard or laptop has the slot), the speeds are even faster reaching 3.0-3.5x those speeds (upwards 1700MB/s). I recommend WD-Black or WD-Blue for decent budgets. If your PC has a free PCI-E Slot, you can install an adapter pretty easily (

u/wolffstarr · 1 pointr/DataHoarder

If all you want is a 4-port SATA card that will only take a PCIe x1 slot, look for something like this Syba card.

Now, going with a PCIe x1 slot is going to bottleneck you a little - max speed is 500MB/sec, and a 7.2k drive can usually give you upwards of 150MB/sec, so four of those could see some slowdown, but overall there won't be a ton.

That being said, if you've got the room for an x8 card, going with a SAS HBA is probably a better bet.

From what I've heard, some people have had some issues with Marvell chipsets for NAS usage. If you're considering FreeNAS, you should really look at their hardware compatibility list to find out what the best choices are.

u/mattbuford · 3 pointsr/HomeNetworking

I've been doing this for many years. I just run a regular Linux distro and do not run a special NAS OS. My router is the same way - just a regular Linux distro.

My general goals are low power and cheap with lots of drives. I don't generally care about size or performance.


Case: I have a full tower case I've been using for 20-ish years. For you, anything you want with 5.25" bays. Here's my old one, which is huge, but has a ton of 5.25" bays:

SATA hotswap bays: I've been using 3 of these for 10+ years to give me 9 bays:

3 of those gives me 9 hotswap bays. However, one thing I don't like about them is the small 40mm fans on the back. Those things seem to last 2-3 years, and with 2*3=6 of them in my case, there' always one dying and making horrible grinding noises (or completely stopped). Even worse, replacing them requires removing the bay from my case, then using a screwdriver to take apart the back of the bay. Other than the fan, the bays work great.

/u/bigdizizzle posted this and and if I were doing it over again, I'd probably give this a try hoping the larger fan dies less often (and half the fans means less failures per year).

Motherboard + CPU: I have this:

I went with this because it is a moderately powerful system (quad core with decent IO), has 3 PCIe slots (all are 2.0 1x, but one is physically 16x), USB3.0 for any external drives I might want to add, it uses normal desktop RAM, and it is very low power. The completed system, with an SSD but no hard drives, pulls 14 watts. There are smalled mITX versions, but those only give you one PCIe slot plus you have to use SODIMMs, so I suggest going with the mATX versions. Best of all, the price was right at $70 for the motherboard+CPU. Oh, and it is fanless. Some options I looked at had 40mm fans on the motherboard/CPU and past experience (my previous NAS build was with Atom330 motherboards) taught me that 40mm fans on the motherboard die or make horrible sounds quickly.

But, that motherboard is now old. Some newer, similar options, which I have not fully researched are:

My J1900 and the 2 above CPUs compared:

Weirdly, the last and best one (J4105) is limited to 8GB RAM, while the lower end J3455 and my old J1900 both handle 16GB. I don't know what's up with that.

Biostar makes a nearly identical J1900 board called the J1900MH2 and I bought one of those too (for $50 open box). It's basically identical, but hotswap does not work on the two internal SATA ports. So, I used that one for my router and the Asrock (with working hotswap) became my NAS. So, beware of Biostar BIOS possibly not supporting hotswap on the internal ports. I'd stick to Asrock.

More SATA ports: The motherboard comes with 2, and then I added 3 of these:

This isn't high performance, but we're talking PCIe 2.0 1x anyway. Hard drives tend to be slow anyway. These plus the internal ports gives me 14 SATA ports. These have proven reliable under Linux.

End result:

I have a NAS with 14 SATA ports, 9 hotswap bays, gigabit Ethernet, an OK (not great, not terrible) CPU, and I pull only 14 watts plus the hard drives. The USB 3.0 lets me connect a bunch of USB hard drives too.

Currently, I have 7 3.5" drives in there. I also have 9 external drives connected by USB 3.0. Those external drives are for backups, and they power on every night, perform a backup, and then power off. Those USB enclosures contain all my old junk hard drives from old computers. There is even a PATA drive still in use there. The backups array is where my old drives go to be run until they die.

My system has enough ram and CPU to run some VMs. I have one VM running a torrentbox that VPNs itself to a VPN provider, and anther VM running as a Bitcoin full node. Then, of course, the NAS is serving SMB to my regular PCs. This also runs as my "server" for various random things like MRTG, Munin, and so on.

Finally, I like that it is power efficient. That helps heat generation too.

Just within the past couple weeks, I got one of these:

I haven't installed it yet though. This is the same idea as the 3.5" bays, but for 6 hotswap 2.5" drives (SSDs) into a single 5.25" bay. My boot drive array is 5x1TB drives (RAID6) that are at 84334 hours of runtime (9.6 years) so I'm thinking it's probably about time to replace them - with SSDs. I also have 2x8TB drives in RAID1 used for mass storage (videos, etc).

Just as a suggestion, what I did was put /boot on a USB thumb drive and then my OS right on the main RAID6 array. That way I get great high reliability for my OS drive, but the thumb drive is there to take care of booting and initramfs (required to mdadm + LVM before trying to mount /). I didn't want to put /boot on a hard drive, since they're much more likely to fail than a USB flash drive sitting there almost never getting written to (only on kernel upgrades).

OS + old storage

cat /proc/mdstat

Personalities : [raid0] [raid1] [raid10] [raid6] [raid5] [raid4] <br />
md127 : active raid6 sdb[2] sda[0] sdc[1] sdg[4] sdd[3]<br />
      2930279232 blocks level 6, 64k chunk, algorithm 2 [5/5] [UUUUU]<br />
# df -h /<br />
Filesystem                     Size  Used Avail Use% Mounted on<br />
/dev/mapper/storage1-storage1  2.5T  2.2T  194G  92% /<br />

New mass storage:

btrfs fi show /storage/

Label: 'storage'  uuid: a6864940-727c-4740-a20d-1f37a202006b<br />
        Total devices 2 FS bytes used 4.54TiB<br />
        devid    1 size 7.28TiB used 4.74TiB path /dev/sde<br />
        devid    6 size 7.28TiB used 4.74TiB path /dev/sdf<br />

Backups pile of old drives:

btrfs fi show /backups/

Label: 'backups'  uuid: 81f5c405-9864-4178-b964-ed60149caa82<br />
        Total devices 9 FS bytes used 3.79TiB<br />
        devid    1 size 931.51GiB used 903.00GiB path /dev/sdk<br />
        devid    2 size 931.51GiB used 904.00GiB path /dev/sdl<br />
        devid    4 size 111.76GiB used 86.00GiB path /dev/sdq<br />
        devid    5 size 465.76GiB used 440.00GiB path /dev/sdp<br />
        devid    6 size 465.76GiB used 440.03GiB path /dev/sdm<br />
        devid    7 size 1.82TiB used 1.79TiB path /dev/sdn<br />
        devid    8 size 2.73TiB used 2.70TiB path /dev/sdi<br />
        devid    9 size 465.76GiB used 440.00GiB path /dev/sdj<br />
        devid   10 size 931.51GiB used 22.00GiB path /dev/sdo<br />

My router, as mentioned earlier, uses the Biostar version of the same MB as my NAS, but obviously without the SATA bays or the SATA cards. One of the PCIe slots has a dual port gig Ethernet card. This lets me do dual WAN. Then, boot is a cheap 120 GB SSD. Total power usage is 14 watts.

u/Jibco · 1 pointr/buildapc

Yes, they are. A SATA card simply provides more ports so that more drives can be connected individually. It's possible to set up software RAID, but that can be a pain and is slower. A RAID card actually controls the RAID setup with its own chip. That card you linked would NOT work in a PCI-E slot; it is a slow PCI card which is different.

This card is basically the same as the other Amazon link, but it adds support for RAID 0 and RAID 1. Anything that supports more RAID levels will likely be more expensive. This is a good example.. It supports RAID 0, 1, 5, 10, 50, and JBOD.

u/DZCreeper · 1 pointr/buildapc

Those prices are very good. If you want the performance boost as well as the extra ports then go for it. Just keep in mind that third party chipset ports aren't any much slower than Intel ports, so a cheap expansion card can serve your purpose equally well.

That card for example can hook up to 8 SATA drives using Mini-SAS to quad SATA cable.

u/Duamerthrax · 3 pointsr/hackintosh

Ok, it's been a while since I researched this. I can tell you that all the cards work though. If I remember correctly, this card is the best for 4 internal ports. This one if you only have a 1x slot available. And this one if you also want some eSATA ports

I have all these cards in my Hackintosh I'm typing on right now without any drivers. They all support hot swap as well.

u/klondike_barz · 3 pointsr/buildapc

IMO if you dont want to do a $400 upgrade (ram, cpu, mobo), then your best bet is to keep your current setup and just spend $40 on a better cpu fan and/or case fan. that artic freezer looks more than sufficient

as for the sata, just get a pcie adapter card (;amp;psc=1&amp;amp;refRID=TC2VD20QZZYCTE2Z5NDJ

u/Something_Funny · 1 pointr/unRAID

I put together almost the exact same build a year or so ago to replace my Drobo. Like your case selection better than mine. The only thing I might suggest is springing for an i5 if you're going to be transcoding multiple streams.

I recently decided to add more HDDs to my build and ran out of SATA ports. Expanded with this. Good luck!

u/pandorafalters · 2 pointsr/buildapcsales

It's a bit more expensive than some, but I recently used this adapter to upgrade my 256 GB boot drive to a 1 TB drive. Seemed to run at full speed until the caches ran out, which is about the best you can really hope for.

It does require a #00 Phillips driver for the M.2 screw, though.

u/NiBuch · 12 pointsr/DataHoarder

Depends on your use case. If you're just looking to add SATA ports, aren't planning anything crazy, and don't want to shell out extra for something enterprise-grade, they should work fine.

I have a similar one in my NAS, along with a free 2-port card that came bundled with it. They work pretty well, and I haven't had any issues with either.

u/CyberSKulls · 2 pointsr/DataHoarder

If you want something dead simple, something this would work just fine:

Syba SD-PEX40099 4 Port SATA III PCIe 2.0 x1 Controller Card

I actually use these in unRAID. These are not gonna be blazing fast SAS3 speeds capable of maxing out read performance across 4 disks all at the same time in a software raid environment but they will work perfectly well for what you described and are cheap!

Edit: I linked the 4 port as that's the one I use. You can go even cheaper and get a 2 port or go extreme and get a 10 port. For me, I don't run raid or use parity so as long as a given controller can max out my drive, that's all I need. It's not like we're running Plex servers full off 1TB M.2 drives :)

In the past I ran LSI SAS 9201-16E's in my JBOD rack chassis. I'm simplifying everything, going down to tower chassis with larger drives. No more SAS cards, cables, expanders.. I wanted less drives, less chassis, less complex.

u/cookiez · 1 pointr/DataHoarder lots of good hardware recommendations in there.

Edit: ok, maybe not lots, they just use the card and it's definitely out of our league.

In the last version they were using CFI-B53PM port multipliers, those are cheap and very good apparently.

u/mmartinutk · 1 pointr/buildapc

Edit: Done. I don't know why, but it didn't work until I connected my Optical Drive to the PCI-e adapter and had both hard drives in the standard SATA ports. Did a clean install. Thanks for all who helped.


Just bought this computer and a Samsung 500 GB SSD to put in it. I cannot for the life of me migrate this fucking OS to my SSD.

I've tried cloning using EaseUS and Minitool Partinioner. When I re-boot and go into the BIOS, I'm not even given the option to boot from my SSD. Just from Windows Boot Manager. Windows came pre-installed on the HDD.

It's worth noting my SSD is connected via PCI-E. Not sure if that matters.

I've deleted all my previous partition efforts. Here's what I'm looking at in Disk Management

Just tell me what to do man. Every guide online doesn't help.

Edit: Starting a clean install as suggested by another user. Idk why I've been trying to migrate over install from USB.

u/shiggitay · 2 pointsr/hackintosh

I've used several SATA PCI-E x1 cards in my hacks. Ppl on TMx86 can tell you what ones work, but here's one I got recently and it works great:;amp;psc=1. There's also an 8 port version but it's like $90... of which wasn't in my budget when I got it.

u/AK-Brian · 1 pointr/Amd

Yeah, that's the one distinct disadvantage of the consumer AM4/Z390 style platforms. They offer enough lanes for most typical users, but when you start tacking on a lot of I/O, GPUs or networking they quickly find themselves staring at a PCI Express lane wall. Platforms like X399/TRX40/X299 offer more lanes, but come with their own set of disadvantages (primarily price).

I did a little bit of looking around and could not find any information indicating that your specific board supports lane bifurcation. It's possible that there's a modified BIOS floating around which would enable it, but I didn't see anything on the sites I checked. My thinking was that if it were supported, you might be able to utilize a 2xNVMe M.2 -&gt; x8 PCI Express card such as this ~$50 one from Supermicro, which would leave you with a fairly adequate x8 configuration on your GPU as well. The downside is that it's "only" two ports and requires that bifurcation support to operate. You could also plug a generic $15-20 single NVMe M.2 -&gt; PCI Express x4 adapter into that second slot, for use with a single drive. It'd be at full speed, and your GPU would again operate at the lower x8 link state.

It's still worth exploring the use of additional SATA SSDs on any remaining open ports you may have - they can be configured in RAID 0 through the BIOS or through Windows itself.

u/phosix · 2 pointsr/freebsd

I use an I/O Crest 8 Port SATA III in my NAS box.


  • Pretty cheap, you can generally find one for under $70.

  • Supports up to eight drives on one card. It does this by being two Marvell 88SE9705 chips on one board.

  • FreeBSD likes it! I've been running this card since FreeBSD 10.x, currently running it on FreeBSD 12.1


  • Only the first four disks are visible to BIOS/UEFI, the second Marvell chip isn't visible until the OS brings it up. So it's no good for a 5+ disk boot array.
u/HeavyHDx · 2 pointsr/pcmasterrace

An SSD on SATA 2 will still be vastly faster than a mechanical hard drive. Plus, you can just continue using it when you finally upgrade the rest of the PC, these things don't really break. But yeah, ideally you'd want SATA 3. If you really wanted to keep the board, you could just get a PCIe SATA 3 card:;amp;qid=1459233631&amp;amp;sr=8-2&amp;amp;keywords=pcie+sata+3

Or something more fancy like this:;amp;qid=1459233631&amp;amp;sr=8-3&amp;amp;keywords=pcie+sata+3

This will give you the full speeds for your SSD, as well as another SATA 3 port. Much cheaper than upgrading your mainboard just for the ports.

u/Polaris2246 · 1 pointr/unRAID

My buddy and I each built unraid servers in the past month. He went higher specs with a Xeon e3-1250v3 and a higher end consumer motherboard. Hes going to get an AMD rx480 video card for it so he has a second gaming computer for anyone that comes over. 16 gigs of ecc RAM. I went more power efficient and bought a supermicro board with an Intel Avalon C2750 CPU. It's essentially a server Atom CPU. It uses 20watts and has eight cores and 16 gigs of ecc RAM too. The motherboard has the right features I wanted. ipmi built in, four nics and some other stuff. I was worried the CPU would be under powered by it packs plenty of power for my docker containers. Sonarr for auto TV downloading, couch potato, nextcloud server, web server, MySQL server, modded Minecraft server, crash plan backup server, and others. I barely eat up 30% CPU when everything is running and actually doing something. Idle is below 5%. I don't have Plex on it because my Nvidia shield does that. It's surprised me a lot how much power it has. If you want gaming, it's not for you but it is more than enough as a file server and the applications its running and plenty more.



SATA Controller Card (needed more sata ports than motherboard had)

Power Supply

[2x SSD for Cache/Pool set up]

5x WD Red 3TB

Better fans for case

Case (LOVE the case)

u/meemo4556 · 1 pointr/techsupport

They most definitely use hardware raid these days, here's why:

Basically universal compatibility

More reliable than software

Better performance (not taxing CPU constantly)

More SATA ports avaliable with them.

To answer your question, don't risk it.

If you really want the security/performance from RAID, get a controller like:

u/Mr_T0ad · 1 pointr/DataHoarder

I have a similar question. I am running Windows 10 and am out of ports. I was looking into getting an IO Crest card from Amazon. Would something like the 9211 be recommended over the IO card for me?

IO Crest SI-PEX40062 4 Port SATA III PCIe 2.0 X2 Controller Card

I currently have 4 6tb red drives and am using Drivepool with the mirror set to X2. I just purchased 2 of the 8tb drives from best buy. ^I should probably move onto one of the other software raid systems with parity drives.

u/VMFortress · 3 pointsr/HomeServer

Yeah, that should be all you need. I have a similar HBA that I just flashed to IT mode and it runs fine. And yes two of those cables but power for the backplane will be supplied by 4 or 5 Molex connectors.

As for speed, I think with the expander you're going to be limited by the SAS bandwidth which when split across 24 drives seems to be about 400MBps per drive. So if you're filling the system up with SSDs, you might hit a bottle neck but with hard drives you'll be fine.

Can't remember if the back plane even supports more connections than the two so you might be limited in that way regardless unless you directly wired each drive with something like the original HBA you linked.

u/nealbscott · 1 pointr/buildapc

What you will want is card like below. It's a sas/sata HBA (host bus Adapter) card. SAS is the big brother to sata. From the photo you see what looks like 2 ports. Those are not sata. Look close and you will see a special type cable called a 'breakout' cable. One plug on one end which turns into 4 sata plugs on the other. The card below supports 8 sata drives.

The card:

The cable:

I like the LSI cards, but certainly there are lots of others.

u/sivartk · 2 pointsr/PleX

If you have any open PCIe slots you can always add more SATA ports. I'd just be careful about swapping into a non-Dell case. It can be done but it is kind of ugly (you can see the front panel inside and the power button as a periscope 😀) Although I did later clean it up. If figure when I add a few more drives, I'll upgrade everything. Still have about 6TB of free space so that will take a while to fill up.

u/H1Tzz · 1 pointr/buildapc

man im in pretty much same boat as you, i currently have 4790k also and my mobo is dying, its in worse condition than yours( motherboards vrm are failing which causes random shutdowns) and im saving for ryzen system because in my country there is pretty much zero used good haswell motherboards and getting from ebay or something similar is pretty expensive considering i could just sell my 32gb vengeance ram along with cpu and use those extra 200-400 euros for new system which will have upgradeability, nvme, better performance and so on. Also i have 1070 which im saving it too ofc. I think you have 2 options, first would be to get sata-pcie expansion card such as this I personaly never tried it though. Or as me, sell your current ram and cpu and just upgrade.

u/Covecube-Christopher · 1 pointr/DataHoarder

x8/x8 is fine.

PCIe 2.0 x8 is enough to run ~20 drives at 120MB/s (each) on a single card. And to even get that many drives, you'd need to use a SAS Expander.

Honestly, an 8i card with an expander may be a better idea. It gets you about 24 drives (or 40 with two expanders), and may be cheaper than the 16 port card.

Heck, it actually recommends a SAS9211-8i card here. That's exactly what you'd want.

u/ShdewZ · 1 pointr/buildapc

Doesn't seem to have an m.2 slot but if you have free pcie x4 slots you could get an adapter like this.

u/ChaiGong · 1 pointr/freenas

&gt;If it's just a data drive and you're not looking to do anything super fancy with it. These work great.

I had a card with the same chipset and it was utter shite. Seemed to work fine, but I got all kinds of SMART errors (related to data transfer, not the drives themselces), drives would spontaneously be kicked from their vdev array, drive commands would fail, etc.

I recommend never using PCI SATA expanders. You can get an LSI HBA for the same price plus rock solid performance and better speed.

u/velogeek · 1 pointr/DataHoarder

The issue is that SAS1 was created during the era of PCIE 1.0. Even at 12Gbps per cable (of 4 channels), you would still saturate an x4 card with a single port. It was necessary to use an x8 and even then, it was possible to saturate that link with a dual connector card.

PCIE 2.0 was released just before SAS2 and this basically put it in the same scenario where you can get 24gbps per connector but with 500Mbps per lane. So, a dual connector card can do a theoretical 48Gbps whereas an x8 PCIe can only do 40Gbps. Again, not a terrible bandwidth issue because workloads are rarely that sequential.

So in reality, there's just never been a business need for an x1 HBA. On the other hand, cards like this one exist in the consumer space for just adding SATA ports to an x1. If OP has a 2.0 slot (possible but not sure how likely since standard PCI was given as an option...) then that card can add a few ports - it just won't be expandable and the jury is out on whether or not it's a true HBA in regards to passing drive data.

u/mvillar24 · 2 pointsr/unRAID

The question about PCI-E SATA cards is how much you are willing to spend and what available PCI-E slots do you have on your motherboard.

The cheapest I've tried (with slowest throughput) when you only have PCI-E 1x slots free is to use four port SATA cards like this Marvell 88SE9215 chipset based card for $33 on Amazon:

If you got at least a PCI-E 4x slot you can something faster for $100 - $160 such as (note these are 8 port cards):

  1. HighPoint RocketRAID 2720SGL 8-Port
  2. Supermicro AOC-SASLP-MV8

    On eBay used:
  3. Dell HV52W PERC H310

    A number of the above solutions are not as fast as you can go since they use PCI-E 4x slots. But 8x slot cards can cost a lot more. Personally I don't notice the slow down as much since I'm really using these drives to stream and don't notice that parity checks and moving data from cache to permanent drives take longer.
u/BWC_semaJ · 3 pointsr/buildapc

Now what is interesting, there are adapters that take a PCI express port and convert it to Sata 3s which I highly recommend looking into.

I'll just link an example. Please do your own research on this though. I might link you one that might not work for your motherboard.

Also, it might be a bit expensive. There are a lot of these adapters out there so best of luck.

EDIT: Another thing to look into is if you can have the OS on one of these adapters. I personally don't have that much experience with them so I don't know as of now but I'm sure if you do a bit of research you could find the answer.

u/mcracer · 2 pointsr/homelab

Depends on how many drives you want to support. If only 4 or less you can get by with a cheapo PCIe SATA card like this one

If you want to go bigger, then you are looking at your original list... You can get them pre-flashed like this one

u/jcpb · 17 pointsr/shittykickstarters

Software RAID, simply because it's the only way they can fit everything into a USB thumb drive the size of a pack of Wrigley's chewing gum.

Real, actual, hardware RAID requires a fully dedicated controller designed for said purpose, along with its own RAM cache and heatsinks. Example 1 Example 2 I'd like to see these clowns attempt to shrink-ray one of these server cards into something small and power-efficient enough to run off USB 3.0.

u/nobearclaw · 1 pointr/freenas

I use one of these: IO Crest 4 Port SATA III PCI-e 2.0 x1 Controller Card Marvell Non-Raid with Low Profile Bracket SI-PEX40064

It works well, but I don't do tons of read/write as I only is mine for backups. I would recommend to get hba if you're going to use it for a lot...going to be better for u in the long run.

u/godzplague89 · 1 pointr/DataHoarder

Hmm will it still perform if the only other pcie slot on my Mobo that can support x8 size will only run in x2 mode. Wonder if something like this would be better?

u/TwoAprilFools · 1 pointr/homelab

They disabled the M.2 Slot on the motherboard for firmware over 1.0.2. You can downgrade, but it comes at price of having the intel bug.

I do not believe that there is a card that will allow two NVME drive on the same PCIe slot, but I'd love to be proven wrong.


I have a Dell T30 with this card. And it works very well, I haven't done a speed test, but it boots Proxmox in seconds.

u/hertzsae · 1 pointr/freenas

I bought this and hooked up mirrored ssds. It's not one if the "recommended" controllers, but those are all expensive and it can't be worse that USB. I'm also not using it for my data, just boot disks. Been running great for a while now on 12.2

I/O Crest 4 Port SATA III PCI-e 2.0 x2 Non RAID Hard Drive Controller Card Marvell 9235 Chipset

u/lawpetex · -1 pointsr/DataHoarder

Besides what others have said, I've used this for almost a year without problems

Might as well plug up those pcie x1 slots that you'll never use

It's slow in theory but on 5400rpm raid environment it doesn't make that much of a difference. The asm1061 chip is pretty solid, Asrock has those on many of their MBs.

u/SuperPunnyRedditName · 1 pointr/Proxmox

This is my server case that I use. I use these cables to go from the back plane to these PCIe sata cards. Back when I used to have my server running Windows I had bought a RAID controller. It wasn't until after I switched to proxmox that I found out the controller wasn't compatible. I think this is a much cheaper option, and I already have multiple of these so my server is pretty much already filled. That is a good idea though. I just didn't really find a good authoritative list on what RAID controllers actually work well with proxmox a few years ago, that is why I went this route. Thanks for the idea though

u/Iron_Yuppie · 1 pointr/freenas

&gt; lsi sas

Thank you so much! I was looking at this one -;amp;ie=UTF8&amp;amp;qid=1426876419&amp;amp;sr=1-2&amp;amp;keywords=4-port+sata - any feedback? What's known to be supported in FreeNAS? Browsed around, but didn't see any listing specifically for support.

u/Big_Papi_Knows · 2 pointsr/buildapcsales

I have one of these that has worked well for me. I think there are better performance devices out there, but for a HDD that's just storing media for me it has worked like a charm.

IO Crest 4 Port SATA III PCI-e 2.0 x1 Controller Card Marvell Non-Raid with Low Profile Bracket SI-PEX40064

u/dhess · 2 pointsr/haskell

I run NixOS on both a Jetson TK1 (armv7l) and a Jetson TX1 (aarch64). It works great on both of those boards. On the TX1, I use a Mailiya M.2 PCIe to PCIe 3.0 x4 adapter (;amp;psc=1) along with a Samsung EVO 960 NVMe board to host the entire NixOS filesystem, and it really flies.

Out of the box, Nixpkgs does not support GHC on any ARM architecture, because -- believe it or not -- the GHC bootstrap process on Nixpkgs starts by downloading the binary distributions for GHC 7.4.2, and there is no binary distribution of 7.4.2 for any ARM. (Even if there were, you would not want to wait for it to bootstrap through 3 or 4 versions of GHC on ARM! Building even a single version of GHC on the TK1 is brutally slow.)

However, I've created a Nixpkgs overlay that downloads the Debian package for 8.0.1 and use that to bootstrap a Nix derivation for 8.0.2. I posted links to a rough version of that overlay in the comments here: I've been using this overlay to build Haskell packages for armv7l on my TK1 for months now with great success. I thought it worked on aarch64 as well, but based on the feedback from a tester in that GitHub issue, it sounds like it doesn't work anymore for that platform. In any case, over the next week or so I'll try to post a working version of the overlay somewhere on GitHub.

Re: the Jetson TX2, I can't get NixOS to boot on it, which is odd given how similar the platform is to the TX1. It can't find the root filesystem from the initrd. I even tried the official linux-tegra kernel, which is maintained by Nvidia devs and has bleeding-edge support for Nvidia's Tegra platforms, but to no avail. I haven't tried the recently-released 4.14 kernel yet, but I will soon.

u/Thaurane · 1 pointr/windows

Its sounds like you want raid10 you will need to buy a raid card if your motherboard doesn't support it. If it doesn't you will need one. I would recommend the LSI brand (I'm currently using an lsi card for raid6). this one appears to support raid10. In the QA there is a user who explains how to set it up.

u/DARKZIDE4EVER · 1 pointr/PleX

&gt; SI-PEX40064

Thanks, I initiated a return with refund to Amazon of the Syba I ordered which had RAID option and got the IO Crest of that same model you mentioned. Hope this works.

Here is the link:

u/tastemakerchuck · 1 pointr/homelab

Thank you, do you have those fit into a regular PCIe card like this:

u/johnnyp42 · 1 pointr/computers

Sata DATA cables can't be split, it sounds like that is what you're looking for. If you need more SATA ports than your motherboard provides you can add more with a PCI-E card.

The thing you posted a picture of is supposed to connect to a RAID card or a PCI-e controller - that's why it has that weird connector you don't recognize.

Adding a PCI-E SATA controller is going to be the cheapest and easiest way to add more ports. Something like this.

u/jaxspider · -1 pointsr/DataHoarder

SAS really isn't for the average joe. I would highly recommend against it unless you are willing to go all in. With SAS, If you really want to do SAS then do it right. With Raid 5 and a good raid card.

I recommend the LSI 9260-8i. Its only $40~$50 more than the one you picked. But it the cheapest SAS raid card I can personally vouch for.

For Raid redundancy settings backup (raid doesn't die on you)... get the BBU battery kit. It costs around $160~$177 on its own.

The 9260-8i and it's BBU were used heavily at my last job. My former employers put this specific raid solution in their 2U supermicro 2010/11 server models. They have now since upgraded to much more expensive SAS raid solutions. This is how I know it actually is durable and worth the investment. The raid card has really good 3 year warranty. But be warned that the battery is basically out of warranty after a year. For home NAS usuage this is well beyond overkill... but we hoarders love that kind of shit.

Furthermore, I don't know anything about IT firmware flashing. My opinion is strictly regarding the SAS raid card itself.

u/lordmycal · 1 pointr/Windows10

I used to have a sata controller that would do what you're talking about. You'd attach the SSD and the hard drive to the same controller and set it up so that the SSD would cache reads. I think it was called HyperDuo...

I think it was something like this:;amp;psc=1

This looks like it would also fit the bill for you:

u/kupan787 · 1 pointr/DataHoarder

Good question. I’m running this on a white box/DIY NAS. It’s got an older AMD A10-7860k processor, 32 GB of DDR3. 6 SATA III ports on the motherboard. 4 SATA ports on a IO Crest SI-PEX40062 SATA III PCIe 2.0 x2

I’m not sure the best way to diagnose the iowait issue. I just know that when Duplicati starts a database rebuild, the CPU is pretty idle (5% in use), but the load average is 3.4/3.7/3.8, and WA shows around 20-30%.

What is the best way to find what is blocking IO?

u/port53 · 1 pointr/homelab;amp;ie=UTF8&amp;amp;qid=1394259875&amp;amp;sr=1-3&amp;amp;keywords=3tb+internal+hard+drive

$20 cheaper, not in an external case (less effort, easier to return if it goes bad)

Also, if you're serious about storage, spend the money on a good hardware RAID card. This card is what I use.

u/usb_mouse · 1 pointr/HomeServer

thanks for those answers,

3) i looked around for what hba is apparently Host-Bus Adapter, I found some crazy expensive prices and some not that high, is this good for the job ?

Unraid looked nice and all but I would like to run a free/opensource software, and to my comprehension it isn't. Also I don't mind spending some time getting to know new tool and I'm decently familiar with linux cli.

5) that makes a lot of sense, separate box it will be.

u/m3ki · 1 pointr/freenas

Interesting thank you I will have to do more research on cases.

So regarding sas expander:
does this work with the motherboard i specified?;amp;ie=UTF8&amp;amp;qid=1409100717&amp;amp;sr=1-1

Do I just take a cable and plug one end into sas port on the mobo and another into the sas expander?

u/JohnDF85 · 1 pointr/homelab

Lol, "Utilize full speed of my SSDs"" = I meant if the PCI 16 vs 4 slows it down or anything like that. This adapter that you posted, would you reccomend that one over the one posted earlier in this threa by @twoaprilfools ? -;amp;psc=1

u/Stingray88 · 1 pointr/freenas

I did some research and apparently that's not universally true. If you have a motherboard that supports bifurcation on the pcie slot, you can use a much much cheaper adaptor like this one

Now I just need to figure out if my motherboard supports bifurcation... Not sure where to find that...

u/GeekMania · 2 pointsr/PleX

With regards to additional sata connections, a SAS backplane to A HBA card is out the question at the moment but to start with would something like this be fine to use:

If so would it be fine to connect the drives that plex use to that or will that cause issues with the plex server? (if im watching something on a hdd connected to the pci card and my partner wants to watch something else on the ipad but that is on a different hdd also connected to the pci card)

u/CookieLinux · 3 pointsr/DataHoarder

You could get a SAS raidcard and cables all for ~$27 on amazon. that would give you 8 ports with those cables and if you wanted you could use a SAS expander cable to get more.


Get a $20 4 port SATA raidcard and go with that.

FYI you can plug a SATA hard drive into a SAS raidcard just not the other way around just like how you can fit a small box inside a bix box but not the other way around.

My reccomendation would be to get the SAS raidcard so you have a little expansion room if you want more drives.

u/AFellowOfLimitedJest · 2 pointsr/buildapc

Just finished my first build, and realised I need more SATA III ports than the 2 on my GA-AB350M-Gaming-3.
In lieu of a motherboard change, I'm looking to add more with a PCI-e card.

Am I right in thinking that this "Syba SI-PEX40064 SATA III 4 Port PCI-e x1 Controller Card" will:

a) fit into a PCI Express x16 slot, running at x4 (PCIEX4) (PCI Express 2.0 standard) on the GA-AB350M-Gaming-3?
b) work just like SATA III ports on the motherboard (same speed, function)?

2 SSDs are already attached to the 2 SATA III ports on the motherboard.
I intend to attach 2 old HDDS (1.5 TB + 500 GB, 5900 rpm Sata III 6 Gb/s) to the card, and another SSD in the future.

u/zSars · 1 pointr/unRAID

I know for a fact that this one works:;amp;psc=1

and this one does not work

Hope this helps

edit: Pretty sure the Marvell chipset makes the difference

u/Martelol · 2 pointsr/buildapc

Pretty much this.

As for your not-enough-sata-ports issue, you'll need an internal pcie card or a usb-&gt;sata adapter.



There's really zero reason to go the USB adapter route though, at that point you might as well just get an external dvd drive:

u/thatoneguyyouknow3 · 2 pointsr/DataHoarder

I don't have enough money or space to put them in a server like I'd like to, so for now they're just in my PC. I did have to get myself a RAID card though, which will someday be taken out and go in a proper server.

Re-purposing my old 8TB drive for backups.

u/Antrasporus · 4 pointsr/DataHoarder

It depends on how many ports more you need and if you have some pcie slots free. I have two sybia cards with 4 ports each, similiar to this one here - amazon link

u/Astealoth · 2 pointsr/pathofexile

Are you using a Z87 or Z97? If you have one of those platforms you can crank your ram up to 2133mhz or so, and if you have a copper heat sink on your CPU you can push a 4690K to around 4300mhz-4400mhz even on boards like B85 and H97 that might be limited to 1.2v of CPU power delivery. And you could get a PCI-E storage device these days really cheap. 256gb Intel 600p is down to $100 on Newegg, even if your board doesn't have m.2 slot support you can get a PCI-E m.2 NVMe riser card for around 20 bucks.

u/Bcron · 3 pointsr/pcmasterrace

The lowly and forgotten PCI-E x4 to m.2 adapter.

It's an interesting bit of tech for 20 bucks, because it is precisely what it says - it allows one to run an M.2 drive over 4 lanes of PCI-E.

It isn't really difficult to make these things, since each PCI-E lane needs a trace to the corresponding pin on the M.2. It allows for one to put an NVMe drive into a system that otherwise wouldn't have that functionality. All one needs is a free PCI-E spot that isn't forced to x8 electrically by cutting the front 8 lanes (the front 8 are nearest to the left notch, and some motherboards make bottom-most PCI-E slots dead on those 8 lanes by not putting traces on those lanes, so that a graphics card will be forced to x8 on the back 8 lanes).

I think it's just marvelous and illustrates how well thought-out the PCI-E protocol is - my X79 rig from 2012 now has a boot drive that wasn't even a concept 5 years ago, thanks to PCI-E, a simple BIOS mod, and this adapter.

u/MutedProfessional · 3 pointsr/unRAID

Any AsMedia, Marvel, etc. SATA card will do.

This one on Amazon is a PCIe 2x card, should do fine for a low-end setup:;amp;psc=1

If you want something better, take a look at LSI cards, but that might be overkill for your needs.

u/zax9 · 1 pointr/techsupport

This card on Amazon is only $16. This card on NewEgg (sold by a third party) gets you 2x SATA-III and 2x ESATA-III ports for $12. If you don't mind the wait, it looks like the same no-name card on NewEgg can be had direct from Hong Kong via this ebay listing for about $7.

u/cf18 · 1 pointr/buildapc

What you need is a M.2 SSD to PCIe slot adapter like that. Your on board M.2 has a 10Gb/s limit. If you put that card in to your x16_3 slot that operates at PCIe 2.0 x4 slot, it can get 20Gb/s, assuming it is not reduced to x1 speed due to other thing being used. From mobo spec:

&gt; The PCIe x16_3 slot shares bandwidth with USB3_E12 and PCIe x1_4. The PCIe x16_3 is default at x1 mode.

Or you can put it in the x16_2 slot and it can get the full PCIe3.0 x4 speed at 32Gb/s. But it will reduce the video card slot from x16 to x8 speed, so it can reduce the gaming performance by a little bit:

u/ctrlaltd1337 · 2 pointsr/bapcsalescanada

I have a server motherboard with SAS -&gt; SATA ports so I can plug in one of these and get 4x SATA ports from one port. For power, I use these power splitters that allow me to neatly add power to a column of HDDs. I run one of the splitters per power line from the PSU. Before I had a server motherboard, I used this PCIE SATA card.

u/LanZx · 2 pointsr/buildapc

Well its a PCI lane so you shouldnt have any speeds issues especially with HDDs. These have SATA3 ports which can handle SSDs

Link: Maybe something like this

u/PeterC18st · 2 pointsr/macpro

The cMP 5,1 uses SATA2 which will limit your io through put to 200-300mbs. USB 3 is fast enough. Honestly get yourself an ssd raid card and enjoy nvme speeds with a usb 3.1 pcie card. [nvme raid card](High Point SSD7101A-1 4X...

[sonnet usb 3](Sonnet Allegro Pro USB 3.1 Type A PCIe Card (Four SuperSpeed 10Gbps USB Connectors)

Also you can get thunderbolt 3 on this machine but it involves a reboot from boot camp.

[thunderbolt 3 card](Gigabyte GC-Titan Ridge (Titan...

how to get thunderbolt 3 working

u/DMRv2 · 1 pointr/homelab

Yeah, I have 32GB ECC RDIMMs as well - I got it for $70-75/stick and have seen similar prices.

This is probably the AOC "HBA" too:

u/12sub · 1 pointr/DataHoarder

this is one of them;amp;psc=1
Cant remember the other model. Pretty much the same thing though.

u/wagon153 · 17 pointsr/buildapc
How does this look? Board checks all the requirements, and the CPU comes in at a very low 25w TDP, which is great for a system that will be running 24/7. In addition, the board comes with a PCI-E slot, so if you need more SATA ports than the 4 it comes with, you can buy one of these.

PCPartPicker part list / Price breakdown by merchant

CPU | AMD 5350 2.05Ghz Quad-Core Processor | $34.99 @ SuperBiiz
Motherboard | ASRock AM1B-ITX Mini ITX AM1 Motherboard | $34.99 @ Micro Center
Memory | Crucial 4GB (1 x 4GB) DDR3-1600 Memory | $12.99 @ SuperBiiz
| Prices include shipping, taxes, rebates, and discounts |
| Total | $82.97
| Generated by PCPartPicker 2016-06-28 00:01 EDT-0400 |
u/Josey9 · 1 pointr/DataHoarder

Thanks for the warning. If I buy this: directly from Amazon, it should be fine, right?

u/SysAdmin907 · 2 pointsr/sysadmin

High Point? Skip it and look for an LSI 9211-8I for $80 on Amazon. Hmmm.. You're getting screwed... Try this. Also look at Runtime Software for data recovery (especially if it was a RAID drive).;keywords=lsi+sas+9207-8i&amp;qid=1566840839&amp;s=gateway&amp;sprefix=LSI+SAS%2Caps%2C246&amp;sr=8-6

Don't forget the cable.. If it's a Dell drive, you might need a Dell PERC controller..

u/nukee26 · 1 pointr/freenas

I've been using this SATA card for my boot device since March and haven't had any issues. It's a good bit cheaper than what you posted too.

u/cetteup · 1 pointr/Proxmox

It's honestly way easier to get a cheap SATA PCIe controller, attach the BD drive to it and pass the controller through to the VM. You can get controllers for around 20 bucks (eg. this one from PCI passthrough on Proxmox is experimental, but the setup is well documented.

u/mmm_dat_data · 1 pointr/DataHoarder

yea I think a good amount of people on here including myself use this HBA

amazon has em and there's lot's of tutorials if you do some gooling...

u/-XorCist- · 1 pointr/AMDHelp

This one matches my mostly black interior. It looks like it'll work, can you look at it and let me now what you think?

black adapter

u/PearsonFlyer · 1 pointr/radarr

You're going to want something like this:

This is a hardware issue though, not really related in any way to Radarr.

u/Lev00 · 1 pointr/JDM_WAAAT

I have that board and can confirm the option exists in the bios. I am considering this card to try for a few reasons... it's 8x form factor so it'll fit the motherboard. I'm using a 847 chassis so can only fit half height cards.

u/kanid99 · 6 pointsr/homelab

HP 462919-001 Smart Array P410 8-Port SAS RAID Controller 256MB ECC DDR2 SDRAM PCI Express x8 300MBps 2 x Mini-SAS SAS 300 Serial Attached SCSI Internal

u/ethraax · 1 pointr/linux

Okay, I looked up which exact cards I have. I have three HighPoint RocketRAID 2720SGL in my three Ceph storage nodes. They operate in JBOD out-of-the-box on Arch Linux. I'll probably be installing Gentoo to them at some point in the future, so then I'll really find out if support for them is included in the kernel. Either way, they perform just fine under JBOD and behave as expected. They even support hotswap, although you need a script to clean up after removing one (Linux keeps the device around - it doesn't get removed "cleanly").

I only ever wanted the SAS connectors for my backplane, so I completely ignored the RAID features of the card.

One annoying thing is that they seem to hijack the boot process, to show you the status of your "RAID array" (just disks in my case). It adds about 20 seconds to the boot time, which is annoying, but they're servers and are rarely restarted, so that's fine.

u/bothunter · 1 pointr/techsupport

You'll need this for the power:

Any maybe a SATA controller like this if your motherboard doesn't have a SATA port on it already:

If your computer is old enough to not have the PCI express and only has PCI ports, you'll need this instead:

u/Deemo13 · 1 pointr/buildapc

If SATA ports is all you need, you could always just add a SATA card like so:

PCI-E SATA expander thing

This is a non-RAID one though, so if you need RAID, consider getting a RAID one.

u/epistaxis64 · 1 pointr/unRAID

Usually people will buy a seperate SATA pci-e card like this:

u/Y0tsuya · 0 pointsr/DataHoarder

A single Rocket 750 supports 40 ports.

u/mightychicken · 1 pointr/techsupport

I bought this card, an M.2 to PCI-express adapter. The drive is in a PCI-Express x16 slot.

Another screenshot. The M.2 SSD is "D:"/Disk 1.

Before installing, I updated my BIOS to 2.90P, which supports NVMe according to the patch notes on ASRock's website.

u/candre23 · 66 pointsr/DataHoarder

If you don't have a motherboard that supports the asus card, you can still buy the bare high point card for $400 and fill it yourself for less than a third the cost of the 4TB preconfigured high point.

u/Balmung · 1 pointr/techsupport

You sure a cheap pcie sata card wouldn't give better performance? The reviews on says it does get around 350-400MB/s so it is better, but not the full 500.

u/jeffrife · 1 pointr/buildapcsales

&gt; LSI 9211

Are there certain specs to look for? Trying to find out the difference between this card and something like this that is a third of the price.

u/Puptentjoe · 1 pointr/DataHoarder

I got these when they were $39
Then I got the top cages when those were on sale for like $50 back in the day. Go with the istarusa, sturdier to me. I had to send one of he Icecage (top ones) back because of a broken connection.

No HBA just a bunch of these IO Crest 4 Port SATA III PCI-e 2.0 x1 Controller Card Marvell Non-Raid with Low Profile Bracket SI-PEX40064 Not the best but it's all I knew at the time that would work. Next build I'll go a better route.

u/blaziecat1103 · 1 pointr/buildapc

If your core problem is running out of SATA ports, a cheap expansion card could solve this problem more elegantly than external enclosures.

u/tms10000 · 1 pointr/DataHoarder

All HBA are going to be working out of the box with a modern Linux distribution. I use super crappy IOCrest or Syba branded HBA like this one:

But I learned from this sub that those should be avoided. The ones that tend to be recommended are the one with LSI chipsets. But for those I have little experience (though many recommendations exist on this sub)

u/mrfixitx · 6 pointsr/DataHoarder

If you just want something inexpensive that will still allow you to saturate gigabit lan you could get this card that uses pci-e x1.;amp;redirect=true&amp;amp;ref_=oh_aui_search_detailpage

u/mmelvin0 · 1 pointr/buildapc

My current setup is 850 Pro SATA on an ASUS Sabertooth mk1. (EDIT: Mobo has no m.2 slot.)

I was wondering about upgrading to an 960 EVO or Pro.

Is it worth it to get an m.2 adapter like this?

u/Raymich · 2 pointsr/unRAID

Oh I know this crappy card, had to return it back before I even opened it fully. See that massive capacitor near PCIe connection? Check if yours is still attached and not wobbly, mine had fallen off during delivery. I have noticed that amazon is cluttered with rebranded versions of these cards.

Edit: if you are looking for cheap alternative, this is the one I got in its place back in 2018, still alive and kicking in my unraid box:

Syba SATA III 8 Port PCI-E 2.0...

u/phrekysht · 3 pointsr/homelabsales

You should check out this it's only 8x, so two m.2 slots but it's cheap enough to buy two. I picked mine up for $40 on Amazon.

u/SeaNap · 1 pointr/PleX

Drivepool doesnt prevent data loss, the SnapRAID parity drive does. Check out their faq for how it all works, pretty cool stuff. But yeah your right, if your drive fails you have two options; fix it under warranty (deleting everything on it, or a new drive), or throwing it out. You buy a new drive to replace it and the parity rebuilds the drive with all the data that was on the failed.

Im powering 16 through a SAS backplane to a HBA card, 6 on the on board SATA, and 4 in a cheap pci sata expander (24 data and 2 ssd os).

u/svenge · 1 pointr/buildapc

Hmm, that is problematic. Perhaps something like this PCIe to SATA add-on card would be a solution.

u/ThinkMention · 1 pointr/buildapcsales

It always requires PCIe since it connects to CPU using that, however to use it as a boot drive it requires NVMe support on motherboard

In old motherboards that were before NVMe, an adaptor can be installed into a PCIe slot, with it a NVMe can run on old motherboards but only as a secondary drive, can't install OS or boot from it

NVMe can offer wide range of performance, from around SATA SSD performance and up to around RAM performance, the price usually reflects that as NVMe with RAM performance is pricey.

u/Sl0rk · 3 pointsr/unRAID

This is the one I use and it works fine -;amp;qid=1543180924&amp;amp;sr=8-3&amp;amp;keywords=IOCrest+SI-PEX40064

Although it costs a lot more that it should be for some reason. I paid $15 for it on Newegg.

u/datahoarderguy70 · 1 pointr/DataHoarder

A few things to understand, hardware RAID involves using hard drives that are the same size, otherwise you can't do it. You can however connect a bunch of hard drives to an HBA or Host Bus Adapter so they all show up as individual drives. It depends on the OS you are using to do this. Cache on a RAID controller only matters if you are doing hardware RAID. You could start with something like this to keep it simple: Or you could go with an Dell H310 and configure your drives as JBOD (Just Big Old Disk) which means they are all individual drives, either way you won't have any redundancy, something hardware raid offers.;amp;qid=1494048102&amp;amp;sr=8-5&amp;amp;keywords=IO+crest

u/Raffy_ruck · 1 pointr/DataHoarder

&gt;The issue was these WD Mybook drives comes as exfat. That prevented the boot of the Windows OS for some reason.

That explains my troubles last night after shucking.

It wasn't able to boot with the My Book plugged into the motherboard, but was able to after plugging it into my PCI-Express Sata card.

u/whoisstewiegriffin · 1 pointr/buildapc

Hi folks - I need to buy a new controller card - I currently have this controller. Wondering if there is a newer model that I should consider or another one you might recommend? This is the motherboard. It does need to be low profile.

u/daemen · 2 pointsr/freenas

Like the other commenters have said, you don't want a RAID card, but either an HBA or SATA expansion card.

Here's a 4-port on on

There's plenty of reviews of the same model on saying it works out-of-the-box with FreeNAS.

u/IvivAitylin · 1 pointr/Amd

I just grabbed a cheap one off Amazon, this one specifically.

u/presler · 1 pointr/hackintosh

also have a Z800 hackintosh (x5675, 48GB ram) and would recommend the following that work OOB in Sierra 10.12.6, no extra kexts needed.

  1. Add USB3.0 ports and possible future expansion to a front panel using this

  2. Add Sata 3 (6GB/s) capability to this monster using this.
u/095179005 · 2 pointsr/buildapc

Nope. Electricity works that way, data doesn't.

You will need to plug the 960 EVO into the top most M.2 slot, labelled M.2_3

That will leave you with 7 SATA ports.

You will then need a PCIe to SATA card, to give you the extra 2 ports you need.

You'll plug it into the PCIe_4 slot.

u/cjlee89 · 1 pointr/unRAID

4-port SATA card:

Get some good SATA cables as well. One of the ones that came with the controller was bad for me.

u/LimoncelloOnIce · 1 pointr/Amd

I bought a few of these from Amazon, there are non-RAID versions also.



Either should work; I have some B350 boards (Gigabyte and Asus, no AsRock), but nothing is setup for 3k series yet, they need flashed. I have a 3600, I can test one and let you know if you want.

Also, does it matter what slot your Syba card is in?

u/crdmiller · 1 pointr/HomeServer

No ecc, no go (exp for zfs/storage cache). years ago i went amd for this reason, fx 8 series is dirt cheap and supports ecc

I dont think the asus storage solutions are any better than just having more pcie slots and buying cheap IT cards see

Your board specs,
Total Slots : 8 (4-channel per CPU, 4 DIMM per CPU)
Capacity : Maximum up to 256GB LRDIMM
Memory Type :
DDR3 1866/1600/1333/1066 RDIMM
DDR3 1866
/1600/1333/1066 UDIMM
DDR3 1866*/1333/1066 LRDIMM
Memory Size :
32GB, 16GB, 8GB, 4GB, 2GB RDIMM

u/kcehlers · 2 pointsr/PCBuilds

IIRC SATA (data) cables can’t be split. If you’re talking about just the SATA power cable, I wouldn’t worry about anything. If you’re talking about data, I ran into this same issue a while back. Found a pci-e card with two sata ports on it pretty cheap and it worked right out of the box. I believe this is the one I have.

u/Jordanl91 · 2 pointsr/PleX

How many PCIE slots do you have open? You can always just get one of these
IO Crest 4-port SATA III PCIe 2.0 x2 Controller Card Green, SI-PEX40064
Rebuild your case if you don’t have enough space and get hot swappable 3.5 bays. It all just depends on budget and what you want / have.

u/manifest3r · 2 pointsr/unRAID

I have a 4-port PCI-e expansion card, using it without any issues on 2 1TB drives, and 1 500GB drive. Model number is SI-PEX40064.

u/Abyssul · 1 pointr/buildapc

This is what I used. I even have my before and after benchmarks in the top review (that's me). I'm not sure how the PCI-E x1 bandwidth limit bottlenecks by current speeds, but I'm experiencing typical speeds with it.

u/smokehidesstars · 1 pointr/buildapc

You could always add a cheap SATA3 PCI Express card:

u/kevin_ol · 1 pointr/pcmasterrace

This plugs into a SAS port, you're probably looking for this

u/bandman2016 · 1 pointr/buildapc

There are PCIe to SATA cards out there, naturally there's many variants, so the best thing to do would be to get out and google around, look at reviews, etc

Example of such:

u/Direwar · 1 pointr/starcitizen

Do you have any spare PCI slots?

u/largepanda · 2 pointsr/linuxquestions

Using this card with Linux seems to be a very mixed bag of results.

If it's still within the return period you might just return it and get one with a different chipset.

u/betstick · 1 pointr/DataHoarder

One is on an MSI X99-A(?) motherboard, and the other is on this guy: Link

u/Fiberton · 2 pointsr/zfs;qid=1566925845&amp;s=gateway&amp;sr=8-1 shove one of these cheap ones in. Plug some new drives to it. Start replacing the drives. Once 4 of the drives replaced just power down and pull the other drives out if that is what you need to do. Then finish the rest or get two of those cheap cards.

u/dannybuoyuk · 1 pointr/DataHoarder

I got this, looks identical to the one already posted:

Also have one on the way that hooks up to the mini-PCI-E card slot on my mobo designed for wifi cards:

Bear in mind PCI-E x1 doesn't have enough bandwidth to use all 4 drives at their peak (although copying from drive to drive attached to the same card might be another story).

u/GT_YEAHHWAY · 1 pointr/unRAID

I have this adapter and it doesn't show up in BIOS on my B450M board.

Should I get a riser then switch adapters?

u/Singular_Brane · 2 pointsr/hackintosh

pci Sata ports

pci extension

Then you can always get a pci ssd adapter.

u/til_you_rock · 1 pointr/DataHoarder

For $26.99 add four more ports via PCI-E 1x expansion card.

u/ihoman202 · 1 pointr/homelab

Looking at this for an alternative to going to totally new server since I have an old Pentium III that does server related things. Would this work?;amp;ie=UTF8&amp;amp;qid=1502492023&amp;amp;sr=1-4&amp;amp;keywords=8%2Bsata%2Bport%2Bcontroller&amp;amp;th=1