#19 in Network I/O port cards
Use arrows to jump to the previous/next product
Reddit mentions of HighPoint RocketRAID 2720SGL 8-Port SAS 6Gb/s PCIe 2.0 x8 RAID HBA
Sentiment score: 7
Reddit mentions: 8
We found 8 Reddit mentions of HighPoint RocketRAID 2720SGL 8-Port SAS 6Gb/s PCIe 2.0 x8 RAID HBA. Here are the top ones.
Buying optionsView on Amazon.com
Replacement model: RocketRAID 2720A6Gb/s (600MB/s) data transfer rateMajor OS support includes Windows, Linux, Free BSD and Mac OS X 10.7x Lion (only for RocketRAID 2722 and 2744)PCI-Express 2.0 x82 Internal Mini-SAS portsRAID level 0, 1, 5, 6, 10, 50 and JBODCertified Cable HighPoint Internal Mini-SAS to 4x SATA Cable (HighPoint Int-MS-1M4S)
|Number of items||1|
I opted for a Windows Server with lots of storage bays.
Server 2012 has a "storage pools" feature, if you are fairly tech savvy and interested in that type of thing. I found it super easy to set up and configure.
Edit: For personal, non commercial use you can get Server 2012 R2 from Microsoft for free, via dreamspark. https://www.dreamspark.com/
The question about PCI-E SATA cards is how much you are willing to spend and what available PCI-E slots do you have on your motherboard.
The cheapest I've tried (with slowest throughput) when you only have PCI-E 1x slots free is to use four port SATA cards like this Marvell 88SE9215 chipset based card for $33 on Amazon:
If you got at least a PCI-E 4x slot you can something faster for $100 - $160 such as (note these are 8 port cards):
On eBay used:
A number of the above solutions are not as fast as you can go since they use PCI-E 4x slots. But 8x slot cards can cost a lot more. Personally I don't notice the slow down as much since I'm really using these drives to stream and don't notice that parity checks and moving data from cache to permanent drives take longer.
Okay, I looked up which exact cards I have. I have three HighPoint RocketRAID 2720SGL in my three Ceph storage nodes. They operate in JBOD out-of-the-box on Arch Linux. I'll probably be installing Gentoo to them at some point in the future, so then I'll really find out if support for them is included in the kernel. Either way, they perform just fine under JBOD and behave as expected. They even support hotswap, although you need a script to clean up after removing one (Linux keeps the device around - it doesn't get removed "cleanly").
I only ever wanted the SAS connectors for my backplane, so I completely ignored the RAID features of the card.
One annoying thing is that they seem to hijack the boot process, to show you the status of your "RAID array" (just disks in my case). It adds about 20 seconds to the boot time, which is annoying, but they're servers and are rarely restarted, so that's fine.
Yes, they are. A SATA card simply provides more ports so that more drives can be connected individually. It's possible to set up software RAID, but that can be a pain and is slower. A RAID card actually controls the RAID setup with its own chip. That card you linked would NOT work in a PCI-E slot; it is a slow PCI card which is different.
This card is basically the same as the other Amazon link, but it adds support for RAID 0 and RAID 1. Anything that supports more RAID levels will likely be more expensive. This is a good example.. It supports RAID 0, 1, 5, 10, 50, and JBOD.
Those prices are very good. If you want the performance boost as well as the extra ports then go for it. Just keep in mind that third party chipset ports aren't any much slower than Intel ports, so a cheap expansion card can serve your purpose equally well.
That card for example can hook up to 8 SATA drives using Mini-SAS to quad SATA cable.
http://www.amazon.com/HighPoint-RocketRAID-2720SGL-8-Port-PCIe/dp/B0050SLTPC is me. Has some issues on linux (drives go into scsi error recovery sometimes, but always come back quickly) but on FreeBSD it's been glorious. The one you pictured looks like mine but with 4x pcie/sata 3gbps, which should be fine for most purposes. I'm running ZFS raidz2, the card is just in passthrough, so JBOD is definitely fine.
Hey there. I must admit I've had the solution for a few weeks but I finally have some free time to post! Looks like good timing.
[I used this HighPoint RocketRAID 2720SGL](http://www.HighPoint.com/ RocketRAID 2720SGL 8-Port SAS 6Gb/s PCIe 2.0 x8 RAID HBA https://www.amazon.com/dp/B0050SLTPC/ref=cm_sw_r_cp_apa_ASH1BbFM218R3). Worked like a charm. Don't freak out because the controller itself won't see the drives (in bios). Boot into your favorite flavor of Linux and then there are a bunch of commands to get the TRUE byte size. I had to try 3-4 different commands in order to get the right answer.
Now the sg_format will take a loooong time depending on your drive size. There is a flag to just resize instead of formatting, however since these were second hand drives I really didn't mind cleaning it all with fire and also stress testing the drives a little.
The best part about that controller is with a little wiggle and giggle I fit it right into the 710 and used the same sas cables.
Hit me up if you have more questions. Hope this helps!
Blue Iris looks like popular software. Those cams are a bit less than 4k, so you might not need 30TB but the WD Reds are the drives to get, and you might want a small SSD for the OS. I'd get a 6800k/X99/DDR4 system with a RAID card (here is a lower end unit, but it should work for this using mini-SAS->Sata adapter https://www.amazon.com/HighPoint-RocketRAID-2720SGL-8-Port-PCIe/dp/B0050SLTPC/ref=sr_1_1?ie=UTF8&amp;qid=1467390211&amp;sr=8-1&amp;keywords=raid+6). You don't need a DVR card b/c the cameras are digital and that CPU should be strong enough for all of them and the RAID processing. Just get a cheap nvidia card (750ti or less), the GPU shouldn't make much of a difference.