#65 in Computer accessories & peripherals
Use arrows to jump to the previous/next product

Reddit mentions of Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 3.3 Feet

Sentiment score: 26
Reddit mentions: 55

We found 55 Reddit mentions of Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 3.3 Feet. Here are the top ones.

Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 3.3 Feet
Buying options
View on Amazon.com
or
    Features:
  • Internal Mini SAS data cable connects a RAID or PCIe controller with an SFF-8087 port to 4 discrete SATA drives; Mini SAS to SATA adapter provides reliable internal connectivity between a Serial attached SCSI controller card in a computer system and direct attached storage devices with a SATA connector
  • Leverage hardware raid performance with this SATA multi-lane cable; Two cables can connect up to 8 SATA drives to span RAID controller arrays and share performance across two PCIe 2.0 x8 lanes with compatible host bus adapters; Supports up to 6Gbs data transfer rate per drive
  • DIY or pro installers both appreciate the convenience of a forward fan-out cable with an internal mSAS connector when expanding storage needs; 3 foot cable harness of SAS to SATA cable provides sufficient length for internal cable management; Slim ribbon cables minimize airflow impact in a computer case
  • Flexible design of SAS breakout cable includes acetate cloth tape over slim ribbon cables for strain relief to protect cables without rigidity; Woven mesh sheath covers half of the cable for easy routing; P1 to P4 markers provide easy ID after installation; Low profile SATA connectors have easy-grip treads with stainless steel latches to prevent accidental disconnection and reduce vibration disconnection
Specs:
Height0.3 Inches
Length8 Inches
Size3.3 Feet
Width6.2 Inches

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Found 55 comments on Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 3.3 Feet:

u/Droid126 · 9 pointsr/DataHoarder

I use these spliters for more SATA power connectors and These hotswap cages for housing the drives. They are often on sale at newegg for $40-60, this card Flashed to IT mode will add another 8 sata connections via two sas connectors(sff-8087) via a breakout cable

Currently I am running 8x3tb drives in my pc with a gtx 970 and my 550watt PSU handles it just fine.

u/LeKKeR80 · 8 pointsr/HomeServer

You can get SAS to SATA breakout cables. SATA hard drives can work with SAS and SATA backplanes/ports, but SAS drives cannot work with SATA backplanes/ports (only SAS).

An SAS controller will work just fine along with any controller running on your mother board. You can have multiple controller cards running along with the built-in controller cards. I don't have experience with the card you listed, but you can get fairly cheap used/refurbished cards off ebay. I prefer LSI SAS chipsets. There are many rebranded versions of the LSI SAS cards.

More than you need now, but you can also get SAS expanders that work kinda like a network switch, but for hard drives.

u/candre23 · 7 pointsr/DataHoarder

Good news and bad news:

The bad news is that the onboard 3008 is SAS1. The good news is that the backplane is a TQ, which means it's just passthrough. It will work with any size drive, you just need an SAS2 (or better) HBA that can supply 8 seperate SATA connections. I recommend finding a cheap H310 (usually around $40 used on ebay), flashing it to IT mode, and getting a pair of breakout cables.

u/Ayit_Sevi · 7 pointsr/DataHoarder

You can purchase something like this and then buy two of these breakout cables to add 8 HDDs without using any of the sata ports.

u/zonedguy · 6 pointsr/DataHoarder

You can definitely stick with the Fractal series. I did because I couldn't have a loud, unsightly machine setup anywhere in my home. I have my main system w/ 10 Drives + 2 SSDs + 3 NVME drives in an R6. That has a DAS connected with 19 drives inside an R5; 8 stock bays + 3 in 2x5.25 bay adapter + extra 3 drive cage + extra 5 drive cage.

As you are in Europe, you might not even have to pay crazy shipping charges to buy spare drive cages from https://www.fractal-design-shop.de/Define-R5_1. In the US I had to source the extra drive cages from r/hardwareswap but that proved to be easier than I expected. Here is a pic I took before I added the 2nd 5-bay drive cage: https://imgur.com/a/TWL8IB1

Edit: Request for more info...

I have not done a build log as I am not yet "finished" with the build, but it looks like there is sufficient demand for parts info so here it goes:

I have an R6 for my main NAS server loaded with the motherboard, 10 3.5 drives and one SSD. The R5 has two extra drive cages (3 + 5) as well a 2x5.25-to-3x3.5 bay adapter.

The expansion cards I use are:

  • 1x LSI 9210-8i with SAS to SATA cables for 8 of the 10 internal drives in the R6. The other 2 + SSD use SATA ports on the motherboard.

  • 1x LSI-9207-8e connected via 8088 cables to two HP SAS expanders powered in the R6 by riser cards which connect to the drives with the same SAS to SATA cables as above.

    Additional parts I used:

  • An SFX PSU is important so you can fix the extra drive cages. Don't skimp on this one. You don't need a ton of Watts (I'm using a 600W Gold) but you need quality, you are hooking up thousands of dollars of drives to it!

  • Power splitters: One & Two

  • Power switch to turn on the DAS PSU and reset it any time you need to take the NAS offline (DAS always must be powered on first)
  • Fan controller for powering fans in the DAS

    More inspiration can be found here: https://www.serverbuilds.net/16-bay-das
u/_kroy · 4 pointsr/homelab

That 9211 is like the gold standard. You shouldn’t have to flash to IT mode, but you do want to upgrade the firmware (which accomplishes the same thing). The real ones are trivial to flash, versus like an H200, so I wouldn’t sweat that.

If you want the “modern” version, the LSI-9207-8i has the most recent chipset.

You can get them new, and quite a bit cheaper, on eBay .

Then you just need a pair of breakout cables

u/completion97 · 3 pointsr/DataHoarder
  • Drivepool and snapraid are usable with an arbitrary amount disks and disk sizes and its very easy to add more drives. Tomorrow I could decide to add a random drive and everything would be setup in under 5 minutes.
  • Also both drivepool and snapraid are usable with drives that have existing data. Just point snapraid at a used drive and it will incorporate it into the parity. With drivepool you add it to the drive, stop the drivepool service, go to the new drive in explorer, show hidden files, there will be a folder called PoolPart*, move the files into that folder, start the service. You can still store files not in the PoolPart folder, they just won't show up in the pooled drive.
  • Drivepool's duplication is uneeded and inferior to snapraid parity. Drivepool just duplicates everything while snapraid creates 1+ parity disks. How many parity disks you create determines how many drives can fail before you lose data. So instead of losing 1/2 of you space with drivepool you can only lose 1/5 if you use one parity disk.
  • Note on snapraid: Its best used on drives storing large files that change rarely. Also its not RAID, which means its not real time redundancy and if you lose a drive you will have down time. I have a task setup to run snapraid everyday to sync any changes I made. So if a drive fails I'll lose at most a days worth of stuff. Also if you aren't a CLI person look into elucidate. Its a GUI for snapraid. snapraid is very intuitive to setup just by editing the config file but I still like the GUI to run the different tasks.
  • I still use drivepool even though I don't use the duplication because it pools my disks. I mounted all my drives to a folder instead of a letter (just so they don't show up in explorer). Then I added them to a pool. Now I can access my all my media drives from one drive letter. Then I pointed snapraid to the individual drive mount points.
  • I bought this HBA (Host Bus Adapter). So far been exactly what I wanted. It has SAS ports on it so you'll need some breakout cables to connect SATA drives. This card allows for 8 SATA drives to be connected which I think is more than your average SATA card. I was able to plug the card in, it did all the drive install stuff automatically, and I was able to use it.

    edit: added more info
u/NewYearNewAccount_ · 3 pointsr/buildapc

LSI 9211-8I 8PORT Int 6GB Sata+sas Pcie 2.0

Get a card like this and you can add plenty of drives. Note: you can find these much cheaper on eBay and often times will include a couple sas-to-4-sata adapter cables Also bare in mind they come in two flavors. One is raid-controller mode and the second is a simple expansion. But you can change that depending on how you plan to use it.

Consider a cheap SSD for your boot drive. Not necessary considering your needs but booting from an HDD gives me a migraine :)

u/rogerairgood · 3 pointsr/DataHoarder

I would suggest an LSI 9211-8i flashed to what is known as IT mode. This mode does not use any RAID and passes the disks directly to the operating system. The 9211-8i has 2 internal SFF-8087 SAS ports. SAS can support SAS as well as SATA. You can buy a breakout cable like this one which has 4 SATA connectors on it.

Here's a link to the 9211-8i itself, already flashed to IT mode. You can flash IT mode yourself, it is just a little "involved".

u/puma987 · 3 pointsr/PleX

I've used pcie sata expanders with mixed success, sometimes the hard drives would disappear and reappear on a reboot, you definitely don't want that to happen on a raid setup. An HBA flashed to IT mode with breakout cables works really well, I use it in my server and it is rock solid.

HBA

Cable

u/zirus1701 · 3 pointsr/PleX

You get a PCI-express SAS controller to install. They come in 2 and 4 port varieties, and you get SAS to SATA cables (turns one SAS port to 4 SATA connectors) to plug into it to connect your hard drives. That could be 16 drives per card, a couple of PCI-Express slots and you'll have more SATA connectors than you have room for hard drives.

Edit: I'm in no way recommending this specific one, but here is one example of what I'm talking about:

https://www.amazon.com/LSI-Logic-9207-8i-Controller-LSI00301/dp/B0085FT2JC/ref=sr_1_4?keywords=SAS+controller&qid=1567797015&s=gateway&sr=8-4 (it's a 2 port variety).

and for cables:

https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=sr_1_3?keywords=SAS+SATA+cables&qid=1567797027&s=gateway&sr=8-3

u/xTheDeathlyx · 3 pointsr/DataHoarder

It would really be more beneficial to just shell out 300 buck for an r710. I'm pretty sure you'll save more money in the long term since like you already know, that thing uses a ton of power. Around 300w idle which depending on your electricity that adds up! r710 can idle at a 3rd of that.

If you insist on keeping it, the h200 is a great card and can be crossflashed if needed. You'll just need some breakout cables like these https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC

u/nealbscott · 3 pointsr/buildapc

Assuming you have PCI express 3 in your computer expansion slots, Get a card like this:

LSI Logic SAS 9207-8i Storage Controller LSI00301 https://www.amazon.com/dp/B0085FT2JC/ref=cm_sw_r_cp_apa_i_xQ-jDbVZ3R30V

It will feed data connections to 8 sata drives all by itself.

It has 2 sff-8087 ports. Then get the special 'forward breakout' cable. Well two really. One end goes into the sff-8087 port and then it splits out into four sata data cables. Which go into the hard drives of course.
https://www.amazon.com/dp/B012BPLYJC/ref=cm_sw_r_cp_apa_i_2T-jDbHFP0FHC

The card can support hardware raid, but fewer and fewer folks do that. After all, hardware raid usually requires identical drives, and us folks at home often have a motley collection of drives of various sizes, speeds, and geometries. So software raid it is. In linux, folks often use freenas or unraid. In windows 10 you can use something called 'storage spaces'. Using raid will allow you to treat all those drives like one device... And have some tolerance for failure (which happens with so many)

Next question.. does your case have room? Do you have enough power connectors?

u/cjalas · 3 pointsr/homelab

Cables: 2x of these: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/

SAS/SATA Controller Card: https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M/

You might find it on eBay for less. Just posting the Amazon links for clarity.

Plug the card into a free PCI-e slot on your mobo.

Plug the Mini SAS SFF-8087 connectors into the two ports on the HBA card.

Plug the SATA connectors into the back of your 5-drive hotswap bay cage.

Insert the HDDs into the hotswap trays (if it uses trays).

Turn things on. Bob's your uncle.

P.S. if you want PCI-e 3.0 version of the HBA card, you'll need to look for "LSI 93xxx" versions of the card. They're more expensive. Also, some others go for different manufacturer cards. I prefer LSI brand.

If you just want to RAID the whole thing, there are cheaper alternatives, but hardware level RAID HBA cards suck IMHO. With this type of HBA SAS/SATA Controller, you can basically pass-through the drives straight to your computer, and they'll show up as individual drives. Later you can then RAID them via software, or not.

u/Excal2 · 2 pointsr/buildapcsales

For sure dude get after it, I've been having a ball with a stack of 170GB 15K RPM drives that I won from /u/storagereview over on the /r/homelabsales sub, still getting my post together for them like 3 months later lol.

Picked up one of these for the machine housing everything: https://www.amazon.com/gp/product/B01C5TG82C as a hot swap rack, it's pretty excellent. Then you just need something like these: https://www.amazon.com/gp/product/B012BPLYJC breakout cables and you're ready to rock. Do yourself a favor and get the 1.5 ft. ones though, 1 meter is too damn long.

Have been having so much fun with this project and I don't even store any data on this array lol, just building and breaking and rebuilding RAID configs.

u/logikgear · 2 pointsr/freenas

Here is the HBA I use with FreeNAS.

LSI Logic SAS 9207-8i Storage Controller LSI00301 https://www.amazon.com/dp/B0085FT2JC/

You will also need these to connect drives to that card.
Cable Matters Internal Mini SAS to SATA Cable https://www.amazon.com/dp/B012BPLYJC/

u/shysmiles · 2 pointsr/DataHoarder

That card you linked has 2 sff-8087 connectors if thats what you have you can use that if you wanted. Each sff-8087 sas port has 4x sata ports basically. Both of your SSDs (and 2 more) should be able to hook up to just one of those ports. The breakout cable that goes from 1 to 4 looks like this: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC

You can get a sff-8087 to sff-8088 adapter that converts the internal 4x connection to a external one to connect something like that ds4243, but for about the same money I'd still recommend the LSI adapter if you can verify you can use one more of the pcie slots. Its better to have a non raid card if your going to use something like zfs file system.

What i mean about the eSATA is that its a single sata port, vs the sas ports that are 4x sata ports. The DS4243 and others like it have a multiplexer that lets all 24 connect with that 4x connection. Those istar esata boxes have some kind of multiplexer or a controller as well, but how good/reliable are they vs something enterprise grade i dont know.

There are lots of other used SAS disk shelfs around as well (dell etc) its just about finding a good deal on one that has all its caddys etc. If your lucky maybe you can find one local on craigslist since they are so heavy and shipping is usually half the price.

u/uselessaccount129 · 2 pointsr/homelab

They are sas/sata compatible on the HDD side. You would need 2 of those lsi cards with 3 sff8087 to sata plugs.

Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 3.3 Feet https://www.amazon.com/dp/B012BPLYJC/ref=cm_sw_r_cp_apa_i_bVg4Db0FS37N2

u/sbeck14 · 2 pointsr/homelab

This is the cable you'll be looking at, and you'll need two of them because they have 4 SATA cables each - https://www.amazon.com/dp/B012BPLYJC

As far as the LSI 9211-8i goes, it's one of the most recommended RAID cards. You can also look at the PERC H310 as it is just a re-branded version of that card and may be a bit cheaper. What RAID configuration are you looking at?

u/spudlyo · 2 pointsr/homelab

So if you look at the R720 Owner's Manual, you'll see that there are two SATA ports on the board. One is labeled J_SATA_CD and one is J_SATA_TBU, numbered 2, and 3 respectively. These are both standard SATA ports, but they're unfortunately SATA II, so only 3Gb/s. There is also a spot on the board listed as J_SAS_PCH (24) which you can plug in a SFF-8087 breakout cable into to give you an additional 4 SATA connections. This port is attached to the built in S110 "RAID" controller. This is sadly also SATA II.

You can buy a SAS9211-8i card for under $100 that will allow you to connect 8 SATA III devices (you'll need a breakout cable), but you'll have to figure out how to power those internal 2.5" SSDs -- I didn't have to. I already had an m.2 SATA SSD, so I bought StarTech PCI card which has two m.2 SATA slots on it. Because this thing is bus powered, I didn't have to worry about how to power it.

u/bobj33 · 2 pointsr/DataHoarder

I have 2 LSI SAS cards.

The first is this 9212-4i4e card which has 4 internal SATA ports and an external mini-SAS port. I use normal SATA cables to 4 internal SATA drives and then an SFF-8088 external cable to my SAS expander card in another case.

https://www.newegg.com/Product/Product.aspx?Item=N82E16816118133


I ended up changing to this card instead which is an LSI 9201-16i which has 4 internal SFF-8087 SAS connectors.

https://www.newegg.com/Product/Product.aspx?Item=N82E16816118142

I then use these SFF-8087 to 4-SATA breakout cables.

https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC


I spent almost a day trying to flash the firmware on my 9212-4i4e card and finally gave up after having so many issues. The card works perfectly fine as individual disks. I can pull a drive out and put it in a USB dock and it works just fine. No hidden hardware RAID formatting or anything.

When I got the new 16i card I didn't even bother to check the firmware. I put it in and the drives work fine as individual drives.

u/Berzerker7 · 2 pointsr/DataHoarder

Something like this, but bought on eBay for cheaper and flashed to IT mode to just "passthrough" the drives to the OS and not do any management by itself.

And a couple of these to connect your hard drives. :)

u/MatthewSerinity · 2 pointsr/buildapcsales

So, in the server world, they're obviously not using SATA for high-density storage. One solution they use is SAS (Serial Attached SCSI). There are many different types of SAS ports, the most common in the homelab community (and with specific types of servers) being the SFF-8087 connector (mini-SAS) for internal storage. HBAs / RAID controller cards usually have 2 SAS connectors on them. They can be flashed (or bought pre-flashed) to what's called IT mode which allows them to operate as JBOD (Just a bunch of disks). Something like this. If you shop around a bit you can find better deals on used ones (which you shouldn't be afraid to buy, these things are rugged as hell and kept in nice server environments). You can then pair this with one of two cables, Mini-SAS 8087 to SATA or Mini-SAS 8087 to SFF-8482. If you by the latter of the two, it will work with any SATA drive you have as well, with the added compatibility for SAS drives (2 in 1!). SAS drives sometimes come in good deals on ebay @ 4TB for $50 so I'd go with the latter if you ever feel like you want bulk storage for cheap. No real harm in it.

u/monnon999 · 2 pointsr/DataHoarder

One of my debian setups is still in an old desktop case too :)
I run this raid card: http://www.ebay.com/itm/DELL-HV52W-RAID-CONTROLLER-PERC-H310-6GB-S-PCI-E-2-0-X8-0HV52W-/201657131656
I flashed mine to be in IT mode so that it doesn't act like a RAID card anymore, just acts like a bunch of lonely SATA ports: https://techmattr.wordpress.com/2016/04/11/updated-sas-hba-crossflashing-or-flashing-to-it-mode-dell-perc-h200-and-h310/ Help with this can be sought in the #DataHoarder IRC room, there are a few of us there who have done this on a few different models of cards now.
Got 2 of these cables so I can slap 8 disks in that sucker: https://www.amazon.com/dp/B012BPLYJC/ref=cm_cr_ryp_prd_ttl_sol_0
Then I installed ZFS as my filesystem and run my disks in a glorious 50TB array: https://github.com/zfsonlinux/zfs/wiki/Debian
I even slapped an SSD off a mobo SATA channel as a caching disk. Happy building! :)

u/Thaurane · 2 pointsr/windows

Try an LSI raid controller https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M/ref=sr_1_2?crid=P5PBSXR61CYV&keywords=lsi+raid+controller&qid=1569558454&sprefix=lsi%2Caps%2C171&sr=8-2

You will also need these https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=pd_bxgy_147_img_2/136-5105212-8833038?_encoding=UTF8&pd_rd_i=B012BPLYJC&pd_rd_r=89475179-7e61-4fe7-915a-198096ed13b8&pd_rd_w=QQkz9&pd_rd_wg=B6Ezg&pf_rd_p=479b6a22-70ae-47a0-9700-731033f96ce8&pf_rd_r=0A6F6Y69MVNJQX5XC257&psc=1&refRID=0A6F6Y69MVNJQX5XC257

Be sure to back up the data on your current raid because they will get formatted. After installing it you will see different boot process from the card when starting up your computer. It should tell you to hit ctrl-h (I think). After that just read carefully, choose your hard drives that you want to combine and choose the raid you want. After that boot your computer normally and install the software I linked below. Be sure to extract it before installing and use the complete installation. It might give you a login screen for the software. It will request your window's login credentials. I was wary of it at first too but its what it wants. My memory is a bit fuzzy. But I believe this is where you finish setting up the raid for windows to be able to format it.

I'm using an LSI Logic SAS9260-4I for raid 6. The only issue I've had with it is while I was installing windows I had to disconnect it. But once that was done once I reconnected it and moved on like normal.

edit: Went to the website for you and searched for the card's software management https://docs.broadcom.com/docs/12354760 that should be it.

edit2: added more information.

u/Lego_Engineer · 2 pointsr/DataHoarder

I got these forward breakout cables.

I'm thinking I may get an SFF8088 to SFF8087 (like this) converter and try running through port #8. If that still doesn't fix it, at least I can still use the cable to double my bandwidth once I fix the other problems.

u/DZCreeper · 2 pointsr/buildapc
Board and CPU combo is good, enough single thread performance for the Minecraft server, enough multi thread for transcoding 3-4 1080p streams in Plex. (Rule of thumb is 2K passmark score per 10mb/s of video)

The board is just standard ATX size. It does only have 6 SATA ports, so you will need buy an HBA card to add more ports, or use fewer storage devices.

https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M

That card can handle 8 drives total, 4 per cable.

https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC

PCPartPicker Part List

Type|Item|Price
----|:----|:----
CPU Cooler | ARCTIC Freezer 34 CO CPU Cooler | $31.95 @ Amazon
Memory | G.Skill Ripjaws V Series 16 GB (2 x 8 GB) DDR4-3200 Memory | $64.99 @ Newegg
Video Card | Zotac GeForce GT 1030 2 GB Video Card | $84.99 @ Newegg
Case | Antec Three Hundred Two ATX Mid Tower Case | $94.58 @ Walmart
Power Supply | SeaSonic FOCUS Gold 650 W 80+ Gold Certified Semi-modular ATX Power Supply | $79.90 @ Amazon
Case Fan | ARCTIC ACFAN00119A 56.3 CFM 120 mm Fan | $8.52 @ Amazon
Case Fan | ARCTIC ACFAN00119A 56.3 CFM 120 mm Fan | $8.52 @ Amazon
| Prices include shipping, taxes, rebates, and discounts |
| Total | $373.45
| Generated by PCPartPicker 2019-07-26 08:45 EDT-0400 |

CPU cooler to keep the CPU quiet. Bit of overclocking headroom if you want the extra performance. Compatible RAM. Basic GPU that will be able to handle any 4K 60Hz HEVC video decoding. Case with tons of storage room. Efficient power supply for low noise, and a long warranty. Extra 120mm fans for front intakes, to keep the storage cool.
u/michrech · 1 pointr/homelab

Well, I only have one "gaming VM" (it has a Radeon HD 6970 and a pair of USB ports passed through, and I've assigned it four vCPU / 6GB RAM), but I'm doing a lot of the rest of your desires. This is going to be a somewhat long post, and I'm not terribly well known for being overly organized with my ramblings, so bear with me... ;)

My host is an HP Z800, and the OS is ESXi 6.0.u2 (with the free license). It has two Xeon X5677's with 32GB of DDR3 (8 4GB Corsair CMZ8GX3M2A1600C9B, if that happens to matter to you at all). Because of the memory ventilation duct, I had to remove the flimsy heat spreaders. It has two fans that blow directly over both RAM banks, and I've not had any issues without the heat spreaders at all. This is the only physical PC in my house, if you don't count my rarely used laptop (it primarily gets used on the rare occasion that I travel, and on game nights to control the RPG mapping VM).

For my primary datastore (where the VMs live), I have an LSI 9260-8i, with a Mini-SAS to 4 SATA (forward) breakout cable connected to one of these, populated with four PNY CS1131 SSDs configured in a RAID5 array. Within my Windows 10 VM, I ran CrystalDiskMark (with its defaults - I'm not terribly familiar with benchmarking), and this was the result. I suspect the slow write speeds is due to 1) parity calculations and 2) write-back cache being disabled due to my not (currently) having a BBU to connect to my 9260-8i.

At any rate, onto the VM's!

VM1 - "Gaming" / primary usage - Windows 10. As previously noted, it has four vCPUs assigned, 6GB RAM, and 256GB vHDD on the afore mentioned primary datastore. It has a Radeon HD 6970 and a pair of the host USB ports attached via 'pcipassthrough'. As the host lives in an electronics / networking closet in my spare bedroom, I use some Cables2Go RapidRun digital cabling (the specific part numbers I used are now discontinued) to bring the HDMI signal from that space to a spot on one of my living room walls, where the monitor is mounted. I used a cheap USB<->Cat5 extender to bring a USB port out to a cheap USB hub, to which is connected the Logitech universal receiver for my keyboard / mouse, and a crappy USB 'sound card' (which is only used for its MIC input). Before you ask, no, I don't notice any input / display lag with the 50' cabling between my keyboard / monitor / mouse.

VM2 - Media server, "nas" - Windows Home Server 2011. This VM also has four vCPUs assigned, along with 6GB of RAM, but only a 160GB HDD (the minimum WHS2011 required for installation). This VM has the onboard Intel six port SAS/SATA controller attached, along with a USB3 PCIe controller. I have an 4-in-1 IcyDock (different model to the one I linked previously, but very similar build), in which live three Samsung / Seagate 2TB 2.5" HDDs. These are controlled / presented to the OS by StableBit's DrivePool. All of my media / other data are stored on this pool. As this VM also handles my media services, it has Plex Media Server, Sonarr, and sabnzbd installed. All downloads / unpacks / media rename / etc happens on the DrivePool, since I don't care how long those operations take (I'm the only one that accesses my media).

VM3 - RPG mapping - Windows 7 - This VM is very basic : two vCPUs and 2GB RAM. It has a Radeon HD 7470 attached, which is connected via a 50' RapidRun analog (yellow, also discontinued) VGA cable. This VM is only powered on / used when I have an RPG group at my house.

All three VMs have Chrome Remote Desktop installed so I can access them from anywhere. The media / RPG VM's are exclusively controlled via this method.

I have a Nexus Player installed at both of my TVs. Each has the Plex app installed so I can watch whatever is on the server.

If you have any specific questions, please feel free to ask. :)

u/slippery_salmons · 1 pointr/PleX

I added this with these cables when I ran out of sata ports.

u/spicyrazz · 1 pointr/unRAID

I have bought this cable which should be a forward breakout https://www.amazon.com/gp/product/B012BPLYJC/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

I have tried to plug it in another port on the HBA, no luck either.
Yes, all drives have SATA power and SATA data and I can hear/feel the spinning.

u/drewtcjones · 1 pointr/unRAID

I need new cables to connect the back panel to the LSI don't I?

I currently have https://www.amazon.com/gp/product/B012BPLYJC/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

u/Hollow_in_the_void · 1 pointr/freenas

I'm going to give this one a whirl, hopefully there isn't some issue with my motherboard preventing it from working. Got a H310 pre-flashed off ebay and ordered two of these and let's hope this fixes all my issues.

I tried to put another 8tb in my server this morning and it wouldn't work even on a mb port. Not sure what's up.

u/gd2246 · 1 pointr/DataHoarder

Yes you would put the LSI 9201-16e into a PCIe slot on your current HTPC. Then you would run up to four SAS to SATA cables inside the back of the new case. There will be a big rectangular hole in the back because the tiny power board doesn't have an IO shield (or any IO in the back at all). The cables are a meter long so they should reach just fine.

As for some kind of pre-built all-in-one unit, the only thing that comes to mind is something like the NetApp DS4243. These can be found pretty cheap on eBay but I don't recommend one because they only use server power supplies that are VERY loud. Like jet-engine loud. Seriously, unless you keep both computers in a garage or basement you don't want to actually live anywhere near these things.

If you do have a basement or something though and want to get one, you would still use the LSI 9201-16e but instead use SAS to SAS cables as they have four SAS ports on the back and you would just slide the drives into the front. Everything inside is already connected.

Be aware though that most don't come with HDD trays so you'll have to buy them separately, and the ones that do usually have old 1 TB drives in them already which drives up the price. But even if you think 24 TB extra is good for whatever they're asking, you have to remember they are heavily used in a server environment and likely to die soon, not mention the electricity costs of powering all those drives with 4 server PSUs, and again the NOISE.

Plus there are compatibility issues that even I don't fully understand. You definitely should read up on them before buying one, but really it's not worth the trouble IMO.

I think you're better off going with one of the other options.

EDIT: NetApp DS4243 SAS Disk Shelf Fan Noise

u/TomatoCo · 1 pointr/homelabsales

I don't have one to sell, but you may want to consider an IBM M1015. It has 2x SAS ports which can be turned into 8x SATA via two of these.

If the $250 price on Amazon is too much you can get them used (or "new" sometimes) on Ebay for as little as $50. Which also opens up some more options for faster shipment, I suppose.

u/12_nick_12 · 1 pointr/DataHoarder

I second this also. I have 4 LSI 9211-8i's and I love them. The only issue I have is that my case needs 5 to be able to use all the bays, but the LSI BIOS will only let me use 4 cards. I paid ~$100 per card, well worth every penny. [LSI 9211-8i] (http://www.ebay.com/itm/LSI-Internal-SATA-SAS-PCI-e-RAID-Controller-Card-SAS9211-8i-8-PORT-HBA-/111834008063?hash=item1a09d389ff:g:YhoAAOSwrklVEHVu), [Mini-SAS to Sata Cable] (http://www.amazon.com/Cable-Matters%C2%AE-Mini-SAS-Forward-Breakout/dp/B012BPLYJC/ref=sr_1_2?ie=UTF8&qid=1451574103&sr=8-2&keywords=mini+sas+to+sata)

u/Redditenmo · 1 pointr/buildapc

>My only gripe it's lacking in SATA III ports. Any tweeks to meet my above goals would be greatly appreciated.

Flash an LSI (or similar branded) SAS raid controller and you'll get 8 sata 3 ports at your disposal (note requires SAS - 4 sata cables [eg. this]). You get ex server ones quite cheap. /r/homelab could probably point you at which ones are worth getting now, I've not looked into it since buying my own 6 years ago.

u/Xajel · 1 pointr/Amd

Onboard video chip or an APU.

An APU can save you from having an onboard video chip or using dGPU (and loose the lone PCIe on mITX). But being an APU that means less CPU power. This will be okay for most NAS usage, but when someone wants more then more cores are better. I've always asked here about an 8C/16T mobile APU with very small iGPU for high-end laptops and such applications. These applications either doesn't need a powerful GPU like a server/NAS or it will already have a dGPU like AIO, high-end laptops or SFF systems.

​

Zen actually support ECC, but it's up to the motherboard maker to implement it to fully support it or not.

​

8x SATA ports on mITX can be hard (although they exist). But things can be compact if we go for more server'y like two mini SAS port, each can handle 4x SATA with simple & low cost adapters.

​

These board should really have at least 2x 1GbE or better a 1x 10GbE + 1x 1GbE, or 2x 10GbE for more high-end versions.

u/Rilnac · 1 pointr/homelab

I looked closer at our chassis setup and it is 4x 5 slot boards, so I'm not actually sure what protocol they run because SAS breakouts should max at 4. We're definitely proprietary compared to the options I am seeing online.

Closest equivalent commercially available part would be something like https://www.amazon.com/Mini-SAS-SFF-8087-Inch-Frame/dp/B00M36C2KK which effectively breaks out an internal sas port to 4x sas/sata interfaces. Looking online the DL320 should have an unused onboard port.

Alternatively https://www.amazon.com/Aplicata-Quad-NVMe-PCIe-Adapter/dp/B01MTU75X4 or https://www.aliexpress.com/item/NEW-The-adapter-card-PCI-E-16X-TO-4P-NVMe-SSD-Support-RAIDO-PCI-E-16X/32951136605.html will let you run NVME directly off the PCIE slot assuming there isn't some other expansion already there.

So in a dl320 you could probably do one of each so long as you have physical space left and you don't run out of power.

Forgot one other option, which assumes OP can find power and mounting points on their own. https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC

u/wrtcdevrydy · 1 pointr/homelab

> are there any potentially slightly older and used RAID controller that can be had for cheap (like <$50), that will let me attach 5, maybe 6 drives?

Yes, but normally RAID requires empty drives so you'd have to use something like drivepool instead if you want to keep your data intact.

> Ive come across the Dell PERC H700

H700 isn't a bad card but if you can do an H300, you can flash it into IT mode (https://www.ebay.com/itm/Dell-H310-IT-mode-Adapter-8-Port-6Gb-s-SAS-SATA-Raid-Card-9211-8i-P20-IT/192642240732?hash=item2cda5f50dc:g:EE0AAOSwNgNbiBGM:rk:1:pf:0) and just use Mini-SAS to SATA cables (https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC). Each cable will allow you plug in 4 drives.

u/ryaniskira · 1 pointr/PleX

>I was mainly thinking I could use the PCI express slot for another SATA card when the time came for upgrading, but again thats a future issue.

Don't go with a SATA card, use SAS. SATA cards are usually stuck to 1x slots and can only connect 4 drives (and even then the 1x slot can start bottlenecking you). SAS cards can connect SATA drives and they usually have more PCIe lanes so they will not be a bottleneck. All you need is a SAS card and SAS to SATA breakout cables.

u/Nyteowls · 1 pointr/DataHoarder

Pretty much this. There are lots of cheap HP 24 Port PCI-E 3Gb SAS Expanders on ebay and some are even sold with cables, but probably not the cables you'd need. You'd probably want Forward Breakout Cables. You also need a way to power them and there are PCIe power attachments, that are commonly used on mining rigs, might work. I think the USB portion is only for data, so you'd need to find cables for molex to sata (15pin?). You could use 3-4 HBA expanders into a cheap external port card like 9200-16E. You'll also need multiple power splitters to spread power to all these HDDs.
https://www.ebay.com/itm/New-HP-SAS-Expander-Card-24-Port-SAS-PCI-E-Expander-Board-468405-001/171532956108
https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/
https://www.amazon.com/dp/B073RBP3V6
https://www.ebay.com/itm/372102178384
https://www.amazon.com/gp/product/B0086OGN9E/

Are these 2.5 or 3.5 HDDs? Without a case the 3.5 HDDs will vibrate and probably can cause connection and data issues, so definitely have a backup and expect the HDDs to quit working here and there. This is also rather power consuming for how much storage space you get.

Even though the price of data cables, expanders, power cables start to creep up, I still think if you construct your own case then it would be cheaper than anything prebuilt. Buying anything to hold the drives would be costly, even buying 12 of these to hold 48 drives would cost $204 alone and you have no cables or anything. Some have built their own by fastening metal brackets together (standalone or within select cases), but this lacks rubber needed for vibration protection... This also doesnt account for how you'll cool the HDDs, but perhaps one big fan could get you by. Post some updates and pictures if you decide to commence on "Project Janky", gluck.
https://www.amazon.com/Rosewill-5-25-Inch-3-5-Inch-Hot-swap-SATAIII/dp/B005FHHOXE
https://imgur.com/3xsabQU
https://www.reddit.com/r/DataHoarder/comments/aceglg/new_build_in_progress/

u/blueman541 · 1 pointr/sffpc

https://imgur.com/a/OzIi5

I'm currently using a ASRock E3C224D4I-14S which has integrated LSI 2308 controller. It has 3x mini SAS ports which I use breakout cables giving me 12 drives. To get more ports I can use a SAS expander card or an extra pci controller.

u/SirLagz · 1 pointr/homelab

Maybe get some of these sorts of SATA cables would make it easier - https://www.aliexpress.com/item/4PCS-Free-Shipping-DIY-Black-sata-3-SATA-III-3-Data-Cable-Dual-channel-aluminum-foil/1582341251.html


Or get a SATA controller that uses Mini-SAS to SATA cables and get these - https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC


Would make running separate SATA cables a bit easier and more manageable

u/ru4serious · 1 pointr/homelab

So I just realized the P812 has the Mini SAS on the board whereas the P800 has the larger SAS ports on the board. Therefore, you'll actually need these breakout cables.

https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC

If you get two of those, you attach them to the internal ports and then that gives you 8 total internal drives. If you needed more than that, then you would get the SAS Expander, run SAS cables from the P812 to the SAS expander, and then use more of those breakout cables on the SAS expander to get more internal drives.

I haven't used the SAS expander for HP so I am not sure how well it works or what additional configuration you will need.

You would need those other cables I listed if you were going to use the P800, but I wouldn't recommend it since that card only supports up to 2TB drives where the P812 supports MUCH larger drives and up to 108 total drives (if you really wanted to).

u/mke5271 · 1 pointr/homelab

Not too worried about which type of RAID, long as it supports the drives at full throughput. The 9211-8i looks pretty nice.

Would this work as a breakout cable for the drives?
https://www.amazon.com/dp/B012BPLYJC

u/ixidorecu · 1 pointr/freenas

maybe an h200 flash it to it mode, then a foward breakout cable then just figure out how to power it all