Reddit mentions: The best computer internal scsi port cards

We found 170 Reddit comments discussing the best computer internal scsi port cards. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 58 products and ranked them based on the amount of positive reactions they received. Here are the top 20.

12. LSI 9200-8E 8-Port 6Gb/s SAS/SATA PCI-Express x8 External Host Bus Adapter

LSI 9200-8E 8-Port 6Gb/s SAS/SATA PCI-Express x8 External Host Bus Adapter
Specs:
Height10.6 Inches
Length6.3 Inches
Weight0.7 Pounds
Width2 Inches
▼ Read Reddit mentions

17. Qlogic Qle4060C-Ck Qlogic 1Gb-Pcie-Iscsi Single Port Hba

Type: Fibre Channel Host Bus AdapterHost Interface: PCI Express
Qlogic Qle4060C-Ck Qlogic 1Gb-Pcie-Iscsi Single Port Hba
Specs:
Height2.5 Inches
Length10.9 Inches
Weight0.7 Pounds
Width8 Inches
Number of items1
▼ Read Reddit mentions

20. IBM 6GB SAS HBA

    Features:
  • Product Type SAS Controller
  • Form Factor Plug-in Card
  • Host Interface PCI Express 2.0 x8
IBM 6GB SAS HBA
Specs:
Height2.72 Inches
Length0 Inches
Weight0.21 Pounds
Width6.61 Inches
Number of items1
▼ Read Reddit mentions

🎓 Reddit experts on computer internal scsi port cards

The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where computer internal scsi port cards are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
Total score: 16
Number of comments: 8
Relevant subreddits: 3
Total score: 13
Number of comments: 6
Relevant subreddits: 2
Total score: 5
Number of comments: 2
Relevant subreddits: 1
Total score: 5
Number of comments: 2
Relevant subreddits: 1
Total score: 4
Number of comments: 4
Relevant subreddits: 2
Total score: 4
Number of comments: 4
Relevant subreddits: 1
Total score: 4
Number of comments: 2
Relevant subreddits: 2
Total score: 4
Number of comments: 2
Relevant subreddits: 1
Total score: 3
Number of comments: 2
Relevant subreddits: 1
Total score: 2
Number of comments: 2
Relevant subreddits: 1

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Top Reddit comments about Computer Internal SCSI Port Cards:

u/BrewingHeavyWeather · 0 pointsr/buildapc
To preempt any pro-Ryzen talk: the Intel platform will be worth it, if you run Windows as the host, just for the Intel SSD RAID 1 to boot from. If the main OS will be Linux, though, a Ryzen 2600 or 2700 with a B450 board will do the job, with a software MBR RAID 1 for boot.

I see 5 8TB drives. That's a red flag. RAID 10 is your friend. With all those things running on it, UnRAID will be slow as molasses whenever you have do anything that stresses your storage, almost as bad as RAID 5. RAID 6, or UnRAID, will be fine for basic file serving, but if you ever actually do anything with VMs, or have to go back through footage on your NVR, it will be so much slower than RAID 10.

750W is crazy overkill. A good 450W would be...still a bit high, TBH, but it's hard to find good ATX supplies under 500W, these days.

That CPU cooler is crazy expensive (if you want quiet, get a Freezer E34). So is that motherboard. The memory is OK, but for home/small office use, cheaper and slower RAM would be fine.

Fan noise should not be a problem. Put both included case fans on the intake side, adjust the fan curves a bit, and it'll be pretty darned quiet, if not practically inaudible.

You can save about $45 going to an i7-8700, though the clock speed reduction might actually matter for some uses, once the machine gets to be a few years old, so I didn't change it.

PCPartPicker Part List

Type|Item|Price
----|:----|:----
CPU | Intel Core i7-8700K 3.7 GHz 6-Core Processor | $354.99 @ Newegg
CPU Cooler | ARCTIC Freezer 34 CPU Cooler | $32.17 @ Amazon
Motherboard | ASRock H370 Pro4 ATX LGA1151 Motherboard | $89.95 @ Amazon
Memory | Patriot Viper Steel 16 GB (1 x 16 GB) DDR4-3000 Memory | $57.99 @ Amazon
Memory | Patriot Viper Steel 16 GB (1 x 16 GB) DDR4-3000 Memory | $57.99 @ Amazon
Memory | Patriot Viper Steel 16 GB (1 x 16 GB) DDR4-3000 Memory | $57.99 @ Amazon
Memory | Patriot Viper Steel 16 GB (1 x 16 GB) DDR4-3000 Memory | $57.99 @ Amazon
Storage | Western Digital Blue 1 TB 2.5" Solid State Drive | $109.89 @ OutletPC
Storage | Western Digital Blue 1 TB 2.5" Solid State Drive | $109.89 @ OutletPC
Storage | Western Digital Purple 8 TB 3.5" 7200RPM Internal Hard Drive | $229.99 @ Amazon
Storage | Western Digital Purple 8 TB 3.5" 7200RPM Internal Hard Drive | $229.99 @ Amazon
Storage | Western Digital Purple 8 TB 3.5" 7200RPM Internal Hard Drive | $229.99 @ Amazon
Storage | Western Digital Purple 8 TB 3.5" 7200RPM Internal Hard Drive | $229.99 @ Amazon
Storage | Western Digital Purple 8 TB 3.5" 7200RPM Internal Hard Drive | $229.99 @ Amazon
Storage | Western Digital Purple 8 TB 3.5" 7200RPM Internal Hard Drive | $229.99 @ Amazon
Case | Fractal Design Define R6 USB-C ATX Mid Tower Case | $141.99 @ Amazon
Power Supply | Corsair RMx (2018) 550 W 80+ Gold Certified Fully Modular ATX Power Supply | $99.99 @ Amazon
| Prices include shipping, taxes, rebates, and discounts |
| Total (before mail-in rebates) | $2570.77
| Mail-in rebates | -$20.00
| Total | $2550.77
| Generated by PCPartPicker 2019-10-15 16:15 EDT-0400 |

Then look on eBay for a used LSI MegaRAID card, preferably a 92xx, with a RAID firmware, support for 8 native drives, RAM onboard (or a DIMM slot), and a battery (lots out there right now under $75, some under $40), if running Windows for the host. If it doesn't come with the SFF-whatever to SATA cables, buy those, too. Boot Windows from the Intel RAID 1, then put your storage on the LSI. It might require legacy bootup. While you can flash from HBA (IT) to RAID (IR), technically, doing so is a royal PITA (I did it once, thinking it wouldn't be so hard...). Oh, and many non-server boards will not boot with a drive controller in the main PCIe 16x, or will not see it, so just use the 2nd one from the start.

Alternatively, if you want to stick to UnRAID, or go with software RAID in Linux, grab one of these, for +4 SATA ports: https://www.amazon.com/Ziyituod-Controller-Expansion-Profile-Non-Raid/dp/B07SZDK6CZ.

Highlights: 6 HDDs, for RAID 10, a cheaper quieter cooler, a cheaper motherboard that still supports RAID, cheaper RAM (with all 4 DIMM slots populated, getting 3000MHz slightly increases the chances they'll all run at 2666), a good quality PSU that isn't overkill, and right-sized motherboard.

That said, I think 32GB RAM would do the job, if running the NVR on metal, since you don't need much RAM per Linux VM. 64GB might be worth it if you plan to split each service up into its own VM for management purposes, though.
u/Sweet_Vandal · 1 pointr/DataHoarder

Yeah, but with one minor correction: I am not using a breakout in the PC. MB SATA -> 8088 Adapter -> 8088-to-8087 Adapter -> SATA breakout (the listing doesn't actually specify that it's Forward, but the description would make think so) -> HDDs

Yes, all layer one. Every adapter is totally passive.

Expensive? Yeah, probably if I had used two of the dual adapters (which, honestly, now that I'm typing this out I feel like a dingus for not having done that - I'm not sure what I was thinking). This was a cheaper alternative to purchasing a 4-bay Mediasonic and would potentially support up to six drives (assuming I get it working). I could have just run a bunch of long SATA cables between chassis, but that would be really messy, cable-wise, and there's no way I'd be able to move both enclosures at the same time. Unless there's some kind of SATA aggregate option, seemed like my best way to go (which, if that's a thing, I'd be interested in that route too, but some quick-ish googling didn't turn much up).

I was reading about some of those changes in the BIOS, IDE vs AHCI - is that what you're referring to? That certainly could be it, since I did see one drive initially. I'll look into that (and MB support...) tonight while I wait on the PSU replacement.

No intention of using the port multipliers. If I need more than four, I'll probably focus on just running another SATA->SAS adapter and use the second port on the 8088->8087 bracket.

u/EatingPattern · 1 pointr/buildapc

I’m not the best guy for a definitive answer on hardware, but to the best of my understanding the 6g/s SAS cards are guaranteeing you 6g/s capable speeds per 4x SAS port which then has to be shared by however many of the 4x drives per SAS port are in use at any given time.

Now how many drives in use via however many separate SAS ports at the same time will cause a bottleneck via the PCIe 2.0 controller operating at 8x speeds... I’m a little fuzzy on...

I was just in your shoes a week and a half ago and spent days looking at the field of cards. This one was really an outlier as to what it can do for cost. It’s also supply and demand involved here. If you noticed, this is an external SAS card and the number of people looking for them is much lower than those looking for internal connections. This same card with an internal port will run you over $200!

But all you have to do is run the cords back inside the case using the pci slot above/below wherever you have the card installed. Or you can use this if you’d like, but I prefer a straight single connection, less chance of problems.

CableDeconn Dual Mini SAS SFF-8088 To SAS36P SFF-8087 Adapter In PCI bracket https://www.amazon.com/dp/B00PRXOQFA/ref=cm_sw_r_cp_api_Nf9PAb5TWXPZQ

But back to speed, can’t tell you how far you’d have to push the card to hit a bottleneck, but I’ve got 24x sata drives in mine (16x on this card) in a Windows 10 box, and the drives 5400rpm drives are still operating at the same speeds they always have with a direct MB connection.

Good luck with your build! Let me know if you have any other questions!

u/SnappyCrunch · 1 pointr/techsupport

For anyone who finds this later - I decided to go with the RAID array despite it not being a financial slam dunk. What happened is that I lucked out finding an adapter that allows you to put two 2.5" drives in a 3.5" front drive bay (link) on sale at newegg for $10, so I bought two. My system board has 6 sata ports and a built-in raid controller. I was looking that that vs software RAID solutions and I found Storage Pools in Windows. Hardware RAID solutions are traditionally inflexible (pun intended), and Storage Pools allows you to add drives to the pool as you go. So I set up a 4x500gb storage pool with parity. Turns out the performance on Storage Pools in Windows with Parity is god awful, and some more searching led me to r/DataHoarder, which recommends the software solutions of DrivePool ($30) for concatenating the disks, and SnapRAID (free) for writing parity information. So that's what I'm going with for now.

Like Storage Spaces, Drivepool allows you to add more disks to the pool at any time, which means I can use some of my smaller disks as well if I can get the ports to plug them in. I'll need a PCI SATA adapter, but those are pretty cheap. I'll then need a place to mount them. I can buy more drive bay adapters, some slot mounting adapters, or make my own.

So far I'm at least $50 in the hole, and maybe more depending on whether I get that expansion card and/or mounts. So let's say $90 for an array that'll be about 2.5TB, maybe. I can get a 4TB HDD for $120 or an 8TB for $200 when they're on sale, which seems to make them clearly better deals. I get slightly more data security with the RAID array, but it seems like old laptop hard drives still don't have a specific use.

u/firejup · 1 pointr/DataHoarder

YUP! This. USB 3.0 works just fine. The MediaSonic Probox is fairly popular around here. It supports both USB 3.0 and ESATA. You'll need to make sure your ESATA supports port multipliers, or get an add-on card. Here is the one that I use. Works great and can support 2 of the MediaSonic ProBoxes.

u/Beaston02 · 3 pointsr/DataHoarder

Sure, although my system is COMPLETE OVERKILL, so you dont need anything nearly as serious. I wouldnt be surprised if I get plenty of people criticizing my build specs because most of it is unused/unneeded. you could use a basic CPU which could be found for probably as low as about $50 and get by if all youre doing is hosting files (still supporting ECC), and most anyone doesnt need 4xgigabit lan, or support for 512GB ram, or dual CPUs like im running. You could do it for way cheaper than I did, so dont let my build scare you away. I would recommend server grade components, but even entry level (or close to it) server grade would probably get you by. I got some parts for next to nothing which helped bush me to a complete overkill build as apposed to a more practical at a little less $$. browse through the freenas forums and /r/freenas to get better ideas, or even post asking for build suggestions.

anyways, heres the specs of my build:

CPU: 2x intel xeon e5-2620v2 (I got these for $100 each from a friend, only reason I went with them.. complete overkill)

motherboard: ASRock EP2C602-4L/D16 SSI EEB Server Motherboard Dual LGA 2011 Intel C602 DDR3 1600/1333/1066

Memory: 2x Kingston 32GB (4 x 8GB) 240-Pin DDR3 SDRAM ECC Registered DDR3 1600

Case: Norco RPC-4224

SATA/SAS controller highpoint rocketraid 2782 32 port I went with this card because amazon had some sort of a pricing error, and I grabbed it at ~$180. Reviews arent great that ive seen, but its been solid for me.
My power supply is one I had laying around, I cant remember the model, but I believe its about 750 watts (maybe 850.. cant remember.. its more than enough either way)

and Im currently running FreeNAS on a usb stick, with 12x Western Digital WD40EFRX 4TB Red drives, and 2 500GB 2.5" drives I had laying around (WD - cant remember which model) which are running the Jails (programs/addons) mounted inside the case

I run plex media server /r/plex and have several people running transcoded streams at once, so I wanted plenty of CPU, but Ive never used as much as 50% of my computing power when ive had as many as 7 people streaming transcoded streams at one time. Other than that, most other service I run on it are rather low demand. if you wanted to run something like plex (or anything that is CPU hungry) building is a huge benefit since any prebuilt system will have which CPU they chose, and its usually not intended for much more than shooting files to a few different systems (some do have better CPUs though, at a cost of course)

also, another option is NAS4Free (which was forked off of FreeNAS when freenas developed into more feature filled and not just a NAS OS). Ive never used NAS4Free as it was more limited than I wanted to get into, but its supposed to be more user friendly if youre looking more for a basic NAS and not trying to use it for much more server purposes, or at least thats my understanding, I never dove into it much so I could be mistaking.

u/Dopamin3 · 2 pointsr/DataHoarder

CT16G4WFD8266 edit: also make sure to only run it in pairs of two. Odd numbered ECC does not work well, if at all in Ryzen for whatever reason.

​

I'm running 4 of these sticks in an Asrock X370 Taichi and Ryzen 1700 (another user reported they work with the 2700X and another board). First boot, ECC was enabled and automatically set to 2666mhz.

​

Generally most Asus/Asrock motherboards have great support for ECC. I wouldn't be too concerned on the number of SATA ports though and would opt for higher quality motherboard like the aforementioned Asrock Rack or the X370 Taichi (12k capacitors, debug LED, stupid strong VRM, 10 SATA ports). With any board though you can always add an HBA like this: https://www.amazon.com/gp/product/B0085FT2JC/ref=ox_sc_saved_title_1?smid=A3SCB8M3AWX80L&psc=1

​

last edit: Taichi is ATX, you're looking for MicroATX. Definitely if you're not going the Asrock Rack route (IPMI is a nice selling feature) I would opt for the B450M Steel Legend. It has THE highest build quality of any consumer MicroATX board on AM4.

u/CollateralFortune · 1 pointr/homelab

I highly recommend a Fractal Design or Lian Li cases. After going through a number of cases, external DASs, etc for my 250TB+, those are the only cases I've been truly satisfied with. I highly regret the SilverStone cases I've purchased. This is based on the factors of cooling ability, noise level, and ultimate cost.

From a cooling perspective, the FD cases can't be beat. You can stuff those so full of huge (and quiet fans) that your drives will be positively arctic.

For building a DAS, all you need is a JBOD power board. They go from really simple to really complex, with IPMI, fan headers and all the fixens.

Then something like this with some breakout cables and you are set.

Obviously that's a bit more than the $70 are you talking about though. But your duplicator case method would be limited on bandwidth vs SAS which would run the drives at full speed. But that's the method I use to put together my DASs now.

u/ndboost · 1 pointr/DataHoarder

I have two DS4243's in my lab at home both full of 2TB Hitachi spinning rust (potentially two more shelves coming if I win the auctions) connected via adapter cables (QSFP+ to SFF-8088) to an IBM M1015 card internally.

The NetApp DS4243 is QSFP+, and it is 3Gb/s (the last number in the model is the transfer rate) you will need to convert that to SFF-8088, and then from SFF-8088 to SFF-8087 if your JBOD card doesn't have SFF-8088 ports externally. Or you can find a compatible QSFP+ card for your NAS.

I have heard murmurs that the DS4243 can be picky about the disks you put in them. I got the two I have with disks already, but IIRC you just need to reformat them into a specific format or something to get them to show up with the DS4243 so be aware of that.

FWIW if you aren't worried about noise so much, the DS4243 can be picked up for about $100-$150 without disks and I'd argue is the best bang for the buck. Even then its the quietest thing in my lab unless they're spinning up from a cold boot.

----

So to recreate my setup you'd need the following:

  • 1x - CableDeconn SFF-8088 to SFF-8087 Adapter Bracket
  • 2x - SFF-8087 to SFF-8087 cables
  • 1x - IBM M1015 flashed to IT Mode
  • 2x - NetApp DS4243 JBOD Shelves
  • 2x - QSFP+ to SFF-8088 Cables ~1m - 3m in length
  • 2x - QSFP+ to QSFP+ cables

    You'd connect it all up like this in order from top to bottom:

  • IBM M105
  • 2x SFF-8087 to SFF-8087 cables
  • SFF-8087 to SFF-8088 adapter bracket
  • 2x QSFP+ to SFF-8088 cables
  • 1st DS4243
  • 2x QSFP+ to QSFP+ cables
  • 2nd DS4243

    FYI, You can chain (I think max is two) the DS4243's together.
u/sh3llm4n · 1 pointr/pcmasterrace

>That's PCIe. I don't see a little green connector.

Perfect. It might just be a cutout to make it fit the systems that I mistook for a connector. I'm not sure how they were mirrored, I just know there were 2 drives in total, of which I now have one (for forensic purposes). They were in an old HP server (model eludes me unfortunately) The sysadmin rebuilt the server on top of the second one, so it's temporarily running without a mirror until I can get the data off and get it back to them so he can rebuild the RAID. I feel the sysadmin knows what he's doing though so the fact that he said it should work standalone makes me a bit more confident.

I understand there's unknowns but appreciate the help. I think I might just bite the bullet and purchase this one https://www.amazon.ca/IOCrest-SI-PEX40097-Port-PCIe-Controller/dp/B00XI4OL82/ref=sr_1_3?keywords=SAS+HBA&qid=1567090932&s=gateway&sr=8-3 from amazon, if things don't work out I can always just return it and say it didn't work.

u/nerplederple · 3 pointsr/freenas

If it's just a data drive and you're not looking to do anything super fancy with it. These work great.

However, be advised that, because the card is PCI-E x1, if you were to actually plug in 4 hard drives or SSDs, you're gonna run smack against bandwidth limitations if you start trying to hammer I/O on the drives connected to the card all at once.

I have this exact card as well as the 2-port PCI-E x2 slot version in use and they work very well for supplementing on-board headers when you're a few short.

I would not attempt to use these cards to run HDDs/SSDs that were going to be datastores for VMs nor as the HBA for something like FreeNAS. If your goal is along those lines, you'd be much better off looking for an HBA like the 9207-8i. You can get those way, way cheap on ebay, and then you just need the correct cables for 'em.

u/mastigia · 1 pointr/linuxquestions

>Why would you consider software raid for enterprise level applications / software and not hardware raid?

If he could afford hardware raid this crappy little file server would not even exist haha. Of course that would be ideal, just not in the budget. And this isn't an enterprise level arrangement, this is just a small business running one small application using consumer grade equipment.

I am actually planning on using this "Semlos New PCI SATA Internal Ports RAID Controller" for connecting the drives, which is not a RAID controller at all despite the description. It is only SATA II, but I think the drives I got won't even max that out, much less SATA III. It has 2x1.5Gbps channels, so each drive is getting 750Mbps. The drive specs say they are doing 554 MB/s / 512 MB/s sequential. Unless I misunderstand, there is plenty of room there? But, feel free to correct me if I have that wrong, this is all kinda new to me.

>Also, you sure it's the disk IO that's the issue, and not say network since you are using SSD's (Assuming your motherboard has sata 3.0 ports for those SSD's and not sata 2.0)?

Nope, I am pretty sure the CPU is bottlenecking in addition to the NIC. I ordered a new Gigabit NIC, as stated in my post, and that should also help things I believe. The rest of the network is on CAT6 with gigabit router and switching, but also consumer grade. The router is running DD-WRT though.

>Also, Access is horribly slow anyways, not really meant to be an enterprise level database software handling large volume.

Completely agree with you, but once again this is a small database with not very many users. It isn't heavily accessed all the time, which is why I have been trying to figure out why it is so slow. Some forms just take forever to load, and I didn't build it and am not allowed to modify it. If I came up with very clear and specific design modifications that would increase the performance I could definitely get the author of the DB to implement the changes. But due to my unfamiliarity with Access as a whole and this DB in particular, I am unqualified to do so.

u/mercenary_sysadmin · 1 pointr/zfs

Can you link me to a good example? Preferably one suited for a homelab, ie not ridicu-enterprise-priced to the max? This is something I'd like to play with.

edit: is something like this a good example? How is the initial configuration done - BIOS-style interface accessed at POST, or is a proprietary application needed in the OS itself to configure it, or...?

u/lordbob75 · 1 pointr/homelab

I'll toss a plug in for UnRAID here, it would do what you want. I use it and love it. May be the simplest option for you as it is a file server, and can also manage VMs and docker containers. There are dockers for Plex, backup software (crashplan, etc), and many other useful things (unifi ap controller, openvpn server, web server, ftp, etc etc).

I also just picked up two of these for my new server:
https://www.amazon.com/gp/product/B0085FT2JC/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

They work with JBOD and can be used with HDDs and SSDs. Not sure what the T30 and R710 have currently, but you may be able to flash them to IT mode for JBOD support.

If the 9207-8i is too pricey, you can get an H310 for around $50-100 and flash it to IT mode yourself. Amazon has them but ebay may be better.

u/bagofwisdom · 1 pointr/pcmasterrace

That's an 80 pin SCSI drive. Making this work in your PC isn't worth the expense. There is no adapter to go from SATA controller to SCSI drive. You'd need a SCSI controller, which was obsolete when PCIe came out. However, you're in luck as controllers are still made. If this is a 10k or 15k RPM drive it's going to be super loud and super hot. I also never trust used hard drives to last long at all. You'll need a SMART report on the drive and hope it logs its power up/down counts and run times.

u/omeglidan · 0 pointsr/intel

I also currently use an i7 920. I have an Asus P6T Deluxe (LGA1366). The x58 is an enthusiast class platform with 40 PCIe lanes. So I put in a PCIe USB 3.0 card with 4 ports for $18. https://www.amazon.com/gp/product/B011LZY20G/ref=oh_aui_search_detailpage?ie=UTF8&psc=1

Also I got a SATA 3.0 PCIe controller card (will give you AHCI) for $68 https://www.amazon.com/gp/product/B007EM7N70/ref=oh_aui_search_detailpage?ie=UTF8&psc=1

I then added 2 Samsung 850 Pro drives onto the new controller as well as a Samsung 950 M.2 SSD through a PCIe card https://www.amazon.com/gp/product/B018U79YQK/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
Also added 24GB of RAM over the years.

I run an EVGA 1080FTW and game at 2460x1440. I get the following avg framerates. Witcher 3 = 57fps Trine 2 = 76fps, Starcraft 2 = 50fps, Tomb Raider 2013 = 81fps.

Personally I am wanting to replace it with an equal or better system which Skylake is NOT because it only has 16PCIe lanes to the CPU. I'm eagerly waiting for Skylake-E at around middle of 2017, that's the next big enthusiast level upgrade.
Any questions just ask.

u/FzzTrooper · 1 pointr/buildapc

Ah thank you. Today ive learned the differences between all the types of PCIe x slots.

Now, what stops me from installing this controller in my bottom most PCI Express 2.0 x16 slot?

Will i get more bandwidth overall and be able to use more SSD's?

From my P8Z77-V LK manual:

>2xPCI Express 3.0* / 2.0 x16 slots (single at x16 or dual at x8 mode)

>1xPCI Express 2.0 x16 slot (black) (max at x4 mode, compatible with PCIe x1 and x4devices)*

>2xPCI Express 2.0 x1 slots

>2xPCI Slots

>
PCI2 3.0 speed is supported by Intel 3rd generation Core processors.

>**PCIe X16_3 slot shares the bandwidth with the PCIe X1_2 slot. The default setting is x2 mode. Go to the BIOS setup to change the settings.

So from what i can gather, if i put any card in the PCIE x16 slot just below the gpu, it will drop down to x8 mode correct? Thats the slot for SLI. That sounds like itll slow down the bandwidth from the gpu..? avoid that.

The x4 card wont fit into PCIe x1 slots or PCI slots.

The x4 card should fit just fine in the 1xPCI express 2.0 x16 slot (black) just fine correct? bottom black pci slot. Would that be better for more bandwidth which leads to me being able to use more than two SATA 3 drives?

I think im starting to understand how it all works now hah.

u/wrtcdevrydy · 6 pointsr/DataHoarder

Okay, here's what you're going to want to learn.

Mini-SAS comes in two versions (internal - 8087 or external - 8088).

If you want to connect drives internally, you get an LSI card with internal (8i, 16i)

If you want to connect drives externally, you get an LSI card with external (8e, 16e)

Say you have two boxes, you need one external LSI card with 8088 and one passthrough 8088-8087 card.

You'll need 8087 cables to SATA (an 8i card will have two ports for 2 cables where each support 4 sata cables)

You'll need 8088 cables to connect the external cards together

Figure out how many SATA hard drives you want to support.

8e - 8 SATA drives per external card.

16e - 16 SATA drives per external card.

Shopping List for 16 External Hard Drives from one computer to another:

External Card ($30): https://www.ebay.com/itm/LSI-6GB-16-Port-SAS-SATA-HBA-Controller-Card-SAS9201-16e-H3-25379-01G-Grade-A/273461892263?hash=item3fab9954a7:g:CSMAAOSwfkFbm-XI:sc:USPSFirstClass!33175!US!-1

Mini-SAS Passthrough (2 x $30): https://www.amazon.com/CableDeconn-SFF-8088-SFF-8087-Adapter-bracket/dp/B00PRXOQFA

8087 to SATA (4 x $8): https://www.amazon.com/CableCreation-SFF-8087-female-Internal-Splitter/dp/B013JP7YI8/ref=pd_lpo_vtph_147_bs_lp_t_1?_encoding=UTF8&psc=1&refRID=AYXPARRHH92MDMM64NJJ

8088 to 8088 (4 x $15): https://www.amazon.com/CableDeconn-SAS26P-SFF-8088-External-Attached/dp/B00S7KTXW6/ref=sr_1_3?s=electronics&ie=UTF8&qid=1537045400&sr=1-3&keywords=8088+to+8088

Edit: Please don't hesitate to ask questions before spending money, just make us a diagram showing where your disks are and where you want to hook them up.

u/nealbscott · 3 pointsr/buildapc

Assuming you have PCI express 3 in your computer expansion slots, Get a card like this:

LSI Logic SAS 9207-8i Storage Controller LSI00301 https://www.amazon.com/dp/B0085FT2JC/ref=cm_sw_r_cp_apa_i_xQ-jDbVZ3R30V

It will feed data connections to 8 sata drives all by itself.

It has 2 sff-8087 ports. Then get the special 'forward breakout' cable. Well two really. One end goes into the sff-8087 port and then it splits out into four sata data cables. Which go into the hard drives of course.
https://www.amazon.com/dp/B012BPLYJC/ref=cm_sw_r_cp_apa_i_2T-jDbHFP0FHC

The card can support hardware raid, but fewer and fewer folks do that. After all, hardware raid usually requires identical drives, and us folks at home often have a motley collection of drives of various sizes, speeds, and geometries. So software raid it is. In linux, folks often use freenas or unraid. In windows 10 you can use something called 'storage spaces'. Using raid will allow you to treat all those drives like one device... And have some tolerance for failure (which happens with so many)

Next question.. does your case have room? Do you have enough power connectors?

u/LeKKeR80 · 1 pointr/PleX

Pretty much. You'll also need a power supply. I used the build I linked above as inspiration for the two DAS I've built.

I've also added a SAS expander to each DAS to reduce the number of cables running from my server to the DAS. The right expanders can also be daisy chained for nearly unlimited storage.

My most recent DAS build for a total of 12x3.5 HDD slots and 4x2.5 SSD slots:

SAS pcie card 9201-16e

mATX case [InWin Mana 137]

3x5.25 to 5X3.5 HDD adapter [or this one]

PCIe HDD adapter

power supply

fans

fan controller

• cables [SAS, SAS to SATA forward breakout, SATA power, etc.]

• Optional - PCIe adapter for easier cable connect/disconnects

• Optional - SAS expander

u/IsimplywalkinMordor · 1 pointr/freenas

For a case maybe a node 804? Or a Silverstone ds380? If you can't find a micro atx or mini itx motherboard with enough sata ports you'll need a controller in IT mode. The mini itx is smaller but you'll be limited on memory ports and sata ports most likely. I suggest building it with pcpartpicker to ensure compatibility. You can also filter for various qualities like number of sata ports or number of 3.5 inch hard drives things like that.

u/3Vyf7nm4 · 1 pointr/sysadmin

FreeNAS

It's Free Open Source software, available at no cost (the commercial, paid version is TrueNAS).

It's built on FreeBSD, and uses ZFS filesystem - a Copy-On-Write filesystem that completely avoids the URE/Write Hole problem that RAID5/6/etc. has.

It supports Windows File Sharing, NFS, and iSCSI, and works very well with VMWare. It also directly connects and shares your AWS storage.

---
e1: non-affiliate links, here's how to build a SAN on the cheap:



Qty | Component | Part Number | Amazon | Newegg
---|---|----|----|----
1 | Chassis | SUPERMICRO SYS-5029S-TN2 | $434.90 | $599.99
1 | CPU | Intel Core i5-6500 | $199.99 | $204.99
2 | RAM | Crucial 8GB DDR4(PC4-19200) Unbuffered SODIMM | $70.95 | $69.99
1 | USB drive (OS drive)| Corsair Voyager Vega 16GB | $18.99 | $13.99
1 | SSD (ZFS Cache drive) | Samsung 850 PRO - 256GB | $117.44 | $119.99
4 | SATA Drive | Seagate 8TB (ST8000VN0022) | $259.79 | $259.99
1 | OS | FreeNAS | n/a | n/a
| | Total | $1,952.38 | $2,418.89

The drives are half the cost of the system. If you reduce the size of the disks, you can significantly lower cost. However, keep in mind that RAIDZ will consume 1 disk, and of the remaining 3, you only get 80% with ZFS. With 8TB drives, this is 19.2TB

e2: updated the above to be SATA instead of SAS - the onboard controller does not support SAS

---

e3: Here's the same basic setup, but with a Xeon processor, ECC memory, 2x1Gps, 2x10GigE, and SAS support:

Qty | Component | Part Number | Amazon
---|---|----|----
1 | System | Supermicro SuperServer 5028D-TN4T | $1,220.00
1 | CPU | Intel Xeon-D-1541 | n/a (included in server)
1 | RAM | Crucial 8GB Single DDR4 (PC4-2133) ECC Registered | $109.57
1 | USB drive (OS drive)| Corsair Voyager Vega 16GB | $18.99
1 | SSD (ZFS Cache drive) | Samsung 850 PRO - 256GB | $117.44
4 | SAS Drive | Seagate 8TB (ST8000NM0075) | $299.79
1 | SAS controller | LSI SAS 9207-4i4e | $114
1 | OS | FreeNAS | n/a | n/a
| | Total | $2,888.73

Of the two above systems, the first I would use for home/media, the second is appropriate for business (though it does not have a redundant power supply)

u/Virtualization_Freak · 1 pointr/CableManagement

It's definitely a rosewill 4u.

However, he has the 12 bay hotswap version.

I'm pretty sure he's using this controller. There are not too many 16 port sata cards, most people use sff8087 to sata breakout cables.

That looks like an OEM board with an OEM AMD heatsink. But I'm just guessing at this point.

u/raj_prakash · 1 pointr/DataHoarder

For home server applications, something cheaper like this would work? I know the LSI cards have a great reputation for solid build quality, plug and play, and longevity. Just proposing a cheaper option for OP.

https://www.amazon.com/SHINESTAR-Splitter-Controller-Expansion-Non-Raid/dp/B07KNXZFRH/)

u/anonymous_opinions · 1 pointr/DataHoarder

It's a QNINE. This one to be exact: https://www.amazon.com/gp/product/B01N5LQ7Z3/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

Reviews are hit or miss but I guess I got a pretty good one.....so far.

u/mvillar24 · 2 pointsr/unRAID

The question about PCI-E SATA cards is how much you are willing to spend and what available PCI-E slots do you have on your motherboard.

The cheapest I've tried (with slowest throughput) when you only have PCI-E 1x slots free is to use four port SATA cards like this Marvell 88SE9215 chipset based card for $33 on Amazon:
(http://www.amazon.com/gp/product/B00AZ9T3OU)

If you got at least a PCI-E 4x slot you can something faster for $100 - $160 such as (note these are 8 port cards):

  1. HighPoint RocketRAID 2720SGL 8-Port
  2. Supermicro AOC-SASLP-MV8

    On eBay used:
  3. Dell HV52W PERC H310

    A number of the above solutions are not as fast as you can go since they use PCI-E 4x slots. But 8x slot cards can cost a lot more. Personally I don't notice the slow down as much since I'm really using these drives to stream and don't notice that parity checks and moving data from cache to permanent drives take longer.
u/TheCheapNinja · 4 pointsr/DataHoarder

LSI LOGIC SAS 9207-8i Storage Controller LSI00301


Mini-SAS to 4x SATA Forward Breakout Cable


I picked up one of these cards and the breakout cables and it handles 8TB drives, easy to install. Works great

u/darkciti · 1 pointr/homelab

Thanks. I'm thinking I can use a 4 port card (but the H200 is only 2 ports) and break 2 of them off to an external SAS adapter like this.

Now I'm just wondering if the performance would be better with 2 cards or 1 card with 4 ports.

u/Sparkybear · 1 pointr/buildapc

I don't have an answer for you. Once you get into controllers that have more than 4 ports you're looking at a lot of server hardware. The Rampage V Extreme won't allow you to create a 6 disk raid, nor would you want it to. The purpose of a raid controller is to offload the work of creating and managing the array to a hardware controller, instead of through software.

I've found an okay-ish place for you to start looking:
https://forums.servethehome.com/index.php?threads/lsi-raid-controller-hba-equivalency-mapping.19/

This is probably one of the better controllers I've found:

https://lenovopress.com/tips0776-6gb-sas-hba?redbooks-divestiture

and on Amazon: https://smile.amazon.com/IBM-46M0907-6GB-Sas-Hba/dp/B003OYV9D6?sa-no-redirect=1

u/DerBootsMann · 2 pointsr/sysadmin

Nice! Here's one for free! I mean I don't see how I could charge you for pointing out at something!

https://www.amazon.com/Adaptec-2248700-R-Express-1-Channel-Adapter/dp/B000NX3PII

u/brains_are_fun · 3 pointsr/WhatsInThisThing

They did, but I believe they were only made to operate scsi peripherals (scanners and such). Your best bet would be to find a desktop with a regular PCI slot and install scsi card.

It's not that hard or expensive:

http://www.amazon.com/Adaptec-1822300-R-39160-Ultra160-Controller/dp/B00005148W/

u/zeblods · 1 pointr/zfs

You know if there's a list of supported chipset on the ROCKPro64 PCI-e?

With something like this I could connect up to 16 SATA discs and run ZFS with it.

I currently have an old server: Tyan motherboard, intel core2duo, 8GB DDR2 ram, PCI-X (not express, the old PCI-X format), 10 3TB hard drives in RAID-Z2... It's massive, heat a lot for not so much perfs, and I'd like to replace it with a lighter config like a ROCKPro64...

Performance-wise, I only need gigabit speed, about 100MB/s top.

u/WetVape · 1 pointr/unRAID

So I would want 2 of something like this? IO Crest 16 Port SATA III PCIe 2.0 x2 Controller Card Green, SI-PEX40097 https://www.amazon.com/dp/B00XI4OL82/ref=cm_sw_r_cp_api_i_pClzDbHH2EXC0

Is there a specific card that is popular in the UnRaid community ?

u/matjeh · 1 pointr/DataHoarder

For best performance, 8 of these:
mSATA x4 to PCI-e 2.0 x4 adapter:
https://www.amazon.com/Syba-HyperDuo-Controller-Marvell-Chipset/dp/B00KKO6N98
price: 62 USD each.

and a motherboard that can take 8 PCI-e slots, like one of these:

MSI Big Bang Marshal : https://bit-tech.net/previews/tech/motherboards/msi-big-bang-marshal-preview/1/

or: Colorful B250A-BTC PLUS : https://i.redd.it/9c1di1ndfv901.jpg

I'd recommend using ZFS and disabling any RAID functionality on those adapters.

u/deusxanime · 2 pointsr/PleX

That all depends on what you want to do with it. I use a Supermicro AOC-SAS2LP-MV8 that I got for ~$130 on Amazon (actually have the older version as well the AOC-SASLP-MV8 but that doesn't support quite as large of drives so stick with the SAS2 now). It supports 8 SATA drives via 2 SAS to SAS connectors. Much easier to run 2 cables than 8. It doesn't have any hardware RAID support, but I don't want that anyway. Trying to rebuild a hardware RAID built out of 8 or more 2+ TB drives would be incredibly long and painful. I use SnapRAID instead for parity inside the OS.

u/mahkra26 · 3 pointsr/homelab

I bought a 24-bay supermicro 2u case with an old AMD motherboard in it and gutted it into a JBOD array with the help of a few small adapters, like so:

  • There's these (with nothing to remove thankfully) on ebay right now: case
  • Install this in place of a motherboard: JBOD module
  • you'll need is a 8087 to 8088 adapter
  • You might need some 8087-8087 cables

    Topology is: SAS expander backplane top and bottom ports (ignore middle) to the two internal ports of the low profile adapter via two 8087 cables, then a standard e-SAS (8088) cable to the LSI 9207-8e in my server from the external ports.

    This has worked out fabulously for me.

    For added comfort (aka noise and power consumption), I removed the stock dual power supply that the 2u case included and replaced it with the guts of a 230w atx power supply, since I don't have dual sources. That cut the power draw down by ~80w or so. I also replaced the fans with much quieter ones (standard ~50 CFM 80mm units) and then improved airflow by taping over holes with masking tape, and using a thick paperboard to block other areas - the main purpose being to force the airflow through the drive bays.


    Edit:
    If you prefer LFF drives, there are 12-bay 3.5" already assembled with all the necessary parts from ebay: http://www.ebay.com/itm/222338813833
u/drashna · 1 pointr/DataHoarder

They can be, yeah. But you can find them for pretty cheap some times, too. link

As for the 36 bay case, it costed me ~$600. $300 for the case, and $300 for "parts" (upgrading to SAS2, drive trays, rails).

As for the 45-60 bay enclosure, I'll be saving up for. $700-1000 is a reasonable price for these, used. THough, if you're luckly, you can find them for much less.

link

u/meyerjaw · 2 pointsr/PleX

Yeah, it wasn't cheap but I'm hoping this NAS will last for years so I was willing to spend the money. This is the RAID Card that I got along with this SAS Expander

u/seizedengine · 3 pointsr/homelab

You can also buy adapters if you have trouble finding a card like the 9207-4i4e

SFF-8087 vs SFF-8088 do the same thing, just SFF-8088 (external) are larger and much more durable. So converting between them is easy and safe.

https://www.amazon.com/CableDeconn-SFF-8088-SFF-8087-Adapter-bracket/dp/B00PRXOQFA/ref=pd_cp_147_1?_encoding=UTF8&pd_rd_i=B00PRXOQFA&pd_rd_r=GE4FNVAZ8Q0AGPSFF88G&pd_rd_w=88BoN&pd_rd_wg=kiVIF&psc=1&refRID=GE4FNVAZ8Q0AGPSFF88G

u/xsnyder · 2 pointsr/homelab

Here is one on Amazon. Be aware that you will be limited to SATA-I 1.5Gb/s.

QNINE 4 Ports PCI SATA Raid Controller Internal Expansion Card with 2 Sata Cables, PCI to SATA Adapter Converter for Desktop PC Support HDD SSD https://www.amazon.com/dp/B01N5LQ7Z3/ref=cm_sw_r_cp_apa_i_OMv5CbP70TQP5

Another question, this sounds like very old hardware, any chance of upgrading so you could take advantage of newer revisions of SATA or PCI-E?

u/CookieLinux · 3 pointsr/DataHoarder

You could get a SAS raidcard and cables all for ~$27 on amazon. that would give you 8 ports with those cables and if you wanted you could use a SAS expander cable to get more.

OR

Get a $20 4 port SATA raidcard and go with that.

FYI you can plug a SATA hard drive into a SAS raidcard just not the other way around just like how you can fit a small box inside a bix box but not the other way around.

My reccomendation would be to get the SAS raidcard so you have a little expansion room if you want more drives.

u/willglynn · 3 pointsr/DataHoarder

You didn't ask me, but you could get a Lenovo SA120, LSI 9200-8e, and the appropriate cable for under $300 – leaving some cash for Lenovo drive trays.

(Note also that none of these parts are necessarily ideal for you; for example, the MSA60 costs less and includes trays but has its own drawbacks. It's hard to say without knowing requirements.)

u/snuffeluffegus · 1 pointr/pine64

I would assume the one pine64 sells is compatible, but the one I got was defective so I can't speak to that, but I do know of an alternative. I picked up the below card instead which works great with the vanilla armbian build for the RockPro64 which I installed OpenMediaBox on top of.

SHINESTAR SATA Card 4 Port with... https://www.amazon.com/dp/B07KNXZFRH?ref=ppx_pop_mob_ap_share

The pro to this card is you get 4 Sata ports vs 2 on the card that pine64 sells, so in my NAS enclosure I have 2x 3.5" platter disks and 2x 2.5" SSDs which I power off the RockPro64 with the power splitter that shipped with the Sata card.

u/AshleyUncia · 8 pointsr/DataHoarder

https://www.amazon.ca/gp/product/B00XI4OL82/ref=oh_aui_detailpage_o00_s00?ie=UTF8&th=1

If you want something weird there's THIS thing. I've never USED it but it seems to be FOUR of those chips on one card. I think it might ACTUALLY be four discrete devices, each getting their own PCIE lane. In retrospect it'd have been more space efficient on my system, cause I could have had a lot more on my 16x slot.

u/FoxxMD · 1 pointr/unRAID

So it's a PCI controller like this? ASM1062 is a chipset, could be onboard or pci -- need to specify.

If it's PCI and consumer hardware its probably not a PCIE 2.0 x4 (or anything higher) so probably x1 or x2. x1 can do 500MB/s, x2 can do twice that. But that's going to be shared between read and write across all drives. If this is your configuration I'd assume you are bottlenecking the bus. If you have onboard (on the motherboard) SATA ports I would offload as many drives to those as you can.

u/nameBrandon · 9 pointsr/DataHoarder

I was just in this position.. I've got an older i7 box with 24GB of ram, and had 8x3TB drives crammed into the tower forming a RAID-6 array that was ~97% full. I'm running openmediavault to manage the storage simply because I prefer Linux to something like FreeBSD. It also has a PLEX plugin as well, and I run PLEX on the storage box locally.

After a lot of research, I purchased the following.

LSI 9200-8e - SAS HBA - ~$40

Lenovo ThinkSever SA120 DAS - ~$200

12 drive caddies / trays for the DAS (optional, but suggested) - check eBay, ~$100 total. You can use the caddies that come with the SA120 but need to dremel them and drill screw holes.

I moved all of my drives to the DAS (Except OS drive) leaving 4 more bays for expansion. I added 2 more 3TB drives and grew the array (actually still waiting on that to finish...).

So for ~$350 I moved to a much more flexible setup (you can actually daisy chain the DAS's, so you can buy another one for 12 more bays when you're ready) and extended the life of the setup by quite a bit.

u/jppowers · 2 pointsr/DataHoarder

This is what I bought building my new rig: https://smile.amazon.com/gp/product/B0085FT2JC/ref=oh_aui_detailpage_o02_s00?ie=UTF8&psc=1

I paid less than the current price listed there. Shop around for the LSI 9207-8i, it's best bang for the buck HBA that's new in box I've found.

u/ixidorecu · 1 pointr/freenas

could go a more ghetto route, depending on case etc.
int to ext

like here

but yeah second the LSI 9201-16e, for 6gb/s, and new off newegg was $26

u/jdrtechnology · 1 pointr/unRAID

I recently put in an LSI card to attach 8 HDD's into my array (I have 5 SSD's attached making up my cache - not ideal, but I had the parts so... ;-). Worked out of the box. no flashing. No updates. I ordered mine from Amazon.com. Was $75, but I did not want to risk it, as this is my server (worth the $25 to me for simple piece of mind).

https://www.amazon.com/gp/product/B0085FT2JC/ref=ppx_yo_dt_b_asin_title_o04_s00?ie=UTF8&psc=1

Combined that with the splitter cables (I used these: https://www.amazon.com/gp/product/B07CKX6HVV/ref=ppx_yo_dt_b_asin_title_o03_s00?ie=UTF8&psc=1 ) and I have had 0 issues.

It was by far the most highly recommended card, and I did not want to deal with a bunch of random issues to save $25 dollars.

u/DuggyMcPhuckerson · 1 pointr/PleX

I tested my server by adding 1080p transcoding streams one at a time while monitoring the system resources. I found that when I reached 4 simultaneous streams, my memory utilization was at 88% of 8GB while the AMD FX-8350 CPU was at 56% utilization. My guess is that you will run out of RAM before you run out of CPU. I would look at increasing your RAM to 8GB or higher before anything else.

As you mentioned additional storage will always be an issue as your system grows. While most Motherboards will only support 4 to 6 SATA ports, if your case size permits, look at adding a SATA card to your system for additional ports. My Motherboard currently has 6 ports plus the four additional provided by my SATA card.

u/unfadingpyro · 1 pointr/DataHoarder

I don't think any such thing exists for multi sata to usb. Atleast not to my knowledge. Another solution would be to use an Internal SAS to External SAS card (Like this) and then use an LSI 9200-8e card in IT mode on your main computer. That would present each hard drive to the computer as an individual hard drive like you were connecting them over USB.

With each port on the Internal to External SAS card you can connect 4 Sata drives.

u/hazel-the-rabbit · 3 pointsr/buildapc

I expanded to add extra drives to my 8 bay NAS with this card

it works great. I also have 6 SATA ports so even if you drop 2 ports you have room on this card.

u/snickers46 · 2 pointsr/pcmasterrace

Are you talking about getting just the SSD for now to install on their existing machines? I'd check to see if their motherboard has SATA connectors to support the hard drive. If not you'll probably need something like this

http://www.amazon.com/gp/aw/d/B00L2X6DE6/ref=mp_s_a_1_2?qid=1419014929&sr=8-2&pi=SL75

Assuming there is no SATA they would probably have a white PCI slot, again, I would double check.

u/zry95 · 1 pointr/homelab

Thanks so much. This one would already be in IT mode wouldn't it. Then I could just update the firmware using the normal firmware update process?

https://www.amazon.com/dp/B0085FT2JC

u/chrisinvt · 1 pointr/CableManagement

OP here, sorry guys I've been busy at work with EU problems :)

It's setup to be a hybrid SAN/NAS depending on how I decide to configure it (currently no OS). You're correct in saying it's a SAN device, and not an entire Storage Area Network, but Id didn't think I'd have to be so specific. The original plan was to install this into my rack and piggyback it to my NAS over fiber, but I might use it as an off-site mirror for NAS.

It didn't really need much horsepower, so I used a mainboard out of an old HP dx2300MT I had kicking around. RAM is 2GB of DDR2 & CPU is a Pentium D 3.4 GHz.

As for the case, you're correct as it's the Rosewill RSV-L4411 and the card is also an Areca ARC-1160D with the RAM upgraded to 1GB.

u/tigershadowclaw · 2 pointsr/homelabsales

In order to use the drive at full link speed (SAS3) you would need something like this: https://www.amazon.com/LSI-Broadcom-9300-8i-PCI-Express-Profile/dp/B00DSURZYS and this cable to go with it (for a desktop anyhow): https://www.amazon.com/CableCreation-Internal-SFF-8643-SFF-8482-connectors/dp/B01F378UF6

if you don't care about getting the full 12Gb/s from it you can go with the cheaper LSI-9207-8i controller ( https://www.amazon.com/LSI-Logic-9207-8i-Controller-LSI00301/dp/B0085FT2JC ) and this cable https://www.amazon.com/dp/B013G4FEGG/ which would allow you to get 6Gb/s which is the current max SATA speeds anyhow. (SATA1 is 1.5Gb/s, SATA2 is 3Gb/s, and SATA3 is 6Gb/s while SAS1 is 3Gb/s, SAS2 is 6Gb/s and SAS3 is 12Gb/s

u/SNsilver · 1 pointr/homelab

I followed the link to your NAS build. Instead of a SAS breakout cable, do you think something like this would work just as well? I am doing a very similar build, but with 8tb HDD's.

u/old63 · 1 pointr/zfs

Thanks so much for all this!

I had found the memory and controller card below in the interim.
https://www.amazon.com/Tech-PC3-12800-PowerEdge-A3721494-Snpp9rn2c/dp/B01C7YS08U

https://www.amazon.com/LSI-Logic-9207-8i-Controller-LSI00301/dp/B0085FT2JC

I think these will work. What do you think?

On this build I probably won't try to get a slog for the zil but in the future I may if we test and can hook these up to our vm hosts. Do you have any recommendations for that? I know NFS does sync writes so I think I'll need a slog if I do that.

u/Gumagugu · 1 pointr/homelab

Another (more ghetto), is to convert the external cabling to internal, using something like this https://www.amazon.co.uk/CableDeconn-SFF-8088-SFF-8087-Adapter-bracket/dp/B00PRXOQFA/ref=sr_1_1?ie=UTF8&qid=1542284971&sr=8-1&keywords=8088+to+8087 and then using a SAS expander.

u/proxydouble · 1 pointr/pcmasterrace

There are PCI SATA expansion cards. Like this: http://www.amazon.com/gp/aw/d/B00L2X6DE6/

u/bobj33 · 2 pointsr/DataHoarder

You can turn internal SAS SFF-8087 ports into external SAS SFF-8088 ports using a bracket like this and some 87 to 87 cables.

https://www.amazon.com/gp/product/B00PRXOQFA

u/HangGlidersRule · 2 pointsr/homelab

I’m using an LSI 8200-8E, MD1200 is attached to it with two SFF-8088 cables. Supports full 6G.

https://www.amazon.com/gp/product/B005MQP232/

u/gilahacker · 1 pointr/unRAID

I'm using two of these, myself:

https://www.amazon.com/gp/product/B0085FT2JC

They work great with my 4 and 10 TB HGST NAS drives, but I did have a problem with my Samsung 850 EVO SSD. There is a firmware update available for them that I haven't tried yet (I just moved the EVO to on-board SATA ports and it's fine).

Edit: You'll need cables like these (it doesn't come with them): https://www.amazon.com/gp/product/B013G4EMH8

u/cheese93007 · 1 pointr/techsupportmacgyver

Amazon. Compatible cards are surprisingly plentiful.

u/brossman · 2 pointsr/buildapc

I've got this in my unraid box. haven't had a problem yet. you'll also need some cables like this though.

u/logikgear · 2 pointsr/freenas

Here is the HBA I use with FreeNAS.

LSI Logic SAS 9207-8i Storage Controller LSI00301 https://www.amazon.com/dp/B0085FT2JC/

You will also need these to connect drives to that card.
Cable Matters Internal Mini SAS to SATA Cable https://www.amazon.com/dp/B012BPLYJC/

u/AttackTribble · 9 pointsr/talesfromtechsupport

You can get PCIe SCSI cards, and SCSI is designed to be backwards compatible. You would need to find an adapter though.

http://www.amazon.com/Adaptec-2248700-R-Express-1-Channel-Adapter/dp/B000NX3PII

u/goar101reddit · 1 pointr/DataHoarder

I'm looking at a very similar card to do something very similar. But this card seems to be the best/cheapest/working method I've found. Does anyone know of a cheaper card (or option)? One that works with either a single slot or only pcie x1?

u/loki8481 · 1 pointr/DataHoarder

depends on what you need the drive for.

you can either use a USB drive (if it's just for backups), eSATA (your motherboard may have an open slot for one on the back), or buy a SATA card like this

u/Reset_Assured · 0 pointsr/DataHoarder

I think that answers my question. I'll just get another Siverstone DS380 case and a bracket

u/charonpdx · 1 pointr/VintageApple

You'd need a SCSI card in the PC.

If you have an older PC with a conventional PCI slot (pre-PCI Express,) SCSI cards are cheap and readily available on eBay.

If your only PC is a newer one, you'd need either a PCI Express SCSI card (not cheap) or an old USB SCSI adapter (really not cheap) then adapter cables to convert the "external SCSI" to an internal SCSI, or the high-density internal connector to the bigger/lower-density connector on the drive.

u/_kroy · 1 pointr/homelab

For now, I was just going to use it as-is. I know it's some X3400 CPU under the hood.

Based on my order history, it was just one of these

u/babecafe · 1 pointr/DataHoarder

I didn't get joy from this card under Linux when I briefly tried it, but Windows might be more accomodating. It's a 16-port SATA controller https://smile.amazon.com/IO-Crest-Controller-Green-SI-PEX40097/dp/B00XI4OL82

u/exodius06 · 2 pointsr/DataHoarder

I think the budget may be what ends up deciding it. I don't want to go with something that is going to be too slow though. I am using the storage area for my Plex media if that matters.

This the LSI HBA I've found so far and from what I see they're both 6Gb/s so I'm not seeing the advantage.

For the motherboard I didn't really notice any difference in my power bill when I starting using this one so I've never thought too much about it. Can you give me an example of a board you're talking about? The only thing that's ever been a little annoying with this box has been how loud it can be.

This is the case I'm using if that helps.

u/flux103 · 1 pointr/DataHoarder

That would be the most efficient and economical, and done properly with this CableDeconn Dual Mini SAS SFF-8088 To SAS36P SFF-8087 Adapter In PCI bracket https://www.amazon.com/dp/B00PRXOQFA/ref=cm_sw_r_cp_apa_BqNwzbV4SGD8K I personally would just run it in a empty pcie bracket though to keep component count down to decrease failure rate.

u/General-ColinBile · 1 pointr/unRAID

Check out the UnRaid forum. I'm looking at these:

-Supermicro PCI Express x4 Low Profile SAS RAID Controller (AOC-SASLP-MV8) https://www.amazon.com/dp/B002KGLDXU/ref=cm_sw_r_cp_apa_UPmFxb8KHH59P

  • I'm on mobile and not at home so I can't find the other.
u/7824c5a4 · 4 pointsr/homelab

He mentioned in his last post that it has QSFP ports, and that he would be buying an SFF-8088 to QSFP adapter. No idea how NetApp handles it though.

OP says
> IBM M1015 in IT mode -> SFF-8088 to SFF-8087 adapter card -> NetApp DS4243 via QFSP -> SFF-8087 cables

u/The128thByte · 1 pointr/buildapc

Are you saying PCI Or PCIe? PCIe has loads of SATA III controllers boards, but PCI is gonna be limited to SATA II. I’m going to assume PCIe bc you said 3.0 afterwards though. I would go with this then: https://www.amazon.com/dp/B07KNXZFRH/ref=cm_sw_r_cp_api_i_rKlTCb1B4ZXDS

u/zirus1701 · 3 pointsr/PleX

You get a PCI-express SAS controller to install. They come in 2 and 4 port varieties, and you get SAS to SATA cables (turns one SAS port to 4 SATA connectors) to plug into it to connect your hard drives. That could be 16 drives per card, a couple of PCI-Express slots and you'll have more SATA connectors than you have room for hard drives.

Edit: I'm in no way recommending this specific one, but here is one example of what I'm talking about:

https://www.amazon.com/LSI-Logic-9207-8i-Controller-LSI00301/dp/B0085FT2JC/ref=sr_1_4?keywords=SAS+controller&qid=1567797015&s=gateway&sr=8-4 (it's a 2 port variety).

and for cables:

https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=sr_1_3?keywords=SAS+SATA+cables&qid=1567797027&s=gateway&sr=8-3

u/KaleemG2K · 1 pointr/techsupport

This is the exact one I bought https://www.amazon.co.uk/gp/aw/d/B00L2X6DE6/ref=yo_ii_img?ie=UTF8&psc=1 I'm not big on computers so I'm not sure about all this stuff, from what your saying I guess I bought the wrong one?

u/Drak3 · 1 pointr/homelab

these bad boys?

meh, if it works, it works.

u/Covecube-Christopher · 2 pointsr/DataHoarder

Cheap? $100-150 is going to be cheap. Anything cheaper, is shitty and you will have issues.

And fuck crossflashing. Get one of these: https://www.amazon.com/LSI-9207-8i-Storage-Controller-LSI00301/dp/B0085FT2JC