7950x/4090. This thing is quick. The 4090 fits into the Silverstone Alta G1M perfectly, slotting in and taking up all 4 slots and sitting pretty above the 280mm bottom intake fan. Temps are impressive, the fans don’t run most of the time. Yes, thIs CAse iSn’t Sff.
However, the case accepts SFX/SFX-L PSUs. You know what’s hard to fit into this case? A giant squid of an adapter cable.
Not even hard, impossible.
So, get yourself:
- connector: https://www.mouser.com/ProductDetail/Amphenol-FCI/10161719-124GLF?qs=A6eO%252BMLsxmTOUw0b8ngp1w%3D%3D
- terminal big: https://www.mouser.com/ProductDetail/Amphenol-FCI/10132447-121PLF?qs=KVgMXE4aH4kD2xtDkHxOJw%3D%3D
- terminal small: https://www.mouser.com/ProductDetail/Amphenol-FCI/10161952-2210LF?qs=pBJMDPsKWf28ZtMmoXzPFw%3D%3D
- Some *high quality* 16AG wire with an OD of <=2mm https://www.titanrig.com/mod-one-custom-pc-cable-wire-16awg-2mm-od-04-30-md-0116-00-xx.html?color=330
- Some https://www.mouser.com/ProductDetail/Molex/45750-3112?qs=c7kbnNtxOmhyR5kVrGXrUg%3D%3D&countryCode=US¤cyCode=USD for the PSU side (I recommend at least HCS, I’m going to use HCS Plus to be safe; lesser terminals can only do 570W~ across 2x PCIE/EPS lines)
- Crimper, stripper, etc
- Instructions, pinout, etc
ATX specs list a max of 150W per PCIE power connector. The 3080 12GB pulls 360W with 2x pcie connectors, with the spare 60W coming from the PCIE socket. These cables are also 1x Molex mini fit Jrs (EPS) to 2x PCIE 6+2 pins, or 8 pin. This means that any PSU that comes with two 1x EPS -> 2x PCIE 8 pin cables can do a 2x EPS -> 12VHPWR and run the card at 600W (full spec). Well, if it’s decent quality and YOU BUILD YOUR ADAPTER CABLE SAFELY. No warranty, this is educational info only, don’t do this, etc.
You’ll need to ground out both sense wires for 600W, grounding one supports 450W, etc.
There are no SFX PSUs with an available 12VHPWR plug, fuck, there’s only ONE 1000W SFX PSU out right now. ASUS Loki, Cooler Master, etc have “released” theirs, but none can be found on the market.
Specific to the Silverstone 1000W SFX
Similarly, a graphics card or expansion card with dual PCIe 8pin connectors that exceed 375W total power draw (300W from two PCIe 8pin connectors + 75W from PCIe motherboard slot) will also not be covered under warranty.
Two PCIe 8 pin connectors come in one wire bundle with this PSU (1x EPS -> 2x PCIE 8 pin cable), each can support 300W. As long as you use at least HCS terminals and 16AWG wire, you’ll be “rated” for full fat 600W on this 1000W SFX PSU.
So, get yourself pinouts, the parts, and learn how to safely make your own cables. I’ll update my blog once everything is built and installed (sometime next week).
And just so there isn’t any confusion, your cable should look something like this:
As for the early reports of SIG finding some 12VHPWR Nvidia power cables melting if bent around the connector,
it may be due to how some have reported that properly crimping 16AWG wire to these new connector terminals can be a bit difficult. You are gonna want to make sure all of your wire strands are fully crimped in the connector for best current handling and use high quality terminals (stock terminals aren’t rated in this config for 600W, use at least HCS). Edit: Lol, no. It’s mostly due to double split/seamed shitty Nvidia terminals and cable pullout. Continue reading. I’m going to try and solder mine after crimping for extra margin (not possible due to how small it is) as I will need to bend my cables pretty tightly (is fine if you build the bend into your cable lengths). You may also want to leave an extra millimeter past the terminal crimp area to account for any potential pull-out (also not possible, clearances are super tight); just throwing ideas out there and speculation around those initial SIG reports. Cablemod also has a preorder for a 90 degree connector.
Edit: Nvidia provided adapter cables use a double split/seam terminal, likely what is causing the fires/melting
My cable with OEM terminals look more copper and are single seamed:
A single seam connector wouldn’t split open like that, at least as easily, as seen in this explainer video:
Nvidia vs OEM. Which one do you think is more secure?