Start a Conversation

Solved!

Go to Solution

1 Rookie

 • 

7 Posts

704

December 13th, 2024 00:37

Dell 5820T Workstation PCIe Bifurcation problem - I’ve tried everything!

I’m trying to use a 4x M.2 NVMe to PCIe add-in card, and I can’t get the dang thing to work! At least not in this computer. It shows 1 of 4 drives only.

I can't find any BIOS setting to set bifurcation to x4x4x4x4. Is there such a setting? My understanding is turn off VROC, turn off VMD, and it just... works? That surprises me, but after spending way too many hours searching forums, that's how it seems?

I’ve tried all the standard things ;_;

  • From googling, the 5820 should support PCIe bifurcation- many people have said they use it.
  • BIOS version 2.36.0 from May 10, 2024
  • I am using a PCIe x16 slot that is wired x16. This tower has two electrical x16 slots, and I have tried both, including the one nearest the CPU
  • This is a Xeon W-2125 processor. It supports bifurcation as far as I can tell, and I have plenty of PCIe lanes
  • I have VROC turned off, and I have no VROC key installed
  • VMD is off
     
  • There is no specific setting in my BIOS that I can find to specifically turn a PCIe slot into x4x4x4x4 or other configs, but as far as I can tell, I don’t need to for it to work
  • I have tested the card on another system with an ASUS TUF GAMING B650-PLUS WIFI motherboard, where after manually configuring the mobo in the BIOS to RAID mode (clearly it’s not literally RAID right?), I could see all 4 drives and use them individually in Windows. I could also do x8x8 and see two of the drives. So the card works, at least in that context
  • This is the card in question. It’s JEYI but people in the comments say it works. Also, previous bullet point ^
  • In the BIOS, under system info, under PCI information, it says the slot is populated by “Mass Storage,” which seems expected

I am stumped. I can boot into TrueNAS Scale (on a different drive), and it can see the first drive just like the BIOS. But the BIOS just cannot see the other drives.

Does anyone have an idea? I’m down to do any testing and settings changing suggested! There’s gotta be some random thing I’m missing.

1 Rookie

 • 

7 Posts

December 21st, 2024 21:27

Final update, best solution, and testing:

I tried the add-in card with no drive in slot 1- this sent the computer into a perma-reboot loop. This computer does 3 reboots after every hardware change, that's normal. But I sat through at least 10 reboots before pulling the plug.

I thought maybe bootloaders, partitions, or sectors might be the solution. So I installed Debian on another M10 drive, put it in the card in slot 2 with my TrueNAS installation on an M10 in slot 1. Did not work. Only saw the first drive.

My conclusion is this:

The boot problems are likely either due to this kind of very early version of Optane (the M10) being weird, or possibly relevant to more people- the M10 drives are B+M Key and PCIe gen 3.0x2. These are the only B+M Key M.2 drives I have, they're the only 2 PCIe lane drives I have. That's my guess.

So if you come here from Google for a similar reason, try putting a proper 4 lane NVMe drive in the first slot of your adapter card. Or try using an M Key drive. I'd be fascinated to know what the exact cause of this is, but ultimately the fix is simple: I have a $10 M.2 PCIe 3.0x4 M Key NVMe drive on the way from eBay as I type this.

1 Rookie

 • 

5 Posts

December 13th, 2024 11:22

Sounds like you've already done a ton of troubleshooting. A couple of things come to mind. Some Dell systems, even if they "support" bifurcation, require a very specific BIOS setting or firmware update to enable it. Did you happen to check Dell's manuals or forums for anything specific to the 5820T? Sometimes these features are buried under oddly named options in the BIOS. or maybe reset the BIOS to defaults (after saving your current config).

hope this helps!

9 Legend

 • 

7.8K Posts

December 13th, 2024 17:32

There is no bifurcation setting menu.  BIOS settings at default will work with PCIe bifurcation.  Tested on Asus Hyper V.2 with x4 Samsung NVMe SSD. 

You may have a faulty JEYI adapter. 

1 Rookie

 • 

7 Posts

December 13th, 2024 17:48

@Chino de Oro

You may have a faulty JEYI adapter.

I don't think so! Like I say, the JEYI card worked fine with bifurcation on my other motherboard. I also tried an Asus Hyper card (one with two M.2 NVMe slots at PCIe 3.0), and it did not work, whereas it also did in another computer.

There is no bifurcation setting menu.  BIOS settings at default will work with PCIe bifurcation.

Dang, yeah that was my understanding. Could it be a bug that resetting the BIOS would fix?

Or VROC drivers? I've heard people talking about VROC drivers in a way that sounds like they're something installed at the BIOS level... that doesn't make sense to me, but could an old config still be causing problems? I would imagine that's an OS-level issue.

1 Rookie

 • 

7 Posts

December 13th, 2024 17:53

@dravenlocke87​ Yeah, I've definitely looked through the manual and technical guide a couple times, along with forums for the 5820, and some 7XXX and 3XXX series systems. I know more about this platform than I ever wanted to lol

or maybe reset the BIOS to defaults

This feels like a measure of last resort, but yeah. These Dell systems spook me because it sounds like they can be incredibly finnicky, and they have some weird behavior which makes troubleshooting harder (like the triple restart before POSTing after every change of specific kinds). Also lots of reports from 2018 (iirc) of a BIOS updating bricking systems. Spooky prospect without Dell support to cover me ha.

9 Legend

 • 

7.8K Posts

December 13th, 2024 17:56

BIOS firmware should not affect PCIe direct wiring to CPU features.

As mentioned in my earlier comment, reset BIOS settings to default value.  

How did you test and found only one drive?  In Windows?

1 Rookie

 • 

7 Posts

December 13th, 2024 18:23

@Chino de Oro​ 

As mentioned in my earlier comment, reset BIOS settings to default value.

Oh ha, sorry, I misunderstood that being an instruction.

I gave it a shot just now, and still no dice. Same behavior- I see one drive on the card in the BIOS when I look at "boot sequence" under "general." And same behavior in TrueNAS (described below) as well.

Resetting the BIOS to default values also set VMD to "auto" rather than "disabled." So after my first check, I changed it to disabled and tried again. Same results.

How did you test and found only one drive?  In Windows?

TrueNAS Scale. It's Debian-based. I could see the first drive on the card in both the TrueNAS web GUI and if I used commands like lsblk in the shell. But only the first drive drive on the card.

When using it in a different computer, I was able to see all the drives in its BIOS and in Windows.

(edited)

1 Rookie

 • 

7 Posts

December 21st, 2024 11:19

A STRANGE SUCCESS!!!

If anyone with a similar issue finds this: there may be hope!

The drives I was using to test all this were four 16GB M.2 NVMe Intel Optane M10 sticks. They're the cheap, slow model you can find on Ebay and the like- I wouldn't generalize this situation to all Optane.

Throwing variables at the wall, I switched the stick in the first slot of my adapter card out with an old "normal" M.2 NVMe drive I have- 500GB with I think a Windows 10 installation on it.

IT WORKED! The "normal" drive and the remaining three M10 Optane drives showed.

Now if I swap the "normal" drive to slot 2 on the card, and leave an Optane drive in slot 1, it goes back to only showing the single Optane drive in slot 1.

Why this is, I have no idea. I may test further, and I may not. If I do I will report findings. If you see this in the future feel free to contact me, though there's a good chance I won't see it.

It may be something with the Optane sticks?? All being the same model, and a very quirky product, maybe they report in a way that, if one of them is the first to report, no more are searched for? It may be because the "normal" drive has a boot partition? It may be that the detection doesn't work well when all the drives are the same model?

1 Rookie

 • 

4 Posts

May 12th, 2025 10:32

Hi, I'm having exactly the same issues. Did you ever find a solution to getting optane to be recognised for bifurcation?

1 Rookie

 • 

7 Posts

May 12th, 2025 15:45

Yeah, if you look at your add-in card: as long as the "first" drive on the add-in card is a typical NVMe drive, it works for me.

My card (random AliExpress model) had marked its M.2 slots as 1, 2, 3, and 4, and my hunch is it's based on whatever M.2 slot electrically connects to the first 4 PCIe lanes of the PCIe slot.

So, swap one Optane for a normal drive and just move the normal drive around on your card until it works lol. Good luck! Let me know if you get it working (or if you need any more info!)

1 Rookie

 • 

4 Posts

May 13th, 2025 10:49

Ah, yeah I managed to get it working with a NVMe drive in one slot and Optane in the other but really I want two Optane drives. I was hoping maybe you'd managed to find a workaround or something but it really seems like there's not a lot of options. I'm using the top slot with a dual NVMe adaptor board so there's only going to be x4x4 bifurcation.

View All

No Events found!

Top