@airmantharp I have a Nikon D800 and viewing 14bit/Adobe RGB raw files in Lightroom or Photoshop in 10bit colour looks more 'lush' than 8bit colour. If you were standing at my desk and I toggled between 8bpc/10bpc I believe you'd agree. With a good camera it's an obvious upgrade.
Let me preface by saying that I absolutely believe that you are seeing what you're stating- the challenge is that what you're seeing may not be what you should be seeing.
It sounds like you have a different calibration between the two modes.
What I'm getting at is that 8bit and 10bit both have the same 'colors', and the same range; what's different is that 10bit affords significantly more steps of gradation, and so less banding should be seen for the same content.
You may also be switching color profiles as well, i.e. SRGB to ARGB, and Adobe may be doing some of that for you; it's not something I've had a chance to work with on my system, no 10bit here (yet). Main thing is that when you take a picture of blue with your D810, it should still be 'blue' on your monitor, whether set to 8bit or 10bit, SRGB or ARGB, etc.
[the hell of getting all of this mess calibrated, and then having to deal with the different color spaces, is why I've honestly avoided 10bit; however, when MS gets their act together with HDR, I'll be looking to jump, as that takes care of a lot of the issues for general usage]