I purchased a Dune HD Vision Pro Solo for this reason. DV is more confusing then most people know. This started when Sony and Dolby Labs introduced LLDv ( low latency Dolby Vision). Previously, STD, where the display did the dynamic tone mapping was done for one reason. At the time the version of HDMI did not support dynamic metadata, so that's why the display had to have a dedicated chip in it to do the heavy lifting. Different displays took different amounts of time to do this depending on how powerful they were. I believe profile 4 is the only version that is STD. Going forward (from around 2018), all DV displays must support LLDV but are not required to support STD. I have read conflicting info on this though.
My understanding was streaming services used profile 5, which was the first version of player led decoding (LLDV). LLDV makes sense because a console can do what the display used to have to do, but way, way faster. With display led, one display may take longer than another which would mean more lag. Obviously not an issue for movies but huge issue for gaming. It's software based so no dedicated hardware is required. I know with the Xbox, it's how they were able to bring DV to the console via a firmware update because it was now all software based.
UHD disks use profile 7 (MEL or FEL). There is no indication on the packaging of any kind that lets you know which is used. For movies, it's better to have the display do the dynamic tone mapping instead of the source. The display knows it's "limitations" for lack of a better term.
Dolby likes to complicate matters, just like that way it complicates Dolby Atmos when it first rolled out circa 2013 - 2014. They provided a white paper on this and requires manufacturers of Atmos-enabled speakers to conform to certain requirements like HPF to crossover at 180Hz before it can be certified as a true Dolby-Atmos enabled speakers

. I guess history repeats itself with the Dolby Vision standards. The problem with Dolby Vision is the complexity of the way Dolby roll out its DV implementation. Believe it or not, there are at least 6 known profile levels. Within a given profile, the maximum level a base layer (BL) or enhancement layer (EL) is restricted by the Profile. If you have used MakeMKV s/w to make back-up copies of your precious bluray or 4K UHD bluray titles, you should be familiar with terms like Profile H.265 Main10 4.1/5.1 etc.
Now let us break down what are some of the Profiles that Dolby has had dished out over the years.
First is the single-layer (Profile 5 & 8) that most LLDV-based sources will prefer - e.g. Netflix, Apple TV+, Disney Plus. Besides streaming devices that utilize this type of DV implementation are the media players such as Zidoo Z9X and nVidia Shield TV 2019 that comes with DV decoding capability (LLDV).
Xbox Series X/S console utilize LLDV (Player-led) as the preferred form of DV instead of the unadulterated version. This is why the 4K UHD bluray drive is not able to read the dual-layer (usually Profile 7) containing the DV for TV to process and decode.
With all that said, all Marvell movies are HDR10 only for the UHD disks. Disney waited until the streaming versions before implementing DV. It's nothing more then a way to get more subscribers because of the DV buzz word. Similar to IMAX version.
Right now DV has one advantage over HDR10, dynamic metadata. No commercial display is near 4000 nits, much less 10000. No 12 bit commercial display exists. HDR10+ looks great. Bohemian Rhapsody is evidence of that but HDR10+ content is severely lacking.
The main reason I bought the Dune is it supports profiles 4, 5, 7 (MEL and FEL), 8 and 9. In fact you can have it output everything as HDR or DV. It properly re-maps the color tones so the colors are correct. This does not really improve the picture quality when done correctly. It does cut down on banding and polizieeation by "up-biting" to 12 bits.
Nobody is doing frame by frame dynamic metadata. At most, scene based. HDR of any kind takes basically no bandwidth. It's just a either 16 or 32 character HEX value. Plain text pretty much uses zero bandwidth. DV does not add any bandwidth that would be noticable regarding bandwidth and size. Even if it was scene by scene a HEX value is nothing compared to sending the info for the billions (or millions) of pixels on 4K display at 24fps.
At the end of the day DV is much more complicated than HDR10 or HDR10+ because of Dolby.