

Normal integer math uses locked endpoints such as. A couple of points that may help you though. Youve got 90% of the concept, so I'll try not to muddy the waters with my usual overkill. As Marco will tell you, it is a wide area of competing maths and confusing nomenclature, and even old math teachers get it wrong with some regularity. I do try to figure things out on my own before I post here and don't mind trying to research and test a thoughtful and intelligent posting, and very much on-topic. I want to thank those on this forum for sharing their knowledge in taking the time to explain these things. So I am not sure of the answer but I do wish to understand. In the article, the scenario I'm asking about is most closely related to Scenario #5 at the bottom of the article. My question was: Is the output on the rendered video file 8 bit or 10 bit? Would it ever be 10 bit, for instance, if you rendered to MXF wrapper? When I ck MediaInfo for Magix Intermediate XQ, it doesn't list Bit Depth whereas in the other "non Magix Intermediate" codecs it shows Bit Depth as 8. Initially, the scenario I was attempting to fully understand is what happens when you import 10-bit source footage such as Sony's own AVC-IntraFrame codec, set the bit depth before render to 32 bit, and then render out to Magix Intermediate XQ. I found this article, and even though it references a software program I will not mention by name, I think it explains it well and assume lot of the same processes apply to Vegas. I was trying to wrap my mind around what Vegas Pro is doing when you set the bit depth to 32 bit and maybe like some others, found myself confused and probably misinformed about what is going on under the hood.
