I recently had the opportunity to render the same video clip on both a MacBook Pro with an M1 Pro chip and a Windows machine with a 4070Ti. Surprisingly, I found that the rendering quality under the same settings was completely different between the two. The Windows version was highly usable, while the Mac version was essentially unusable.
Settings Used:
- Original Video Specs: 4K 59.94fps (the original video was generated by a mirrorless camera and is an HEVC file. The quality is excellent, with no issues like color banding.)
- Features: Full-Frame Stabilization and Artemis - Denoise/Sharpen (default settings)
- Both Mac and Windows used GPU rendering.
- Output: H.265 Quality Level: Low
The file sizes generated by both were similar, with Windows being slightly larger. The video bitrate for Windows was around 19M, while for Mac it was around 16M. The video quality on Windows was excellent, whereas the Mac output showed significant artifacts, such as pixelation and color banding.
I’ve tried using H.264 and setting the Quality Level to High on the Mac, and I’ve also experimented with both v3.x and v4.x versions. The result remains the same: the video output on Mac is basically unusable.
I’ve attached two images for reference. Windows on left and Mac on right.
Does anyone else have the same issue?
Does anyone know why this is happening?
The image will undergo some compression after being uploaded, but what you see is essentially what I see.