The Galaxy S21 Ultra received excellent reviews worldwide, thanks to its improved design, faster and more reliable performance, longer battery life, and better cameras. The new phone brings along two telephoto cameras (3x and 10x optical zoom), massively improving the zoom range compared to the Galaxy S20 Ultra. However, the Galaxy S21 Ultra actually received a lower DxOMark score compared to the Galaxy S20 Ultra.
In DxOMark’s camera review of the Galaxy S21 Ultra, the phone received an overall score of 121 points. In comparison, last year’s Galaxy S20 Ultra received 126 points after being reevaluated with version 4 (the latest version) of the review process. Breaking down the score, the Galaxy S21 Ultra received 128 points in the photo segment, 98 points in the video segment, and 76 points in the zoom segment. The Galaxy S20 Ultra received 128 points in the photo segment, 106 points in the video segment, and 88 points in the zoom segment. According to DxOMark’s review, the Galaxy S21 Ultra is not as good as the Galaxy S20 Ultra in video and zoom quality.
We know that the Galaxy S21 Ultra has more reliable focusing, improved low-light images and videos, and a longer zoom range than its predecessor. But the phone still got a lower score than the Galaxy S20 Ultra. So, what actually went wrong? DxOMark wasn’t impressed with the two zoom cameras, and according to them, the new zoom cameras are not worth much over the Galaxy S20 Ultra’s single 5x periscope zoom camera. DxOMark says that artifacts and noise in images are pulling down the phone’s score.
In terms of video recording, the Galaxy S21 Ultra received a score similar to the Pixel 4a. Apparently, video stabilization is the phone’s primary issue. DxOMark only tested the 4K 60fps mode, though, and not 4K 30fps and 8K 24fps video modes. The company says that it didn’t test the 8K video recording mode because of its lower quality stabilization.
Overall, the Galaxy S21 Ultra not only received a lower score than its predecessor, but also last year’s flagship smartphones. It scored lower than 16 other smartphones, including the Huawei Mate 40 Pro+ (139), Huawei Mate 40 Pro (136), Xiaomi Mi 10 Ultra (133), Huawei P40 Pro (132), Vivo X50 Pro+ (131), iPhone 12 Pro Max (130), iPhone 12 Pro (128), Xiaomi Mi 10 Pro (128), OPPO Find X2 Pro (126), Galaxy S20 Ultra (126), Honor 30 Pro+ (125), iPhone 11 Pro Max (124), Huawei Mate 30 Pro (123), iPhone 12 Mini (122), iPhone 12 (122), and the Honor V30 Pro (122).
I don't do camera testing for a living like DXOMark but I think it's safe to say the amount of time they spend and their methodologies are far beyond the capabilities (and probably knowledge in general) of 99% of anyone posting in these forums.
I can only say the results are curious if you were to assume newer is better and the focus issue present on the S20 Ultra is not present on the S21 Ultra, which by all accounts it is not. There are complaints of the S21 being a bit too bulk and heavy for most people's liking but that is irrelevant to the camera evaluation.
With that said, anyone can make any claim about bias one way or the other, giving just an opinion on if the information is reliable, or accurate, etc. But without some hard evidence from some in-depth analytics/testing/comparison with other testing sources, they are just biased opinions with no real substantiation.
I'm not from DXOMark. I suspect bias in most cases with all my information sources. But look where this discussion is taking place, the SAMSUNG forum. Gee, you think there's any inherent bias here also?
The original post doesn't even scratch the surface relative to the depth and breadth of the DXOMark rating which is based on WAY more information and testing than the few examples you gave in the post. So one could easily make a general statement that "the cameras are *better*" on the S21 Ultra but that doesn't necessarily equate to a better score which factors in SO many things.
You're better making the argument the score is in error by going over each individual score component compared to that specific component for ALL those other phones you listed. But that is of course problematic from a time stand point, so the question has been dumbed down for convenience.
You COULD make a good argument that maybe the weighting of certain score components should be given more or less weight.
You can also take a step back and question whether the difference between 121 and 126 is even statistically significant given the testing methodology is not 100% perfectly controlled from phone to phone so just like political polls, there's a margin of error. Those two scores are well within the margin of error. But the fact this question was posed shows how many of us can get a little too hung up on scores and specifications.