You asked: Should you buy a 70-inch TV, do you really need a Blu-ray player, and more?

This article contains affiliate links; if you click such a link and make a purchase, Digital Trends and Yahoo Inc. may earn a commission.

In this You Asked: Why buy a more expensive Blu-ray player? Blu-ray players vs. game consoles. Are 70-inch TVs any good? Is all eARC HDMI 2.1, and is all HDMI 2.1 eARC? And will we ever get QD-OLED with MLA?

Are all Blu-ray players the same?

A Magnetar UDP 900 Blu-ray player wiith a selection of Blu-ray discs around it.
Digital Trends

Mike B. writes: I’ve seen some discussions on Twitter/Reddit about varying picture quality among 4K UHD players. Ignoring build quality, HDR formats, extra features, etc., shouldn’t all players be able to read the data from the disc and send the “pure” signal to the AVR/TV without any issue? If the player isn’t doing any sort of processing, why would there be any difference in picture quality?


First of all — and please don’t misunderstand me when I say this because I’m not trying to minimize the validity of your question — but this question mattered a lot more a few years ago than it does now.

The reason? We just don’t have a wide range of brands or models to choose from anymore. Anyone still making a Blu-ray player is making a pretty decent player. We have Sony, Panasonic, and LG as the big brands, and then we have specialty players like Reavon and Magnetar. (RIP, OPPO.)

However, we do still see some range within the few options on the market. What might make one better than another in terms of picture quality? Frankly, not a lot. However, the difference in picture quality that can exist resides in the decoding process. A Blu-ray disc is full of encoded video that needs to be decoded before it can be sent along an HDMI cable to your TV. Decoding has less impact on video signal quality than processing — I’m getting to that — but it can have an impact. A very tiny, miniscule, difficult-to-perceive impact.

Processing is a different story. The most noticeable difference in processing will be in de-interlacing Blu-ray discs and upscaling anything 1080i and lower. But that is only if you allow the player to do any processing at all. If your TV is nice enough, its processing will be better than your disc player. If you have a lower-end TV, the player’s processing may be better. But it’s doubtful you’d see a difference on a lower-end TV.

No, most of the price difference between players will be justified elsewhere, outside of video signal quality. The build quality, including the drive mechanism, the transport tray, and the chassis. Or it may be in the audio section. A premium player may have better digital-to-analog audio converters, which can pay off if you use the player for CD, DVD Audio, or SACD playback. You may also notice a difference in start-up and disc loading times. Generally, more expensive players are more likely to last longer, operate more quietly, and operate more quickly. They may also just be less buggy because they use a more powerful chipset.

Blu-ray players vs. game consoles

A Magnetar UDP 900 Blu-ray player with a selection of DVDs and audio CDs around it.
Digital Trends

Following up on that question is another one about Blu-ray, coming from Josh Collins, who writes: You mentioned previously how getting a dedicated 4k Blu-ray player is much better than the PS5 and the Series X. I see this sentiment expressed all over the Internet. But besides the lack of Dolby Vision support for the game consoles, what makes them different/better? And would the Sony x700m be that much better than a PS5, or do I need to spend way more for a good player?


Not to be argumentative, but did I really say much better? I feel like I didn’t. But if I did, I may have overstated the difference. I do remember saying the PS5 is much better than the Xbox, though. And I stand by that. The PS5 handles 24 frames per second content far better than the Series X, which shows more judder.

If you want to put a Blu-ray or 4K Blu-ray disc in either of these machines and watch that movie, you’ll be able to do that. You’ll even get basic HDR and, to some extent, Dolby Atmos. Put the disc in, watch the movie — job done.

However, dedicated 4K Blu-ray players will offer better decoding — which we just talked about — and, if you want them to, better processing than either of the consoles. Will you notice the difference? I’m not sure, especially between the PS5 and a dedicated player.

Where the differences start becoming more noticeable is that neither console will play back discs in Dolby Vision. And, last I heard, neither played back HDR 10+, not that there are all that many discs with HDR 10+ on them.

Also, while getting Dolby Atmos out of a PS5 is pretty straightforward, getting it out of an Xbox is more of a hassle. You have to download an app for it, and I know on the Xbox One it wasn’t real Dolby Atmos.

Speaking of audio, neither of them is a good CD player, let alone an SACD player. So, if you want your disc player to do more than 4K Blu-rays, I’d say get a dedicated player. End of story.

70-inch TVs

A peacock's spectacular tail feathers on display on a Hisense U7K TV.
Zeke Jones / Digital Trends

Jeffrey Hein writes: I’ve always heard you should go for the biggest TV that makes sense in your space. I’ve recently moved, and the new room configuration would work perfectly with a 70-inch TV. Are there any top brands or models available in this size that you recommend, or is it almost always the better option to stick with 65 inches?


This question is deceivingly tricky. Initially, I think most folks would tell you that if money is not an object, buy the larger version of a TV within a model series. For instance, if you want to get a TCL QM8, get a 75-inch QM8 instead of a 65-inch QM8.

Most of the time, the larger version of a given model will be as good as the smaller version. Most of the time. Sometimes, due to panel supply considerations, a manufacturer may have to use a different kind of LCD panel at a larger size. For instance, a certain TV model in the 55- and 65-inch sizes may use a VA-type panel, but the 75-inch version uses an IPS or ADS-type panel. This is the case with the Hisense U8K, I believe. The 65-, 75,- and even 85-inch versions use VA-type panels, but the 75-inch uses a high-end IPS ADS-type panel.

I’ve not tested the 75-inch version of the Hisense U8K, for example, but those TVs I have tested with ADS-type panels were extremely close in performance to their VA-type counterparts. It used to be that older IPS panels had worse contrast than VA panels, but the new ADS panels are much better, and with mini-LED backlighting on the rise, they look just as contrasty as VA panels.

Why, exactly, do manufacturers switch panel types depending on the size? The reason actually ties in with another part of your question that I think is really important to address.

You asked about 70-inch TVs, and I don’t know if you were using 70 as in “in the 70-inch-ish neighborhood” or if you actually meant you were looking at 70-inch TVs versus 65-inch TVs. Either way, I think this is important for everyone to know.

Seventy-inch TVs – as opposed to 75-inch TVs — tend to be found on lower-end TV models, whereas more premium TV models tend to have 75-inch panel sizes.

The reason — and this is a very simplified explanation — is that TV manufacturers source their panels from a bunch of panel suppliers. The 70-inch panels tend to be lower cost and slightly lower quality, and the 75-inch panels tend to be higher cost and higher quality.

This is why, with some lower-end to mid-tier TVs, we’ll hear people talking about whether the TV they got has a panel from Panel Company A versus Panel Company B. There can be some variances there.

Do you see where I’m going with this? If you’re looking at actual 70-inch TVs, you will probably be looking at lower-end to mid-end TVs. Whereas if you’re looking at a 75-inch TV, you’re most likely looking at a higher-performance TV.

Let’s use Hisense as an example again. You can get a 65-inch U7K or a 75-inch U7K, but you can’t get a 70-inch U7K. It doesn’t exist. If you wanted a 70-inch Hisense TV, you would have to step down in the range, out of the U or ULED series, and into the A6 or R6. So, in that case, you would be taking a fairly significant picture quality hit going down to the series with a 70-inch model versus getting the 65-inch version of a much better TV.

I hope that makes sense. That’s a lot of numbers. Just don’t get a 70-inch TV because it is bigger than a 65-inch because, more than likely you’ll be sacrificing not a small amount of picture quality to do it. Go up to 75 inches, or stick to 65.

eARC HDMI 2.1

HDMI Ports on the back of a TV.
Digital Trends

Christopher Watt writes: I have a TCL 65S450G and a Sony STR-DH790 connected to the eARC port. Not high-end, but I thought they had the features I was looking for (4K HDR, and Atmos). The internet (and my PS5 connected to the TV) is giving me some conflicting information on the eARC HDMI port on the TV, though. Are all eARC ports 2.1? Are all 2.1 ports eARC? This TV has eARC, but the documentation doesn’t state HDMI 2.1.

What connection layout would I reap the most PS5 AV fidelity benefits from with this setup?


I’m going to answer the more specific questions directly in a moment. First, let’s answer a couple of the broader questions.

Are all eARC ports HDMI 2.1? No, they are not. Most of them happen to be, but eARC does not need HDMI 2.1 bandwidth to do its job. eARC can pass audio audio formats over the 18 gigabits per second bandwidth supported by HDMI 2.0 b just fine.

Are all 2.1 ports eARC? No, they are not. I’ve never seen a TV with more than one eARC port. But, more importantly, the question of whether your eARC port happens to be one of your HDMI 2.1 ports comes up often because many TVs only have two HDMI 2.1 ports. And when one of your HDMI 2.1 ports is also your eARC port, that can limit the number of HDMI 2.1 devices you can connect directly to your TV. TCL is one of the only brands I can think of that offers two independent HDMI 2.1 ports and keeps the eARC function separate.

You only need to worry whether your eARC port is HDMI 2.1-capable for two reasons. One is if you have more than one HDMI 2.1 device that you want to connect directly to your TV. The other is if your AV receiver is HDMI 2.1 and you are connecting a PS5, Xbox Series X, or a high-end PC to that receiver, then you’d need full HDMI 2.1 bandwidth from your eARC port. If that’s not you, then don’t worry about it.

For this specific situation, neither your Sony receiver nor your TCL TV supports HDMI 2.1 — which isn’t the end of the world, you can still get 4K 60 fps and all the HDR formats through your receiver, you just can’t do 4K at 120 fps. However, since your receiver doesn’t support eARC, if you want the best audio quality, you should run your PS5 to your TV through the receiver to get the best Dolby Atmos sound. However, if keeping lag to the absolute minimum is more important, then connect the PS5 to your TV. You’ll still get Atmos, just not uncompressed Atmos. You’ll also get DTS, but not uncompressed DTS.

MLA QD-OLED

A side-by-side color comparison of a butterly on Samsung QD-OLED and LG MLA OLED TVs.
Digital Trends

Akash writes: We have seen how incredible MLA and QD-OLED panels are. Do you have any insight on when we could possibly see a TV that uses both of these technologies to get a TV with QD-OLED colors with MLA brightness?


Well, I’m afraid the answer is that we will never see that happen, but perhaps not for the reasons some might think. At first blush, it might seem like we’d never see MLA make it to QD-OLED because LG Display owns MLA and Samsung Display owns QD-OLED, and those two rivals aren’t going to share tech. However, we’ll never see MLA paired with QD-OLED because QD-OLED, from a technical perspective, would not benefit at all from MLA — or Micro Lens Array — technology.

MLA technology was developed by LG Display to overcome inefficiencies that are inherent to LG’s WRGB OLED panels. In a conventional WRGB OLED panel, there are layers that cause light to scatter — light that is essentially lost. MLA prevents that light from getting scattered by routing it straight out through the panel so that it reaches our eyes.

QD-OLED panels don’t suffer from light scatter because they don’t have some of the layers that W-RGB OLED panels do -– for example, a QD-OLED panel has no color filter. So, MLA would be a useless layer in a QD-OLED TV.

For what it’s worth, the brightness differences between MLA OLED and QD-OLED are pretty slim. I get the best-of-both-worlds idea — it’s just that those two worlds can never meet.

Editorially, I’ll add that I think OLED -– whether MLA or QD-OLED, has gotten about as bright as it will ever get. That is unless a TV manufacturer decides it isn’t going to worry about burn-in risk anymore and rolls back the brightness limitations they have imposed on OLED TVs. Something tells me that’s not going to happen. For a brighter OLED-like picture, I think we need to see micro-LED displays implemented in normal-sized consumer TVs or emissive quantum dot display technology deployed. And I think we might be close to both, actually.