I've been on many forums recently where people have said that plasmas offer better picture quality than 4k LED tv's. How true is this? Some are even comparing plasmas with OLED's, saying that OLED's are "closely" approaching the picture quality once offered by plasmas. Not sure how true this is either. I've looked at 4k tv's in a store and have been wowed by their picture quality. Not sure if this something to do with how well things look like in the store, but I was amazed by 4k. How can people compare 1080p to that of 4k? You are getting much more resolution/detail with 4k, then doesn't that always mean it will be better?
I own a 60" 1080P 2012 Panasonic plasma tv (TC-P60U50), and recently had some problems where one of the buffer boards died. I replaced the board(s) and everything is back to normal. For a while I thought about going to 4k, and getting rid of my plasma, although my plasma offers a really nice picture, excellent in every sense. Would it make sense going to 4k at this point, or just keep what I have? I want the most immersive/movie like picture quality, and am not sure if making the jump to 4k will be that big of a difference. I originally chose plasma over LCD/LED primarily for these qualities, not to mention the refresh rate thing with plasmas/led/lcd's. I also decided on Panasonic since they have always made the best plasmas, according to what the majority of people/reviews say.
Thanks very much.
I'll try to itemize my responses - hopefully this will be easier to work through, but if you have questions or need clarification, feel free to post a follow-up asking for more info.
1) You're comparing a couple of different variables here, some of which can't be directly compared. Specifically, "4K" refers (loosely) to the DCI-4K standard, which specifies a display resolution, while Plasma (PDP from here on), OLED, and LCD (regardless of backlight topology) refer to the actual display technology in a given TV (or monitor). There's nothing that explicitly limits PDP, OLED, LCD, etc to any specific resolution, although in contemporary equipment you will generally only find 4K displays offered in LCD products; OLED is still in development limbo for the most part, and PDP has largely gone the way of the dinosaur since Panasonic and others have left the market.
2) PDP vs OLED vs LCD is a fairly contentious debate on many forums and in many reviews; lots of people have lots of strongly held beliefs, opinions, or theories about which is "best" and in many cases can end up perpetuating myths that were, at best, once half-truths. Ultimately where all of that originates from is that every display technology has its own sets of pros and cons; none are universally "flawless" in any practical meaning of the word. The fairly rough breakdown here is that LCDs, historically, struggled with reproducing high contrast images primarily due to how their illumination is achieved (specifically, the LCD panel itself does not generate light, so it relies on something behind it to generate light - that can be CCFL tubes, LEDs, or in the case of projectors, an arc lamp) - more modern LCD-based displays have made significant strides in overcoming this through the use of higher quality light sources and front-end video processing. PDP and OLED, along with many other emissive display technologies (that is, the display element emits light) generally don't suffer from that problem, but have problems of their own - usually related to size (OLED) or geometry (PDP) which LCDs won't exhibit. Again, newer and higher quality displays will generally have overcome many of the most glaring shortcomings of a given technology, and at the end of the day will all probably offer an equally "good" picture.
3) "Good" is in quotes there because it's fairly hard (if not impossible) to quantify someone's subjective experience. In other words, what "looks good" to you may not transfer to someone else. There *are* certainly objective measurements that relate to displays, however correlating them to your subjective experience is the trickier part. In other words, just because a display has a faster redraw rate, higher resolution, bigger size, better (measurable) contrast ratio, etc does not explicitly mean you will experience it as being universally better, especially if we factor cost into the discussion.
4) 4K itself largely falls into the trap outlined in point 3. It has been marketed (and I would argue, mismarketed) as some sort of "hallelujah" moment for "closer to what the artist intended" (a lovely throw-away marketing phrase that we've inherited from the portable audio world), which tells little about what the display is actually "doing." There's also the bigger question of "what content are you feeding it?" What 4K specifically refers to is the DCI standard, targeting a resolution of 3840x2160. With native content (which I'll hit again in another point) you *are* getting a higher resolution display, but how much does that "higher resolution" actually matter to your eyes? To simplify the underlying math and optics, the easy answer is that the further away you move, the less it ultimately matters (as in, your eyes become progressively less capable of discerning the higher resolution) - so if you're sitting 10-15ft back and the display is relatively small (lets say under 120" diagonal) there isn't likely to be a huge difference in perceived "sharpness" or "clarity." Specifically I would suggest using Apple's pixels-per-degree metric, which requires knowledge of the display's resolution and size, as well as your viewing distance, and is plugged into 2dr(tan(0.5*)) where d represents distance (in units) and r represents pixels per unit (units could be inches or centimeters, but must be consistent for both).
4a) However that "what are you feeding it" is also a very important piece to consider too. Simply put, there is VERY LITTLE 4K content available on the market. Blu-ray 4K is a brand-new standard with very few releases. All of the available streamed content is *heavily* compressed in order to get the transport stream small enough to be compatible with most people's ISP, similar to contemporary streamed (or cable/satellite) "HD" content - while it may be spitting out a 1080p or 4K frame, its doing it at roughly DVD bitrates, and as a result image quality will suffer significantly. Then there's all of your "legacy" content - Blu-ray, HD-DVD, DVD, Laserdisc, VHS, any game console currently in production, etc that simply will not output at 4K, and in some cases is only sourcing a few hundred lines of resolution (e.g. DVD, Laserdisc, etc). All of that has to be scaled up for 4K. True, all of the "non-HD" stuff is being scaled for 1080p, but that's a lot less significant "jump" for that content as opposed to being taken up to 4K. So if you're dealing with a lot of "legacy" content, the 4K TV will probably make it look worse than better, simply because of how much scaling needs to occur for the image to fit the panel. The overall lack of current 4K content (and the very real limits on what, in terms of back catalog releases, can ever become 4K content) is also something that warrants a very strong look. If you're primarily watching compressed HD content or DVDs, which reflects a fairly typical usage scenario, the 4K TV won't do that content any significant favors, and will not live up to the very tailored and optimized demo content it plays in the store.
Overall, as of this writing, I wouldn't pursue the 4K TV primarily for film/television content simply because there's not much content available, and quality/compatibility with older sources is less assured as opposed to a functional 1080p display. While there are some theoretical advantages to the higher resolution display, they aren't clear-cut or absolute by any means, and in all likelihood can go entirely unnoticed in practice - especially in a fair apples-to=apples comparison (e.g. not using the highly optimized demo content that the store is provided, and where the color and video processing options are either defeated or set to comparable levels). Furthermore, the greater the viewing distance from yourself to the TV the less the resolution will ultimately matter - if you've ever been to a stadium and seen the gigantic "jumbotron" displays, they look fairly sharp from the seats (often at a distance of a few thousand feet), but each of the pixels on those displays can be the size of a basketball or larger. The same concept applies here to your TV. As a computer monitor, and/or for gaming on a capable PC, 4K can demonstrate a bigger difference. In the future this situation will likely change - when HD resolutions first came out ~15 years ago, they were primarily of use to computer users, and eventually "trickled down" to multimedia applications as we know them today. It is reasonable to assume, therefore, that similar proliferation will occur for 4K with respect to multimedia content. Do keep in mind, however, the limits on back catalog releases to ever see a 4K release, and the limitations on streamed content, which will only become more exaggerated (than they are with 1080p) with the transition to 4K.
If you have any further questions, feel free to ask.