|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
I would look at the vintage of the material. Anything made in the last, say, 10 yrs, could conceivably be 4k. Certainly anything made in the last 5 years. Anything produced in the early 2000's would prob have been up-rezed. Professional digital motion picture cameras started at 2k and quickly moved to 4 and 5k. The shift from film to digital happened in the early 2000s. Anything originally shot on film would have been scanned at 4k for restoration, archiving, and eventual distribution. When the James Bond films were originally released on Blu-Ray @ 2010, they were restored from 4k transfers made several years earlier.
|
|
|
|
|
|
|
|
|
|
Posted: |
Sep 10, 2018 - 11:57 PM
|
|
|
By: |
Ian J.
(Member)
|
I gather quite a few TV shows in the United States are (or were) shot on film, so there should always be an opportunity for a rescan at a higher resolution. Eventually, when the video resolution gets high enough, those will reach a limit of the grain of the film, which only got finer in more recent decades. Shows shot on video can't be upscaled as such, although they can be cleaned up. Many British shows are like that, or a combination of video and film (quite a lot of the original series Doctor Who was like that, with location shoots on film and video for the studio work). Farscape is an example of a show shot on film but whose effects were on video. The original film elements may no longer exist unfortunately, so we're stuck with the PAL-50 final product, which has had a clean up for 1080p, but won't get much if any better with current upscaling understanding. Some shows in more recent years will have been shot on 4K or higher but probably fewer than we would like, so expect upscales to be the norm for those that were shot on 1080p.
|
|
|
|
|
|
Here's a little more info I can supply (since I was in attendance for 6 separate shoots) Before it went off the air in 2015, the CBS show Two And A Half Men was filmed in 35mm film with Panavision cameras outfitted with 2000-foot magazines. It was scanned for 1080 at CFI labs. It would "hypothetically" be possible to make it 4k, but since they ran 4 cameras simultaneously, it would be a scary amount of 4k scanning and re-conforming the edit. In the studio next door, The Big Bang Theory was filmed with "Panavised" Sony HDW F-900 cameras filming 1080p. The whole post production chain being 1080--well that's that. Another I know about, the TNT shows, The Closer and it's continuation spin-off Major Crimes were shot on 35mm film, but scanned and post-production chained in 1080. (As an aside, it's kind of funny to look close at the recent shows originated on 35mm film and watch the random specks of dirt, dust and fuzz go by! Dead give-aways they're originated on celluloid.) Again, with each an every show, you'd have to start from the ground up making new scans and re-conforming the edit. $$$$$ the studios won't want to spend.
|
|
|
|
|
|
|
|
|
|
|
Posted: |
Sep 11, 2018 - 7:41 AM
|
|
|
By: |
Ian J.
(Member)
|
So some can only be upscaled which won't be true 4K. Others can only be cleaned up and look as best as possible, and others were shot on film so can be re-scanned for 4K output. Do I have that right? Wow, that's so crazy. There should be a "standard" in filming so it can meet the demands of new technology. They knew higher resolution was coming. The fact the studios don't think ahead to meet new advances in technology is nuts. In a word: budget. Most shows, especially when starting out, won't spend mega-bucks using the highest resolution available at the time, because it costs too much. Shooting on film is slowly becoming less common (both for TV and cinema) again because of cost, so the path to higher resolution from it goes too. As and when the equipment rental/purchase costs for higher resolution come down, then the shows can fit that into their budgets, especially when the cost is practically the same or even more for the older, lower resolution equipment. Even if a show could afford 4K equipment now, what about 8K that's coming? And after 8K, what then?
|
|
|
|
|
|
It's funny how people get hung up on 4K, when it's arguably the least important part of the UHD (UltraHD) equation. Resolution visibility is relative to viewing distance. One must sit less than 3X the picture height to even see the difference 4K brings. That's awful close to a flat panel - closer than most people want to sit! The difference is not in sharpness, but in visibility of fine details - like being able to read the license plate of a car featured in a long shot. And there again, you can only see the detail if you sit close enough. The other parts of the UHD format - High Dynamic Range (HDR) and Wide Color Gamut (WCG) offer FAR more visible benefits than 4K. These are improvements in picture contrast and color space - improvements that can be seen from ANY viewing distance. FUN FACT - the move to 4K was almost completely driven by the consumer space. Flat panel sales started dropping, so flat panel manufacturers started cutting their panels at 4X the size. Voila, 4K! Now you know why there is so little content in 4K. Hollywood couldn't care less. At least until recently, that is, because they realize there is a market. However, the picture quality improvements are mainly in HDR and WCG, NOT resolution. As Dolby says - we need BETTER pixels, not MORE pixels Wrote an article about all this here, based on a projector shootout we held with four different projectors - two native 4K, others with wider color gamut and better contrast. In almost all cases, the projectors with the best contrast won over those with higher resolution: https://www.thescreeningroomav.com/single-post/2017/12/23/The-Results-Are-IN-Read-All-About-Our-JVC-vs-Sony-Projector-Shootout
|
|
|
|
|
|
John S.^ very interesting. Thank you.
|
|
|
|
|
Figured I'd post the relevant info from my article for those who don't want to read about a projector shootout: "I’d like to share a little perspective gleaned from my own experience working with several of the major Hollywood studios while UHD / 4K standards were being finalized. About four to five years ago, I was involved with a project that attempted to get the major studios on board with higher resolution, anamorphic Blu-rays. Shawn Kelly of Panamorph had developed a technology where 33% more resolution could be “hidden” behind the black letterbox bars on Scope / 2.35:1 movies, and then reintegrated into the image when decoded by properly equipped projectors or Blu-ray players. At one point we had three of the major Hollywood studios seriously interested in this technology. As a side note, the timing of this corresponded with the rise of 4K flat panels in the consumer electronics industry. As a result of my involvement with this project, I attended two events that were eye-opening. The first was one of the first tests of high resolution 4K native material at one of the major studios. It was a comparison of 4K scanned IMAX footage shown side by side on a Panasonic 1080P plasma and a Samsung 4K display. From anything approaching a normal viewing distance, differences were very hard to make out. In this test, the Panasonic "won out" thanks to its higher contrast and better screen uniformity. We were actually in the room with the XXXXXXX studio techs who were going back and forth trying to see differences in detail. They were there, but you had to get close to the screen to see them. My take-away – the differences between 1080P and 4K material is in the ability to make out fine details and textures, not in overall sharpness. As we got deeper into this project and I started playing with native 4K footage myself, it became obvious to me that we needed picture content with ultra-fine textures or patterns to reveal any improvement at all with 4K displays and content. Examples would be fine textures on fabrics, leaves on trees, fine patterns containing lots of geometric shapes, etc. Sometimes differences could be seen in fine details on objects far away from the camera (such as being able to read the license plate numbers of cars during some of the helicopter shots in the film SICARIO). It was also clear to me that you needed to be close enough to the display in order to actually see these differences. Typically, this means having to be less than 3 times the picture height from the screen. To put that into perspective, if you own a 65” 4K flat panel, you would need to sit less than 8 ft. from the display to see the improvement that 4K resolution brings. And you had to go looking for it, when it was present in the types of objects described in my previous paragraph. The second event was a meeting of the Hollywood Post-Production Alliance in Palm Springs, CA. We witnessed Dolby making their argument that "we don't need more pixels, we need BETTER pixels." This is where the move toward HDR and wider color gamut was really kicked into high gear. Their argument was – and is - that differences in brightness, color and contrast can be seen at ANY viewing distance, while differences in resolution can only be seen at relative viewing distances. To be clear, this is not to say that differences in resolution are UN-important, just perhaps less important. That was Dolby’s point of view, and it has grown to be my own as well. I’ll give another perspective, this time from a filmmaker's perspective. Having high resolution 4 to 6K cameras like the RED, Arri and others means that filmmakers can be that much looser framing the image during production because there is so much excess resolution captured by the camera sensor. Filmmakers and editors know they can go in during post-production and zoom and crop the image to their heart's content without losing sharpness (I did this myself when I edited the proof of concept trailer for my upcoming feature film, which was shot on a RED). It’s also true that most movies – even if they are filmed and edited at 4K resolution or greater – are finalized on what is known as a 2K Digital Intermediate (2K DI). This means that the actual final product that makes it to Digital Cinemas and even UltraHD Blu-ray is actually only 2K in resolution. The reason? Right now, it’s simply because even the most modest films these days have some kind of digital, CGI based FX, and it’s much cheaper to render these FX at 2K resolutions than 4K resolutions. Of course, as with almost everything technology related, the cost and difficulty of doing 4K renders becomes less and less, so more and more true 4K movies and content will be coming down the pipeline. (For anyone interested in finding out if their favorite UHD movie is actually sourced from a 4K master, here is an excellent resource: https://realorfake4k.com/).
|
|
|
|
|
|
DP
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: |
Feb 20, 2019 - 4:00 PM
|
|
|
By: |
Bob DiMucci
(Member)
|
Samsung Quits 4K Blu-ray Player Market John Archer Contributor Consumer Tech Samsung has confirmed to me today that it is no longer going to be making any new models of 4K Blu-ray player. Rumors that this was going to happen have been circulating for months. One or two dealers let slip that they’d been told not to expect any more players, and new 4K Blu-ray player models were conspicuously absent from Samsung’s stands at both the IFA show in Berlin last August and the CES in Las Vegas in January. My own Samsung contacts, though, told me in January that a single new deck - apparently a quite high end model - was being planned for launch later in 2019. Those same contacts have now confirmed, though, that this deck is no longer going to appear. It looks like the M9500 is going to be Samsung's last 4K Blu-ray player. I wasn’t given any specific reasons for this decision, but presumably Samsung figured that it likely wouldn’t be able to take enough market share (in what’s already a niche hardware market) from the likes of Panasonic and Sony to justify a full production run. Especially if its new deck was going to follow the same line as Samsung’s TVs and not include support for Dolby Vision playback. Regardless of whether or not the now canned Samsung player could have been a winner, as someone who loves the premium quality of the 4K Blu-ray format I’m sad to see such a major electronics brand giving up on 4K BD hardware production. Especially as it comes on the back of the news last year that even Oppo was pulling out of making new players, despite their debut models receiving rave reviews from almost everyone who tested them. The Samsung announcement comes at the end of a week, too, where we’ve learned that a series of high profile films are apparently not going to be getting a 4K Blu-ray release: The Favourite, Stan & Ollie, and Holmes And Watson (I said high profile, not necessarily good!). Also, while the film industry still claims to be happy with 4K Blu-ray's rate of uptake, the latest disc sales stats for the US show 4K Blu-rays accounting for just 5.3% of sales, while DVD - yes, DVD - still claims 57.9%.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
UHD (UltraHD) High Dynamic Range (HDR) Wide Color Gamut (WCG) I don't know who decided these should be phrases made into acronyms, but he and I are gonna have a little talk...
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|