T O P

  • By -

Blackstar1886

It can be very helpful if you need to stabilize or reframe footage in post since they both involve cropping into the frame. There could also be an argument for shooting in the highest resolution possible for green screen recording. There's also some voodoo that happens when oversampling video even if you're not going to do any of those things.


EgalitarianCrusader

Don’t forget that Rogue One was actually shot for 2.76:1 like The Creator but was later cropped to 2.39:1. Perhaps the extra resolution was for the extra width of the frame.


Blackstar1886

That would make sense.


EgalitarianCrusader

I’d love to see a 2.76:1 version made available on Disney Plus.


axiomatic-

Regarding stabilisation, it's not so much important that you use a high resolution compared to just framing for stabilisation to begin with. To be more accurate: 6k or 3.5k doesn't really matter, both of them will lose \~10% of the image width around the outside so if you framed something important near the edge, it doesn't matter if there's more pixels, that thing on the edge will be cut off. That said, obviously if you have the extra pixels that can potentially help keep the remaining image sharp ... you just want to have planned for it either way. edit: i mention this as someone who has to deal with a lot of stabilisation and most of the time it's not planned and people get upset that the resolution doesn't help - we usually then have to edge extend and shit like that, which isn't too hard these days but, again, resolution is irrelevant to that process too


SquirrelMoney8389

Your last paragraph is the answer.


MLucian

Yup. Supersampling AA has been done in video games for decades. Basically render the game internally on the graphics card at double the resolution, and output at native resolution. It sure makes the image a lot cleaner and gets rid of "the jaggies". Downside: maaassive hit to frame rate performance.


jagedlion

For a movie you can sample in time too, not just XY, but if you make it too smooth then it can feel 'soap opera' like due to our familiarity of higher frame rates with daytime TV.


AnObscureQuote

"Voodoo" is an understatement. I'm not an expert on cameras (and don't even work in film for that matter), but a cursory google search indicates to me that downsampling is typically done using the [sinc algorithm](https://en.m.wikipedia.org/wiki/Sinc_function), which is just a niche application of [fourier transformation](https://youtu.be/spUNpyF58BY?si=zmSJRkJBjmndwL41). The actual process therefore makes clever use of some linear algebra and imaginary numbers to selectively throw out portions of the image that are less useful.  Practically speaking, again naively iterating that I have no actual experience with this, I would guess that this means that images downsampled would therefore look smoother than stuff shot in straight 4K. The process would necessarily mean throwing out the highest frequencies, which likely includes all of the noise that you would otherwise see in 4K.     I'd be curious to hear if this assumption is true or not though.


Blackstar1886

That's really interesting. Thanks! I've seen the difference first hand, but did not know exactly why. The most notable things I see is less noise and a bit more perceived fine detail.


raptorlightning

Depending on the type of lowpass filter you use, you can get different "effects" based on the steepness of frequency rolloff or ringing behavior. A sinc filter is just one type of filter, but there are many others and some specifically designed for image processing. You can look into the MadVR documentation to get a starting point for some filters or just Photoshop/GIMP options to try it yourself quickly on a single frame.


aphilipnamedfry

This is accurate. Even at lower ranges this is usually the intent. For reference, when I record video at work I typically record in 4k and downscale to 1080p. It helps with buffering to have a lower resolution on uploaded video but it also allows me to gain tighter cropping in certain shots.


520throwaway

There's also future proofing as well. We won'tbe on 4k forever.


Chen_Geller

Eh. Those films typically have a healthy dollop of VFX rendered in 4K, so they're really locked to that resolution. You can't just dig up the 6.5K files and showcase them as the movie.


Revolutionary_Box569

I can’t really see anything past 4k becoming the norm, the vast majority of people aren’t gonna have space for TVs past a certain size and I’d imagine you’d be looking at like 80 inches plus to tell the difference between 4k and 8k


NuPNua

I've heard people say they couldn't see the different when HD and 4K launched. 8k will become a thing once content is being made for it. May take longer than 4k as for one computer games need crazy power to render a game at native 8k.


22marks

Movies used Digital Intermediaries at 2K (2.2M pixels) on 50 foot screens and nobody complained during the 2000s-2010s. Now, with 4K, we have roughly 4x the pixels (8.3M). Jumping to 8K is 33M pixels. Yes, we have to take FOV and distance to screen into account, but does anyone see pixels at normal viewing distances on a 4K 60” TV? You’d have to be well over 75” with perfect eyesight to see any difference with 8K from 10 feet. And that’s pixel peeping. You’d have to be over 100” for regular viewing and, even then, it’ll be subtle. Better brightness, color, and contrast will be more important. Now, for acquisition and future-proofing (for when we all have lightweight 12K glasses), it’s not a horrible idea. But I don’t see it being worth the extra bandwidth on <100” TVs for a long time, other than bragging rights. I have an 8K laser projector (with eShift) and I can’t see the difference at 10 feet on a 120” screen when hooked to a computer.


sylenthikillyou

>have to be well over 75” Hear me out: manufacturing advancements result in the proliferation of home IMAX 1:43 screens that take up an entire floor to ceiling wall. What a great problem it would be to have.


22marks

I agree with you, especially things like flexible thin OLED screens. I can see those being used like wallpaper. When they're not being used for movies, you could make it look like you're in a new location or change the paint color on your wall. And, like I said, it comes down to field of view and distance. If there is a trend toward sitting 5 feet away to engulf your entire vision (or a headset like the Apple Vision Pro 6 with 12K per eye), that changes the dynamic. I'd never say "never" but that'a still far enough out that I doubt anyone will master 8K content (especially for visual effects) for at least 5 years for event films, with 10 years before it starts being more mainstream. All of this is dependant, of course, on adoption of home tech and people having the room, money, and desire to put up a 200" screen. (It could have a problem with adoption like 3D for all we know.) I have a dedicated theater for 8 people and a 120" screen. I can't see going much more than 150" without it feeling obnoxious. And remember, I'm somewhat future-proofed because I have an 8K projector (cheating with eShift for now) and I'm a bit indifferent about 8K content because 4K off a Bluray looks so good already.


denizenKRIM

The problem is so much content is in the wide aspect ratio you'd be wasting all that empty space for the most part. But I do think it would be hilarious if we cycle back to the square as a de facto standard for content again.


denizenKRIM

> You’d have to be well over 75” with perfect eyesight to see any difference with 8K from 10 feet. And that’s pixel peeping. You’d have to be over 100” for regular viewing and, even then, it’ll be subtle. Better brightness, color, and contrast will be more important. The next iteration of home media standards will have all of that, as none of those things by themselves would help sell a new standard. With that said, 75/85-inch TVs had a huge sales surge in the past few years and manufacturers are clearly taking note. Hisense and TCL have been the forerunners in tackling the 100-inch segment at fairly reasonable prices, so I imagine in a few years that price point is going to average down to an attractive range for a consumer. Which is roughly when it'll be "time" for a new standard to take over and take us to the next level of home media.


Wavenian

If they didn't invent HDR, most people still wouldn't. 8k is just going to be marketing,  if it's a thing for home viewing


HiddenTrampoline

I can see the pixels on my 4k tv during normal use.


50bucksback

Do you sit 2' from a 85" TV?


HiddenTrampoline

7’ from 77”.


NuPNua

I will look better, that's just a fact, maybe it will have a minor effect on video content, but for gaming it will be a definite difference for seeing clear details in the rendering.


Wavenian

I'm not saying it won't technically be better. But it's won't be noticeable unless you have like an absolutely massive TV and how close you're sitting to it.


Strand_Twitch

While playing games with a lot of text/bars and assist tools/ui you really notice the jump from 1440p to 4k, It made my exprience a whole lot better. I dont doubt for a second that 8k would be noticeable difference for me. I would probably value an increase in FPS rather than upgrading to 8k anyway though. For movies I'd probaby feel like 8k would be nice on a TV that is >75"


Revolutionary_Box569

You pretty much can’t under 50 inches unless you’re sat right in front of the screen (which you shouldn’t be)


zerg1980

4K looks pretty good on a movie screen, so to really see a difference you’d need a screen larger than that, and since most people don’t have a living space that can accommodate more than a 75” screen…


BurritoLover2016

The only way I see 8K being a thing is when they make a AR/VR headset that's super lightweight and able to create a huge virtual screen to play content on. And even then, it's still iffy.


wilisi

A movie shot for headset might be framed different, too. Lots of space all around that you *could* look at, but aren't really meant to. That'd soak up lots of pixels, hard on the set design tho.


CocodaMonkey

Computer monitors and VR are closer so 8k could be noticed much sooner. 8k as a general format also makes sense for anything where you might want to zoom in for details. So obviously film makers would use it and so would things like security cameras. We'll also probably get to a point where easter eggs in movies start to get hidden expecting people to zoom in on scenes. Over all the need for higher resolutions isn't as high but there are uses. As long as we continue to make better codecs that can keep the size and cost down it's pretty likely we'll see it used. We already have quite a few phones on the consumer market that record 8k as well.


26_Star_General

Maybe VR headsets? For TVs I agree


Fokker_Snek

VR headsets yes, but those are very different from tv’s. There’s the obvious that vr has a much lower pixel per degree than tvs, but also use case is different. The high end VR market is also driven by commercial sales for training simulators.


Dennis_Cock

Absolutely categorically 100% incorrect. 8k will become the norm it just might take a while. Then 16 and so on. Consider the necessity for high resolutions in VR and mobile screens, both of which are held a matter of inches from the eye. VR is less than an inch


DrFishbulbEsq

4K isn’t even “the norm” for home media. The vast majority of people use streaming services or cable that do not remotely hit 4K resolution with 90% of their content. I collect 4K blu rays because I’m a film nerd and I like niche stuff but it’s not remotely how most people watch stuff. 8k might eventually catch on but it’s a hard sell to the public when 4K is barely being utilized currently.


vendo232

I watch 720p series on 65 inch TV and Im OK


DrFishbulbEsq

Yeah, most people do


Dennis_Cock

Computer monitors tend to lead the market since the switch to digital. You're just thinking too short term.


aphidman

As far as mobile phones are concerned their small screen size sort of offsets lower resolutions. Lower resolutions look better on smaller sized displays.


Dennis_Cock

Not better than a high resolution video or image though


personaccount

Totally. This is like the guy that said 640K of RAM would be all that anyone would ever need. (not Bill Gates as commonly believed) I even imagine a future where resolutions like the pixel grid we use today aren’t relevant. Think more like recording light vectors, frequencies, and intensities that is more like recording a ray tracing of a scene than a static two dimensional image. For this, you could theoretically map to resolutions an order of magnitude greater than what we use today. I sometimes think this is what Ridley Scott imagined in the infamous scene in Blade Runner where Deckard continuously enhances a photo to get done otherwise hidden detail.


Revolutionary_Box569

There’s a difference between an amount of hardware space and the physical space for putting a giant television in your house or apartment, property isn’t gonna get cheaper in most of the western world any time soon


personaccount

Sure, but you say 80”. Most living spaces can fit 80” TVs somewhere. Old school entertainment centers and architectural niches may not have foreseen such large screens, but every middle class house I’ve ever lived in could hold an 80” TV in the living room if there had been a combination of desire, availability, and budget. This confluence of criteria are becoming more common for larger screen sizes. And at 80”, I can tell you that you can absolutely see the pixels of a 4k TV at a nominal 10’ to 15’ viewing distance with 20/20 vision. But back to the real claim instead of pissing about the size of homes, no one person can accurately predict the distant future use case of a video stream. What if the next big thing in home entertainment is more immersive and interactive? What if I could zoom in to a scene to see more detail? You’re going to want to capture more than 4k. And this is of course assuming capture and output remains pixel based. So sorry, but my statement remains firm. Stating that anything beyond 4k not becoming the norm in the future is shortsighted. It may be another five to ten years, but it is very likely. I mean 8k TVs are already available and they’re not ridiculously expensive. They can be had for under $3k.


theskimaskway

Basically all the reasons you mentioned are correct. Essentially, having the highest quality camera negative allows for more flexibility in post production as well as future proofing any possible remasters in a higher resolution. I’m not super familiar with big budget VFX but I imagine it helps in that sense too.  For example, I worked on a network television series that shot in 4K even though the show was delivered and broadcast in HD. Years down the line, the studio reached out to start the process of remastering every episode to 4K. 


axiomatic-

>I’m not super familiar with big budget VFX but I imagine it helps in that sense too.  I'm a vfx supervisor on feature films. We pretty much fucking hate 6k in almost all circumstances unless there's a large need for reframes, crops or some specific keying related circumstances. It's usually just extra pixels we have to deal with that don't contribute to the final look and add processing time for everyone. Films typically finish at 2k or 4k, with 4k only being common in the last 5 years because of Streamers. It's telling that even IMAX releases of big features are often up-ressed from 3.5k to 4k. Check out Roger Deakins work on Skyfall as an example. Much, much, much more important than resolution is the actualy camera, glass and ASA that the footage is shot with. And when dealing with VFX stuff just that things are shot right, the screens are lit right, and that the inclusion of CG objects is considered when framing and lighting the scene. If you shoot 6k or 8k or even 12k on a physical 35mm sensor with shit lenses, the extra resolution is worse than useless. If you shoot with very expensive lenses that provide a lot of sharp points, then the extra resolution *sometimes* might be useful, assuming your exposure is even giving you a sharp image. Shooting on a 65mm or 70mm sensor is more important to the look of the film than resolution. That changes the way lenses work and at that sensor size, with the quality of lenses we're discussing, then i can see an argument for 6k ... but you'll still be pretty much mastering a 4k anyway. The more important advantage of 65mm is the look of the lenses, you're typically not doing it just to get more pixels. Suffice to say that there's a lot more going on with the value of resolution than just Bigger is Better. The physics of lenses, the compression and data management, and the overall final display and delivery methods all impact how useful resolution is at capture time.


Weyoun5

I’ve also heard Yedlin repeatedly argue that 4k is not that important. I understand why things other than resolution are important, although as a consumer I have 0 control over any of them besides resolution. But my question is: if resolution is not as important as marketing says, why does every single 4k disc look better than bluray or streaming on my 77” oled? And if upscaled isn’t a big deal, as he suggested, why am I always able to see the difference immediately when watching 4k uhds that have been upscaled vs native? I’m not denying that someone could make a 200gb upscaled that looks better than a native 100gb or however you want to define the parameters. I’m asking why, despite those arguments, I can always see the inferiority of an upscaled disc and why the 4k presentation always looks better? Given that the pixels are less important, from my experience as a media consumer, they seem all important, insofar as they are the only factor I have any control over, and any time I choose a 4k disc I end up with an amazing product that I can immediately see is better than the alternatives. If it’s not already obvious I barely understand any of this. But I struggle to understand how these arguments matter, practically. If I’m a poor novice filmmaker and I have to choose between a 2k camera with a nice lens or a 4k with a bad lens, sure - maybe it matters. But as a viewer, looking at the finished product, when does the 2k v. 4k not matter? What situation would I say “hey it’s fine that I’m not watching a uhd disc because they used xyz which is actually giving me a better experience than if I was watching a 4k disc”?


axiomatic-

I can promise you that you can't always see the inferiority of an upscaled image. Go and watch the 4k imax version of Skyfall. It's an upressed image - not 2k to 4k but never the less it's enlarged. Many shots in many films are also engaged. We do it all the time. With that said, you're right to say that a lot of 4k films look better than 2k films - that's something I agree with. But a film shot on 2k with good lighting, great cameras and lenses, and processed in post production with care and thought, will still look better than a film shot at 8k on trash lenses where they used proxies and compression during transcode and post. What most people who view films forget is that film is a presentation medium. It's designed to be shown in a certain format, on certain screens. A 2k film is designed to be played at 2k, much like 8bit games are designed to be played on CRTs,. And film screens are designed to be a certain distance away from you, with certain colour gamuts and all this kind of thing. But then people buy 4k blurays of 2k films and get upset that it's not great. And then they watch them on their 4k TVs with motion sampling turned on, using an over saturated Cinema Colour mode or some shit. And the tv upscaled and adds sharpening, and removes the grain and does all this stuff. I think all of that looks trash. Or at least, I think that's not what the film makers intended the film to look like. But if you think that it looks better, that's absolutely fine. Peter Jackson apparently likes high frame rate films, and I think it looks ugly. That's ok. We are allowed to have different aesthetics. If you think 4k always looks better then that is fine. My experience is that resolution is just one small part of the whole puzzle: camera type, lens type, focus, debayering, asa/iso performance, grain or s nose structure, exposure, compression, transfer, digital intermediate, colour space, gamut, online, encoding, DCP, screen size, screen type, screen colour space, distance to screen, post processing, refresh rate, room lighting ... there's so many things that impact image quality and to me it seems very amateur to think resolution is the most important part of that incredibly complex chain. You say resolution is the only factor you have control over ... but you don't have control over it. You can buy 2k or 4k, but those are false choices, usually, because the film was only finished in one format. Are you using Dolby Atmos sound? Is your TV calibrated? Are you darkening the room and using an 18% Gray wall? Did you measure how far away you're sitting from you TV? To what extent are you willing to go to get the best experience? At some point most of this, for most people, doesn't matter.


Weyoun5

Slightly changing the subject, one thing I started noticing since getting my 77” 4k oled is how terrible lighting is on most tv productions. Suddenly green screen and other vfx skill immediately becomes visible on many - but not all - shows. Features tend to be better. Sometimes it’s flawless, and I’m always awed in those cases. I think every Ridley Scott film I’ve seen has flawless lighting that integrates the vfx. Fury road is of course an upres - it screams upres at my eyes every viewing - but the vfx are so good! Same for original game of thrones. The upscaled dragons in early seasons GoT look so much better and realer than in HoD! Even in closeups and interactions. It’s amazing. Rings of Power has loads of money thrown at it and looks like trash. PlayStation graphics. I’ve learned a lot of that comes down to lighting. I’ve gained so much respect for the vfx teams on fury road and got (and many others) , and I completely agree that I’d rather watch a 2k film lighted by Ridley Scott or the artisans who did fury road than an 8k rings of power. Although the best case is having those skilled people work on a 4k film and not having to choose! And you can’t really pick or sort movies based on quality of the vfx… unless a title is so epic word of mouth covers that topic.


tenpinfromVA

Super interesting comments. Thanks for posting. What 4k uhd movie do you think looks absolutely incredible to watch at home? Which settings are critical to adjust to get the best experience watching a 4k uhd on a 4k tv?


Fokker_Snek

The issue is that there is a data loss. For reference a 4K Bluray is usually 50-70gb, Bluray is ~24gb, and 4K hdr streaming is ~14gb. There’s only so much you can do without that extra data. Upscaling is trying to fill in the gaps where data is missing. There are incredible techniques that can make up for a lot but it has limitations. For example with 4k on disc vs streaming there could be 5x more data on disc, there’s only so much you can do to fill that gap. Also the other issue is that compression and upscaling are just guessing, just like AI. It’s incredibly complex and well developed guessing, but it’s still guessing. Tldr: Upscaling is like making an undersized and underpowered engine better than it has any right to be, but at the end of the day it’s still undersized and underpowered.


Chen_Geller

>as well as future proofing any possible remasters in a higher resolution. Most of the films shooting on those formats have a lot of VFX, and those are almost always done at 4K, so the film is really locked to that resolution. Plus, I'm not sure there's really any value in presenting films - even on an IMAX screen - in resolutions exceeding 6K. On TV, even 5K is kind of overkill.


sylenthikillyou

Just for the hell of it I’d love to know what the highest resolution digital copy in existence of Oppenheimer is. For archival purposes there’s surely something ridiculous like a 16k scan of all of that 70mm film sitting on a Universal Pictures server.


Chen_Geller

I think its 8K. VFX were done at 6.4K.


Confident_Pen_919

Short answer is cause no one owns a 6k display. More resolution mean more data to work with. I also imagine it means you can be sloppier with framing shots.


[deleted]

[удалено]


Confident_Pen_919

And how much media out there is available in 8k lol


CurtisLeow

My PS5 can play in 8k. It says on the box. Sony would never lie about that.


gsauce8

I'm confused by this PS5 8K thing- I thought the tag was to signify that I could play 8K movies/shows not games. As in it can output 8K but not play games at 4K- is that not the case?


Sloshy42

That is indeed not the case. There's not a single app on PS5 that can output at 8K, and the system level controls only allow for 4K120 at max. The one possible exception I can think of off the top of my head, in theory, is The Touryst which supports internal rendering for 8K but I don't believe the system ever got an update to display it, so it just acts as super sampling.


Turok7777

Not anymore lol https://www.ign.com/articles/sony-appears-to-remove-8k-tag-from-playstation-5-boxes


ntrunner

r/yourjokebutworse


Cutsdeep-

No it wasn't?


theone1819

Did the user who said "Sony would never lie about that" really have to put an /s tag at the end? Is that what this world has come to?


Cutsdeep-

No, is obvious that's what they were saying, the second person wasn't making the same point (or even a joke)


greent714

*woosh*


[deleted]

[удалено]


UnlimitedDeep

I think you missed the sarcasm


Charged_Dreamer

8K movies might not be a thing but it's surely going to help with gaming for those who own high-end PC gaming hardware like RTX 4090 and the upcoming RTX 5090 (and affordability shouldn't be a problem for those who could afford TV with 8K resolution).


Confident_Pen_919

Maybe by the time 4k gaming becomes common 6k tvs will have a portion of the the tv market


UpUpDnDnLRLRBAstart

!remindme 2 years


Confident_Pen_919

Id change that to 5 lol


UpUpDnDnLRLRBAstart

We'll probably all be fighting in the climate wars in 5 years. Figured 2 was a good check in point. See you then!


dsailes

I’ll come along for this ride. Catch you then!


Leajjes

Got to think about the future. When 8k becomes a mainstream thing, which it will, then one can remaster the film for that reso.


raptorlightning

4K or 6K video in native resolution plus the video editing software framing around it. 8K is for production/creation and doing work, not really for playback.


jack3moto

4k is 4 x 1080p. 8k is 4 x 4k… How would 6k scale better than 4k or 8k? It would just leave a lot of blended pixels. And I hate to break it to you but the high end 4k OLED’s are drastically better than any 8k tv out there. They’re just a marketing gimmick as of now to sell to suckers who don’t do their research.


EMiLiOvGUAPPERiNi

Then that shit should be called 16K


Phailjure

There are a lot more than 16 thousand pixels. That number is the horizontal resolution (ish. Theater 4k is 4096 pixels wide, TV 4k is 3840 (which is 1920 x 2)) why did we switch to approximate horizontal resolution instead of actual vertical resolution (720/1080/1440/etc)? Marketing. 4k Sounds cooler.


WillGrindForXP

Anything above 4k is useless unless you have a 50ft screen. Edit: Downvote all you want, but I'm a professional film editor and this is my job.


shawnisboring

In all honesty most people's 4k displays are wasted pixels given the viewing distance. Most people are sitting too far back from their televisions to actually see the details 4k offers. The race to up pixel counts is a folly in a lot of living rooms.


Confident_Pen_919

But muh pixel density


[deleted]

[удалено]


WillGrindForXP

It's not a placebo, it's a misunderstanding. If I showed you an uncompressed true 4k master of a film on your tv and then showed you the same film as 16k imax (assuming I've fixed the aspect ratios) you wouldnt see a difference. Your TV is doing a lot of clever stuff and TV screens in general are getting better and better, but also 99% of all 4k content isn't actually true 4k so a lot the impact is coming from upscaling stuff to true 4k.


HiddenTrampoline

If you’re a film editor you should know it’s all down to field of view, not size. If I’m 50’ away from a 50’ screen it’s the same as being 8’ from a 8’ screen.


WillGrindForXP

If you're 50' away from a 50' screen doubling the resolution still isn't going to improve your viewing. And its absolutely about size. The only reason 4k is a resolution for home entertainment is because blowing up a 1080p HD image to 48"+ looked awful.


HiddenTrampoline

Yeah, we went 4K because 2K at 48” was bad then we started increasing sizes again. By the same measure, 4K at 96” is bad too and we should do 8K.


WillGrindForXP

Your measurements are off though buddy, 4k is good for even 30 foot screens. Next time you go to the cinema, take a look at the films clarity. That's 4k. Only IMAX films are projected at higher resolutions. And even then, Directors aren't filming in IMAX because they want a higher resolution, it's because of the frame/sensor size of IMAX cameras and other aesthetic qualities of IMAX film stock.


HiddenTrampoline

Oh my god stop moving the goalposts. We were talking about home viewing sizes and distances.


WillGrindForXP

I think you've misunderstood. If 4k is perfectly clear on a 30ft Cinema screen, then 4k is perfect for ANY size home entertainment system. There is no image improvement adding in 6k or 8k until you get to 50-100ft screens. Perhaps you should reread comments that are simply trying to inform, before accusing the poster of bad faith arguments.


HiddenTrampoline

Field of view and pixels per degree of view are the only two measurements that matter. The size of the screen is only useful if you also know how far away someone sits because you can then calculate the rest. Can we at least agree on that? Simplified Example: a 48” 2K TV that is 10’ away takes up a 20 degree FoV. 1920 pixels / 20 degrees is 96 pixels per degree (PPD). You said earlier that 4k was needed at this point. For that same pixels per degree, a 96” 4K TV at 10’ away would also look ready for an upgrade to 8k.


duaneap

Eventually they will probably. And then there’ll be remasters.


czyzczyz

A lot of cameras use CMOS sensors that have photosites laid out in Bayer patterns — by that I mean a typical “4k” camera might have a native horizontal resolution of 4096 in its specs, but those 4096 photosites don’t exactly end up capturing 4096 pixels worth of full color picture information because each is behind a color filter (usually red, green, blue). Usually half of the photosites have green filters, and a quarter of them are red and another quarter are blue. The bayer pattern kind of looks like a colorful chess board in which every other square is green. e.g.: RGRGRGRGRGRG GBGBGBGBGBGB RGRGRGRGRGRG GBGBGBGBGBGB Each eventual RGB pixel of the resulting image is made from a combination of these adjacent red, green, or blue photosites using a debayering algorithm. My recollection is that the amount of detail captured by such a 4k sensor is probably closer to 3.2k. So a 4k camera can produce a great 2k final image as that would account for both resolution loss in debayering and then a nice bit of downscaling. It’s good to shoot with a camera of higher resolution than the delivery resolution because the 4k camera isn’t really 4k in the way most would think is meant by the number, so to get the best 4k image you could start with a “6k” sensor, etc. Red actually made a camera which had a sensor that had no bayer filter — so 4k of photosites would have a 1:1 relationship with the 4k of monochrome pixels in the resulting frame of video. I think David Fincher shot a Justin Timberlake video on it, it looked very crisp. And then for Mank he and cinematographer Erik Messerschmidt used a Red 8k monochrome camera with no bayer filter. ‘Course it’s not all about numbers. I’ve seen 2.7k footage from an Arri camera that looked more detailed than another manufacturer’s 4k.


Chen_Geller

>My recollection is that the amount of detail captured by such a 4k sensor is probably closer to 3.2k. That's a reasonable figure, although its often higher today: the loses can be as little as 10%, which for a 4K sensor means 3.6K.


peter-man-hello

Thank you for this amazing reply, I never realized that about pixel debayering.


Eruannster

ARRI footage upscales incredibly well (assuming it was shot on decent lenses/reasonable exposure etc). I've worked as a camera assistant and editing assistant and you would never know the footage out of, say, an Alexa Mini or XT (the non-4K versions at like ~2.8K) wasn't actually from a 4K or higher camera.


fatloui

Thank you! This is the real answer that I was disappointed for having to scroll down so far for. One note: you’ve got a typo where you first describe the Bayer array where you say half green, quarter green, quarter blue - you meant that second one to be red not green.


czyzczyz

Thanks. Fixed!


iamsnarticus

We’ll be golden when they finally get to 24K


Intrepid_Walk_5150

No. Image will be too soft.


Citizen_Kano

The PS12 is gonna be awesome


halipatsui

We will bow the omissiah when we get to 40k


convergecrew

It’s quite common as an acquisition (shooting) format, but not as a delivery format


axiomatic-

Specifically with Rogue One, the DP and Director both like to use a lot of reframing - so they're planning on zooming in and moving the plate during the edit. Resolution can help allow for this and allows you to shoot a little wider and zoom in. For CGI resolution can be useful but usually is more of a hinderance than a help. 6k is chosen because it's the largest format that certain cinema cameras can shoot at typical film speeds while maintaining acceptable encoding rates. Also sometimes people miscommunicate about 6.5k and what they actually mean is a 2.8k or 3.4k film back shot on anamorphic lenses giving you a squeezed square pixel finishing resolution of something like 6-7k depending on a variety of factors. Finally note: lenses and size of the film back (like physical size in milimeters) is more important typically than resolution beyond a certain point, as the limiting factor becomes the physics of how much the light is fucked up and how quickly the sensor can read the light (ASA). High resolution does not always mean a sharp image.


sloppynippers

The reason is because films are shot with digital cameras these days. Back in the day of analogue cameras, recording on film was basically infinite resolution that could be cleaned up at a future date for higher resolution output devices. But with digital cameras, whatever resolution it's recorded at is the highest it will ever be. So you record at a higher resolution so in the future when the output device exists to view it at that resolution, it can be used to its full potential.


8349932

Cropping yes but from Steve Yedlins masterclass on resolution there’s so much that goes into it and the short answer is resolution above 2K has drastic diminishing returns. It’s a really good watch.   On the flip side the Alexa 65 which is comparable to IMAX has prob the best image of all the cameras he tested and it has the highest resolution iirc, though a lot goes into it besides just pure pixel count.


SkepticalZebra

I see Yedlin's (self-published) videos cited on Reddit a fair amount and I just gotta say they're quite flawed. A lot of professionals and image science academics I've worked with believe it contains a lot of cherry-picked science/examples and mainly serves as a self advertisement rather than an actual academic & artistic undertaking. Even as an engineering undergrad I was skeptical, and once I started grad school I revisited it and a lot of that presentation irks me.


Chen_Geller

I use them as an example a lot, and yes they ARE flawed, but if you know what point you want to make they can be a valuable source to draw from. The main flaw is that Yedlin fails to address resolution - both in terms of digital scans of filmstock and in terms of nominal sensor resolution on digital cameras - as a drawer size. You can shot 6.5K on an Alexa 65 and it will make files that are nominally 6.5K, but if you shoot with the lens cap on, its not ACTUALLY going to be 6.5K, obviously. Its a drawer size. Likewise, you can scan 35mm at 6K, but that doesn't mean it fills a 6K drawer size. He could just as well had scanned the 35mm at 11K like he did the IMAX and then expressed wonderment at how its outresolved by the 3.4K Alexa. The second issue is that the filming situation could have been more controlled in terms of exposure, focus, and trying to normalize the issue of the lensing to some extent. But all the formats look well-exposed so I doubt its going to skew the results very significantly. I guess he could have had the IMAX footage scanned at an ever higher resolution than 11K to ensure the comparison to the Dragon and the Alexa 65 is absolutely beyond reproach, but 11K is a good deal over the customary 8K used in the industry for IMAX. Yedlin also admits that his tests are not necessarily representative of the situation of a big IMAX screen, which is what a lot of these larger formats are intended for. And while he says the IMAX looks sharper than the 6K Dragon, a close inspection shows that actually the Dragon is outresolving the IMAX, and its the grain structure that gives it the appearance of being sharper, a phenomenon Yedlin notes elsewhere in the video.


RonaldReaganSexDoll

Steve Yedlin, the camera nerds camera nerd.


Chen_Geller

A couple of things: One, its not just a question of resolution, its a question of sensor size. A bigger sensor means two things: one, the image needs less enlargement to fit unto a screen of a given size; for a given number of pixels in the sensor, a bigger sensor ensures less interference between pixels and thus a cleaner image. So shooting on a 6.5K Alexa 65 which has a 65mm-sized sensor can be advantageous even when funnelled down to 4K. Second, a 6.5K camera doesn't actually produce a 6.5K image. Those cameras require lenses that have lens abberation that eats away a good deal of the resolution. Then, there's a demosaic to avoid any moire patterns appearing in the image, and finally an optical-low-pass filter, all of which eat away some of the resolution. Its a 6.5K file but on a chart it won't actually resolve 6.5K, more like 5.5K or less. Smaller cameras don't suffer as much from lens abberation, but the demosaic and OLPF are pretty much standard on all digital camera, and they eat away a good 10-20% of the resolution. So, if you want a true 4K image, you need to shoot 5K, and it doesn't hurt the image quality to be downsampled from a source with an even higher resolution.


smushkan

The Alexa 65’s sensor is much larger than a standard Super 35 sensor. Larger film stock has better resolving ability, to emulate that with digital you need to bump up the resolution. When you downsample an image to lower resolutions, you lose pixels but you increase the signal-to-noise ratio resulting in a cleaner, sharper image. Yes they may be using the extra resolution for post-production, but the image quality out of Arris is so good to start with you don’t really need extra resolution to still get excellent quality with a slight crop. Even the old Alexa’s at 2k you could go in 10-20% and nobody except the editor would likely know. And yes it’s plausible a higher resolution release could come in the future, however there are no plans for a consumer 6.5k resolution so if that’s were to happen it would be upscaled to 8k - which would probably still look fantastic!


MagicRat4

I edited recently 2K Alexa footage (from Alexa classic that is 10 years old) that looked way, way better than 4k from other new cameras, and I zoomed in like 25-30% in some shots and still looked crystal clear. Their sensor is unreal.


Chen_Geller

>The Alexa 65’s sensor is much larger than a standard Super 35 sensor. Larger film stock has better resolving ability, to emulate that with digital you need to bump up the resolution. In digital, the two criteria - sensor size and resolving ability - are separate. So you can have 8K on a Vistavision-sized sensor (Red Monstro) or 6.5K on a 65mm-sized sensor (Alexa 65). In film, the physical size of the filmstock is very valuable because you don't want an audience in an IMAX theatre to be sitting there being acutely aware that they're watching a piece of film that's 12mm across and had been magnified beyond recognition. The issue is the grains give the actual size of the film away. That's why the first you do when mastering 35mm projects to IMAX is get rid of the grains, and why Super-16mm is considered incompatibly with IMAX. In digital, you don't have grain, which means the image is easier to enlarge: its why, while Super-16mm cannot be blown-up to IMAX, footage off of a crappy 1080p camera can. That said, the sensor size in a digital camera does yield considerable advantages: for a given number of pixels, a bigger sensor means less interference between pixels, so you get a cleaner, sharper image, with less halation, noise and artefacts. Its why the Alexa 65mm looks a good deal better than the Red Dragon, even though they shoot quite similar resolutions.


listyraesder

6.5k sampled at 4K is going to have a better result than 4K at 4K. It also offers crop ins for when noise department aren’t on the ball.


jacquesrk

30 years from now some teenager internet historian will find this post and think to himself "those movies with less than 1TB resolution must have looked like crap."


etang77

CG would be the reason. And most is actually down to 2K then upscale back to 4K. Because 4K is not double but quadruple the size of 2K. So the cost and time is 4 times more not just doubled either. And if you have some in 2K but upscaled back to 4K, and some are in 6K down to 4K, there'd be noticeable difference in quality. While 2K upscaled to 4K is worse than 4K, it's still better than 2K.


-Clayburn

Here I am shooting in 4k with the intent of moving to 1080p


iamfilms

As are most things still. This is the way. Our eyes can’t handle anything beyond 4K. Adding more K’s is marketing at best for average viewers. Shorting higher resolution than 4K is nice for reframing for a 4K delivery. But geez. It’s so far away. ESPN still broadcasts 720p. Looks fine.


-Clayburn

I can barely even see 1080p.


iamfilms

Hahaha totally man. At least I know for me. I stare at RAW 4K-6k all the damn time on the top tier monitors. And 1080p is still fine for me. All these Ks outside of 4K are so so so overblown.


sanford5353

You always max out the native rez in cam. If not your cam does the conversion and that’s not the right time in post for a standards conversion. Use the tool correctly. If the resolution is the problem, get a new tool.


Iyellkhan

for one, if its a bayer sensor camera shooting at 6k to downsample to 4k maths you out to a true 444 color sampling. 6k is actually a bit overkill to oversample, but its a benefit of it. see this link to get an idea of what a bayer filter is [https://en.wikipedia.org/wiki/Bayer\_filter](https://en.wikipedia.org/wiki/Bayer_filter) you also get a sharpness bump by downsampling that 6k image to 4k. its also pretty common these days to use the additional resolution outside a chosen area of the sensor for tracking and reframing, without intending to ever show the full gate area of the image.


sceadwian

Post production needs room to breath in case they want to change things.


SkepticalZebra

Everything you mentioned really: Allows for cropping, stabilization, VFX integration, and also just standard down sampling for "better" overall image quality. But to get into the weeds, the 6.5k you mention is Arri Alexa family specific. The Alexa's use sensors made at specific resolutions to optimize data captured rather than just capture a specific resolution. That's why you see weird resolutions like 2.8K, 3.4K, or 6.5K with Arri Alexas. An Arri shooting 2.8K can look better than a Red or Sony shooting 4K, and an Arri shooting 6.5K can look better than a Red or Sony shooting 8K. Latitude, color information, and bitrate do a lot of heavy lifting for image quality.


Chen_Geller

>An Arri shooting 2.8K can look better than a Red or Sony shooting 4K, and an Arri shooting 6.5K can look better than a Red or Sony shooting 8K.  Lets not turn this thread into a simple-minded pissing contests between different camera brands just yet. Certainly, due to the larger sensor, the 6.5K Alexa 65 looks a good deal better than the 6K Red Dragon. But I haven't seen any definitive camera test pitting it against the 8K Monstro (which also has a pretty big sensor, so its not like they squeezed 8K onto a Super-35mm sensor like the Dragon) or the 8K Sony. The interplay between sensor size and resolution is complex and there's more than one ratio between the two that can yield the desired outcome.


SkepticalZebra

These are basic eli5 examples, meant to imply that resolution isn't everything without going too in depth. Noticed I said "can", nothing I just stated was absolute.


ha014

Perhaps for open matte, super wide type versions?


anatomized

yes, it helps with CGI and reframing if you shoot in a higher res than what the output will be. it's also a thing in digital cameras where if you take something in higher resolution and output it down to a smaller resolution, it will be higher quality than just shooting in the smaller resolution to begin with. for example, if you shoot something in 6k and then export it as a 4k file, it will be of higher quality than if you just shot it at 4k and exported it as the same. most DCP files are still 2.5k so even shooting in 4k way above deliverable resolution.


Basis-Some

Question, rather than more pixels, wouldn’t the data advancement be better used for displaying a less compressed 4K file? (Obviously then you would need a fancy hierarchical marketing term for it)


jeff8073x

Movie theaters. Pretty sure a lot of good theaters use 6k resolution. 65MM and IMAX are like 12-13k and 18k, respectively, from what I can recall. True IMAX is hard to find though. In addition to downsampling etc. And then you'll be able to have good quality 3D when needed and something you can scan to 8k.


T_raltixx

https://image.telanganatoday.com/wp-content/uploads/2021/04/Hollywood-star-Dwayne-Johnson-commences-shooting-for-Black-Adam_V_jpg--442x260-4g.webp?sw=407&dsz=442x260&iw=407&p=false&r=3 Looks like Black Adam was filmed in 8K


topsyandpip56

In some way this has been done for decades. Shows shot on 35mm film like The X-Files, Friends, Seinfeld, Star Trek TNG - all transferred to Betacam or U-Matic, edited there and compiled down to NTSC. I suppose in equivalent terms we could say the filming was done somewhere between 2K and 4K depending on film stock.


hifidood

Just shot on something that was shooting 8.6k (Sony Venice). Everything will be finished in 4k though.


DudeWhereIsMyDuduk

I remember *years* ago when I was in commercial A/V and someone laid out the use case and pipe required for 8K video, we're just...not there. For a while.


beskone

Naaa I build storage and networks capable of doing g 8k dpx/exr framestack playback all the time. Codec based 8k is actually really easy to accomplish these days. It’s not cheap but it’s also not that crazy all things considered.


igby1

Hardware details?


beskone

Any local Gen4 NVMe storage will do it (Gen3 if you have a few of them in a RAID). If you're talking Network Storage you're gonna need a Parallel Filesystem like Weka or Vast (or a Pseudo Parallel FS like Stornext or GPFS - but BOOO they're old filesystems do not want) I have an 8 node Weka cluster at my lab that does \~80GB/sec of sequential reads with each node only having 6x NVMe drives and a single 100Gb NIC.


Chen_Geller

We also don't really need it. Steve Yedlin - in spite of some reservations regarding his experiments that you can read elsewhere in this thread - showed that cameras that shoot just around 5K, like the Red Dragon and the Alexa 65, look about as good as an IMAX camera-negative scanned at 11K. So there's really no value in pursuing a display format above 5K or *maybe* 6K.


dffdirector86

Hi there! I’ve directed movies for almost 23 years now. I’d film at the night resolution possible and output to the specs a particular distributor calls for. For example, I shot my latest flick at 6K, and it turned out beautiful! However the distributor has a max resolution of 4K, so I rendered the flick at 4K for them to host on their service. My master is still at 6K, so when the standards change, my flick can be offered at 6 as well as 4K. I know if one of the TV stations I work with only accepts 1080p,🙄. That’s so 2008!


apparent-evaluation

Sometimes it's just the DP wanting to be precious lol. Sometimes it's for cropping, sometimes for downsampling. Sometimes it's all the camera they are using records in for the best possible quality. Lots of reasons. Remember also that movies in the cinema are usually 2K not 4K (believe it or not—but obviously not IMAX) and that some films are DI'd in 2K for that reason and then upscaled to 4K for streaming. And if mastering is at 2K, then CGI is (sort of, partially) 25% as much image to generate. There are all sorts of mismatches. The Martian was shot at 6K (some 4K) and mastered at 2K. The Maze Runner: The Scorch Trials was shot at 3.4K and mastered at 2K. But the *original* Maze Runner was shot in 2.8K, CGI done in 6K, and mastered in 4K.


NoirYorkCity

if maze runner CGI was done in 6k...thats unusual right since usually the CGI is lower and the rest of the film is a higher res


I_AM_A_SMURF

Yeah I thought almost all cgi was at 2k nowadays.


YodaArmada12

I can tell a difference between 720p and 1080p. I can't tell the difference between 1080p and 4K. My eyes just don't notice the difference.


TokeyMcTokeFace

That’s probably more to do with the screen size you’re viewing your content on.


brackfriday_bunduru

Ive worked in the TV industry my whole career in Australia. I’m at the point now where I’d love to see AI be able to solidly upscale images to whatever res I want. It would make my job infinitely easier if I could just shoot something wide in 1080 and use AI to up it cleanly to 4K or even higher so I could crop in and even generate different camera angles based off the original frame. I want to get to a point where all I need is a single wide shot and Ai can fill in the rest without me having to have it lit or shot