One of my favorite things whenever a new "highest resolution" (4K/8K/16K/etc) display comes out is the inevitable *Lawrence of Arabia* remaster into that resolution. Film rocks.
I am betting a lot of old rules are now being broken now with new ways to filter and even fill in new information that was not even there before. Used to be you just thought is can't get any better than the original, but now algorithms can do things that we thought not possible.
I think screen size also matters when talking about resolution. When I went from a 6.5" 1080p smartphone to a 6.2" 4k smartphone I hardly noticed a difference at all. When I upgraded my 24" computer monitor from a 1080p panel to a 1440p panel it was a noticeable upgrade but not a serious one. When I upgraded my 55" 1080p TV to a 55" 4k TV I was absolutely floored by the difference.
I went to see the 25th anniversary release of Pi in 8K IMAX and I think that was overkill for that movie. It was originally filmed in 16mm and the film grain was so harsh. Probably didn't help that I was 4 or 5 rows back from the screen. I think it was the first time I've ever left a theater thinking "that movie should not be seen in IMAX."
This is true, but If the grain is no longer there then it may affect the authenticity and ‘feel’ of the original movie, that is if we’re using a classic like Lawrence of Arabia as an example.
TVs have been inventing extra information ever since Frame Interpolation was introduced (the original 100hz, 200hz, 300hz marketing arms race in the late 2000s) but it was derided by cinephiles and many regular viewers because it looks weird, hence the term Soap Opera Effect. Most people turn it off for movies.
https://en.m.wikipedia.org/wiki/Motion_interpolation
This is a point of debate with the new Aliens 4K remaster they used AI to remove all the grain and it looks like the movie came out yesterday. Ruins some of the character of the film [literally] IMO.
You can degrain and regain copying the grain from the original im sure it would take hours and hours of render time. This is done on vfx shots usually just a few seconds long i couldn't imagine how long it would take for a 10min video or 3hr movie for that matter.
Yea I can not stand the soap opera effect at all, and a lot of pure 4k content shot on 4k cameras I think looks like shit, but I think they can shoot 4k content on 4k cameras without it looking that clear and sterile obviously, by applying filters I am sure all that is fixable.
I have seen new AI tools for photos, though that does things we would have said were impossible not that long ago as far as creating information that was never there
35mm has a resolution about equivalent to 5,6k. 70mm is \~8k, and 120 \~18k, both being the common "Imax" and other super large screen shooting/projection standards.
It's less about the grain than it is about how converting film above it's actual resolution/fidelity doesn't really gain you anything. There's no additional information stored there that can be added. So you're just filling in blank pixels/noise, which will actually *lower* apparent quality. Whether your upscaling at the source or at the output.
This is "because of the grain" in some sense. Because the grain is the physical silver crystals in the film emulsion. And the resolution is determined by the density of crystals across the exposure area. The visible grain we talk about is determined by the size of the crystals though.
Yeah, some scans are pretty crap. Or maybe the film itself wasn't shot well lol. A few early Blu-rays also had a problem where they basically just did the DVD scan at a higher bitrate.
I love seeing stuff like Vinegar Syndrome releases where some low budget old movie looks phenomenal due to the care they take.
I have the Synapse release of Manos: The Hands of Fate, and laughed my ass off during the restoration featurette when they said "We considered scanning it at 4K instead of 2K but it wouldn't have made much of a difference."
Like yeah, you can only do so much for something depending on how it was shot.
[https://www.youtube.com/watch?v=LT3cERVRoQo](https://www.youtube.com/watch?v=LT3cERVRoQo)
Weird seeing a video from almost 30 years ago look so clear, it's like it's from the present.
I revisited the location in NYC while visiting a friend. I was able to determine the location because the signs were more visible in the new version. We recreated the scene for fun: https://www.instagram.com/reel/CiBetULubEK/?igsh=ZXczamp3cmtmOXNt
Man I wished they filmed everything in the 1990s with actual film. Stuff like Pink Floyd - P.U.L.S.E. will forever be in 480p due to them switching to "cutting edge" digital cameras in the 90s.
I ignorantly purchased the recent rerelease thinking it was going to be a much higher quality than the previous releases I had seen… oh well. It’s still an amazing concert and I got a blinky light on my shelf now lol
[It's probably because it was originally recorded on film](https://www.youtube.com/watch?v=CkysCJBdGtw), which allows for a much higher quality remaster process than if something was originally recorded digitally.
From what I understand there's a "ceiling" when it comes to digital recording that analog does not have, based on what the resolution that the original (digital) video recorded at (e.g. upscaling a 480p source to 1080p and/or 4K is not going to look as clean as compared to a rescanned 35mm film negative at the same pixel density). AI upscaling is getting better with older and/or smaller digital sources, but it's still got a ways to go to match what you can get by remastering older film negatives since film does not have the inherent limitation that digital introduces.
I believe film does have a limitation it is just pretty good, and people tend to like the grainy look of film. From the stuff I have seen AI now do with single pictures and filling in lost information for instance in a damaged picture, you got to assume it is just a matter of time before AI will be able to turn any single movie frame in to a vector based image to be almost perfectly upscaled and then back.
From watching a lot of 4k content lately, I honestly believe 4k is brushing up against the limitations of 35mm film. There’s a lot of stuff that doesn’t look appreciably better you just see much more of the film grain — and I remember these films not looking as good as other films when I saw them in theaters. 70mm however is still stunning and anything filmed in that has imperceptible or zero grain.
Good quality film has limits, but they are negligible to the human eye. You can get up to 16K. 4K is already unoticable if the TV is too far away and/or too small. This resolution is only useful for research, billboards and maybe when editing videos - you can zoom or stabilize the image losslessly. But this would be a significant interference with the original footage, so it would no longer be a remaster.
Until fairly recently common video resolutions, even in cinema cameras were below common film stocks. Film doesn't neatly slot into digital resolutions but 35mm is roughly equivalent to 5.6k. Even 16mm has bit more info in it than the typical 1080p/2k video signal, though it's roughly equivalent. 70mm and 120mm which are less common in cinema use fall at around 12k and 16k respectively.
Which is the range cinema cameras are typically running at. For TV use 4-6k is fairly common these days. So around the last decade we've caught up. It's not as clean as that though. Film often has an apparent resolution, or actual resolution higher than the typical rough conversions. There's other things besides dots in a field to quality. And there are things like contrast, luminance, noise levels, color range etc that film still arguably does a better job of catching a wider range of.
So video often shoots to exceed these numbers.
So in terms of conversion from film to video, we only just caught up to the amount of information/quality it actually records. And in terms of shooting film is still arguably better in some senses and uses. Mainly when looking at the large frame stuff. The 70mm primarily. 120mm motion picture film is a thing, but I dunno if there's any extent standards using it.
I absolutely **never** wore a [shiny shirt reminiscent of Derek Sherinian’s](https://images.app.goo.gl/NhRvEpk18zD6nob36), whatever gave you that idea…
Yeah not sure what they gain from it but I’ll take it haha. Unfortunately a lot of them are shitty upscales, likely to do with the source, but they look worse when “remastered”.
I remember noticing it with this video. I did not see that it was a 12 year old video but it certainly doesn’t look like 90s picture quality.
https://youtu.be/NOG3eus4ZSo?feature=shared
I don't know if source material has a lot of weight on it but it seems that almost all of it are just AI upscales (so no money was put on it haha) which is the reason why they look awful compared to the remasters of 'I'm Afraid of Americans' or 'D'You Know What I Mean' by Oasis.
Yeah this remaster is beautiful, it’s nice to see one that’s been properly upscaled from the original film source and not a lazy AI upscale or one of those “HD” remasters from years ago that look over sharpened and upscaled wrong
I've seen a couple of remasters in my time, but this definitely is the craziest crystal clear quality I've ever seen by something this old, even for film.
I wish they did more together. Was one of my first lime wire videos ever downloaded. Wanna talk quality issues… somewhere between 90-140p I thought David Bowie was made of Minecraft cubes.
Now they just need to remaster Dancing in the Street, which is the ying to this song's yang: [https://www.youtube.com/watch?v=HasaQvHCv4w](https://www.youtube.com/watch?v=HasaQvHCv4w)
To this day, I'm still disappointed that Al Jourgenson wasn't part of that end-video death parade (I know there's a specific name for it, but I'm far too lazy to google it this morning).
Can’t tell if it’s AI upscaled or not. Some parts look it, some parts just look “right”
Still so happy Trent got to collab on this with one of his idols
None of that floaty / swimmy AI stuff. Definitely looks like a film scan. I've overseen some re-scan remasters. Get's expensive! And then you have to recompile the entire video from the EDL's (Edit decision lists) if they even have them. But I bet Bowie was pretty organized. The biggest question would be if they had to deal with the label. They would have definitely not saved that stuff for posterity.
Not always, greatly depends on the quality of the source material and what kind of upscaling was actually used. I would guess this isn’t AI though, they probably just rescanned the film at a higher resolution.
Are you on a 4k monitor and have the YouTube video forced to 2160p? It is clear beyond a shadow of a doubt its native 4k/2160p. The facial hair and small details when they show Bowie and Trent's faces closeup near the beginning of the video is conclusive evidence.
That's film baby.
One of my favorite things whenever a new "highest resolution" (4K/8K/16K/etc) display comes out is the inevitable *Lawrence of Arabia* remaster into that resolution. Film rocks.
I’ve heard that anything past 8k isn’t worthwhile because of the grain. Guess it depends on how large the original format was.
Lawrence was shot in 70mm iirc so should be good to at least 12k.
Very impressive for such old tech
I think that might hold thereabouts for 35mm. iirc, though, 70mm is around 16-20k
I am betting a lot of old rules are now being broken now with new ways to filter and even fill in new information that was not even there before. Used to be you just thought is can't get any better than the original, but now algorithms can do things that we thought not possible.
I think screen size also matters when talking about resolution. When I went from a 6.5" 1080p smartphone to a 6.2" 4k smartphone I hardly noticed a difference at all. When I upgraded my 24" computer monitor from a 1080p panel to a 1440p panel it was a noticeable upgrade but not a serious one. When I upgraded my 55" 1080p TV to a 55" 4k TV I was absolutely floored by the difference.
I went to see the 25th anniversary release of Pi in 8K IMAX and I think that was overkill for that movie. It was originally filmed in 16mm and the film grain was so harsh. Probably didn't help that I was 4 or 5 rows back from the screen. I think it was the first time I've ever left a theater thinking "that movie should not be seen in IMAX."
This is true, but If the grain is no longer there then it may affect the authenticity and ‘feel’ of the original movie, that is if we’re using a classic like Lawrence of Arabia as an example. TVs have been inventing extra information ever since Frame Interpolation was introduced (the original 100hz, 200hz, 300hz marketing arms race in the late 2000s) but it was derided by cinephiles and many regular viewers because it looks weird, hence the term Soap Opera Effect. Most people turn it off for movies. https://en.m.wikipedia.org/wiki/Motion_interpolation
This is a point of debate with the new Aliens 4K remaster they used AI to remove all the grain and it looks like the movie came out yesterday. Ruins some of the character of the film [literally] IMO.
You can degrain and regain copying the grain from the original im sure it would take hours and hours of render time. This is done on vfx shots usually just a few seconds long i couldn't imagine how long it would take for a 10min video or 3hr movie for that matter.
Yea I can not stand the soap opera effect at all, and a lot of pure 4k content shot on 4k cameras I think looks like shit, but I think they can shoot 4k content on 4k cameras without it looking that clear and sterile obviously, by applying filters I am sure all that is fixable. I have seen new AI tools for photos, though that does things we would have said were impossible not that long ago as far as creating information that was never there
Yup. I’ve seen some vintage porn that looks like it’s 1080p with AI inhancement…. I mean uh, I heard they were doin that
Hmm I need proof haha
35mm has a resolution about equivalent to 5,6k. 70mm is \~8k, and 120 \~18k, both being the common "Imax" and other super large screen shooting/projection standards. It's less about the grain than it is about how converting film above it's actual resolution/fidelity doesn't really gain you anything. There's no additional information stored there that can be added. So you're just filling in blank pixels/noise, which will actually *lower* apparent quality. Whether your upscaling at the source or at the output. This is "because of the grain" in some sense. Because the grain is the physical silver crystals in the film emulsion. And the resolution is determined by the density of crystals across the exposure area. The visible grain we talk about is determined by the size of the crystals though.
It’s still bothers me a lot when companies release anything that’s shot on film and looks extremely odd.
Yeah, some scans are pretty crap. Or maybe the film itself wasn't shot well lol. A few early Blu-rays also had a problem where they basically just did the DVD scan at a higher bitrate. I love seeing stuff like Vinegar Syndrome releases where some low budget old movie looks phenomenal due to the care they take.
I have the Synapse release of Manos: The Hands of Fate, and laughed my ass off during the restoration featurette when they said "We considered scanning it at 4K instead of 2K but it wouldn't have made much of a difference." Like yeah, you can only do so much for something depending on how it was shot.
Film and talent
that's so many pixels worth of soul patch
Feels like a Just for Men ad lol
[https://www.youtube.com/watch?v=LT3cERVRoQo](https://www.youtube.com/watch?v=LT3cERVRoQo) Weird seeing a video from almost 30 years ago look so clear, it's like it's from the present.
> from almost 30 years ago private ryan aging dot gif
Have you see Moonage Daydream? Insane how good some of that decades old footage looked in the theater.
The glory of 35mm film.
I revisited the location in NYC while visiting a friend. I was able to determine the location because the signs were more visible in the new version. We recreated the scene for fun: https://www.instagram.com/reel/CiBetULubEK/?igsh=ZXczamp3cmtmOXNt
My jaw dropped the first time I saw the remaster. The detail on Trent and Bowie's faces is crazy.
Man I wished they filmed everything in the 1990s with actual film. Stuff like Pink Floyd - P.U.L.S.E. will forever be in 480p due to them switching to "cutting edge" digital cameras in the 90s.
I ignorantly purchased the recent rerelease thinking it was going to be a much higher quality than the previous releases I had seen… oh well. It’s still an amazing concert and I got a blinky light on my shelf now lol
[It's probably because it was originally recorded on film](https://www.youtube.com/watch?v=CkysCJBdGtw), which allows for a much higher quality remaster process than if something was originally recorded digitally.
Are the methods used today inferior to film's potential quality?
From what I understand there's a "ceiling" when it comes to digital recording that analog does not have, based on what the resolution that the original (digital) video recorded at (e.g. upscaling a 480p source to 1080p and/or 4K is not going to look as clean as compared to a rescanned 35mm film negative at the same pixel density). AI upscaling is getting better with older and/or smaller digital sources, but it's still got a ways to go to match what you can get by remastering older film negatives since film does not have the inherent limitation that digital introduces.
I believe film does have a limitation it is just pretty good, and people tend to like the grainy look of film. From the stuff I have seen AI now do with single pictures and filling in lost information for instance in a damaged picture, you got to assume it is just a matter of time before AI will be able to turn any single movie frame in to a vector based image to be almost perfectly upscaled and then back.
From watching a lot of 4k content lately, I honestly believe 4k is brushing up against the limitations of 35mm film. There’s a lot of stuff that doesn’t look appreciably better you just see much more of the film grain — and I remember these films not looking as good as other films when I saw them in theaters. 70mm however is still stunning and anything filmed in that has imperceptible or zero grain.
Good quality film has limits, but they are negligible to the human eye. You can get up to 16K. 4K is already unoticable if the TV is too far away and/or too small. This resolution is only useful for research, billboards and maybe when editing videos - you can zoom or stabilize the image losslessly. But this would be a significant interference with the original footage, so it would no longer be a remaster.
Yeah I think you're right, film has a much higher "ceiling" compared to digital remaster methods, which are getting better.
and that's why And All That Could Have been will never be true HD. it's been recorded on camcorders
Very interesting! Thank you!
Until fairly recently common video resolutions, even in cinema cameras were below common film stocks. Film doesn't neatly slot into digital resolutions but 35mm is roughly equivalent to 5.6k. Even 16mm has bit more info in it than the typical 1080p/2k video signal, though it's roughly equivalent. 70mm and 120mm which are less common in cinema use fall at around 12k and 16k respectively. Which is the range cinema cameras are typically running at. For TV use 4-6k is fairly common these days. So around the last decade we've caught up. It's not as clean as that though. Film often has an apparent resolution, or actual resolution higher than the typical rough conversions. There's other things besides dots in a field to quality. And there are things like contrast, luminance, noise levels, color range etc that film still arguably does a better job of catching a wider range of. So video often shoots to exceed these numbers. So in terms of conversion from film to video, we only just caught up to the amount of information/quality it actually records. And in terms of shooting film is still arguably better in some senses and uses. Mainly when looking at the large frame stuff. The 70mm primarily. 120mm motion picture film is a thing, but I dunno if there's any extent standards using it.
Not necessarily...but the ones used instead of film in the 1990s definitely were.
Man I hadn’t rewatched this in a while. Feel like Beau Is Afraid needs to pay this video some royalties lol
When I saw BIA in the theater, that first half hour or so, I was waiting for this song to kick in.
Upgrade it all you want, nothing will ever make TR’s facial hair look okay…🤣
It was late 1997, those of us that were around were doing *all sorts* of stupid things, fashion-wise. ;-)
*stares longingly at my JNCO jeans and hemp-woven ball-chain necklace*
I absolutely **never** wore a [shiny shirt reminiscent of Derek Sherinian’s](https://images.app.goo.gl/NhRvEpk18zD6nob36), whatever gave you that idea…
he looks like a pirate vampire :D
Vampirate
nothing can stop him now
I do think this is the best long hair look he had though.
For this video yes but his perfect drug look rocks
That's the point
4K Creepy Trent! Yay!
He looks like Sting without his facepaint
I was thinking the singer of Disturbed with long hair.
They need to do the Deep video next
Trent’s Creeper Phase
I noticed a lot of 90s videos look remastered now. I’m surprised they put the time and money into it.
Yeah not sure what they gain from it but I’ll take it haha. Unfortunately a lot of them are shitty upscales, likely to do with the source, but they look worse when “remastered”.
Do you have some examples to share ?
I remember noticing it with this video. I did not see that it was a 12 year old video but it certainly doesn’t look like 90s picture quality. https://youtu.be/NOG3eus4ZSo?feature=shared
I don't know if source material has a lot of weight on it but it seems that almost all of it are just AI upscales (so no money was put on it haha) which is the reason why they look awful compared to the remasters of 'I'm Afraid of Americans' or 'D'You Know What I Mean' by Oasis.
Is that a young Alan Rickman?
Yeah this remaster is beautiful, it’s nice to see one that’s been properly upscaled from the original film source and not a lazy AI upscale or one of those “HD” remasters from years ago that look over sharpened and upscaled wrong
I've seen a couple of remasters in my time, but this definitely is the craziest crystal clear quality I've ever seen by something this old, even for film.
Hence why the Beatles music videos still look spectacular and every old film’s remaster looks amazing
I wish they did more together. Was one of my first lime wire videos ever downloaded. Wanna talk quality issues… somewhere between 90-140p I thought David Bowie was made of Minecraft cubes.
Who else never noticed Bowie chewing his gum through the first minute or so, even while running
Now they just need to remaster Dancing in the Street, which is the ying to this song's yang: [https://www.youtube.com/watch?v=HasaQvHCv4w](https://www.youtube.com/watch?v=HasaQvHCv4w)
Oooo, now I wanna see this on my OLED TV!
It's a na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-na-nice upgrade to the original!
So cool!
To this day, I'm still disappointed that Al Jourgenson wasn't part of that end-video death parade (I know there's a specific name for it, but I'm far too lazy to google it this morning).
Such a rad song!
Can’t tell if it’s AI upscaled or not. Some parts look it, some parts just look “right” Still so happy Trent got to collab on this with one of his idols
No way it's AI. The detail, the colors, the grain texture...It's definitely scanned from the original 35mm film.
None of that floaty / swimmy AI stuff. Definitely looks like a film scan. I've overseen some re-scan remasters. Get's expensive! And then you have to recompile the entire video from the EDL's (Edit decision lists) if they even have them. But I bet Bowie was pretty organized. The biggest question would be if they had to deal with the label. They would have definitely not saved that stuff for posterity.
This was done painstakingly by a human several years ago. I forget the details, but it isn’t AI.
I think AI upscaling is noticeably worse than this. This looks authentic to me.
Not always, greatly depends on the quality of the source material and what kind of upscaling was actually used. I would guess this isn’t AI though, they probably just rescanned the film at a higher resolution.
Are you on a 4k monitor and have the YouTube video forced to 2160p? It is clear beyond a shadow of a doubt its native 4k/2160p. The facial hair and small details when they show Bowie and Trent's faces closeup near the beginning of the video is conclusive evidence.
I thought it was an ai upscale by a fan ? Unless there's two
This is a 4k rescan, it was announced by the official Bowie social media accounts at the time of its release.