View Full Version : Antialiasing on HDR in DX9/XP -- Bunked
Phopojijo
August 29th, 2007, 06:30 PM
There are many claims that you can achieve Antialiasing on HDR pixels in DX9. I try to explain that its just not possible because deferred shading doesn't give ample control to developers in DX9, making the code to shortcut HDR+AA for any reasonable bittage (32 or 64FP HDR).
Recently the argument gave some misinformation when people have "hacked" antialiasing into Bioshock in DX9 mode on XP simply by forcing it in the drivers.
Here is one such "success story" in screenshot form.
http://img.photobucket.com/albums/v342/Phopojijo/bioshockaa.jpg
I obviously drew on it examples of what the problem is.
The blue arrows were portions of the frame that the forcing of Antialiasing worked. Note that the sections have little to no bloom. The drivers were able to blend the pixels using typical multisampling methods creating the 8x antialiasing we all know and love.
The red arrows are portions where there is no antialiasing done on the pixels. This was all rendered together, so what went wrong? Those pixels were non-negligibly influenced by bloom. The drivers were unable to use their tricks anymore and there was no antialiasing.
The edge with both blue and red arrows pointing at it is the perfect example. Near the bottom there is perfect antialiasing, where just near the upper-middle there is absolutely no antialiasing. The same edge -- transitioned from antialiased to not antialiased on the same frame.
So in summary -- in DX9 you will not be able to Antialias your framebuffer. You might fluke out and sharpen some edges... and if that's worth the hit for you then you certainly are allowed to drop your framerate somewhat to get the pretty smooth edges. Bioshock was mostly dark with little bloom... some games will use more bloom and the lack of antialiasing will be more prominent.
But it will still not antialias certain shader functions. That will have to wait for DirectX 10.
Footnote: Vista is a hard sell... there were so many enhancements done to it to futureproof it, but people don't notice since it ultimately runs a bit slower due to its larger memory footprint. This is my logic for why Microsoft focused on the User Interface... its about the only change people will notice when they boot into Vista. But, its not the only change that occurred... the rest are just developer/hardware vendor toys for the most part.
Tweek
August 29th, 2007, 06:49 PM
hdr and AA never liked eachother.
i dont use AA, i run at huge resolutions, so i don't need it.
thanks for the info though.
legionaire45
August 29th, 2007, 07:55 PM
With source based games you can get away with AA and HDR at the same time even on Nvidia hardware. If I remember correctly it's because the Source Engine doesn't use the typical FP16 blending mode that most games do.
Although for some reason color correction caused artifacts on my system when I was running XP. Vista seemed to fix it for whatever reason =/.
Phopojijo
August 29th, 2007, 08:29 PM
Source and Oblivion use less bittage for their HDR lighting.
We're talking real 32/64bit HDR here.
legionaire45
August 29th, 2007, 08:38 PM
Source and Oblivion use less bittage for their HDR lighting.
We're talking real 32/64bit HDR here.
Ahh, in which case I completely agree with you there.
Agamemnon
August 29th, 2007, 08:50 PM
And? A game's a game. The last thing I actually noticed in Oblivion is if there was pixel tearing or not. Considering if I ever took a screen shot to show off I would just blur that in Photoshop anyways, what does it really matter?
Amit
August 29th, 2007, 09:06 PM
Come on guys, take a look at the pics...you don't notice it too much. Who care if you have a few small jaggies, it doesn't make the game unplayable.
Syuusuke
August 29th, 2007, 09:24 PM
The point is that people are saying AA and HDR can exist at the same time with both settings on, this is about saying "No you can't" not "it's still playable".
What uses more resources...AA or HDR? =?
Amit
August 29th, 2007, 09:34 PM
The point is that people are saying AA and HDR can exist at the same time with both settings on, this is about saying "No you can't" not "it's still playable".
What uses more resources...AA or HDR? =?
You're right. I sorta misinterpreted the point. But who cares? I rather have HDR over AA anyways. It's kinda pointless to argue over saying if HDR and AA can work at the same time. As I said, you don't really notice it.
legionaire45
August 29th, 2007, 09:56 PM
The point is that people are saying AA and HDR can exist at the same time with both settings on, this is about saying "No you can't" not "it's still playable".
What uses more resources...AA or HDR? =?
I don't have any research or anything to base it off of (I can't even have more then 2X without lagging to death in CS:S xD) but I imagine that AA scaled up really high would be more of a resource hog then HDR. I think someone like Phopo or Tweek would have a better idea though.
Phopojijo
August 29th, 2007, 10:24 PM
The point is that people are saying AA and HDR can exist at the same time with both settings on, this is about saying "No you can't" not "it's still playable".
What uses more resources...AA or HDR? =?Its not quite cut and dry. You have multiple levels of HDR and Antialiasing.
Also -- and my main reason for posting this. HDR can be done in many ways. FarCry and HalfLife tend to have the "ducttape" approach. They create a standard rendering pipe and stick on HDR capabilities.
In UnrealEngine3, the pipe is natively built around 64-bit HDR and so they could always rely on HDR and not need to worry about regular color range limitations... like programming their own Gamma control -- indoor/outdoor lighting... whatever.
So most games involving an engine like UnrealEngine3 -- will not give you the option of turning off HDR. It just will not be possible, there is no other rendering pipe.
Antialiasing generally is a bit more of a resource hog. Basically what you're doing is rendering the image multiple times -- firing little sub-pixel rays.
http://gbxforums.gearboxsoftware.com/showpost.php?p=877325&postcount=6
Con
August 29th, 2007, 10:32 PM
Its not quite cut and dry. You have multiple levels of HDR and Antialiasing.
Also -- and my main reason for posting this. HDR can be done in many ways. FarCry and HalfLife tend to have the "ducttape" approach. They create a standard rendering pipe and stick on HDR capabilities.
In UnrealEngine3, the pipe is natively built around 64-bit HDR and so they could always rely on HDR and not need to worry about regular color range limitations... like programming their own Gamma control -- indoor/outdoor lighting... whatever.
So most games involving an engine like UnrealEngine3 -- will not give you the option of turning off HDR. It just will not be possible, there is no other rendering pipe.
Antialiasing generally is a bit more of a resource hog. Basically what you're doing is rendering the image multiple times -- firing little sub-pixel rays.
http://gbxforums.gearboxsoftware.com/showpost.php?p=877325&postcount=6
teach me more!
Phopojijo
August 29th, 2007, 10:35 PM
Lol...
Glad I'm helpful
Game designers always have tradeoffs... they can cut out legacy support to increase the quality per frame -- but then they have less people who can play their game.
UnrealEngine3 supported anything with PS3.0 (or 2.0b I believe) to rely on 64-bit HDR natively. This way -- they could transition from indoor, outdoor, cavern, whatever and have the engine dynamically shift the gamma and bloom/blacken the pixels which don't fit in with the average brightness. They also cut off GeForceFX and Radeon9 in the process.
Give and take... thats what performance is all about.
I talk most about UE3 basically because thats the engine I know most about.
jahrain
August 29th, 2007, 10:53 PM
Is this just for bioshock? I have had no problem running both AA and Bloom, (not HDR) in hl2 episode 1 and CSS.
Phopojijo
August 29th, 2007, 11:08 PM
Nah -- its for anything which requires high levels of HDR, like 32-64bit.
And bloom is not necessarily HDR. Bloom is usually associated with HDR because the first implementations of HDR are bullshit o.o
"Lookie all! I have HDR! Lets make the game look like Gordon Freeman and Jack Carver have hangovers!"
jahrain
August 30th, 2007, 05:39 AM
Bloom is the only thing I notice different with HDR enabled in most games, other than that, things just look more dull and grey, or highly contrasted/gamma tweaked and the fps goes way down.
Tweek
August 30th, 2007, 05:51 AM
well HDR means ALOT of things.
if anyone wants to know a bit more about it, there's always the internet and wikipedia/google.
generally speaking, AA will cause you more framedrop than hdr.
especially on high resolutions. this is still heavily bound to to what level you put things. if you have 2x AA, and all possible hdr crap, hdr will propably be the thing bogging down your pc.
hdr, high dynamic range, simply means you can go use higher light values, and you're not bound to the 0-100, your monitor can display.
example of this, if when you look at the sun in a game, you're not blinded, because yout monitor can't really show you that kind of light intensity. hdr puts in seperate effects to mimic it though, bloom, tone mapping, stuff like that. thats just 1 example of hdr. like phopisaid, bloom isnt really good hdr, but its more of a feature that uses HDR-data (above 100 light values) hdr can do waaayz more than this, so i really suggest people do some research in it, because it's cool.
Phopojijo
August 30th, 2007, 03:43 PM
Correction to tweak,
1) Its actually 0-255 for 8bit channels.
2) FP HDR actually uses FloatingPoints so its actually decimal numbers and exponentials.
Lets say you're playing a videogame which goes in and out. Lets say you're playing... Halo 2 Vista. (Go figure)
When in the skybox you have lighting data for both indoor and outdoor environments. The idea behind this is that your iris closes somewhat when you go outdoors and widens when you go indoors to adjust for the wildly different light amounts.
The game engine changes between two lighting methods depending on whether you're indoors or out. Doom 3 doesn't at all... which is why it looks like ass outdoors. (HDR was not enabled in Doom3's implementation of the D3 engine)
Standard Color Range: Color needs to be broken down to 0-255. If you go outside -- light might be exponentially more intense than indoors. (Our eyes sense brightness logarythmically. Our perceived "twice-as-bright" might actually be "brightness-squared"... its just a physics/biology thingie.) The game engine compensates for this by saying
Indoors -- this amount of light is 255, this amount of light is 0... outdoors... this amount of light is 255, this amount of light is 0
High Dynamic Range Color: At all times every pixel has a MUCH LARGER intensity range... possibly from 1 photon to 10^350 photons per red/green/blue channel. (Not the actual numbers)
So lets say you're outside with on screen 10^50 photons entering each pixel. The game engine says "Oh, so the player's iris should be this closed" and sets a 0-255 color range at that point in time for that average light intensity
But then you walk into a cave with ~10^3 photons per pixel
Jee at that light range you wouldn't be able to see anything! o.o Everything should be black... Oooh, well... lets just "open the iris more". The game engine then decides what 0-255 should be based on the lighting data it receives at that one point in time.
The game engine usually decides this once every frame/couple of frames.
But wait... what happens if you're in a cave with 100 photons on average per pixel... BUT a stained glass window with direct 10^40 photon sunlight?
Simple -- what does your eye do? Well... all red/green/blue channels would be at 255 brightness for the window... so it'd appear white. But its so intense it actually bleeds white to neighboring pixels.
Bloom anyone? :)
Some other things also are judged by HDR like White Balance (color temperature).
Simply put...
HDR lighting is designed mostly to give accurate lighting if the ambient brightness increases and decreases heavily... such as Oblivion where you're constantly moving both in and out doors.
And then of course you get games which ramp up the bloom to masturbate their technology in your face... completely forgetting why the technology was even there in the first place.
Anything you want clarified?
Tweek
August 30th, 2007, 05:33 PM
you win for taking the trouble to explain things properly <3
Amit
August 30th, 2007, 05:37 PM
Bloom is the only thing I notice different with HDR enabled in most games, other than that, things just look more dull and grey, or highly contrasted/gamma tweaked and the fps goes way down.
Same. I wish there was HDR on de_dust2 though, that would be awesome!
Phopojijo
August 30th, 2007, 09:57 PM
The dull and grey is not due to the HDR though. Thats various desaturation filters they use to make things seem more gritty... similar to common desert combat movies like Blackhawk Down.
Some games also have a filmgrain filter added.
That's artistic... not HDR.
These are HDR
http://www.beyondunreal.com/staff/hal/ut2007/UT3_E32007_04.jpg
http://www.beyondunreal.com/staff/hal/ut2007/UT3_E32007_14.jpg
http://www.beyondunreal.com/staff/hal/ut2007/UT3_E32007_18.jpg
http://www.beyondunreal.com/staff/hal/ut2007/UT3_E32007_03.jpg
Notice the blooms on things that are too bright for the frame's average light intensity.
et_cg
August 30th, 2007, 10:25 PM
http://img483.imageshack.us/img483/6056/goodexamplemw3.jpg
The fourth picture that you have up there, I noticed displays HDR very well.
Notice the bald man's head, the color value there has been raised and therefore showing that the light is casting over his head and is quite bright.
Also, if you were using a standard lighting method (no HDR), you'd see that the darker part of his body would be much more visible. In this case, with HDR, it's much darker. Assuming you have your monitor's color values adjusted correctly, you would notice it much better, rather than raised monitor brightness and the color where there's a shadow would seem much more visible (but not as saturated with color, which is what the artist intended).
Phopojijo
August 30th, 2007, 10:42 PM
Yep... and its all because the game's HDR framebuffer has the every pixel's R/G/B components in their precise intensities. How many red photons, how many blue, how many green (sorta -- color science is VERY complex especially since its very subjective to the brain. Really its just energy deposited per photon and how many of them... but lets not worry about the details.)
No data is getting lost until the final stage where the engine decides the Gamma level. Other than that -- all photon intensities are precise. Nothing gets "averaged out" to look brighter/darker than it should until the engine decides the exposure, gain, and white balance scales.
jahrain
August 31st, 2007, 04:53 AM
Is there any non-HDR samples of those same pictures to compare to? As I said before, the only thing I can notice is just the added bloom overlay onto the image. Thats something I could easily do with Halo CE screenshots in photoshop with some clever artsy filters as well.
If its about adding the bloom just to areas where the light intensity is greater than white pixels on the display, such a thing could simply be emulated by setting the color output buffer to output dimmer shaded pixel values as bright colors, and any brighter pixels exceeding that threshold to have a bloom overlay. Once again, thats something that could be done just with clever output filtering if the overall color output for the game is set properly.
did this in photoshop really quick to sort of demonstrate this concept using a single screenshot from a hce map.
http://i193.photobucket.com/albums/z11/jahrain00/haloce2006-11-0211-46-25-43copy.jpg
Either way, even if the game uses precise calculations for color intensities, its still watered down by the monitor's display color output. Such advantages of the precise HDR calculations would only be truly useful with monitors that can display such color intensities to that precision such that extremely bright lights actually glare out of the monitor like the real intensity of the light.
et_cg
August 31st, 2007, 04:56 AM
It's more of a technical thing you have to appreciate. I mean, do what they said a few posts back and just use google and you'll be astonished.
Oh, and DX SDK, they have some *decent* samples. I'm sure Nvidia has some, as ATI may also.
jahrain
August 31st, 2007, 05:11 AM
I have looked it up, but from what I understand, such feats of HDR cannot be truly utilized by the fact that they are restricted to the output monitor's limits.
The effects such as iris opening and closing when moving from outdoors to indoors and light burn-in effects is something once again could be accomplished just by automatic brightness/gamma adjustments, as well as overlay filtering without the necessity to inefficiently calculate it by using floating point based pixels. It just seems all too inefficient and a waste of computations if the floating point RGB colors are just going to get watered back down into 32bit color when displayed. Like I said before, such things would only be usefull for a monitor that can actually display floating point intensity colors.
Phopojijo
September 1st, 2007, 12:54 AM
Is there any non-HDR samples of those same pictures to compare to? As I said before, the only thing I can notice is just the added bloom overlay onto the image. Thats something I could easily do with Halo CE screenshots in photoshop with some clever artsy filters as well.No, there are not. UnrealEngine3 can only do HDR lighting. Its render pipe is 64-bit floating point... period... that's it.
I have looked it up, but from what I understand, such feats of HDR cannot be truly utilized by the fact that they are restricted to the output monitor's limits.Yea many people tell me that -- then I have to explain to them what HDR even is.
The effects such as iris opening and closing when moving from outdoors to indoors and light burn-in effects is something once again could be accomplished just by automatic brightness/gamma adjustments, as well as overlay filtering without the necessity to inefficiently calculate it by using floating point based pixels. It just seems all too inefficient and a waste of computations if the floating point RGB colors are just going to get watered back down into 32bit color when displayed. Like I said before, such things would only be usefull for a monitor that can actually display floating point intensity colors.
... and have a monitor that can blind you to the point that your irises would need to close? Not very practical to get snowblind from a monitor...
That's not what HDR is all about; HDR is, quite frankly, mostly a developer toy. With HDR you do not need to manually input "okay looking" lighting values in order to get a "decent" lighting profile.
http://www.beyondunreal.com/staff/hal/ut2007/UT3_E32007_04.jpg
and
http://www.beyondunreal.com/staff/hal/ut2007/UT3_E32007_03.jpg
Would typically require -- basically two separate non-HDR 3d engines to render accurately without the artist SERIOUSLY playing some lighting tricks. In HDR? It just works...
In HDR the lights are set to appropriate powers with respect to each other... and the color and white balance should constantly adjust to represent what the scene should actually look like.
You can get *really* accurate lighting with less effort and have the white balance and dynamic gamma be *algorithmically* changed using the laws of physics... versus what an artist believes the conditions will be under predicted circumstances.
It will also be a requirement when Global Illumination and some other visible algorithms come into play in realtime applications.
Tweek
September 1st, 2007, 05:30 AM
It will also be a requirement when Global Illumination and some other visible algorithms come into play in realtime applications.
need.
i imagine full realtime raytracing in the near-future as well, especially with multi-core processing becoming more available, and more and more the standard.
we had a project at school, a realtime raytracer (http://www.uni-ulm.de/rt07/Program.html), based on an 8-core computer we called octopussy, and it performed awesome things.
if somehow, a new leap in polygonal rendering can be made, polygon limits will become a thing less and less of the past, since raytracing isn't impacted by polygon count one bit.
if someone were to write an engine, purely off that, can you imagine what's going to happen? i definately can't, but i know it's going to be freaking awesome.
legionaire45
September 1st, 2007, 05:38 AM
need.
I wish they would release the Quake 3/4 raytrace demos D=/. Those have smexy lighting even if it only runs at -2 FPS.
Not GI or another form of eyesex though =(.
Phopojijo
September 1st, 2007, 07:18 PM
need.
i imagine full realtime raytracing in the near-future as well, especially with multi-core processing becoming more available, and more and more the standard.
we had a project at school, a realtime raytracer (http://www.uni-ulm.de/rt07/Program.html), based on an 8-core computer we called octopussy, and it performed awesome things.
if somehow, a new leap in polygonal rendering can be made, polygon limits will become a thing less and less of the past, since raytracing isn't impacted by polygon count one bit.
if someone were to write an engine, purely off that, can you imagine what's going to happen? i definately can't, but i know it's going to be freaking awesome.Per-pixel lighting models already are very negligibly impacted by polygon counts.
Static meshes basically only are limited by their RAM footprint.
Deformation meshes still require processing power in distorting the vertices. However, PS4.0 will supposedly offload most (if the developer can manage it by means of trade-offs) of the geometry manipulation to the GPU.
Phopojijo
September 3rd, 2007, 03:03 AM
Is there any non-HDR samples of those same pictures to compare to? As I said before, the only thing I can notice is just the added bloom overlay onto the image. Thats something I could easily do with Halo CE screenshots in photoshop with some clever artsy filters as well.
Mind the double-post... I just noticed something from Jahrain's post which requires serious clarification:
HDR is not something you should notice!
Bloom and Dynamic Gamma have been around before UnrealEngine2.
HDR is only to allow for mathematical precision in the lighting and application of Bloom and Dynamic Gamma (along with some other future-eyesex like GI)
What you SHOULD notice are the side-effects of HDR: Bloom (where it should be bloomed, and not where it shouldn't), correct color representation between indoors and out. (If you want to see exactly how hard it is to get good color representation both in and outdoors at the same time... talk to a photographer.) etc etc etc.
Phopojijo
October 25th, 2007, 01:34 AM
A little update -- allegedly HDR and Antialiasing can (allegedly shortly) be done without Vista and DirectX 10 (You'll apparently need the Radeon 2#00 series until nVidia does a similar update for GeForce8+).
XP Users! Sorry -- still not for you!
http://www.phoronix.com/scan.php?page=article&item=887&num=1
Its for Linux.
Completely feasible to be done in an open Kernel with some new version of OpenGL... that was never the problem... the problem is DX9 cannot support 64bit HDR and Antialiasing because DX9 cannot properly control deferred rendering.
... DX10 doesn't have that problem...
And with the proper patches -- OpenGL doesn't either. Allegedly ATI did it... it may have been a misprint and it could be the HalfLife2/Oblivion patch where it doesn't work on 64FP HDR -- only low bit HDR... but hey -- it seems to be true for compatible games.
For once -- Linux is a more viable gaming option than Windows -- at least temporarily. <3 Linux
Powered by vBulletin® Version 4.2.5 Copyright © 2024 vBulletin Solutions Inc. All rights reserved.