Invitation

Whoever wants to contribute to this blog , posting his experiences on the same place, please send me a message !

Saturday, April 28, 2012

No srgb support in HD graphics..



Through a problem that i encountered  I learned that in the config below HD graphics cannot show sRGB color mode so it is advisable , in order to have the same output with every card, to disable sRGB mode in your code :

    GLUTDisplay::setUseSRGB(false);

in your main function.


( My  PC has the monitor connected to an Intel HD graphics iGPU and using two Geforces as  dedicated cuda devices)

Solution-Sponsor : Kyle Hayward




5 comments:

  1. Sounds like it might be an srgb issue (perhaps the intel gpu is returning false for the texture and framebuffer srgb support?). Try disabling srgb in the path tracer sample program.

    In path_tracer.cpp (towards the bottom in main())
    GLUTDisplay::setUseSRGB(false);

    ReplyDelete
  2. The above should test if srgb is the issue by giving you a darker scene for both intel and nvidia.

    ReplyDelete
  3. Eventually i found out that the HD-graphics igpu cannot use srgb so it is not an Optix problem, but an intel problem

    I set it to false and everything is ok ..Thank you again :-)

    ReplyDelete
  4. If the intel really doesn't support srgb (seems unlikely, but maybe there's another extension to test for. I don't work a lot with opengl so I'm not sure ). You can do gamma correction manually.

    You need to convert textures/material colors to linear space and do all lighting calculations, then convert back to gamma space. Ex:

    float3 diffuse = pow( tex2D(...), 2.2 );
    float3 final = NdotL * diffuse * lightColor;
    final = pow( final, 1.0/2.2);

    This post describes why linear space lighting is important: http://filmicgames.com/archives/299

    ReplyDelete
  5. I know how to do it ! I have done it with OpenGL/GLSL before . I will implement it but probably in the end of the project, after doing things that matter more

    ReplyDelete