Tools and Tips

Gioacchino Noris: Hi Guys, I took the liberty of creating this page. It should be meant as a knowledge bazar. Feel free to add whatever you think might be usefull.

Troubles

RenderToTexture + Z-Buffer Matthias Bühlmann: Hi all. At the moment I have a strange problem. When using SetRenderTarget to render to a texture, everything works fine on my computer (I've got an Ati Mobility FireGL V3200), but on the computers of my teammates (they've got GeForce Go 7300 and Geforce 6800) the Z-Buffer does not work anymore! So, when rendering into a renderTarget other than the backbuffer, everything gets drawn just over the stuff that got drawn before..

Did anyone of you experience similar problems or has an idea what could go wrong? Thanks!

Useful Links and Resources

  • Kobalt64 Tutorial: Good to see how to apply custom effect to Model subparts (meshes).

  • NVIDIA Developer Site: link

Specific Topics

Creating Normal Maps

  • You can use the NVIDIA Photoshop plug-in which directly transform an image to a normal map. Take care of merging all the layers of the image before calling this filter, or you might encounter problem due to the transparency of the selected layer. You can find the plug-in here.

  • If you are able to model high and low poly meshes, you can use the NVIDIA Melody program. It basically does a normal map baking (take the geometry of the high poly mesh, and project it into the UV space of the low poly mesh, using the low poly mesh geomtry for the projection). It produce both tangent and object space normal maps, and has a lot of nice tweaking features. Btw, you can import the created textures in photoshop, modify them a bit, and use the photoshop plug-in above to re normalize them. Melody can be found here.

Melody.png

Particles

Gioacchino Noris: I followed the above tutorial by jsedlak about particles. Here I post some throubles I have.

  • If I create a Windows project everything works fine. I had to change the blending in the draw() method, setting this: graphics.GraphicsDevice.RenderState.DestinationBlend = Blend.Zero; This way I get a better looking result, but of course it depends on what kind of effect you are interested in.

  • Performance wise I'm quite confused. On my Desktop PC (P4 2.6Ghz, Geforce 6800GT) I get 90FPS up to ~20'000 particles, than performances drop drammatically. On my laptop (Centrino 2GHz, Radeon mobility 9700) the FPS cap is 60 (even if screen refresh is 75Hz, no v-sync), and with more than 7000 particles I get performance losses. More over, even with few particle, if I go closer to the emitter things slow down badly (it does not happen on the desktop PC). Anyway I don't really know how much sense this kind of tests make, since it should be tested with the published version (hoping that the final build is quickier).

  • Xbox.... here start real troubles. First of all if simply put my project into an Xbox one, I get an exception that complains about the vertex declaration at this line: graphics.GraphicsDevice.VertexDeclaration = new VertexDeclaration(graphics.GraphicsDevice, VertexElements);. As far as I've understood this vertex declaration is used to create a struct you pass to the DrawUserPrimitive methods. Without this, on the window project I get the particles which are not blended correctly. On the xbox I heve two things happening
    • performances are ridicolous (5000 particles make the image warp... didn't check the FPS count)
    • I cannot see the particle textures; the paricles are monocolored.

Matthias Buehlmann:

Hi! About your troubles you experienced:

1.) Seting the Destblend to Zero is quite strange - if you do so, you won't get transparent particles - actually you should get black borders arround every particle - quite strange that i can't see this effect in your screenshots . . . for your kind of effect the right settings would be SrcBlend = SrcAlpha, DestBlend = InvSrcAlpha. If you don't get black borders, i guess that you change change your blending a secound time in the Technique settings of your shader - could this be?

Gioacchino Noris: You are right. It is not actually set as Blend.Zero. I thought it was because it says that that is the default value, and I wasn't changing it, but if I explicitely set it as zero, I got everything srewed up. So, somehow XNA change it.

2.) I'm experiencing the same performance problems with my particle system. On my laptop, everything is fine up to 20'000 particles, then performance drops quite fast. On the xbox i get 7'000-10'000 particles and then the performance drops. So far, I think the bottleneck is the rasterizer and the blending of the particles. So it will be very difficult to optimize this in any way. The best thing will be, if we setup our particle effects carefully so that we don't draw too many particles over each other which don't add something to the visual appearance anyway... Also, you should use additive blending with "One One" whenever possible, because it is the cheapest equation and you don't get artifacts if you don't sort. Another thing we could think about is implementing deferred shading for the particles - however i don't know how well this works for semitransparent objects - but i think one could create interesting effects that way...

3.) about your white particles. In case you use pointsprites, you have to change the TEXCOORD semantics of the vertexstream in the input of the PixelShader (only there!) to SPRITETEXCOORD. You can do conditional compiling that way:

#ifdef XBOX float2 TexCoord : SPRITETEXCOORD; #else float2 TexCoord : TEXCOORD0; #endif

Also, you should compile for pixelShader level 3 if you compile for the xbox

Maybe we should sit together and compare our particle systems. I'm interested in how you implemented your particle management.

Gioacchino Noris: I'll try it tonight, but Thanks in advance wink

-> I've tried but I have hard time making the shader compile. Here is the current shader. If you, or anybody, can have a look into it and fix it it would be great.

Matthias Bühlmann: -> the sprite texcoord stream must only occur in the PS_Input - so you need different structs for VS_OUTPUT and PS_INPUT! Also I suggest to use the PSIZE stream in the vertex shader to drive your particle Size! If you divide the PSIZE by the z coordinate of the transformed Position, your particlesize will decrease over distance. I changed the shader a little bit - didn't test it though: Here

Matrix Stack

Matthias Bühlmann:

I think the hierarchycal drawing of the models in XNA is very ugly. So far i have to use myModel.CopyAbsoluteBoneTransformsTo(boneTransforms); in every frame before drawing to let the Model class calculate the absolute Transformation matrices relative to Worldspace of the Model-Inside hierarchy of the meshes. Now i want to implement some kind of matrix stack in my RenderManager similar to the one directly provided by DirectX and openGL if the fixed function pipeline is used. However it's still not really clean because i have to differ between my Model-Outside hierarchy (models hierarchy) and the Model-Inside hierarchy (bone hierarchy inside each Model). Did Anyone find a good solution to deal with that? Also - is there maybe a way to use hardware to do the matrix calculations and Access the fixedfunction-pipeline's modelView matrix stack directly from the shader? You can add semantics to the external variables which are passed to the shader using effect.Parameter["asdf"].setValue like this:

uniform extern float4x4 m_WorldViewProj : MYSEMANTICSFORTHEWORLDVIEWPROJMATRIX

However - i didn't find a way to use these semantics in XNA so far. Does anyone know something about it?

FX files

Matthias Bühlmann: At the moment i have to deal with three shader problems:

1.) Since every shader needs different Attributes, and XNA gives runtime errors when passing Variables that are not used by the shader, It's very difficult to implement the rendering so that you can easily add new objects with unique shaders. One possible way is to put ALL possible variables that are used by all of the shaders in EVERY shader header and pass all of them to every shader. This has two disadvantages: 1.) Data overhead 2.) if you create a new shader that needs a new attribute, you have to change all headers of all the other effect files and also change your drawing routine. The way i'm using at the moment is to create for each shader I'm using an additional "InterfaceClass" that provides a PassAttributes() method to pass the right attributes for this shader.

If someone found a better solution, please let me know!

2.) Multipass rendering and MultiMesh rendering

If your Model consists of different meshes, and every mesh needs a different shader, it would be nice to have all these shaders in one .fx files. Also, if you have multipass rendering, it would be nice to have the different shaders for these passes also in the same .fx files. In Rendermonkey, for each mesh of your "model" a separate pass is created. A nice way to combine multipass rendering and multiMesh rendering in one fx file would be the use of techniques and passes like this:
technique MyFirstRenderPass
{
    pass myModelsHead
    {
        vertexShader = compile vs_1_0 HeadPass1VertexShader();
   pixelShader = compile ps_1_3 HeadPass1PixelShader();
    }
    pass myModelsSunglasses
    {
        vertexShader = compile vs_1_0 SunglassesPass1VertexShader();
   pixelShader = compile ps_1_3 SunglassesPass1PixelShader();
    }
}

technique MySecoundRenderPass
{
    pass myModelsHead
    {
        vertexShader = compile vs_1_0 HeadPass2VertexShader();
   pixelShader = compile ps_1_3 HeadPass2PixelShader();
    }
    pass myModelsSunglasses
    {
        vertexShader = compile vs_1_0 SunglassesPass2VertexShader();
   pixelShader = compile ps_1_3 SunglassesPass2PixelShader();
    }
}


This would be very nice since you could have all your shaders needed for one model in one singe fx file. The problem is, how to access the single passes of the techniques without searching for the right name string for each mesh to draw? (since the order may change, it is not always the first model associated with the first pass of each technique). Also, sometimes you want to use the same shader for all meshes for certain parts. For example, you will have different shaders for Head and Sunglass in the Diffuse-Pass, but you want to use the same shader for the Shadowmap pass.

Anyone an idea or maybe anyone interested in sitting together with me to find a nice solution?

Thanks!

Gioacchino Noris: I don't have a solution. As far as I know the only thing you can do is to apply different .fx to different meshes. As you do, If you want to have just one effect for the hole model, than you have to manually select the passes associated to the meshes when you do the drawing. The mesh order should be fixed, as the passes and techniques, right? I didn't get the "order may change" thing. Maybe I' completly out of the scope, but what about an hashing of meshes-passes? At least you wont have to loop looking for names.

Matthias Bühlmann:
The order may change as soon as you change something in your model (outside of XNA )- and you always have to analize first in what order your Meshes are stored and then order your passes according to this order which is quite annoying! Hashing could be a way - by comparing the names of the meshes with the names of the passes when the shader is created and storing it in a hastable. I'll try this - however i have another problem i don't know how to solve: I'm using the Mesh.Draw() method to render my models - but i don't know where to set which pass of the .fx file shall be used. Pass.Begin() "rendersomething" Pass.End() seems to work only when rendering in immediate mode... Any ideas?

Gioacchino Noris: As far as I've understood, mesh.Draw() simply a shortcut for something like "for all mesh, for all techniques, render all passes". I let you know if I find something to set specific tech/passes to meshes.

3.) Something quite confusing about .fx and .X is, that the .fx files allow to contain code to load a .x mesh (as well as textures). Vice versa, it is possible to store links to effectfiles inside the .X format. It would be quite nice to store everything that has to be loaded together with a Model inside one file (so that you can define what fx files will be used directly in the .X file). However, I didn't find a way to use these features in XNA without writing a custom importer. Any Ideas?

Accessing Vertex data of XNA Model class

This tutorial explains how to "circumvent" the write-only restriction on vertex buffers of xna Models. Note that the solution proposed there has several weaknesses:

  • Geometry data is duplicated
  • Even worse: It is assigned to a field of type Object, which means that each access causes "boxing", resp. "unboxing", which consumes valuable processing time.
  • The custom model processor is executed at compile time (!). It would seem logical that at this time it doesn't make any sense to read data from the model vertex buffer, however, with .x model files this works well. .fbx files on the other hand don't work, all you get is (what we think) random data.

The problem seems to have been solved in the XNA Game Studio Express Update. Further fixes are listed here.

Animation library

This library can be used to use animated models. There is a good tutorial in the tutorial section.

Specific Shader Question: Ghost!

I would like to ask for some help for a specific shader. Hopefully this might bring out some matter which may be useful for other people.

Take the ghost in the picture here. I want to create a shader to make a particle emitter look like them (concentrating on the color aspects. The shape will be mostrly a tuning of the emission, or a combination of more emitters).

I've created this texture, and I'm wondering how should I blend it. Consider these settings:

graphics.GraphicsDevice.RenderState.AlphaBlendEnable = true;

graphics.GraphicsDevice.RenderState.AlphaBlendOperation = BlendFunction.Add;

graphics.GraphicsDevice.RenderState.SourceBlend = SourceAlpha;

graphics.GraphicsDevice.RenderState.DestinationBlend = InverseSourceAlpha;

As far as I've understood this produce the following effect: color = texture * alpha + background * (1-alpha) where background mean anything that's behind the current particles (other particles already drawn, objects, or viewport background), right? And the result looks like this.

As you might notice there is a lot of green inside the ghost shape, and what I would like to reach is having green-yellow only on the border of the shape, and black inside. Is this feasible? Do I need a second shader pass? How?

Matthias Bühlmann: I'd use deferred shading for the particles to get this effect. Maybe you could implement something like 'linear burn' with the blending equations - but this kind of effects would work only against bright backgrounds anyway - so you would have to render them into a thexture, render an alpha channel too and composite them over the background in a composite pass. Don't know how wether the borders will look clean. To get this kind of effect, I'd use deferred shading! Render your geometry into a Texture and then clear the framebuffer to black, but leave the depthbuffer as it is. then render gray, blurred particles with additive blending with sourceblend = one and destblend = one against a black background into a secound texturetarget. Finaly render a compositing pass on a screensized quad. In the pixelshader of this pass you load the two rendered textures as well as a 1D texture with your black to green gradient. Then you use renderTexture2 as texturelookup for the 1D gradient and you use the same texture multiplied by something between 2 and 6 and clamped to [0,1] to blend it with renderTexture1 (multiplying it will make the particle boarders more opaque when compositing them with the background).

If you can render a lot of small particles, you could also create a similar effect it in one pass by using a ghostMesh as particleEmitter and use the dotproduct between eye-ray and mesh normal to calculate the creation-color of the particles - but you will most likely have to calculate the dot product on the cpu at creation of every particle.

360 Dashboard like Keyboard

An onscreen keyboard can be found here.

Random Tips

  • In Windows, to screen capture only the selected windows, use ALT+PrtScr
  • In Photoshop, spend some times to look into the "brushes" panel. It allows to have crazy shape and color dynamics. And if you have a mouse pen, you can make things depend on the pressure and so on. For instance the ghost in this image were done in 30 sec (not joking), combining the right brush with some layer outer+inner glow.

Additionals


I Attachment History Action Size Date Who Comment
PNGpng Melody.png r1 manage 167.9 K 2007-04-27 - 12:05 UnknownUser Example of object normal map generation with Melody
Unknown file formatfx PointSprite.fx r1 manage 1.0 K 2007-04-19 - 17:51 UnknownUser  
Unknown file formatfx shader.fx r1 manage 1.0 K 2007-04-19 - 21:31 UnknownUser  

Page URL: https://twiki.graphics.ethz.ch/bin/view/GameClass/ToolsAndTips
2024-03-29
© 2024 Eidgenössische Technische Hochschule Zürich