Scanned objects material update

In trying to find the effect we want to apply to objects as they are being scanned, I have managed to create this effect as shown in the image. My goal is to now combine both techniques and reach a conclusion with the texture.

 

 

 

*EDIT* Having researched this I have managed to create an effect that we could potentially use. Combining the toon FX outlines and the surface light materials together produces the desired effect. However the only problem so far is the glow will only be seen if it is been cast on objects nearby. I will further look into developing this effect.

 

 

*EDIT 2* I believe I have finally reached a good point to stop going further with exploring this particular material.

Scanned objects material

Within the product we are creating, we are aiming to make objects stand out as they are being scanned by the AR Drone; in this case it will be the bombs the player has to diffuse in a certain period of time.

 

 

 

 

 

 

As you can see in the image I have managed to achieve a outline for the object but I am trying add a glow so it is a bit more pronounced.

Augmented Reality limitations.

Having found a document detailing limitations of augmented reality. These are guidelines we should carefully follow to achieve a good workflow and produce a game that technically works and plays well.

D’Fusion is a development kit designed for augmented reality (http://www.t-immersion.com/products/dfusion-suite/dfusion-mobile) and here is a list and a Q&A of what D’Fusion can and can’t do.

D’Fusion Limitations

Overview:

 

·         Use only textures having the size of power-of-two.

·         Avoid big textures – try to adapt the texture size and make it coherent with what will be finally rendered. E.g. you don’t need big textures for a static panel finally rendered into a zone having dimensions of dozens of pixels.

·         Don’t use shaders.

·         Don’t use shadows.

·         8000 triangles max if your scene is not animated. 4000 triangles max if your scene is animated.

·         Skeletal animations are limited as most of the time they are not hardware accelerated (depending on the devices).

 

Detailed Q&A:

Q. Are Bump Maps created in Maya rendered in the Ogre for AR?
A: Bump and Normal maps are not exported but are only supported in Ogre through the use of shaders that are applied after export. We use these very rarely though, as they will not work on all computers, so they should not be used for projects that will be on the web.
Q: How can you export baked textures from Maya (using the Mental Ray baked texture) into Dfusion?

A: You can add them to your diffuse texture or create another slot in your .material.

Q: How can you export animated image sequences on the mapping channels of a Material in Maya into Dfusion?

A: You must add them manually in your .material created by your export.

Q: Can you export a spherical global environmental map from Maya into Dfusion?

A: No, you can add an environment map in your .material created by your export.

Q: Which Maya Material Channel Maps are supported in dfusion?

A:

·         Color: RGBA/texture file (jpg, png)

·         Transparency: RGBA/texture file (if transparency uses the alpha channel of the Color texture file)

·         Ambient Color: RGB

·         Incandescence : RGB

·         Specular Color: RGB

·         Cosine Power

·         Color and Ambient Color should be the same. If a texture file is used as Color, Ambient Color must be white (RGB = 1, 1, 1).

·         The other components are ignored. In particular, advanced materials like procedural textures or Mental Ray™

·         Materials are not taken into account by the exporter.

·         Real-time shaders are not exported either.

Q: Can the mesh be created by all triangles with 3 vertices or does it always have to be quad faces?

A: Yes, you can model in triangles or quads. Quads will be converted to triangles on export, while the triangles will remain triangles. The triangle count in Maya is what you should pay attention to as far as how complex a scene has become, since D’Fusion counts triangles, not quads.

Q: I have a model with texture and set the ambient RGB color = 0.5, but the exporter will change the ambient color to 1 automatically. How can I export model using ambient color RGB = 0.5 not = 1 ?

A: The ambient value in maya, is at 0 0 0, therefore the ambient light would not interact with the material. This is why, the default preset of the exporter is to lock the ambient at the value of the diffuse ( 1 1 1), this feature can be disabled directly in the exporter through the advanced tab.

Q: What is this error message : “Warning Some submeshes have 32 bits indices”, it appears at the end of the export ?

A: This is not an error; it is a warning to prevent you from having a mesh with too many vertices. You have exported this scene with “public” target hardware, this target is to prepare the .scene to be compatible with a large range of video cards (like generic video card of laptop). This feature must be used if you have to create a web or @home application.
If your target is for one specific hardware that you can manage with a top video card (like for an event),  you have to select “pro” setting.
However, to resolve this issue in public application, you have to assign 2 materials (can be a copy of the first material) on the different faces of the mesh. For example, select 50% of the faces of the mesh and assign one material, then, then the other half of the faces and assign the other material.

Q: I having some problems with using a .PNG texture in maya. I get strange transparency issues with polygons that are close together.

A:  In general, if you have some problems of priority with alpha, you can manage it in the .scene or directly in the exporter. You must make a right click on the mesh you want to change the priority and select the priority channel. Then put the number you want. Keep in mind that the highest number will appear in first, (priority =10 in front of priority = 9).

Q: I am wondering how or if it’s even possible to use specular mapping in Dfusion.

A: Dfusion supports specular map but only with shaders, it means that you can only use this feature for a special target hardware (just specific video cards will support it), this is not compatible with a “public” export for web application, because the size of the application will be too heavy to be downloadable and of course because lot of generic video cards will do not support the shaders.

Q: How do I create a glowing light bulb in Augmented Reality?

A: To make a lightbulb effect in D’Fusion, I would not try to use emissive materials as this will most likely not give you the result that you are looking for.
When we need to make a glowing effect here, we generally fake it with the use of a painted in glow on a .png plane, or series of planes. It generally looks better and you have more control over the look, and you don’t have to worry about hardware requirements or writing glow shaders.
So, for a lightbulb, I would make my “off” material for the bulb, and a separate brighter “on” material for the bulb. I would also make a plane surrounding the glowing part of the bulb with a .png of a glow on it. If the object is going to be turning in space, you can try using a series of planes. I would have the materials unaffected by the scene lighting. I would have the engineer program a material swap at the same time that you animate the visibility of the glow plan “on.” Alternatively, you could have 2 lightbulbs, 1 off, 1 on with glow planes, and animate the visibility of everything in Maya instead of having the materials swapped in programming.
When using planes with png transparency, it is best to freeze transformations only on scale (and possibly rotation), but not translation, as this will affect z-sorting.
You can for example have several times the same mesh and each time with a bigger scale and a lower opacity. So it means the outline of the object will be fuzzy, like a glowing effect, this can be animated by playing with fade in /fade out effect.

Q: We are creating a web based AR project, what is the maximum number of polygons that can be used for web AR? What is the recommended?

Technically with a “Public” export (which is used for web based projects so most computers can see them) there is a per mesh limit of 65,000 vertices. That is per mesh, not per scene.
Here, we try to never go above 50,000-60,000 triangles on screen at any given time for a web based project, lower if acceptable. For a project using skeletal skinned animation, we try to go even lower since this adds to the load on the computer. For real-time use, it is always best to use the lowest acceptable poly count to improve performance.
It is also important to look at texture sizes, both for their impact on performance and on download times.

Q: When an object has animated transparency (on the shader) and appears in front of another object, it appears invisible – you can see straight through the model. Why is this?

A: You can resolve this problem by using priority. For setting priority, you can either set it in the exporter by right clicking on the mesh you want to modify or manually change the generated scenette file.
By default the priority is set to 0, however a priority can be lie within [-50, +50]. During the rendering process, meshes would be sorted using their priority. Meshes with a high priority will be display in front of the meshes with lower priority.

- Metin Kolsuk (3D)