Please login or register.

Login with username, password and session length
Advanced search  

News:

IRC channel - server: waelisch.de  channel: #wme (read more)

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - CrashTheuniversE

Pages: [1]
1
Anyway, I don't know if WME will allow this sort of overlay technique, but perhaps that will give some ideas on how to do it with existing instructions.
It certainly would. But I believe Crash wants the overlay to actually interact with the image, to deform it etc., hence the pixel shader request.
But I have to admit rendering the entire scene into texture just for post-processing purposes sounds like an overkill to me too..

Well, that's all about that nowadays.
I mean shaders.
Take in account, for example, that many products, nowadays, that have not those skyrocketing HW requests, make a normal 8 to 20 passes on what you're going to see.
Overkill? On my engine3D on a common Radeon9800 SE 128MB ram, for example for a full screen glow, with downsampling, I make 8 passes (4 downsamplings H and V) for DS, and a full screen glow pass. That's common. Samething for Toon shading.
Vertex Shader, and first shader pass, 1 times sharpening, and 3 passes for prewitt-sobel (H, V, merging), finally a full screen blur.
Wonder FPS? ranging from 120 to 150, 1024x768, about 40K poly non-BFC on screen.

The images you seen about Spherical Harmonics Lighting, are quite the same, the pixel shader on itself is very heavy for the SHL, then you add all the passes for simple texturing, bumping, shadow sharpening or blurring depending on lighting situation. (indoor low frequency, or outdoor high frequency).

Indeed, that's all about the target user you want to reach.
At least a Radeon 9600, or Nvidia counterpart is required.

P.S. obviously all that passes require P-Buffers, but a single texture rendering...Anyway, we are trying to figure out some solutions ATM, something like Brat proposed. But actually I didn't figure out how to integrate it.

2
Well, actually, this could be a starting point to begin testing.
Anyway, for example, there must be a way to have a per frame update event, or something similar, on some selected entities/actor.

If I have a light object that moves on the scene, and I want to effect pixels in weighted inverse distance way, I've got to change the costant at every frame update.
While I can register an event, this could not be healthy to performance, compared to a low level event. But here you could have a better solution, in registering an event to update the internal state of the object before rendering, so that I can drive PS behaviour by costants.

Of course, the main problem of this approach would be about additional texture units. That is the capability of setting another texture unit, (may it be a shadow, light, noise, bump, or whatever else needed texture).

Since you have actually no Image entity, that simply loads an image from file to vid mem, and keeps it, that could be additional work.

To set it to a texture stage, is not that hard anyway. Simply put, that all involves a user defined number for the texture stage, and the setTexture. There is still the problem to reset to the texture stage, but, on engine side, you set to 0, the next texture stage, resetting this way at each object rendered.

That's not a concern instead for the additional UV set. Since 99% of PS do use the same uv from the pixel pass, or generate on their own what they need.

This won't do for the moment, the full screen thing. Unless you allow an option on the Game object that allows for two pass pixel shader rendering, where you set even here a shader, for the next pass. This could change things a little bit, I don't remember atm, the process of rendering to texture in DX8.

Anyway, I think that for the moment, we could start with simplier steps, and then move smoothly to a more complete feature set. If you like the idea of having this feature on the wme, let's start, with feedback, and testing, refinement is easier, as some concerns may arise that we are not thinking of atm.





3
I read some posts asking about FX files integration.

On my behalf, Vertex shaders could be nice for some effects of course, but in our case, with auroraclock, due to the fact we have a good number of 3D artists in various locations, and using some different 3D softwares, we are using a fully 2D approach.

In this case, pixel operations could be a must. During software engineering graduation, I had some image processing subjects. Now you can find many papers around, about porting MATLAB algorithms in real time on pixel shaders. That's simply render the full screen to a texture, and then repass it as a rendering target, with this time a pixel shader on it.

I think that could do the job, even for some sprites, this time, without having to render to texture, save, then render again.
Simply applying the PS over it as a material attribute.

All that I would need is a pixel shader entity, loadable by a psh file, (not necessarily HLSL, or GLSL, even shader asm could do it for the moment), and a function to set a costant. This would be a very limited support, but at least of some interest. For example, I was wondering
how to implement a Silent Hill 2 like noise effect on full screen. Being not able to access render frame, makes this impossible of course.
Without shaders that would be not a chance anyway, since there is no way you can have a decent frame rate, with rendering to texture, locking memory, upload to system memory, do the job, and upload again...for every frame...

With PS that could come easily using a noise distribution pass, then any convolution matrix applied for every pixel. I'm sure shader asm, and costant settings can take place on DX8, if you check my site, on 3D, the images from the toonish engine are made on DX8 with vs and ps in asm. (1.0 and 1.2). (www.crashtheuniverse.com)

I think that would be a very interesting feature, if coupled with the capability of passing an additional texture unit, or so.
(for example, real time bump, light maps, glows, sepia toning, and so on).

That would give a very wide set of opportunities at design stage, and I think with a relative small trade-off.

That's not the same for the programming side. That would mean a total new block of features.

If that's a burden (and I know it can be), that could be nice, to have some more access to the engine via plug-in.
If we can find a way to give me the capability, of setting some data, like a PS to be used, or some costants, without being the whole state
resetted by the internal renderer, I can already think of a nice plug-in to process this all.

Let me know what you think ;)

4
Well, the problem could be on the NEAR and FAR being the same value.. Anyway when you see -1.00 in both NEAR and FAR clipping plan is because this way the engine is going to use default values. For example, I don't know which one Mnemonic uses, but common ones could be 1.0 for near and 100.0 for far plane. Probably when you clear the 3 values, he still tries reasonably default values that's why it works.

If you have problems due to a camera too close to the object, or for other reasons (like zooming, focal lenght and so, that I don't know if they are taken in account currently in camera matrix togheter with altering the projection matrix), you could have to make the near plane less than 1.0 for example try to shrink to 0.1 for near plane and 100.0 or 1000.0 for far plane.

Tell me if it works for you. Anyway Mnemonic could clarify this better than me .. i'm a little noob of WME eheh :) but better at 3D coding ^_^

5
Technical forum / Accessing D3D objects from a plugin
« on: April 29, 2006, 05:54:19 PM »
I've seen in the doc that I can access D3DDevice and DDrawInterface ...
Could someone make an example? Maybe in the wiki?

I could be in need of changing culling from CW to CCW or vice versa sometimes. So instead of having it as a new feature,
I was thinking about making a plug-in that could give a little bit more access to the 3D renderer.

Thank you all.

6
Game announcements / The Aurora Clock
« on: April 21, 2006, 11:26:19 AM »
My Team and me are proud to announce that we are working on a new adventure game based on WME.

The game name is "The Aurora Clock".

The official homepage is www.theauroraclock.com

After a team gathering period, and a concept study, we are now approaching production. More info to come on the game homepage.

Thank you for your attention,
Regards

Pages: [1]

Page created in 0.155 seconds with 24 queries.