2009-06-21

Pokemon Battle Revolution - h4x

Interesting theoretical question;

Let's say you make a video game with monsters such as Lizardon (Charizard for some);

And you want it to have different textures so you can make different colored versions.

The logical thing to do is to add the textures as references, or make a named texture set so your model can just quickly request a texture set switch, right?

Apparently, someone at nintendo didn't get the memo, and instead, they JUST MAKE AN ENTIRE NEW FILE which has the additional textures in it, ignoring the fact that the entire model set is identical and only the textures change...

For example,

Mr. Lizardon is ~ 800 kB, compressed.

When you decompress him, you get:

lizardon_0 ~ 1024 kB
lizardon_1 ~ 1024kB

So, the funny part, is the textures take up maybe 200 kB of each file.
If they would have just added the textures in, it would have saved nearly 800 kB, and who knows what that translates to in the compressed format!

Weird enough. Also, it explains why some of the colorings I had were dead wrong (is it?)



Oh well. Once I figure out where it tells me how many vertex arrays / vertex objects there are, I'll have it nearly 100% exported.

Peace!

-Z

2009-06-20

Pokemon Battle Revolution - h4x

The first thing you probably said was "Pokemon!?? @#$%%(&!?" with more expletives.

On a side note; It was useful to decode the LZSS varient used in these games; It offers a serial form of compression that is very good at packed redundant data, and is byte driven instead of bitwise like a huffman. It seems to get pretty decent compression ratios for sparse or repetitive data; and is general & lossless. Also, learning more block compression schemes for pixel data is useful, PVRTC is the only one I care about at the moment, but STC3 and some others are handy for my own projects. Do they have a normal map compression scheme yet? hm.

On the actual note; I'm not sure if finding out how pokemon are made is useful; But it does show me just how crappy textures can be if you use shaders properly.

And, why would they use vertex keyframe animation? Makes it really really hard to integrate into any other system; Might explain the lack of Digimon/Monster Rancher type games. Oh well. Only theories.

Here's some random pokemon in Blender; No credence given since I never played the games:



-Z

2009-06-13

You're fucking right

So I have yet another bad day out of my 27,375 available.

Solution Time.

1. Create my own 3D animation tool that does what I need it to;

2. Open it to the community via Sourceforge;

3. Integrate IGTL::Sys into the mix to improve both components;

4. Create 1 x demo project to showcase how it works in a pipeline.

I've named the project; Once I finish the design specifications it's on to coding it.

Planned:

*Support for full 3D Joint/Matrix based animation (Hardware level matrix joints, with bone display option for seamless transitions from your old tools)

*Support for the newly popular UVS animation (Using bones as 2D billboards that have x,y scale and can change uv coordinates per keyframe; Think Odin Sphere)

*Perfect Data Editor so that all data can be viewed/changed irrespective of present 3D view/mode, as well as locking and visible flags for all pieces of data.

*Keyframe based animation on a discrete timeline; So be sure to start with your base FPS set correctly (default is 50)

*Logical flow to data and animation; No 'global timeline' exists. Everything is animated from 'Strips', and 'Sequences' can be made from and combinations of strips

*Mode driven editing; Object mode for scene objects, Mesh Keyframe mode for mesh objects; Pose keyframe mode for Joint/UVS objects.

*Level Of Difficulty integration; Default system is super simple; Advanced users can tweak and expand editors as needed to suit their work mentality

*Entirely abstracted input with multitouch support; Any Human Input readable by SDL can be used and assigned to hotkeys/event keys. This also means you can record some macros within the program. (Automation tools)


Some obvious problems:

-Exporting will be limited initially to a text format for inter-compatibility; So long as the text format specification is rigid (like OGRE xml) then anyone can easily write a converter from text -> custom.

-Speed and User friendliness; If you do not have OpenGL 2.0 or a newer graphics card, the program will initially deny you the ability to use it. This will be fixed later because it is not a priority to write my own TnL for OpenGL 1.1 users. I have done so before, but this is the lowest priority item.

-Operating Systems; Windows and Mac and Linux don't seem to have multitouch support in the SDL I am using. Too bad; I'll have to store special mouse states for 'virtual mice' aka joysticks.

-Networking; I want this program to allow collaborative editing; This is always difficult to do and not a priority item because it is outside the scope of the first version, plus this would be mostly beneficial for scene editing and individual animations; Think cooperative moviemaking

-Rendering; This program is NOT a rendering tool; It can export frames, but onlt as good as your graphics card can make them. I do not want to write a rendering pipeline; But I should include exporters so you can dump a animation to a real rendering program like Blender. However, this is also a low priority item due to it being outside of the scope.

-Complexity; all good tools have reasons for complexity; Generally it is lazy programmers, but that's because some of the basic problems are very difficult, and they have timelines to meet so they do 'the dumb yet it works' solution. I am doing this for no profit, so there will be slowdowns in the development of this tool.

-3D Mesh Modeling; I do not want to make a mesh modeler; That is blender's job, not this tool. I want this tool to focus on making game animations with an existing mesh, as well as weighting the verticies of the mesh in the program. This means I will need some mesh tools; and that means the first thing people will demand is a mesh editor/generator. This is a low priority and not the scope of the program, though I may add some cool tools for it in the future; especially because re-meshing is common in real industry; which includes re-uv mapping and adding/removing some verticies. This will have to be supported and is a medium priority item.


I plan to have it work a lot like blender. I hope if I build it, indy developers and hobbyist animators can use it for their purposes and avoid the headaches with classical paid program nightmares.

This should take me at least 6 months to get a beta out.

-Z

2009-06-10

Extremely Depressed

As stated, I am very sad.




More than 10 years of programming, multiple jobs, and even an engineering degree later, I'm still not happy.

Here's why:

Blender, being the free opensource 3D wondertool had intrigued me from when I first found it. However, after years of playing, making animations, and games using this tool, now that I entered the realm of 'real' game development, blender is severely lacking in multiple areas.

1. Armatures

The concept of a armature is invalid; The 3D graphics hardware you have and have had since 1970's has always been of the 'projection matrix' * 'modelview matrix' => output raster position. Now, modern 3D hardware has the ability to be programmed, so, people like me can code in fully articulated characters by adding weights per vertex and writing a simple vertex shader that multiplies by each joints matrix.
Blender does not conform to this universal standard; IT instead tries to 'make it easy' by giving you a 'bone', which, here's the serious problem: It has a length. Matrices deform from their center, not an arbitrary point. This makes conversion to my game and from my game to blender impossible, thus, blender cannot be used for the animation pipeline. Any attempt to 'hack' blender into making this work is a waste of time; True, you can constrain your game a lot, but if you had 1/100th the experience I do, you would know better. Now for another point; Even with armatures, blenders animation system is designed for movies; That is, everything works on a global timeline via global IPO keys. No game works like this, so combining run + walk animations becomes very difficult, as well as keeping track of current animation track data. They have botched and fluffed over this for years; No positive results yet.
In conclusion, thanks to a broken bone system and incompatible animation keying system, I now no longer have a animation tool my artists can use for our pipeline.

2. Space conversions

Blender doesn't use math centric +x forward, +y left, +z up space consistently. This causes nothing but headaches for everyone. There is no justification for having inconsistent coordinate systems, pick a coordinate system and make your entire program be consistent.

3. Pipeline

When I make a model in blender, I use my character sketches and some quick concept coloring. Blender makes mesh modeling quick, which is nice. However, when I finish with my model, I want to take the data out, and put it into my game. There are many options for this, but, I usually have to write my own converter. Given, every single update of blender, guess what? My converter breaks somehow thanks to a undocumented python function or change in the way things work. Usually the breaks are not too large, but this is a lot of my time wasted for something that the program should do automatically; For instance, 'dump ascii' should export a large, concisely documented ascii file of all the data for the current selection, including it's linked data and so forth. If they wrote a game engine in blender, why can;t we dump that data out? And why do I keep having to make more converters to spit out a text file?

4. Data Model

Blender uses a older C-Data model. This is a good one to use, however, I would like to have more data model tools; For instance, if everything is reference counted and deleted on zero counts fine; But let me control that and show it to me in the OOPS or a special 'data tree' viewer. As a developer, I need control for that data to better improve the exporting I have to write for this tool. Also, sometimes blender files get junked up with bad chunks from older files. And, more importantly, where is the .blend to ascii converter? That would be very nice to have.

5. Next Gen Content

Blender currently is pathetic when it comes to this; Let's say I want to make a MGS4 snake. No problem you say, and model out a nice 3300 poly Pliskin and then build an armature for him. Now, you can bend and animate him with some ease, though, lookie here, his shoulder bends funny! Well, after about an hour of tweaking the armature, you got it to look better, but not commercial quality. Now you have to generate a mesh keyframe and link it to a python controller that listens to the armature. Okay, fine. But, how do you export that data out of the system? And how do you ever preview your animations if keyframes are global application? Hm, looks like you have a severe problem editing keyframes and armature actions. OH NO, you added visemes so snake could talk; Looks like there's no way to make animations except by manually entering times on the timeline; oh, and look, while he's talking the python controlled armature actuator is fuzzing the keyframes... Looks like you just wasted 8 hours fighting a system that wouldn't work anyways.
Enough bitching about that example; Point is, if you have a animation system, but 1 special component can have 'local' timelines (armature 'actions'), why can't mesh keyframes and other animation systems have 'time strips' that you can make, so that your main animation system can paste strips together? Oh what's that? NLA? only works with armature actions, sorry. Unless you're making a movie with no dynamic content, you're SOL here. And try writing an exporter for that. At least they finally added GLSL to the damn system.



I'm so depressed. What do I do, write my own tool like FrameGL3 (already solved all these problems myself btw; SDS, IK, anims, ect...) or do I just give up? This is a lot of work for anyone to undertake; Only because of the gruntwork required. More importantly, there has to be someone else who has this problem, but where is there solution?


Also, being unable to crack 'Dragon Quest Swords' funky LZSS type compression really has me down. But not down like Valgirt Nedlog has me down; fucker's hard!


Maybe I should give it all up for a while, like, a year or something...


I'm in the wrong fucking state/country/planet...


-Z


As a side note; I've hacked the graphics out of Primal Rage 2; Killer Instinct; Wario World; Super Smash Bros; Turok; and many other games just to learn how they built their data, as my ONLY FORM OF VALIDATION that what I have been doing is correct.