Coder Social home page Coder Social logo

ragnarokfileformats's Introduction

This repository will no longer see updates. My work continues here: Ragnarok Research Lab

To see the latest results of my research into Ragnarok Online's file formats, visit the new documentation website.

Note: I'll archive the repo after the migration of all relevant contents has been completed.


Ragnarok Online File Formats Manifest

Gravity Co has used some fairly arcane file formats in creating their first MMORPG client, presumably supported by custom (in-house) or now-obsolete software.

This repository contains my findings from researching how they work and how they can be used or modified, alongside sources and links that at times have been quite difficult to track down after all these years.

Hopefully having it all collected in a git repository will allow the information to remain available for many years to come. Feel free to do with this as you please!

Obvious disclaimer: I don't claim ownership of any findings and results of the hard work that others have put into analysing these file formats. Sometimes it's impossible to say who originally discovered the various details due to the passage of time and obscure references I had to consult, but I'll list my sources as best as possible.

Status

I've analysed and created tools to work with most of the common file formats by now. However, there are many unknowns and smaller details that may still need clearing up in order to understand the formats perfectly.

Often these are likely to be just oddities stemming from the fact that Ragnarok Online uses the old Arcturus engine, so they might not be required to process the files and render their contents correctly.

Newer (post-Renewal) changes may not be reflected in this specification, though I expect to look into them more once the basics are covered.

Contributing

Please open an issue or contact me directly via Discord (RDW#9823) if you can contribute in any way! Hopefully, with some help from the community most of the unknowns can be eliminated eventually.

File Types Overview

All of the file types come in different versions, because of course they do. Usually, newer versions add more features while many of the older versions are virtually unused, being remnants from the early days of RO development and Gravity's previous project, Arcturus: The Curse and Loss of Divinity.

They used a custom, in-house engine that was later extended to create Ragnarok Online. It's also most probably why RO's file formats and architecture seem pretty odd, by modern standards.

However, since even the earliest RO clients don't support all of the legacy formats, they will be of limited interest. As existing tools aren't really compatible, researching them further isn't something I'm currently planning on.

World Data (Map files)

The most complex data stored and processed by the client is arguably the world and environment data. There are several files that are combined to create a representation of the world (maps) and its parameters, which however are not perfectly understood at this time.

There exist several third-party programs that appear to recreate a faithful copy of the original client's rendering output, so the information available should be enough to interpret the original world without visible deficits.

Extension Interpretation Contents
GND Ground Map geometry, texture references, lightmap data
GAT Altitude and Terrain Terrain altitude and type (ground, cliff, water)
RSW World Resources 3D objects, water, lights, audio sources, effect emitters

Sprites (2D Actors)

Most characters, monsters, and NPCs are displayed as 2D sprites with their various animations defined as "animation clips", which consist of multiple images. One clip exists for each of the different view angles, separating a full circle into just eight basic directions.

Which direction should be used is determined by the client based on the camera position, i.e. the angle between the camera and the monster's position, and the assumed direction it is currently facing (I think). The latter is probably derived from its movement pattern, since I didn't find any serverside code that would manage this.

The information on how to display a given unit's ingame actions is embedded in special collection files that contain additional information, such as sound effects to be played as part of the animation.

Animations are composed of simple image files that will be played as individual frames, consisting of one or several layers that can be arranged to display multiple sprites together, in a predefined order, and with various visual changes applied to them (such as rotation, scaling, transparency, and color shading).

Extension Interpretation Contents
SPR Spritesheet Sprite images as bitmaps or TGA, metadata and, optionally, an RGB palette
ACT Animation Collection Keyframes, animation details, and, optionally, a list of sound effects to be played
IMF Interface Management File Information used to render sprites on the UI (2D) layer (seems obsolete?)

Models (3D Objects and Actors)

Very few 3D models have been used for actual monsters, and the formats are somewhat unfamiliar to me. Most 3D models are those placed on the map while rendering and represent objects in the environment, like trees or architecture.

Since the format for these appears to be custom or vastly outdated, I haven't found any standard tools to edit them, but people have written software to at least display the models and render them as part of a given map.

Extension Interpretation Contents
RSM Model 3D models used for objects (and sometimes units) in the game world
GR2 GRANNY2 3D models stored in an ancient format once used by the Granny3D framework
RSX ? Arcturus 3D model format? (doesn't appear to be used in RO)

Effects

Effects come in several varieties, all of which are processed separately. Some effects simply use the existing ACT/SPR formats, some use a separate compiled (binary) format, some are hardcoded into the client executable and in later versions particle effects can also be generated from settings stored in specific Lua files.

The hardcoded effects appear to be limited to effects involving 3D components, such as particles and geometric primitive shapes, though they will often use sprites and other textures as part of their animation. Purely sprite-based animations are provided as separate files, which are mostly understood by now.

Particles are configured using the "effect tool" and only seem to be used in recent years, with individual configuration files defining effect emitters for a given map.

Extension Interpretation Contents
STR ? Compiled (binary) effects format
EZV ? Raw (text-based) effect format?

Script and configuration files

Client-side data and Mercenary/Homunculus AI is written in the Lua programming language and embedded in the client itself, or sometimes as text-files with roughly-CSV-like structure (where fields are separated by hash tags).

Lua files are human-readable text files and thus easily understood, even if the creators clearly didn't have readability in mind. LUB files are precompiled Lua files and can be somewhat reversed. This allows one to read much of them, though some information is still lost in the process.

Thankfully, LUB files are only used in Post-Renewal clients (or so it seems), which means that all the classic files are perfectly readable. Converters exist to restore the original table structure from LUB files, though I cannot vouch for their correctness.

Extension Interpretation Contents
LUA Lua source code Lua script file
LUB Lua bytecode Precompiled Lua script

Audio files (Music and Sound Effects)

Finally, something easy! These are just regular WAV or MP3 files, respectively, and can be played with any standard program.

Interface files

They are regular images (bitmaps). Nothing to be said about them, other than that the Korean/Unicode filenames don't normally translate properly on non-Korean Windows systems and therefore can end up as gibberish.

Only 256 colours are supported and the first palette index is used as transparent background colour. Usually this is the oldschool "pink" with RGB code (#FF00FF), though similar colours can also be occur:

It's safe to clear anything with red and blue > 0xf0 and green < 0x10

Guild emblems are stored as compressed bitmaps, in the EBM format.

Community Resources

See here for a list of editing tools and client reimplementation projects that may help with understanding the file formats described in this documentation.

Endianness

All numbers found in RO's binary formats are stored as little-endian, which means that the individual bytes have to be "reversed" before interpreting the binary data as a number.

ragnarokfileformats's People

Contributors

drgomesp avatar duckwhale avatar zhad3 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

ragnarokfileformats's Issues

Research the changes made in GND version 1.8

GAT 1.3, GND 1.6 and RSW 2.6, the waterinfo is still not correct for those new, they removed waterinfo in RSW 2.6 and moved it to GND 1.6 (they changed that somewhere between 2021-11-03 and 2021-11-16)

Source: Discord

Add links to the reference implementation (of the decoders)

Since they're part of an as yet undisclosed project, I can't do it just now. But once I'm done with the STR decoder and have resolved the remaining RSM issues it should be useful enough, even if some newer versions aren't currently supported.

Research how palettes are applied exactly by the client

From my notes (source unknown):

Palettes are applying on the pixel shader to avoid having to re-build each sprite image (faster and easier).

This issue should remind me to consider the following questions:

  • What is the source?
  • Is this actually true?
  • If yes, can it be verified (in the client)?
  • Is it relevant and should be noted somewhere in this repository?

Add layout tables to the RSM specification

Some parts may be missing or need verification.

For example, it is currently unclear why some triangle faces have more than three entries for the smoothing group array (I've seen up to five).

Research the changes made in RSW version 2.5

Example: gw_fild01.rsw (latest kRO patch)
Interestingly enough, GRF editor doesn't have the same issue as with version 2.2., meaning it doesn't crash (but still reads the wrong values for anything but the header).

Probably should find out what they changed in 2.2 first, though (see #22).

Research bounding box parameters used by the GR2 models

I think they're somehow used to calculate the final position during (or possibly after) binding each mesh to its assigned bones. I haven't done this and the Archer Guardian model looks off, with its shoulder plates "flying" above where they should be located.

image

The bones of the shoulder similarly aren't quite right (too far up), and there's some sort of "placeholder" mesh (rigid, not skinned) that appears to be omitted, but would technically be attached to the shoulders if it were to be rendered.

More info/time for research is needed. As of yet, the spec can't be complete before I've found out what I'm missing.

Unfortunately this might be very difficult to resolve, which is why I'm putting it here as a reminder so that I can continue with #40 in the meantime.

Open questions: SPR

Some collection files contain additional sprites, such as shadows (?) or overlay effects, but I haven't been able to find out more about how they are used exactly.

What's up with that?

Add a specification of the GRF format

There are multiple versions, and IIRC not all of them are supported by the current standard tools (GRF Editor mainly). I remember there was some ancient tool for the old Arcturus/RO alpha version, which even had some textual documentation on how it worked. Maybe it's still archived somewhere? However I don't think the source is available (this always makes me very sad).

I know some people have worked on implementations that should be functional, though I'm unsure as to what versions are supported. Some examples:

Looking at how many working implementations there are, the format can't be very difficult to understand. At least it should be much simpler to research than GR2 or even RSM/2.

To be determined:

  • How do GRFs relate to RGZ and GPF? Are they the same or at least use similar mechanisms? I doubt they'd create multiple distinct archive formats, but it wouldn't be the first time they've done something silly...

Research ACT version 2.4

I found an older file on my computer that has the version 2.4 format. In my editor it loads all bytes, so I guess it should be correct. It's different from the version 2.5

Research non-zero alpha values in the SPR color palettes

In the parser I found the following comment:

alpha: 255, // Shouldn't it always be ignored? // TODO only for <2.0?

This refers to the palette colors; various sources claim that alpha values for the indexed-color bitmaps are always ignored by the client. It remains to be seen whether this is actually true, though I'd need to see if there are any non-zero alpha values and check if/how the client interprets them.

I'd also be curious to see how they look if the alpha values aren't ignored. But that would require finding some where it isn't zero, or there'd be nothing to see after all.

Add any missing sources for third-party info as footnotes where possible

Now supported on GitHub: https://github.blog/changelog/2021-09-30-footnotes-now-supported-in-markdown-fields/

I've had to track down some sources after forgetting where the info is from after all this time, so linking them directly seems like it would avoid this problem. It would make it easier to double-check old info to disprove or verify it once more knowledge is available, which has been required most recently with the GND lightmap slices, for example.

Create a specification for the IMF format

I was under the impression that it's unused, but apparently that's not quite right. This I find rather confusing, since I didn't notice any visible glitches in my rendition of the animations.

Of course I didn't check every conceivable combination of equipment, so I might've missed something.

Research how 3D effects are rendered by the client

Just dumping here my notes so I can delete them and have it archived for later:

Are they created as mesh objects, unwrapped, textured, and animated all in code?

That is about right. All effects (CRagEffect) simply configure instances of effect primitives (CEffectPrim), with the only exception of effects based on the STR format (what Gravity calls the "EZ effects" :D). Effect primitives are animated in code, and buffers for effect primitives are constructed on a per-frame basis. Making these effects is basically like working with a 3D canvas from the code, where everything is dumped in the renderer's queue. In many cases an effect primitive is something non-specific like a half sphere, where properties like radius and opacity is changed over time. There are also many primitives specific to individual skills. A skill can have multiple effects, which in turn can use multiple effect primitives and STR effects. For instance, Acolyte's Heal uses four different effects based on the amount healed.

A rough description of how the skills you picked out are implemented:

  • Magnum Break - Two expanding spheres and a 3D circle on the ground
  • Cold/Fire Bolt - The bolts are just textures moving towards the target, combined with an expanding 3D circle on target
  • Frost Diver/Grimtooth/Frost Nova - just multiple spikes growing in random directions, spawned one by one for the duration of the skill
  • Safety Wall/Heal - 3D cylinder changing height with the texture fading out in the top, plus some floating particles here and there
  • Cart Revolution - Combination of STR effect, an expanding 3D sphere and an expanding 3D circle

Attack skills will often have one part for the skill itself, and one for when the target is hit. Damage effect is also varied, often based on the element of the attack. Most skills will additionally have a special effect on the player, like the glowing cylinder around the magician when a spell begins. There are lots of different ones. Skills using only a simple STR are relatively uncommon.

Review GAT terrain types and their interpretation (incl. serverside/gameplay implications)

The original GAT specification included a more detailed interpretation of the terrain types (tile flags) than I am currently able to confirm myself. Since the client clearly doesn't interpret them as such, I'm removing those parts while adding this issue as a reminder that they're to be reviewed once I can more comfortably test the assumptions and verify the research others have done in terms of a more detailed interpretation of those flags.

Specifically, the whole "underwater vs. ground-level" differentiation is something that seems a little all over the place right now, and I haven't seen anything in the client that would care even one bit about water tiles.

I'd assume that the zone server cares much more about the different types than the client's basic "can I move there?" checks I've seen. For the time being, it's not really all that relevant to the client itself, though.

Research the shadow sprite to determine how it's used exactly

Obviously there is some information, but a lot isn't quite clear:

  • It appears units use differently-sized sprites, the exact dimensions for which are taken from... somewhere?
  • Is the transparency always the same?
  • Do all units (monsters) actually have shadows?
  • Are there conditions where the shadow sprite won't be rendered? (I know players don't have one when sitting, for example)

These are just some of the questions that should be answered in the ACT/SPR articles, or possibly in a separate document that can be referred to. Also TBD: is that all there is to know about them? Is it even correct?

Research indexing of true color images in the ACT files

Example: goldporing.act

There are 40 indexed-color images, and one true color image. It seems the index zero is used twice, but I haven't checked the ACT data since there's some unrelated issues with true color images.

Presumably, they would simply be appended, and indices shifted accordingly. Will have to revisit later.

Original note (SPR.MD):

I still need to investigate the few sprites that use transparency, but there's so very few of them that I haven't gotten around to it yet. Otherwise, rendering sprites (and their animations) seems to work correctly.

Open questions: GND

Unclear:

  • What is the lightmap scale (lumel) scale? It's probably a 1:1 mapping similar to most textures...
  • According to some sources, there exists a flag to use texture rotation. Where is it?

Research the changes made in GAT version 1.3

GAT 1.3, GND 1.6 and RSW 2.6, the waterinfo is still not correct for those new, they removed waterinfo in RSW 2.6 and moved it to GND 1.6 (they changed that somewhere between 2021-11-03 and 2021-11-16)

Source: Discord

Research how shader parameters are used for rendering GR2 models

There's a metallic sheen that must be stored in the material data. Without it, the models look too colorful and paste-like (see screenshot in #41). I don't know if it can be exactly replicated without the proper modelling tools (3DS Max), but some info about this could certainly be added.

AFAIK 3DS uses a shader graph-like structure with the top-level material data carrying the shader properties, and the child nodes being used for diffuse color/texturing (at least in this case). I think a similar approach is used in Unity, Godot, UE and BJS (NME), presumably with some differences?

Add more details about the features of the RSM format

Lots of stuff is missing:

  • Animations
  • Transparency (needs more research)
  • Transformations (model to world space, bounding box alignment, instance placement)
  • Double-sided triangle faces (needs more research)
  • Shading modes
  • Smoothing groups (RSM2 needs more research)

Add some information about bink videos (as used by the new RSM2 models)

Not that they're widely used, but some info probably couldn't hurt.

So far I've got only this:

  • There's an official tool to convert the videos here
  • libavcodec appears to have reverse-engineered the format

Used in only this model:

image

There's a "trailer" (01.bik) and these two:

image

The videos themselves look completely out of place, are in low quality, and appear to have no audio. I wonder what they're used for ingame, or why they even added them in the first place.

Research the changes made in RSW version 2.2

Example: gefenia01.rsw (kRO latest)
GRF editor also fails to load it. Clearly something other than the as yet unknown water property must have changed.

Update: A more recent version of GRF Editor exists that does display them AFAIK, but I don't think it actually uses them.

Research the changes made in RSW version 2.4

Looks like there is in fact such a version. As discovered by the someone in the BrowEdit Discord, paramk appears to use 2.4 and is therefore incompatible. I expect the changes will be minor, but I haven't looked into it at all.

GRF Editor is able to render it, so clearly it's already been researched. Now if I only had the source code...

Edit: The above makes no sense... paramk.rsw is using version 2.2? (tra_fild.rsw uses 2.4 though)

Research the changes made in RSW version 2.6

GAT 1.3, GND 1.6 and RSW 2.6, the waterinfo is still not correct for those new, they removed waterinfo in RSW 2.6 and moved it to GND 1.6 (they changed that somewhere between 2021-11-03 and 2021-11-16)

Source: Discord

Research the various weather effects that are supported by the client

Not really a file type, but it might be of interest regardless. The idea is to put it into the Research folder, alongside all the other concepts that are somewhat related.


There are some working weather related effects like snow and fog in-game, but a much more comprehensive system was planned at some point. This would include dynamic sun location and a true day/night cycle. One problem was probably that model shadows in Ragnarok are precomputed. A rain effect was implemented, but it was probably never finished as it looks pretty bad. The jRO team used to have some pictures related to this on their news page, and I believe they planned to release the new weather system with the Yuno update. Private server developers took matters into their own hands. If you played on an eAthena server around 2004 you may remember the horrible night effect that was enabled by default: Enabling the blind effect on all players until it was "morning" again!

Looking into the snow effect seems handy, as it's used in Lutie. And the fog effect is required to add hue to the scenes anyway.

There's also clouds (that use the mislabeled "fog" texture), which can be seen on at least the following maps:

  • yuno
  • gonryun
  • thana_boss
  • 5@tower

Additionally, there's some "lens flare" effect for sunlight, but I'm not sure if that's actually used.

Add a specification for the EBM "format"

It can hardly be called a real format, but some info probably should be added for completeness' sake.

This sort-of goes along with the GR2 format since EBM textures are mapped to the guild flag model.

Update RSM specification to account for the RSM2 changes

Source: https://rathena.org/board/topic/127587-rsm2-file-format/ (I wish there were more awesome posts like that ❤️ )


Heya,

This post is meant to explain the file format of RSM2 for those who are interested and want to play with them. I haven't seen many projects exploring the topic and I've finished digging through the file for GRF Editor. I shared some of the structure pubicly in BrowEdit's Discord almost a year ago, but the fields were still unknown at that point. Also before anyone asks, no I am not making a public converter for RSM2 > RSM1. That's not fully possible anyway.

General
The structure of a RSM file is quite simple. It's a list of mesh data with transformations applied to them. Each mesh has a transformation matrix, a position, a parent, etc. Then you have the transformation components on the mesh:

    Offset/Translation
    RotationAngle
    RotationAxis
    Scale

And at last, you have the animation components on the mesh:

    RotationKeyFrame
    ScaleKeyFrame

All the code presented below comes from GRF Editor. Also the structure varies quite a bit even among the 2.2 version and the 2.3 version. I was unable to find any model using versions 2.0 or 2.1. I'd guess they were only used internally...? Who knows.

Animation duration changes
In previous versions, below 2.2, the AnimationLength field and the frame animation field represented time in milliseconds. So a model such as ps_h_01.rsm has 48000 as a value for AnimationLength, which means the animation lasts for a whole 48 seconds before it resets. The key frames for the transformations work in the same manner.

In version 2.2 and above, the AnimationLength field reprensents the total amount of frames in the model. So a model such as reserch_j_01.rsm2 has a value of 300. The keyframes would therefore range between 0 and 300. The duration is given by the new FramesPerSecond field, which is 30 for almost all 2.0 models currently existing. The delay between frames would then be 1000 / FramesPerSecond = 33.33 ms. The duration would be 1000 / FramesPerSecond * AnimationLength = 1000 / 30 * 300 = 10000 ms in our example.

Shading
Nothing new there, but I thought I'd go over the topic quickly. The ShadeType property is used to calculate the normals. There are three types that have been found in models to this day:

    0: none; the normals are all set to (-1, -1, -1).
    1: flat; normals are calculated per triangle, with a typical cross product of the 3 vertices.
    2: smooth; each face of a mesh belongs to a smooth group, the normal is then calculated by adding the face normal of each connected vertices.

In the real world, most models end up using the smooth shading type. The smooth group is a bit confusing at first if you've never heard of it, but some reading on the topic will help you. These are common techniques.

Textures
In previous versions, below 2.3, the textures were defined at the start of the file. Each mesh then defines a list of indices. So for example, a mesh could define these indices: "2, 5, 0" which means the mesh has 3 textures. Each face of the mesh then has a TextureId property from 0 to 2 in our example. If the face TextureId is 1, it would refer to the second indice previously defined, which is 5. This means that the texture used for this face would be the 5th texture defined at the start of the model.

In version 2.3 and above, the textures are defined per mesh instead. There are no longer using texture indices. The TextureId defined for each face refers directly to the texture defined of that particular mesh. So say the TextureId for a face is 1, then the first texture defined on the mesh is the corresponding one.

Transformation order
In version 2.2 and above, the Scale/Offset/RotationAngle/RotationAxis properties were removed. Instead, it relies on animation frames or the TransformationMatrix. 

The order looks as such:

/// <summary>
/// Calculates the MeshMatrix and MeshMatrixSelf for the specified animation frame.
/// </summary>
/// <param name="animationFrame">The animation frame.</param>
public void Calc(int animationFrame) {
	MeshMatrixSelf = Matrix4.Identity;
	MeshMatrix = Matrix4.Identity;

	// Calculate Matrix applied on the mesh itself
	if (ScaleKeyFrames.Count > 0) {
		MeshMatrix = Matrix4.Scale(MeshMatrix, GetScale(animationFrame));
	}

	if (RotationKeyFrames.Count > 0) {
		MeshMatrix = Matrix4.Rotate(MeshMatrix, GetRotationQuaternion(animationFrame));
	}
	else {
		MeshMatrix = Matrix4.Multiply2(MeshMatrix, new Matrix4(TransformationMatrix));

		if (Parent != null) {
			MeshMatrix = Matrix4.Multiply2(MeshMatrix, new Matrix4(Parent.TransformationMatrix).Invert());
		}
	}

	MeshMatrixSelf = new Matrix4(MeshMatrix);
	
	Vertex position;

	// Calculate the position of the mesh from its parent
	if (PosKeyFrames.Count > 0) {
		position = GetPosition(animationFrame);
	}
	else {
		if (Parent != null) {
			position = Position - Parent.Position;
			position = Matrix4.Multiply2(new Matrix4(Parent.TransformationMatrix).Invert(), position);
		}
		else {
			position = Position;
		}
	}

	MeshMatrixSelf.Offset = position;

	// Apply parent transformations
	Mesh mesh = this;

	while (mesh.Parent != null) {
		mesh = mesh.Parent;
		MeshMatrixSelf = Matrix4.Multiply2(MeshMatrixSelf, mesh.MeshMatrix);
	}

	// Set the final position relative to the parent's position
	if (Parent != null) {
		MeshMatrixSelf.Offset += Parent.MeshMatrixSelf.Offset;
	}

	// Calculate children
	foreach (var child in Children) {
		child.Calc(animationFrame);
	}
}

The original vertices are then multiplied by MeshMatrixSelf for their final positions. MeshMatrix is the resulting transformation matrix of a particular mesh only, without taking into account its parents matrixes or the mesh position. The MeshMatrixSelf is the final transformation matrix that will be applied to the vertices. Contrary to previous versions, the TransformationMatrix is applied all the way to the children. The matrix invert function may not be available in all common librairies, so here is the implementation used:

public Matrix4 Invert() {
	if (this.IsDistinguishedIdentity)
		return this;
	if (this.IsAffine)
		return this.NormalizedAffineInvert();
	float num1 = this[2] * this[7] - this[6] * this[3];
	float num2 = this[2] * this[11] - this[10] * this[3];
	float num3 = this[2] * this[15] - this[14] * this[3];
	float num4 = this[6] * this[11] - this[10] * this[7];
	float num5 = this[6] * this[15] - this[14] * this[7];
	float num6 = this[10] * this[15] - this[14] * this[11];
	float num7 = this[5] * num2 - this[9] * num1 - this[1] * num4;
	float num8 = this[1] * num5 - this[5] * num3 + this[13] * num1;
	float num9 = this[9] * num3 - this[13] * num2 - this[1] * num6;
	float num10 = this[5] * num6 - this[9] * num5 + this[13] * num4;
	float num11 = this[12] * num7 + this[8] * num8 + this[4] * num9 + this[0] * num10;
	if (IsZero(num11))
		return false;
	float num12 = this[0] * num4 - this[4] * num2 + this[8] * num1;
	float num13 = this[4] * num3 - this[12] * num1 - this[0] * num5;
	float num14 = this[0] * num6 - this[8] * num3 + this[12] * num2;
	float num15 = this[8] * num5 - this[12] * num4 - this[4] * num6;
	float num16 = this[0] * this[5] - this[4] * this[1];
	float num17 = this[0] * this[9] - this[8] * this[1];
	float num18 = this[0] * this[13] - this[12] * this[1];
	float num19 = this[4] * this[9] - this[8] * this[5];
	float num20 = this[4] * this[13] - this[12] * this[5];
	float num21 = this[8] * this[13] - this[12] * this[9];
	float num22 = this[2] * num19 - this[6] * num17 + this[10] * num16;
	float num23 = this[6] * num18 - this[14] * num16 - this[2] * num20;
	float num24 = this[2] * num21 - this[10] * num18 + this[14] * num17;
	float num25 = this[10] * num20 - this[14] * num19 - this[6] * num21;
	float num26 = this[7] * num17 - this[11] * num16 - this[3] * num19;
	float num27 = this[3] * num20 - this[7] * num18 + this[15] * num16;
	float num28 = this[11] * num18 - this[15] * num17 - this[3] * num21;
	float num29 = this[7] * num21 - this[11] * num20 + this[15] * num19;
	float num30 = 1.0f / num11;
	this[0] = num10 * num30;
	this[1] = num9 * num30;
	this[2] = num8 * num30;
	this[3] = num7 * num30;
	this[4] = num15 * num30;
	this[5] = num14 * num30;
	this[6] = num13 * num30;
	this[7] = num12 * num30;
	this[8] = num29 * num30;
	this[9] = num28 * num30;
	this[10] = num27 * num30;
	this[11] = num26 * num30;
	this[12] = num25 * num30;
	this[13] = num24 * num30;
	this[14] = num23 * num30;
	this[15] = num22 * num30;
	return this;
}

New transformation animations

    TranslationKeyFrames
        In version 2.2 and above, PosKeyFrames are added. If you've seen the previous formats, you may be confused by this. I've seen PosKeyFrames in many implementations, but version 1.6 adds ScaleKeyFrames, not TranslationKeyFrames. The name is self-explanatory: it translates the mesh.
    TextureKeyFrames
        In version 2.3 and above, TextureKeyFrames are added. Similar to other transformations, they are defined as:

        struct TextureKeyFrame {
        	public int Frame;
        	public float Offset;
        }

         

The TextureKeyFrames target a specific texture ID from the mesh and have different animation types. The Offset affects the UV offsets of the textures. The animation types are:

    0: Texture translation on the X axis. The texture is tiled.
    1: Texture translation on the Y axis. The texture is tiled.
    2: Texture multiplication on the X axis. The texture is tiled.
    3: Texture multiplication on the Y axis. The texture is tiled.
    4: Texture rotation around (0, 0). The texture is not tiled.

Main mesh
In previous versions, below 2.2, there could only be one root mesh. This is no longer the case with newer versions.

Code
And those were all the changes! Here is a full description of the structure (which is again based on GRF Editor).

#
#  RSM structure
#
private Rsm(IBinaryReader reader) {
	int count;
	
	// The magic of RMS files is always GRSM
	Magic = reader.StringANSI(4);
	
	MajorVersion = reader.Byte();
	MinorVersion = reader.Byte();
	
	// Simply converting the version to a more readable format
	Version = FormatConverters.DoubleConverter(MajorVersion + "." + MinorVersion);
	
	// See "Animation duration changes" above for more information.
	AnimationLength = reader.Int32();
	ShadeType = reader.Int32();
	Alpha = 0xFF;
	
	// Apparently this is the alpha value of the mesh... but it has no impact in-game, so...
	if (Version >= 1.4) {
		Alpha = reader.Byte();
	}
	
	if (Version >= 2.3) {
		FrameRatePerSecond = reader.Float();
		count = reader.Int32();
		
		// In the new format, strings are now written with their length as an integer, then the string. In previous versions, strings used to be 40 in length with a null-terminator.
		// The syntax below may be a bit confusing at first.
		// reader.Int32() reads the length of the string.
		// reader.String(int) reads a string with the specific length.
		for (int i = 0; i < count; i++) {
			MainMeshNames.Add(reader.String(reader.Int32()));
		}
		
		count = reader.Int32();
	}
	else if (Version >= 2.2) {
		FrameRatePerSecond = reader.Float();
		int numberOfTextures = reader.Int32();
		
		for (int i = 0; i < numberOfTextures; i++) {
			_textures.Add(reader.String(reader.Int32()));
		}
		
		count = reader.Int32();
		
		for (int i = 0; i < count; i++) {
			MainMeshNames.Add(reader.String(reader.Int32()));
		}
		
		count = reader.Int32();
	}
	else {
		// Still unknown, always appears to be 0 though.
		Reserved = reader.Bytes(16);
		count = reader.Int32();
		
		for (int i = 0; i < count; i++) {
			_textures.Add(reader.String(40, '\0'));
		}
		
		MainMeshNames.Add(reader.String(40, '\0'));
		count = reader.Int32();
	}
	
	// The Mesh structure is defined below
	for (int i = 0; i < count; i++) {
		_meshes.Add(new Mesh(reader, Version));
	}
	
	// The rest of the structure is a bit sketchy. While this is apparently what it should be (some models do indeed have those), they have absolutely no impact in-game and can be safely ignored when rendering the model.
	if (Version < 1.6) {
		count = reader.Int32();
		
		for (int i = 0; i < count; i++) {
			_scaleKeyFrames.Add(new ScaleKeyFrame {
				Frame = reader.Int32(),
				Sx = reader.Float(),
				Sy = reader.Float(),
				Sz = reader.Float(),
				Data = reader.Float()
			});
		}
	}
	
	count = reader.Int32();
	
	for (int i = 0; i < count; i++) {
		VolumeBoxes.Add(new VolumeBox() {
			Size = new Vertex(reader.Float(), reader.Float(), reader.Float()),
			Position = new Vertex(reader.Float(), reader.Float(), reader.Float()),
			Rotation = new Vertex(reader.Float(), reader.Float(), reader.Float()),
			Flag = version >= 1.3 ? reader.Int32() : 0,
		});
	}
}

#
#  Mesh structure
#
public Mesh(IBinaryReader reader, double version) {
	int count;

	if (version >= 2.2) {
		Name = reader.String(reader.Int32());
		ParentName = reader.String(reader.Int32());
	}
	else {
		Name = reader.String(40, '\0');
		ParentName = reader.String(40, '\0');
	}

	if (version >= 2.3) {
		count = reader.Int32();

		for (int i = 0; i < count; i++) {
			Textures.Add(reader.String(reader.Int32()));
		}

		// This is more so for backward compatibility than anything. The texture indices now refer to the texture list of the mesh directly.
		for (int i = 0; i < count; i++) {
			_textureIndexes.Add(i);
		}
	}
	else {
		count = reader.Int32();

		for (int i = 0; i < count; i++) {
			_textureIndexes.Add(reader.Int32());
		}
	}

	// The TransformationMatrix is 3x3 instead of 4x4 like everything else in the universe.
	TransformationMatrix = new Matrix3(
		reader.Float(), reader.Float(), reader.Float(), 
		reader.Float(), reader.Float(), reader.Float(), 
		reader.Float(), reader.Float(), reader.Float());


	if (version >= 2.2) {
		// In 2.2, the transformations are already applied to the mesh, or calculated from the animation key frames. None of these properties are used anymore.
		Offset = new Vertex(0, 0, 0);
		Position = new Vertex(reader);
		RotationAngle = 0;
		RotationAxis = new Vertex(0, 0, 0);
		Scale = new Vertex(1, 1, 1);
	}
	else {
		// The Offset is the translation vector for the mesh. translated > scaled > rotated >TransformationMatrix.
		Offset = new Vertex(reader.Float(), reader.Float(), reader.Float());
		
		// Position is the distance between the mesh and its parent.
		Position = new Vertex(reader.Float(), reader.Float(), reader.Float());
		RotationAngle = reader.Float();
		RotationAxis = new Vertex(reader.Float(), reader.Float(), reader.Float());
		Scale = new Vertex(reader.Float(), reader.Float(), reader.Float());
	}

	count = reader.Int32();

	for (int i = 0; i < count; i++) {
		_vertices.Add(new Vertex(reader.Float(), reader.Float(), reader.Float()));
	}

	count = reader.Int32();

	for (int i = 0; i < count; i++) {
		_tvertices.Add(new TextureVertex {
			Color = version >= 1.2 ? reader.UInt32() : 0xFFFFFFFF,
			U = reader.Float(),
			V = reader.Float()
		});
	}

	count = reader.Int32();

	// A face has changed a little in the new version. The SmoothGroup isn't only bound to the face itself, but can be bound to the vertex itself instead.
	for (int i = 0; i < count; i++) {
		Face face = new Face();
		int len = -1;

		if (version >= 2.2) {
			len = reader.Int32();
		}

		face.VertexIds = reader.ArrayUInt16(3);
		face.TextureVertexIds = reader.ArrayUInt16(3);
		face.TextureId = reader.UInt16();
		face.Padding = reader.UInt16();
		face.TwoSide = reader.Int32();

		if (version >= 1.2) {
			face.SmoothGroup[0] = face.SmoothGroup[1] = face.SmoothGroup[2] = reader.Int32();

			if (len > 24) {
				// It is unsure if this smooth group is applied to [2] or not if the length is 28. Hard to confirm.
				face.SmoothGroup[1] = reader.Int32();
			}

			if (len > 28) {
				face.SmoothGroup[2] = reader.Int32();
			}
		}

		_faces.Add(face);
	}

	// This was weirdly predicted to be in model version 1.6... which never existed? Either way, it is safe to set it as >= 1.6
	if (version >= 1.6) {
		count = reader.Int32();

		for (int i = 0; i < count; i++) {
			_scaleKeyFrames.Add(new ScaleKeyFrame {
				Frame = reader.Int32(),
				Sx = reader.Float(),
				Sy = reader.Float(),
				Sz = reader.Float(),
				Data = reader.Float()	// Useless, has in impact in-game
			});
		}
	}

	count = reader.Int32();

	for (int i = 0; i < count; i++) {
		_rotFrames.Add(new RotKeyFrame {
			Frame = reader.Int32(),
			// Qx, Qy, Qz, Qw
			Quaternion = new TkQuaternion(reader.Float(), reader.Float(), reader.Float(), reader.Float())
		});
	}

	if (version >= 2.2) {
		count = reader.Int32();

		for (int i = 0; i < count; i++) {
			_posKeyFrames.Add(new PosKeyFrame {
				Frame = reader.Int32(),
				X = reader.Float(),
				Y = reader.Float(),
				Z = reader.Float(),
				Data = reader.Int32()	// Useless, has in impact in-game
			});
		}
	}
	
	// Texture animations, look at "Textures" above for more information
	if (version >= 2.3) {
		count = reader.Int32();

		for (int i = 0; i < count; i++) {
			int textureId = reader.Int32();
			int amountTextureAnimations = reader.Int32();

			for (int j = 0; j < amountTextureAnimations; j++) {
				int type = reader.Int32();
				int amountFrames = reader.Int32();

				for (int k = 0; k < amountFrames; k++) {
					_textureKeyFrameGroup.AddTextureKeyFrame(textureId, type, new TextureKeyFrame {
						Frame = reader.Int32(),
						Offset = reader.Float()
					});
				}
			}
		}
	}
}

I'm also sharing the program I used to test the RSM2 files. It's a bit messy, but it does the job and might help someone. This testing program no longer has any purpose to me as it's been merged into GRF Editor already.

    https://github.com/Tokeiburu/RSM2/tree/master/Rsm2

The provided model is the following (it contains all the new features of RSM2):

test.gif.4f825b0e9a7180ad7191a55116502429.gif

The chain on the right as well as the lights use these new texture animations. The red ball uses the translation key frames.

This test project can read any RSM or RSM2 file as well as save them (you can edit RSM/RSM2 models via source). Changing the header version to change the output file will cause issues depending on which version you go from and to. With that said, have fun...! One day I'll make GRF Editor sources public again, one day. 

Research how rendered sprites are scaled by the original client

Original note (SPR.MD):

As for sprite scaling, there are conflicting information (both not corroborated with any evidence):

The dimensions of a sprite follow a defined pixel ratio (0.1428571492433548f to be exactly), which is multiplicated with the dimensions of the sprite. Since everything else is based on unified sizes it's pretty easy to support high resolution sprites (just one line to change). However, you would still have to save the information about additional pixel ratio somewhere. The best solution would be to implement a lookup table where custom pixel ratios are saved for specific sprites. For example: Let's say you have a sprite with the dimension 100x100 pixel. If you recreate this one to 200x300, you would set the pixel ratio to 2.0f for width and 3.0f for height.

and

SpriteActor.CSpriteScale = 0.54857;

Personally, I don't think it's necessary to scale sprites in order to achieve a proper look, but it's entirely possible I'm missing something.


This is something to research further (RE/ASB?). I think it might only matter if the original rendering process is to be reproduced as my rendition seemed to be proportionate, despite using a different coordinate axis/scaling, but at this time I can't be sure.

This is primarily a reminder to look into it more, if it seems prudent. Also I wanted to get it out of the (uncommitted) notes so it doesn't get lost :D

Research how the "system palette" is used to render sprites for ancient SPR versions

Since I've never actually seen a SPR file that is this ancient, I can't verify the assumptions outlined in the specification so far. It might be they aren't publicly available at all, or that they are used in Arcturus, or possibly early alpha clients?

I would expect them to either use a predefined set of colors, or to simply rely on using whatever the user's computer (i.e., the Windows OS) provides and simply hope it works out. The latter seems prone to errors, but also like something they'd do :P

It's probably not really relevant to my efforts, but for completion's sake it'd be useful to investigate this at some point in the future.

Research the height of "wall" surfaces

From the GND spec:

image

The TODO is what I'm referring to. Clearly, the general assumption is safe since all the regular ground surfaces can be rendered correctly. The one remaining question is, does this also apply to "large" wall surfaces? I noticed they can look a bit stretched in some areas (e.g, schg_dun01) which is a problem that could conceivably be worked around by making the wall surface larger. I also noticed the "clock" textures on the walls in c_tower2 look a bit squished, which may be my perception being off a little, but warrants a second look.

As far as I can tell, this isn't a problem with most maps. For example, the walls in xmas_dun01 look just fine when using the "2 GAT tiles (world units) per GND surface" heuristic. It shouldn't be too difficult to check whether the "large" walls are also distorted in Gravity's implementation, which would indicate that walls are indeed always two GAT tiles high.

If this wasn't the case things might get more interesting, of course... Otherwise, amend the article and remove the TODO?

Research the changes made in RSW version 2.3

Example: amicit01.rsw or amicit02.rsw ... in fact, that's the only maps using this version?

What I've determined so far:

  • No size change in the format
  • The only value that looks off is the bounding box (does it use floats or ints?)
  • The byte after the version was already present in 2.2, though I don't know what it's for (Update: It's the "build number" apparently)

Research the changes made in GND version 1.9

Example: icecastle.gnd

Seems to be almost identical to 1.8? The main differences I can spot so far:

  • What was presumed to be a wind vector (added in 1.8) now seems to have values other than 1/1, but isn't a normalized 2D vector - meaning the initial guess was probably wrong
  • There's more data at the end, 644 bytes after the 1.8 data in this instance
  • Before that, lots of repetitive data, a bunch of floats and ints from the looks of it
  • Almost certainly related to the water plane, as well

Research the RSW ambient light source's properties

My rendition didn't look quite right. I think the angles were calculated incorrectly, so let's find out what they're actually doing. It also appears they used the colors differently than I would expect.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.