Habe gerad mal die Funktion calculateMR() rausgesucht.
Code:
.method public hidebysig int32 calculateMR()
{
.locals init (int32 V0,
int32 V1,
int32 V2,
int32 V3,
int32 V4,
class System.Object[] V5)
ldarg.0
ldstr "MU"
call int32 DSA.Model.Character::attr(class System.String key)
ldarg.0
ldstr "KL"
call int32 DSA.Model.Character::attr(class System.String key)
add
ldarg.0
ldstr "IN"
call int32 DSA.Model.Character::attr(class System.String key)
add
stloc.0
ldloc.0
conv.r4
ldc.r4 5.0
div
call int32 [UnityEngine]UnityEngine.Mathf::CeilToInt(float32)
stloc.1
ldarg.0
ldstr "AG"
call int32 DSA.Model.Character::attr(class System.String key)
stloc.2
ldarg.0
ldfld class DSA.Model.CharacterClass DSA.Model.PlayerCharacter::klasse
callvirt int32 DSA.Model.CharacterClass::get_mrmod()
stloc.3
ldloc.1
ldloc.2
sub
ldloc.3
add
stloc.s 4
ldstr "CalcMR: baseval {0} / 5 = {1} - AG {2} + mrmod {3} = {4}"
ldc.i4.5
newarr [mscorlib]System.Object
stloc.s 5
ldloc.s 5
ldc.i4.0
ldloc.0
box [mscorlib]System.Int32
stelem.ref
ldloc.s 5
ldc.i4.1
ldloc.1
box [mscorlib]System.Int32
stelem.ref
ldloc.s 5
ldc.i4.2
ldloc.2
box [mscorlib]System.Int32
stelem.ref
ldloc.s 5
ldc.i4.3
ldloc.3
box [mscorlib]System.Int32
stelem.ref
ldloc.s 5
ldc.i4.4
ldloc.s 4
box [mscorlib]System.Int32
stelem.ref
ldloc.s 5
call class System.String [mscorlib]System.String::Format(class System.String, class System.Object[])
call void MCPIF::LogDiceroll(class System.String text)
ldloc.s 4
ret
}
Sieht also so aus als würde die Berechnung etwas anders ablaufen.
ceil((MU + KL + IN) : 5) - AG + Mod = MR
Wobei der MR-Mod wohl dem MR Wert in classdefinition.xml entspricht.