I was talking out loud to myself while playing a videogame, and I had an interesting question come up, one which could only be solved by using the worst of all demons: Math. So let me phrase the situation for you guys...You are playing an RPG. At the beginning of the game (or whenever, really, but my numbers use a level 0 assumption), you can take a perk that offers you an extra 10% on every single experience granting transactions. Is it worthwhile to take this perk?
All supporting data
here
Tool tips can't lie, right?
Now, the gut feeling towards this changes day to day. Clearly, the earlier that you take the perk, the more effect that it has, meaning you should take it is as early on as possible. Second, that’s an extra ten percent, baby! One tenth of each level is free! However, what exactly does that imply...that means that I will not have a significant experience advantage over anyone who didn’t take the perk until level 11 - they will still be at ten. This continues for each ten levels. At level 20, I have only managed to eke out an extra 2 levels over my theoretical friend. So would I have been better off to have taken another skill in its place?
Both of those arguments are easy enough to conceptualize, but the benefit that you are getting from either decision is actually pretty hard to quantify without using some math. The reason? We all know that RPGs are drip fed - the difference between level 1 and 2 is nowhere close to the difference between 20 and 22. This is largely in part to the way that experience is curved out. Because of this, I couldn’t come up with a satisfactory answer and I had to do some math. To start with, I had to design an RPG Exp curve, along with average monster exp. Having no background in game design, this is something that has always fascinated me...how do they know how much gold and exp to hand out in any given area, but still maintain balance? In my mind, there was some master spreadsheet kept in the back that told them, similar to an actuary table. In reality though, calculating this is quite easy! So I wanted to dedicate a lot of space to how I set up this example…
Or I could just...you know...plagiarize one. But where's the fun in that?
I started messing around with each level being exponentially far away from the previous level, but wow, that gets out of control quickly! I have never seen an exp bar go into scientific notation, so I clearly had to use a different approach.
I decided that first, the RPG system is a drip feed so the time it takes to reach a level should increase at each level. After messing around with some models for a bit, I discovered that the time to level is probably the first factor that you want to decide on. I figured that if you added an extra 15 minutes to each level, that would probably be enough to keep things balanced - 15 minutes to level 1, 30 to level 2, etc.
Next, you have to know the average amount of experience you are giving out per level. I decided to make an assumption that the exp per monster would also increase in a linear fashion - 50 at the first level, 100 at level 2, and 150 at level 3. These numbers could be anything and it should all still work out. Next, I made yet another assumption that each fight would take exactly one minute. I guess you could replace the minutes with encounters and it would still be the same, but for some reason I was hellbent on using fractions of hours in Excel, so...it stays as minutes.
So what does that let you do? Well, now you can take the total experience per encounter, and multiply it by the encounters to level up! This gives the experience gap between levels! In our case, it takes 750 to reach level 1, an additional 3000 to reach level 2, then 6750, and so on (spoilers if you figure out what ‘and so on’ means here, I suppose!). Taking the cumulative values of all of these gave me an exp curve that starts at 750 for level one, and ends at 7 million for level 30, after a hefty 116.25 hour journey!
Now here is where something happened that blew my mind...I had created an RPG curve that I was satisfied with, and wanted to know how it was modeled. I took the data for cumulative experience and put it on a plot, tried some regressions, and nothing quite matched. It looked close on one or two, but the fit wasn’t perfect. Maybe it just wasn’t that mathematical. Then, because I was bored/wanted to be thorough, I put a regression on the experience between the two levels, and discovered that it fit a 2nd order power regression perfectly. Not ‘really really well’, not ‘pretty close’, but 100% perfection. Then I thought about what that meant. If I had the value between two points on a chart….and it was represented as aX^2+bx+c….then that is the integral of the experience curve! And sure enough, the experience curve comes up as a 3rd power regression, with a 100% fit, whose derivative gives the fit of the exp between charts (+ c, naturally)! Holy crap! Thats awesome! That also means that I can take the derivative of the experience between levels to get….something….related to the experience...I didn’t actually figure out what that would give me as it has been a while since I have performed a calculus, much less on a real world example.
Pretty much 'mfw'
After this, it becomes pretty simple to figure out how the scaling works. Just slap on an extra 10% to the exp per fight, and do the division. And what do we get? Well, the initial assumption actually holds true...the player with the extra 10 percent hits level 22 at 57.5 hours of play, at which time the player without the perk is just barely level 20. The comparison for 10 and 11 is 15 and 13.75, so it isn’t quite a clean cutoff, but the marginal gains are clearly there. So it really doesn’t seem like much, does it? Maybe you could take a perk that gives +20% damage which pays off at every single level instead of getting an extra level every 10 levels…
...Except the way that the levels are curved out showed me something different. In order to reach max level, it takes the player with the perk 105 hours, and the player without 116 hours. In fact, it is taking the second player an average of half an hour less past level 20 to level up. So it isn’t what you are gaining in levels over another player, but perhaps the time you are gaining back from the game. If a perk said ‘save you ten hours of leveling’, would that be worth it?
But then this is where things get extraordinarily more interesting (or more boring, but if you think that and you made it this far...uh...thanks?) Because of the way that the experience scales, you save less time in the early levels as opposed to the later levels. From levels 1 to 7, you only save half an hour by taking that perk. From 23-30, you would save 5 hours, which is to say 10 times more time! So this is where the final wrinkle of this puzzle is introduced..in fact, you do not get more value by taking the perk early, as logic might dictate. While it is true that having it wouldn’t be a bad move at early levels, its effects are most pronounced in the late game. This is again because of how the experience distribution curve is calculated - for lower levels, the difference between 1.1 and 1.0 multiplied by x^3 makes less of a difference, but for higher levels, it suddenly becomes a much larger gap.
While discussing this with one of my coworkers, he actually brought up an excellent point, which I had not initially considered when writing this, but was actually the very thing that brought it to my mind: In some RPGs, traditionally the western RPG, enemies scale with the player! This means that you are actually inflating the difficulty at a rate which it should not be inflated, and losing a perk in the progress! At level 10, you only have 9 perks working for you if you took the extra level, making you 1 perk behind the slower player! So all of the sudden you are trading a decrease in time for an increase in difficulty! Oh no! The example that brought this to mind was actually Dead Island: as far as I could tell, the zombies 'leveled' with you by gaining new abilities, such as increased mobility, better climbing, larger numbers, things like that. So I constantly felt that, while I was better off in total exp, I was actually playing a much harder game than I should have. And we call this the Ayn Rand paradox of game design, wherein the player who made the 'smart' decision to have more exp is punished by the unflinching machine of poor game design.
Now obviously, this is just an example of what an experience curve could look like, and most of the numbers were chosen by an arbitrary method, but the principles still apply. So what can we draw as a conclusion? Basically, taking the perk early isn’t actually that important...in fact, you get more value from it the later that you take it, on a per level basis. As long as the early grind isn’t that bad, it is totally worth holding off. And as far as being worth it in general, it comes down to a matter of the value of your time. If the game is something like Final Fantasy where you are expected to grind out between levels, it is probably going to be worth it. But if the game is something like Fallout where having levels is nice, but not required? Maybe just enjoy the sights. Certainly, dont waste your perk to make the game harder - most games have a slider for that! In any event, you should now be more prepared each time you face that decision!
Easy fix: Add
[*].disqus.com
to your software's white list. Tada! Happy comments time again.