a multiplayer game of parenting and civilization building
You are not logged in.
I stupidly alt-tabbed while the game was starting up and doing the framerate measuing so I could restart the spotify album (I use it to help keep track of my lifespan lol) and when I started playing she was running maybe 2x or 3x as fast as normal and having a lot of trouble picking things up (kept doing that latency thingy where she bounces off an object for a bit til the server acknowledges). Was kind of useful for scouting since I could haul ass but my hunger speed might have been sped up as well. The screen was having trouble keeping up with the map data (kept getting the filler scribbly doodles) and eventually I got eaten by a wolf somehow. My poor homestead :'( :'(
Was playing on windows 8.1, graphics card nvidia geforce gtx 850m
Let me know if you need any more info!
Last edited by jcwilk (2017-12-20 14:40:54)
Offline
Yeah... this is pretty normal!
The problem is that various graphics cards and monitors have different refresh rates in modern times. They're not all running at 60fps anymore (well, actually back in the CRT days, none of them were at 60fps anyway). We've now got 120 and 144 to deal with.
So, the game tries to figure out what kind of situation it's in at startup. If you mess with this process, it gets a bad measurement. I think the game was probably running at 30fps or even slower when minimized.
This isn't a bug, exactly.
If you have any thoughts about how to work around this, or how the game should behave if this happens (maybe some kind of warning page if it see you running at under 60fps)...
Offline
Well, I'm relatively new to game development and haven't done really low level stuff like this so I can't really offer suggestions on what the best way to handle it code-wise is... But just taking a step back and looking at the behavior, there should never be a time when the player does reasonably normal activity (switching in and out of a program is pretty normal, although I did do it at a bad time) which results in unplayable game behavior.
Is there no elapsed time that each update loop gets that you can work off of rather than assuming a framerate? Or is that just inferred from the refresh rate that the game is assuming? Surely there must be a solution to this since every other game out there handles this fine. Could it be re-measured every time a graphics change is detected? I'm not sure what kind of events you can listen on or what the standard approach to solving this is, just spit balling. Can you measure it once at the beginning and then set a graphics option in the menu and then force the graphics to be at that rate until they manually change it or restart the app?
I've only played about 6 or 7 lives so far (aside from the slew of failures when i was learning the game) so I think this is something that is probably not rare and likely even falls under the umbrella of "things that might make new players ditch" that we were talking about before. But yeah maybe it's too early to start doing hardware fiddling polish when there's still so much content and gameplay stuff to worry on, just tossing in my feedback.
Offline
Fixed frame rate helps solve tons of problems, and it also produces a more consistent play experience on slower computers (computers that lag when lots of trees are on the screen, etc.... it feels better if those parts just slow down instead of jumping to a lower framerate and back up again later).
But the question is, what to set that fixed framerate at?
Also, vsync is REALLY important for visual quality, especially on a game that pans a lot like this. If vsync is off, people are going to see ugly tearing across the middle of their screen. This is not so important on an FPS. Most people run with vsync off, in general.
So, I need some way to measure this, to fix the frame rate, and also to warn people when vsync seems to be off (which can only be fixed by them turning it on in their graphics card control panel).
That said, I do think that I can detect alt-tab, and I should probably abort the measurement if they do that, and resume once they tab back in.
But the problem is, that's only one of many things that might result in a bad measurement.
As far as what other games do.... many of them just run at max FPS, even if the monitor can't even draw the frames that fast, which is a total waste of computational resources. Or they fix a framerate to 60fps all the time, which sucks on modern 144 fps graphics card AND sucks on older, slower computers.
Anyway, I'm trying to do something a bit smarter here.... but the result is pretty fragile.
The other thing I could do is measure it once at first startup and save it into a file. I could also put a one-time message up saying, "I just measured your fps to be 60.04, and vsync seems to be off. Is this correct? Should I save this for future sessions?" Or maybe let people type in their own desired FPS one time.
But I think a lot of people don't even know their own FPS.
Like, what is your normal FPS?
The game ships with these possible options:
30
60
120
144
So, during your alt-tab thing, it must have picked 30.
But if you're running at 60 normally, that would only make the game twice as fast. Not ridiculously fast.
So maybe your normal is 120? Which would make it 4x fast, which would be unplayable.
Offline
Yeah those are all understandable points. The alt-tab abort during measuring sounds like a good idea if it's not much work. Doesn't totally solve the problem but it at least fixes what seems like the most obvious way to screw up the measurement.
Having an optional manual override might be nice as well, but as you said, most people probably aren't aware of this and it would be nice to find a solution that didn't require manual user intervention. OTOH, if for some reason someone's setup triggers this issue frequently, it would be nice to be able to point them to where to set it manually (or if they're savvy enough, be able to find it themselves).
Might it be possible to measure the FPS, save it (maybe even just for the current session as you are now), and then force the graphics to use that refresh rate? As you said, it would risk wasting resources or tearing or underutilizing the graphics card, but IMO all those options are better than the game running at the wrong speed.
Looks like I'm at 60fps, or that's what my refresh rate is set to currently. So yeah I suppose it picked up 30 and maybe it was only running at 2x speed. I could play... kind of... it was just awkwardly fast and there were networking issues since (presumably) everything is optimized for different timings so the network data couldn't keep up. The biggest issue was probably that I had a ton of issues interacting with things, like when I tried to pick berries off a bush it would "bounce" off the bush maybe 10-20 times before it finally resolved the action. At first I thought it was an anti-cheat mechanism that was waiting until I "reached" that location since i was moving too fast, but it's probably more likely that some sort of timeout was only waiting half as long as it should have and kept timing out. I'm not really sure but eventually I got eaten by a wolf without even seeing it on my screen because the map data couldn't be retrieved fast enough to keep up with my running. Even if that hadn't happened, I doubt I would have been able to survive let along get anything done due to the weird bouncing timeouts.
There's also a big opportunity for exploitation there since with a little practice I'm sure someone could figure out how to take advantage of moving way faster than other players.
Might there be some way to periodically sample the current time and compare it to the elapsed "game time" and make sure it's not wildly off? I suppose someone could speed up their clock in that case, lol, but maybe something like that would be a bandaid for now without requiring measuring up front and assumptions.
Maybe if I have some free time I'll derp around with the game code and see if I have any ideas. It's so cool that you have your games as open source, you're kind of a hero in my book for that haha. I hope to get my shit trained up well enough to figure out how to make something like that work someday. Game development is such a different beast from web development :'( xD
Offline
Yeah, if you're running 2x as fast, you will arrive places way early, and the server won't let you do anything there until it thinks you've actually reached your destination.
Anyway, I think this is pretty exploit-proof, because everything is happening server-side, including move speed and timing. The client just displays it to you (in this case, displaying the moves faster and shorter than they actually are).
As far as enforcing the frame rate via sleeping goes, if vblank is really on, we DO NOT want to do that, as it introduces tearing. But here it is way off from what we think it is, so vblank at 30fps is clearly not working.
The other option is to measure again, on the menu screen.... or wait until then to measure in the first place. But it does require a few seconds for a proper measurement.
Maybe first-startup needs to be more interactive about measurement. It should say, "Are you ready for me to measure your framerate?" and then wait for you. After that, it should say, "I measured 60fps and vsync seems to be on, should I save this for future runs?"
So then later, it won't measure anymore, so minimizing and such won't matter.
But sheesh.... people will be annoyed by any of this stuff, because no other games are doing it.
I really think that other games are pretty much phoning it in, though...
I think that part of the problem stems from my game recording system, which depends on fixed-length timesteps.
Probably most other games use variable timesteps, and are just constantly measuring the latest frame time every frame. But when frame times slow down (like during busy scenes), I find this to be aesthetically jarring, because it results in big jumps in movement. I think slowdown is better than bigger, stuttery jumps.
Offline
Yeah, if you're running 2x as fast, you will arrive places way early, and the server won't let you do anything there until it thinks you've actually reached your destination.
Anyway, I think this is pretty exploit-proof, because everything is happening server-side, including move speed and timing. The client just displays it to you (in this case, displaying the moves faster and shorter than they actually are).
Ok, yeah that explains the trouble I was having then.
Maybe first-startup needs to be more interactive about measurement. It should say, "Are you ready for me to measure your framerate?" and then wait for you. After that, it should say, "I measured 60fps and vsync seems to be on, should I save this for future runs?"
So then later, it won't measure anymore, so minimizing and such won't matter.
But sheesh.... people will be annoyed by any of this stuff, because no other games are doing it.
Yeah I agree that this would not be a pleasant/impressive first few seconds of the game. Might make sense to have an option in the menu but mandatory user action on startup does seem like too much.
I think that part of the problem stems from my game recording system, which depends on fixed-length timesteps.
Probably most other games use variable timesteps, and are just constantly measuring the latest frame time every frame. But when frame times slow down (like during busy scenes), I find this to be aesthetically jarring, because it results in big jumps in movement. I think slowdown is better than bigger, stuttery jumps.
Yeah this seems like the root of the problem. Speaking as a gamer, I agree that at least for single player games I would prefer it to slow down a bit rather than skip steps... But for a multiplayer game, especially when combining it with the mechanism you mentioned about everything happening on the server (which of course, as it should) it doesn't make a lot of sense to me to have the client ever slow down to a different time-rate than the server. Lag jumpiness sucks for sure, but time moving at a different speed than the server seems worse. I was playing alone when this happened but the thought of how this might be experienced when playing with one or more other players on your screen seems really painful and strange.
I'm not sure how much work it would be to adjust the game recording system, but it seems like making the game able to handle an inconsistent/incorrect framerate in ways other than changing the time-rate would lead to a more consistent and less confusing experience. (ie, by doing the standard deltatime measurement and basing physics and timing on that) That's just me giving my own feedback as a gamer that has experienced it though, maybe others would say otherwise, and ultimately of course it's your own game and I'm sure you know what's best in the grand scheme of things.
Anyways, I've eaten up enough of your time on this but thank you for the discussion :) Looking forward to updates and once it gets to that paid state when you'll rope in a bunch of new folks! Can't wait.
Offline
That's a good point about multiplayer vs single player.... hmmm....
I do really hope that it will run at a solid 60fps on almost everyone's machines. It does so on these very old laptops that I have here at home....
Offline