Mirror‘s a great tool for networking in Unity. Here are 2 things that we did to speed things up a little that I wouldn’t want to push to the official repo, but we did in our project with no repurcussions.
We just straight up commented out this section in NetworkIdentity.ServerUpdate().
It was costing us a lot of performance with many GameObjects with SyncVars. I took a look at what it was doing, but didn’t seem necessary. From what I see, the moment an object becomes an observer they’d be updated anyway. Maybe the intention of this function is to call it once, instead of every frame.
I didn’t have too much time to wrap my head around UNET/Mirror, so I just commented it out. We gained 40FPS.
It’s been several months and nothing bad has happened so far. Maybe our other modifications prevented this change from wrecking our game, but I doubt it.
Edit: It seems like JamesDev | Mirage pointed out that this is used to clear changes to SyncLists when no observers are around. We use them extensively to sync our inventories, but haven’t noticed much. I could see a potential memory leak if we were to make modifications to an inventory with no observers around, so we’ll keep an eye out for that and make a fix.
Another optimization I can see is to optimize SendUpdateVarsMessage to be at a different frequency than ServerUpdate, like in Unreal. I’ll probably do that at some point.
We noticed that Mirror was calling FindObjectsOfTypeAll, which caused us massive load times for large levels since it was iterating across the entire scene per object.
Evan created a List of NetworkIdentities in the NetworkManager that was cached on Start in editor time.
This optional networkIdentities variable is passed in through the NetworkManager’s StartServer, FinishStartHost, OnSceneLoaded, FinishLoadSceneHost, FinishLoadSceneServerOnly through existing SpawnObjects calls. You just have to edit the payload.
We used a lot of FirstGearGames plugins, which have been very good. You should modify the ProximityChecker to use a cache instead of GetComponents, though.
Thanks for reading!