I still think MTU is irrelevant and best left at the defaults your ISP sets.
Typical MTUs on modern connections are around a kilobyte
or two, whilst bandwidth is usually multiple megabytes a second.
That means that making wild changes to your MTU has a near-zero effect on your latency. Let's spitball a 30MBit/s connection, with 20ms of latency in-game, providing 4MiB/s of bandwidth and a 1536 byte MTU. Let's cut your MTU to 512 bytes. That's a pretty drastic change, but let's see what happens to your in-game latency:
At an MTU of 1536 bytes, the maximum packet delay for the next packet caused by transmission time of the current packet is 1536 bytes divided by 4,194,304 bytes per second, multiplied by 1000ms (in a second). That's 0.36ms of latency caused by your MTU, equivalent to 1 frame at 3000 frames per second. It would change your in-game latency from 20ms to 20ms, AT WORST. Remember, the average delay is half the maximum delay, too.
At the new, radical MTU of 512 bytes, the maximum packet delay for the next packet caused by transmission time of the current packet is 512 bytes divided by 4,194,304 bytes per second, multiplied by 1000ms (in a second). That's 0.12ms of latency caused by your MTU, equivalent to 1 frame at 8000 frames per second. It would change your in game latency from 20ms to 20ms.
If you game at 8000 frames per second, and the game you're playing has network code that also runs at that rate, you ABSOLUTELY SHOULD change your MTU. Meanwhile, in 2018 the typical game servers run at a tick rate of 63Hz, 64Hz or even 21Hz - and the twitchiest, fastest game to date runs at 128Hz. I don't think we're in danger of getting servers running at more than 3000Hz any time soon
TL;DR - even if your internet connection is a couple of orders of magnitude slower than a mediocre connection in 2018, you won't gain even a single frame of latency by messing with your MTU.