When trying to estimate CPU wear rate, I'd go with [halved life per every 10C increase] times [some very steep voltage-based formula I don't know] times the duty cycle. It does depend on quite a few implementation details I'm not familiar with, but if a core is in one of the deeper sleep states, I expect you could throw quite a lot of voltage at it with no ill effects.
When sorting this out for my own CPUs, I treat each 0.03V increase as halving the expected life (which is probably pretty conservative, but I'd rather be conservative with this). That means if I expect my CPU manufacturer had a design life in mind of 10 years 24/7 at their top all-core voltage bin, and I want to adjust to 2.5 years 24/7 and expect my cooler to be able to keep it to 70C in normal use (30 below spec), I'll go as far as 0.15V over that top manufacturer bin (0.09 from thermal headroom and 0.06 from acceptable reduced lifetime). Ryzen (XFR, really) does a bunch of weird stuff that makes this a bit tougher, but maybe I'll be able to glean more about that voltage/wear curve by examining its operation.
In short, yes overclocking wears things out faster, but things wear anyway. It just happens slowly enough that you're not going to notice, and overclocking isn't going to change that unless you're either abusing it with an extremely high-temperature workload 24/7 (like prime95) or overclocking to unreasonable levels. At 1.40V, don't worry about it.