Ideally, the enterprise you work for can afford a competent team that enforces source control and code reviews/audits of anything that gets added for merging to production.
I work in industrial automation with stuff that can kill people, and the closer you get to the stuff that directly controls the fatality potential, the less controls seem to exist.
I am not joking, there are ladder-logic programmers who come in on contract from vendors/project management companies, who do god knows what with zero documentation, zero control, zero oversight, and this happens all the time.
I directly worked with someone who, futzing around with a level 1 system, completely violated LOTOTO and could have easily killed someone if all the factors had coincidentally lined up.
He was promoted. This is a Fortune 250 company.
If you can't trust your employees to do their job correctly and with integrity, I'd say you have bigger problems.
If someone is being offered half-a-million dollars to "make a mistake", a "mistake" that will likely never lead to any actual losses because they're in cahoots with someone who's going to take another half-million dollar payday to privately disclose it before anyone else discovers it, what's the "bigger problem?"
Because that's.... as big as it is realistically going to get for anyone, right? What other kind of ethical concerns about code maintainership have anywhere near that kind of temptation behind them?
Code reviews should be mandatory, pair-programming can be a useful tool too early on for jr devs to learn the ropes, but also to switch pairs around to keep the style of code consistent, and the quality high and honest.
OK. So your node.js or whatever crap is great. Perfect, fact. Best ever.
Who cares? What about libraries?
Layered software? Environment control?
I mean, in this particular case, it's blueZ, right? An open-source bluetooth stack?
What does a company do about that?
Any source control software is going to have a way to track everything anyone has ever done.
In actuality, you tend to find that you can't always build what's in it. Or, when you can, it doesn't exactly match what's currently in production. Who knows? The guy with commits doesn't, or she doesn't even work there anymore. Or it turns out that he didn't write it, someone else did. It was code snippet from stackexchange, cargo code from somewhere else internally. They can't explain what it did when they wrote it the first time, let alone now. Worse case, you fire them. What a great place to work!
This is not a simple problem, and the moral hazard that JBI is talking about is precisely why bug bounties are almost exclusively offered by 1) best-in-class makers of 2) ubiquitous and broad-facing software and 3) virtually only ever claimed by an informal community of well-known participants.
Making the practice universal confounds all three.