whm1974 wrote:Anyway looking at the code it sure does look like a nightmare language to program in.
Some people say that about C/C++.
"What do you mean I need to do my own memory management? That's crazy talk!"
Personal computing discussed
Moderators: renee, SecretSquirrel, just brew it!
whm1974 wrote:Anyway looking at the code it sure does look like a nightmare language to program in.
just brew it! wrote:whm1974 wrote:Anyway looking at the code it sure does look like a nightmare language to program in.
Some people say that about C/C++.
"What do you mean I need to do my own memory management? That's crazy talk!"
Buub wrote:just brew it! wrote:whm1974 wrote:Anyway looking at the code it sure does look like a nightmare language to program in.
Some people say that about C/C++.
"What do you mean I need to do my own memory management? That's crazy talk!"
C maybe... C programmers seem to be in love with cleverness and terseness.
There's really no excuse for it in C++. C++ code should be readable. The compiler optimizers are better than humans at optimizing code these days anyway, so getting clever to save a few cycles can backfire as frequently as it helps.
Redocbew wrote:It's best if you do just to help in avoiding doing something boneheaded, but I think Buub was referring more to the kind of "micro-optimizations" that may shave a some CPU time off the program, but do so at the expense of readability.
whm1974 wrote:Even so shouldn't programmers still need to know how to memory management?
just brew it! wrote:whm1974 wrote:Even so shouldn't programmers still need to know how to memory management?
Part of the point of languages with background GC (Python and Java, to name two) is that no, the programmer does not need to worry about memory management. It is all supposed to work magically behind the scenes. And in general, it does. Until it doesn't.
whm1974 wrote:Even so shouldn't programmers still need to know how to memory management?
DancinJack wrote:whm1974 wrote:Even so shouldn't programmers still need to know how to memory management?
If you get this far in C/C++, let me know if you feel the same way after you learn another language that does things for you.
POINTERS!!! POINTERS EVERYWHERE!!!!!!!!!!!!!!!!!
DancinJack wrote:whm1974 wrote:Even so shouldn't programmers still need to know how to memory management?
If you get this far in C/C++, let me know if you feel the same way after you learn another language that does things for you.
POINTERS!!! POINTERS EVERYWHERE!!!!!!!!!!!!!!!!!
whm1974 wrote:I didn't say that programmers should do memory management manually, but they should know how to do it.
whm1974 wrote:Buub wrote:just brew it! wrote:Some people say that about C/C++.
"What do you mean I need to do my own memory management? That's crazy talk!"
C maybe... C programmers seem to be in love with cleverness and terseness.
There's really no excuse for it in C++. C++ code should be readable. The compiler optimizers are better than humans at optimizing code these days anyway, so getting clever to save a few cycles can backfire as frequently as it helps.
Even so shouldn't programmers still need to know how to memory management?
Redocbew wrote:It's best if you do just to help in avoiding doing something boneheaded, but I think Buub was referring more to the kind of "micro-optimizations" that may shave a some CPU time off the program, but do so at the expense of readability.
Pancake wrote:whm1974 wrote:I didn't say that programmers should do memory management manually, but they should know how to do it.
You've got some strange ideas of what programmers should or shouldn't be doing - for a complete noob! My advice - don't take on any prejudices or preconceptions. Have an open mind about everything when you're learning otherwise you'll create all sorts of artificial barriers for yourself.
Pancake wrote:In actuality, having a heap you can rely on to allocate/deallocate chunks of memory is a cop-out. A crutch. If you're really down to the bare metal you will deal with one large chunk of RAM and have at it - as we would have to in the old days. A heap is - of course - far better. But it is also an abstraction. The CPU doesn't know there's such a thing.
Pancake wrote:In actuality, even in a high level language like Java with garbage collection you STILL have to be quite careful how you manage your memory and other resources. But it's nicer not having to think about the minutiae and focus on the design/implementation. Garbage collector > C-style heap management. Garbage collection can even be a performance win as with the JVM which can do it in separate thread(s) helping take advantage of multi-core processors. But whatever heap model you're using a GOOD programmer will be mindful they're not thrashing it by unnecessary use.
Pancake wrote:In actuality, 99% of software developers have no need to know about free()/malloc(). We've been freed from the tyranny of explicit memory management for the vast majority of our work. And that's a good thing.
Buub wrote:Redocbew wrote:It's best if you do just to help in avoiding doing something boneheaded, but I think Buub was referring more to the kind of "micro-optimizations" that may shave a some CPU time off the program, but do so at the expense of readability.
Yes, this too. Make the code correct, first, then optimize later. Making the code well architected and readable is frequently more valuable than clever optimizations for two reasons: 1) You may be optimizing something that isn't even a bottleneck, and making the code harder to maintain for no real benefit, and 2) the compiler might already be doing as good a job as you could.
Buub wrote:Totally agree with the n-squared cost loops UNLESS you can guarantee that this is only ever used for small values of n - and by guarantee I mean a fundamental limit of what you are dealing with, not just what the dataset is today because sometime in the future someone will increase it!Of course, this is not to say that you shouldn't be performance aware. You absolutely should. Don't make a bunch of n-squared cost loops. Don't do stupid things. Make "free" optimization habits (pre-increment/pre-decrement rather than post-increment/post-decrement all your variables, except where the latter is explicitly needed, for example).
But make the code correct, without doing dumb things. Worry about the clever optimizations later.
liquidsquid wrote:Personally I try to avoid malloc() and its ilk. For one, usually RAM is limited on small MCUs. For another, if it isn't VERY carefully managed, your heap can get fragmented on a smaller processor over time to the point you can no longer get a new chunk of continuous memory even if there is plenty available. It is a long-term operation issue on embedded devices, especially ones that can (hopefully) run for years at a time.
just brew it! wrote:Not sure I'd call it a cop-out. The way you'd likely deal with "one large chunk of RAM" would be to write a module with some utility functions to help you manage it. That module might even end up looking a lot like malloc() and free().
Pancake wrote:I plant lots of native plants as part of environmental rehabilitation and build beautiful gardens using native plant species.
But there's a joy in computation. In exploring algorithms and the unknown. Mathematics is the frontier of the mind. This is my country: http://fluffysoft.com.au/war/Com_fluffysoft_web.html
DancinJack wrote:Can we please stop it with the real man ****? It's dumb and just, no.