Personal computing discussed
Moderators: renee, SecretSquirrel, just brew it!
just brew it! wrote:@chuckula -
As they say, "Those who can, do; those who can't, teach!"
And yes, there's a mindset you need to get used to if you want to be an effective Python coder. Having used C/C++ for most of my career, learning Python really messed with my head. Also, if I do a project in Python then switch back to C (which is still the language I use the most at work) I keep leaving the damn semicolons off of everything; the first compilation of the first new module I write after switching back comes up with dozens of syntax errors due to missing semicolons!
chuckula wrote:* I've don't plenty of C too, and I've done C-code that interfaces with Python in the past.
just brew it! wrote:chuckula wrote:* I've don't plenty of C too, and I've done C-code that interfaces with Python in the past.
Yeah, the C/Python interface is surprisingly straightforward. I've done a fair bit of hybrid Python/C too -- in some cases for talking to I/O devices, and in other cases for doing number crunching where there was a need to efficiently leverage multiple cores.
chuckula wrote:Think of the semicolons!!
P.S. --> You actually can put semicolons in python, they just don't do anything for you. If it makes it easier to stay consistent with C and if you don't mind snobby comments from us Python types* then you could go that way.
* I've don't plenty of C too, and I've done C-code that interfaces with Python in the past.
just brew it! wrote:2. I was once assigned to debug a sizable body of C++ code that had been written by others. It was full of memory leaks and wild pointer dereferences, and crashed/misbehaved seemingly at random. I discovered that the original coders had used C++ new/delete and C malloc/free more or less interchangeably (e.g. sometimes an object would be allocated with new, but deallocated with free... or vice-versa). After about 6 months of beating my head against that wall I quit that job.
just brew it! wrote:As they say, "Those who can, do; those who can't, teach!"
anotherengineer wrote:My question what is the most common mainstream programming language today?? C#??
Deanjo wrote:anotherengineer wrote:My question what is the most common mainstream programming language today?? C#??
Java, followed by C, then C++, then Python then C#.
http://spectrum.ieee.org/static/interac ... -languages
just brew it! wrote:1. In college I spent a couple of semesters as a grader for a sophomore-level CS data structures course. By midterm, there were still quite a few students in the class who couldn't get their assignments to compile. In one case, it was because the person insisted on coding in a different programming language than the compiler being used for the course (in spite of being informed repeatedly that he really needed to code in Pascal, not PL/1, if he wanted to pass).
int main()
{
printf("Sorry I couldn't finish this assignment\n");
return 0;
}
just brew it! wrote:I'm surprised PHP isn't in the top 3, given its ubiquity in the web app space. I'm also surprised that C is still that close to the top.
Edit: Found a non-paywalled link: http://spectrum.ieee.org/computing/soft ... -languages
Including VHDL in a list of programming languages seems a bit odd to me?
UnfriendlyFire wrote:At my work, we have a handful of 1970's industrial controls systems, with 4 to 16 kilobytes of system memory/storage and no way of expanding the storage capacity. Because the memory and storage are on the same volatile memory, turning off the systems will wipe the entire memory. Replacing those control systems cost several million dollars and a few months of no operation, so upgrades are out of the question.
Redocbew wrote:just brew it! wrote:Including VHDL in a list of programming languages seems a bit odd to me?
Yeah that's weird. I always thought VHDL was just for modeling.
Redocbew wrote:I wonder if PHP is in the process of getting edged out by Ruby. I see a lot more postings including Ruby than I do PHP despite how much PHP code is around these days.
Chuckaluphagus wrote:UnfriendlyFire wrote:At my work, we have a handful of 1970's industrial controls systems, with 4 to 16 kilobytes of system memory/storage and no way of expanding the storage capacity. Because the memory and storage are on the same volatile memory, turning off the systems will wipe the entire memory. Replacing those control systems cost several million dollars and a few months of no operation, so upgrades are out of the question.
This strikes me as a company-obliterating disaster waiting to happen. I'd be a freaking nervous wreck if I had to interact with or be responsible for those.
[Edited because prepositions are important.]
just brew it! wrote:Deanjo wrote:anotherengineer wrote:My question what is the most common mainstream programming language today?? C#??
Java, followed by C, then C++, then Python then C#.
http://spectrum.ieee.org/static/interac ... -languages
Link appears to be paywalled.
I'm surprised PHP isn't in the top 3, given its ubiquity in the web app space. I'm also surprised that C is still that close to the top.
Edit:
Found a non-paywalled link: http://spectrum.ieee.org/computing/soft ... -languages
Including VHDL in a list of programming languages seems a bit odd to me?
dmjifn wrote:Some of the worst code I've seen recently is in interviews where the candidate and I whiteboard a couple softball code problems. I ask a question that's pretty much Project Euler #1. This is tough exercise for most just because you're on the spot and out of your element. But we take pains to ease candidates into it. And I'm super liberal in accepting reasonable pseudocode, even letting them assume operators and API functions that don't exist but really probably could. Everyone we interview has a good resume listing very-likely-sounding accomplishments in the area of C# and .NET. Shockingly, less than half can even work through the logic of the problem, much less write the code.
BIF wrote:Chuckaluphagus wrote:UnfriendlyFire wrote:At my work, we have a handful of 1970's industrial controls systems, with 4 to 16 kilobytes of system memory/storage and no way of expanding the storage capacity. Because the memory and storage are on the same volatile memory, turning off the systems will wipe the entire memory. Replacing those control systems cost several million dollars and a few months of no operation, so upgrades are out of the question.
This strikes me as a company-obliterating disaster waiting to happen. I'd be a freaking nervous wreck if I had to interact with or be responsible for those.
[Edited because prepositions are important.]
I agree. This is like having no emergency plan. If a disaster ever struck them, they'd be out of business and maybe could find themselves sued for breach of ... everything.
dmjifn wrote:I'm assuming that the preferred solution is to solve this mathematically rather than through iteration?
JustAnEngineer wrote:dmjifn wrote:I'm assuming that the preferred solution is to solve this mathematically rather than through iteration?
The count of multiples of 3 is int((max-1)/3). Let's call that C3.
The average of those multiples is C3 x 3/2. Let's call that A3.
The sum is (C3+1) x A3.
Programming it, you'd have to be careful about data types, but when I do it in Excel, it treats everything as a real.
UnfriendlyFire wrote:So a VM or DOSBox?the compiler that converts the code that we type on our computers and write it to the control systems isn't guaranteed to work with Windows 8/10. The compiler uses Windows 7's compatibility mode.
dmjifn wrote:Some of the worst code I've seen recently is in interviews where the candidate and I whiteboard a couple softball code problems. I ask a question that's pretty much Project Euler #1. This is tough exercise for most just because you're on the spot and out of your element. But we take pains to ease candidates into it. And I'm super liberal in accepting reasonable pseudocode, even letting them assume operators and API functions that don't exist but really probably could. Everyone we interview has a good resume listing very-likely-sounding accomplishments in the area of C# and .NET. Shockingly, less than half can even work through the logic of the problem, much less write the code.
JustAnEngineer wrote:You could get rid of duplicates by plugging in the product (3x5) into the formula, then subtracting it from the sum of the result for 3 and the result for 5.
just brew it! wrote:FWIW the 1-line Python solution is:
print sum(set(range(3,1000,3)+range(5,1000,5)))
JustAnEngineer wrote:dmjifn wrote:I'm assuming that the preferred solution is to solve this mathematically rather than through iteration?
Myself in the interview wrote:Say we're looking at the numbers 1 through 10. (Write them on the board.) If we consider the numbers that are multiples of 3 or 5, we would have 3, 5, 6, 9, and 10. (Draw short arrows from those numbers.) The sum of those is 33. (Write "3 + 5 + 6 + 9 + 10 = 33" at the end of those arrows.) Would you write a function to do this. So if I give you "10" you give me "33". (Circle 10 and 33.)
public int SumIt(int n) {
// Your code here
return sum;
}