You take some fuck, and shit shit, some fuck, and some shit you've got
a fuck shit stack. A fuck shit stack.
Mr. Fukushita wishes to have a word with you, sir.
yea, java and see-carpet suck as well.
javascript is more c-ish if you do it right (use jslint and so on) then python; meanwhile I lik both of them.
whats f*'n me up is that all of them (python/js/c++) have key/value pair lists, flat lists and so on, but they're named different or even worse - with offending sense. Its a bit like Cats n Dogs tail language.
C/C++ will also continue to be used for some time on software that has performance needs (gaming, in particular), and that has to work closer to the operating system for some reason (embedded, and weird stuff).
C++ isn't going to die that easily, though, since the gaming community uses it pretty heavily. It offers much of the performance of C with significantly less of the security problems.
I think the trend is growing for Java and C#, unfortunately, with Python and JavaScript being used for scripting, but kind of client-side stuff if I have it right. I've worked with both Java and C# stuff for some relatively sophisticated stuff in Linux and Windows respectively, particularly server-related programming. They make things seem easy, up until you have to get close to the operating system, then both of these are really annoying to use. At least for me.
I do like working with a decent IDE, though, I have to confess. It seriously speeds up the time it takes to write code.
Mon Apr 27 2015 09:15:19 AM EDTfrom dothebart @ Uncensored...javascript is more c-ish if you do it right (use jslint and so on) then python; meanwhile I lik both of them.whats f*'n me up is that all of them (python/js/c++) have key/value pair lists, flat lists and so on, but they're named different or even worse - with offending sense. Its a bit like Cats n Dogs tail language.
I agree. The ability to do C and wrapper it up for speed - as a library / library extension for C libraries - Python is a bit of a security blanket for me. But I must admit that the key/value pairs has perplexed me as well. I love slicing and all that, but list / dictionary / tuple etc... is a bit much to remember at my advanced age :-) And they seem to be adding / changing language syntax at an ever increasing rate these days. I would recommend a bit of Raymond Hettinger to figure out why they do what they do in the Python world - smart nerd:
https://www.youtube.com/watch?v=OSGv2VnC0go
(skip to about 5:22 to get past the chit chat stuff.)
Watch the whole thing to get the why behind differences between the languages. I "believe" Python has it right, but would love to hear about other languages have it more right (i.e. I don't have to twist my brain in knots to figure out what to type)...
Anyway, I like Raymond. He makes programming fun again :-)
I'm going to open a big and stupid can of worms here. But I am actually interested.
What screen width is *actually* used for programming these days?
I know the traditionalists will always say 80 columns, but really, it's hard to stay inside an 80 column screen. I have mine set to 132 columns, simply because I wanted more columns but wanted a non-arbitrary value (at least 132 is "something common")
Have most developers already thrown off the shackles of a width determined half a century ago by IBM's punch cards?
hah, yes thats funny.
the 80 columns are actualy refered to as "what one can overview fast"
I guess they neither had stuff like indention nor class hirarchies in mind with that.
imho 120 columns is ok, where there shouldn't be more than 80 characters filled.
the c++11 -auto feature comes to aid here, at least in many cases you can better adhere the DRY-rule using that.
If the type isn't present in that line, i'd discourage using auto, since the source becomes harder readable if you don't know what that specific type pulled from some obscure getter actually is, and what to do with it.
I guess they neither had stuff like indention nor class hirarchies in
mind with that.
Nope. I'm pretty sure nobody was writing Python code on IBM punch cards in the late 20th century.
What screen width is *actually* used for programming these days?
I actually use 80, because on a typical 1080p monitor I can throw two windows side by side and compare test driver with implementation, or client with library, etc...
imho 120 columns is ok, where there shouldn't be more than 80
characters filled.
yeah, we use 120 as a standard, especially our UI guys, but my intellij autoformat seems to want to stay on 80 anyway, and I have not seen fit to fix my broken settings ;) (see previous post)
Huh... I haven't really given that question as much thought as I should.
I just sorta use whatever width I feel I need, and try not to get too ridiculous.
If I see it getting too ridiculous, I assume I am probably making some kind of mistake, and try to find ways to refactor the code a little to make it more modular.
But, sometimes, I don't feel I have a lot of choice... maybe I'm using code from somewhere else, or whatever. Then I just learn to deal with the width problem.
I'm trigger happy with the auto-format, and Intellij will happily re-wrap most lines in .java to 80 (with exceptions for things like long comments.)
This is **not** reliable for .groovy, intellij will alter your code semantics by adding newlines in places that change the code meaning (due to semicolon inference...)
Howto disrupt your contributing community:
https://github.com/orientechnologies/orientdb/issues/4354
*popcorn*
http://whatthecommit.com/
google some of these and you will see them appearing in real-world open source projects...