jump to navigation

Types, types, types! February 27, 2017

Posted by PythonGuy in Uncategorized.
add a comment

Of all the arguments against Python, the one that seems to stick at all nowadays is typing. They think Python’s type system is all wrong, that we should instead embrace an archaic type system written back in the day when computers had very limited CPU cycles and memory.

I think I have an argument that should completely obliterate their point of view.

They like their type system because it gives them:

  • Speed
  • Reliability and predictability
  • Control

OK, I give you that. Yes, when the compiler knows in advance what type a variable is, when you can write simple programs to ensure that you don’t have any type mismatch (really, you just push that error out of the program itself), and you feel like you are somehow managing complexity. You win that part of the debate.

However, what do you do with generic types?

If they don’t understand why generic types are important, then you bring up templates, generic functions or classes that can be applied to any kind of type that provides a certain basic level of functionality.

See, when you bring up generics in these type systems, it is a whole ball of wax that they know better than to tangle with. They roll their eyes, they say things like, “When would I ever use generics? They’re so error-prone and a source of so many problems.”

The “Aha!” moment is when they realize that every type system must support the generic type, and their type system does it poorly.

Something like Python, with its dynamic-strong paradigm, handles them effortlessly.

No, you can’t predict how your program will behave, but could you ever, really? All you can do is break it down into little bits that you can test, test those bits (AKA, unit tests) and then see how the parts fit together (AKA integration tests). I mean, you’re going to write your tests anyway, right? What fool would think just because their program compiles it must be correct?

In short: You’re going to have to deal with generic types eventually. It’s better to use a language that can handle them very well, because once you’ve handled generic types, you’ve handled ALL types.

Advertisements

Someone asked me about programming languages February 27, 2017

Posted by PythonGuy in Uncategorized.
add a comment
I won’t post the question or the context. I don’t think it’s too important. The response I wrote follows.
It’s hard to describe what it is a programming language is supposed to be.

Imagine you were an engineer in charge of designing a bridge. You want to build a bridge that goes from point A to point B and can carry a certain amount of weight and such.

Now, of the umpteen billion possible bridges you can build, you’re going to settle on one design. When you choose what makes that particular design the “best”, you’re going to factor in things like the cost of materials, the kinds of technology the local construction companies have available to them, what kind of maintenance will be performed, etc…

Programming languages are like bridges. They go from point A (ideas in your head) to point B (actual running code.) When you consider what programming language to use, you are not so interested in whether the programming language can do the job because they all can, as long as they are Turing complete. You’re going to be interested, instead, in things like how much work it takes to make the program, who is available to maintain or extend the code base, and what you will have to do to find and fix bugs that may or may not arise in the future.

Of all the attributes of a programming language, the most important to me, after programming my whole life and spending 17 years being paid to program, is complexity. If your programming language is complex, then I know it’s going to be really hard to write a correct program, very hard to maintain, and it will cost a lot of money to find people smart enough to keep it working.

But a programming language can be too “simple” as well, meaning that there aren’t enough features that I would like to have available. For instance, Lua is a great, simple language, but it is too simple and doesn’t handle some stuff that other languages provide, and I find that very annoying. And the simplest language of all? Machine code. It’s so simple even CPUs can understand it.

It’s all about balance.

When I talk about “Vanilla C” vs. “C/C++”, I am describing a problem I saw arise in the C/C++ community. C itself is a rather simple language, pretty well-defined with only a few weird cases that arise rarely. The subset of C I call “Vanilla C” is plenty to get any job done right the first time. It takes quite a bit of work, and you have to be a bit verbose, but it gets the job done. C/C++ introduce a number of new concepts designed to make my life better, but it just ended up making my job harder. I saw teams write down rules about “Please don’t use feature X” and such. It seemed every new thing brought in by C++ was disfavored, except namespaces, and even then, it would be confusing.

The reason people use C at all nowadays is for speed and simplicity. Well, now that CPUs are not the bottleneck for the vast majority of problems we are challenged with (even video games!) C or any of its cousins are not the solution we are looking for. Besides, we learn in college that oftentimes, the issue is we’re using an O(n^2) solution when a pretty good O(n) solution exists, so no amount of C code in O(n^2) will make that beat Python in O(n) with sufficiently large numbers.

And one more thing: When you write Python, you can have the computer compile it down to the same thing as C, and make it run as fast as C. And with JIT, some algorithms are even faster.

The world has changed a lot since the 90s. It’s time to embrace the new paradigm shift where the programming language can do a lot more work for you, all without creating a very complex system.