The Prototype's Language
The evolution of programming languages in payments technology sector is discussed, highlighting the shift from COBOL to Java and now to Python for its speed and adaptability. Language choice impacts developers and work quality.
Read original articleThe article discusses the evolution of programming languages in the payments technology sector, highlighting the shift from COBOL to Java and now to more modern languages like Python. It emphasizes that the choice of programming language is not just a technical decision but also a social one, impacting the community of developers and the quality of work produced. The author argues that Java, once dominant in payments technology, is no longer suitable due to factors like the need for nonstop availability, quick response times, and cost-effectiveness. Python is presented as the new preferred language for building payment systems due to its speed in prototyping and adaptability in the fast-paced technological landscape. The article draws parallels between Python's rise and Clayton Christensen's theory of disruptive innovation, positioning Python as a tool embraced by innovative engineers over traditional corporate-backed languages like Java. The importance of selecting the right programming language for payment applications is underscored as a critical and impactful decision in software development.
Related
Even taking it at face value that the choice is between Python and Java, the article leaves so much unaddressed.
> a good plan executed now trumps a perfect plan done next week
In what universe does this apply to a payments system?
Hatred for Java is burned deep in my soul, and I've spent an unreasonable amount of time on this very web site carefully articulating reasons why I think Java is a crufty relic whose design reflects a bunch of discredited beliefs from a quarter century ago. But gosh, if I had to choose between Java and Python for a payments system, I think I might choose Java.
Got it. So why was Python chosen again? Call me crazy but I want things that touch my money to be rock solid.
An interesting exercise many programmers don't do but should is to program purely in paradigms other than imperative OOP. The solutions you come up with are very different when you don't have objects, for loops, and if statements in your language's vocabulary.
Try writing a real program in Prolog or SQL. Not just database calls to retrieve data, but an actual full app with a UI and user interaction and network calls and all the rest. You will likely feel like a novice programmer if you haven't trained your brain to think in set operations like union, join, intersect, etc. When everything in your program is a set instead of a stateful objects, you do set operations over them instead of method calls. That's uncomfortable and disorienting at first.
As for the article, I didn't get much from it past the bit I quoted.
FWIW that PG quote is fully out of date; I do not associate Python with programmer competency. The first thing I ask when a programmer says they know Python is "What other languages do you program in?", because I've come across far too many Python programmers who can't tell a stack from a heap. YMMV
But whatever, we've still got lua 5.1ish.
The rest of the article is coherent, including the properties you want in the language, I just think python in particular has catastrophically lost its way.
But what _is_ the author talking about? It just isn't said. It merely states that the argument 'python is slow' is no longer relevant. It doesn't even state a reverse argument (why python would be better than java), and thus doesn't even begin to allow waylaying _those_.
As a trivial example, perhaps the author means: A nominally, strongly, explicitly typed languages is obsolete. In which case - how does that explain how, for example, typescript has happened, which actively _re-introduces_ these aspects to a language?
Clearly then, 'python is superior' cannot be taken as a truth universally held and requiring no logical argument.
The rest, sure, whatever - "Programming language used has a significant impact on what you can write and how one would write it" is.. kinda obvious, no? Turing machine logic says that it is _literally_ incorrect (anything you can compute in one, you can compute in the other), but that _is_ something all those with a modicum of programming experience probably take for granted. You _can_ write a compiler in brainfuck, but, that'd be insane.
The tricky argument isn't stated, the obvious arguments are.
I surely must be missing the point of this post.