Algorithms are now commodities
When I first started writing software, developers had to implement most of the algorithms they used; yes, hardware vendors provided libraries, but the culture was one of self-reliance (except for maths functions, which were technical and complicated).
Developers read Donald Knuth’s The Art of Computer Programming, it was the reliable source for step-by-step algorithms. I vividly remember seeing a library copy of one volume, where somebody had carefully hand-written, in very tiny letters, an update to one algorithm, and glued it to the page over the previous text.
Algorithms were important because computers were not yet fast enough to solve common problems at an acceptable rate; developers knew the time taken to execute common instructions and instruction timings were a topic of social chit-chat amongst developers (along with the number of registers available on a given cpu). Memory capacity was often measured in kilobytes, every byte counted.
This was the age of the algorithm.
Open source commoditized algorithms, and computers got a lot faster with memory measured in megabytes and then gigabytes.
When it comes to algorithm implementation, developers are now spoilt for choice; why waste time implementing the ‘low’ level stuff when there were plenty of other problems waiting to be implemented.
Algorithms are now like the bolts in a bridge: very important, but nobody talks about them. Today developers talk about story points, features, business logic, etc. Given a well-defined problem, many are now likely to search for an existing package, rather than write code from scratch (I certainly work this way).
New algorithms are still being invented, and researchers continue to look for improvements to existing algorithms. This is a niche activity.
There are companies where algorithms are not commodities. Google operates on a scale where what appears to others as small improvements, can save the company millions (purely because a small percentage of a huge amount can be a lot). Some company’s core competency may include an algorithmic component (whose non-commodity nature gives the company its edge over the competition), with the non-core competency treating algorithms as a commodity.
Knuth’s The Art of Computer Programming played an important role in making viable algorithms generally available; while the volumes are frequently cited, I suspect they are rarely read (I have not taken any of my three volumes off the shelf, to read, for years).
A few years ago, I suddenly realised that I was working on a book about software engineering that not only did not contain an algorithms chapter, and the 103 uses of the word algorithm all refer to it as a concept.
Today, we are in the age of the ecosystem.
Algorithms have not yet completed their journey to obscurity, which has to wait until people can tell computers what they want and not be concerned about the implementation details (or genetic algorithm programming gets a lot better).
heh, yes. You know, it’s even sad for me. Today IT divided into many fields. And some devs do not understand others. This path passed by math and physics already. Algos has just a niche.
Out of good fortune, I have always worked at companies with embedded s/w (sometimes down to the level RTL) running in very constrainted environments so algorithm improvement was always at the forefront. This new age is very alien to me.
>Open source commoditized algorithms
I disagree. The concept of libraries commoditized algorithms. What is a library but a collection of algorithms you can call. This can range from simple algorithms like computing an average to complicated algorithms like decoding video. There have been plenty of companies built around producing a library for other companies to use. Similarly there have been similar efforts released as open source software.
>Algorithms are now like the bolts in a bridge
Your view of what an algorithm is, is to narrow. Larger algorithms can be created by combining other algorithms. The algorithm for finding an average relies an algorithm to traverse a list, an algorithm to add two numbers, an algorithm to divide two numbers, and many more.
>New algorithms are still being invented. This is a niche activity.
Whenever you write code, you are implementing an algorithm. You are working on an algorithm that constitutes your entire program’s behaviour.
>Algorithms have not yet completed their journey to obscurity
Algorithms are nowhere near obscure. Different algorithms make different tradeoffs and make different assumptions about the input data. While in some case computers are powerful enough for the differences not to matter, but in other cases they do.
>has to wait until people can tell computers what they want and not be concerned about the implementation details
This is the opposite. In that reality computers are able to compile algorithms from speech to code. A programmer will be doing nothing but reciting algorithms for the computer to implement.
Truly what a parochial, ill-understood, poorly expressed understanding of algorithms, and an even poorer understanding of commodities. surely in some narrow sense all intellectual work is commoditised, is that the essence of technical discovery? you might as spell the end of mathematics while you’re at it.
@Anon
I think you didn’t get the point. All your arguments do not take all the facts the author stated.
This is as misguided as a chef claiming recipes are now commodities, and the common chef need not be familiar with any. As with cooking, any organized programming of a machine necessarily involves algorithms, although lesser programmers won’t notice them. Being generous, it’s surprising to me someone would seem to have such a breadth of knowledge and yet make such an ignorant claim, so I’d prefer to think this merely may have been communicated extremely poorly.
“Algorithms were important because computers were not yet fast enough to solve common problems at an acceptable rate”
Look no further than Urbit for an example of what can happen when someone does away with progress purely for its own sake. There’s an infinite amount of problems for which computers still need use good algorithms to efficiently process, although there can be fun in using the most basic and naive solutions to problems which modern computing now permits as feasible.
“Today developers talk about story points, features, business logic, etc.”
These people are called “tools”, because that’s how they’re treated by employers. A computer is supposed to be a lever for the mind, and using humans as unintelligent labor, when it could be automated, is an obscenity.
“Given a well-defined problem, many are now likely to search for an existing package, rather than write code from scratch (I certainly work this way).”
I don’t understand this mindset. One should at least review the code in its entirety, beforehand. I read specifications, pursue an ideal design, and write my own libraries. With my most recent program, I didn’t have a well-defined problem, and I devised many algorithms to suit it, with the time and space characteristics desired, because I was actually creating, not merely gluing something together. It’s not enough for a program to work, it should also be elegant.
“New algorithms are still being invented, and researchers continue to look for improvements to existing algorithms. This is a niche activity.”
Programmers are certainly a commodity, by design. Giving programmers powerful languages makes them too powerful, so they’re given lesser languages such as Java or Go, so that they may be easily replaced. It’s good for business to have ignorant employees.
I’m glad to know not to buy this book.
@Verisimilitude
I was agreeing with your well-reasoned rebuttal until I came to this odd conspiracy theory.
“Giving programmers powerful languages makes them too powerful, so they’re given lesser languages such as Java or Go, so that they may be easily replaced. It’s good for business to have ignorant employees.”
Before I make the obvious Mother Night reference, I should ask, what are these secret languages with which powerful wizards are equipped?
@Verisimilitude
The ‘chefs’ in most restaurants heat precooked components of a meal and combine them on the plate.
Progress requires being able to treat what used to be important as commonplace.
Progress is path dependent, e.g., the characteristics of the ASCII encoding will continue to be with us for a very long time. Urbit? Yet another clean slate OS; is it progress or a vanity project?
@Unpretentious
It’s no conspiracy theory. Observe this quote from Rob Pike:
“The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt.”
So, it doesn’t even require inference to make this conclusion, and I’m not surprised Rob Pike wasn’t smart enough to avoid being so blatant with his terrible ideas.
“Before I make the obvious Mother Night reference, I should ask, what are these secret languages with which powerful wizards are equipped?”
I enjoy Common Lisp, Ada, and APL, to name three. Far from being some man with his head in the high-level languages, I’ve just finished a reimplementation of an interactive machine code development tool of my own design, mentioned in my prior comment, which is far superior to the assembler model, in my eyes.
@Derek Jones
“The ‘chefs’ in most restaurants heat precooked components of a meal and combine them on the plate.”
That’s a fair point, and I could argue about the meaning of “chef”, but I’d rather let the analogy break down here and merely point out that there’s no need for such programmers. They can and should be automated away, instead of used as unintelligent labor. This is what automatic computers are for.
“Progress requires being able to treat what used to be important as commonplace.”
I agree, that which we can do without thinking about it is the hallmark of civilization and all of that. Rather than continue on with the organized degeneration of programming as employment by claiming programmers should also become commodities, however, I again state the world has no real need for such programmers. A business prefers to have ignorant employees, so they can’t fight back against corporate fiat.
I’m not concerned about programming as a profession, in any case, and pursue it for my own purposes. Algorithms will never be a commodity to me. I actually do read the books I own, including my copies of “The Art of Computer Programming”, although I can’t currently claimed to have finished reading them. It’s also important to point out that, even if libraries were a commodity, licensing still matters; I use AGPLv3 for most of my work, ensuring I won’t be working gratis for some corporation, and this is, in fact, a method to avoid becoming a commodity as, legally, it’s not possible to switch out a library in this manner and then relicense. I could go on about how I’ll be hacking, likely until my death, but I’ve communicated my point well enough, I think.
“Progress is path dependent, e.g., the characteristics of the ASCII encoding will continue to be with us for a very long time. Urbit? Yet another clean slate OS; is it progress or a vanity project?”
Urbit is clearly nonsense. I referred in particular to its asinine qualities, such as having increment but no decrement, yet also being intellectually barren and so having no reason for being so inefficient. I do take issue with ASCII and Unicode, moreso the latter, and so Urbit doesn’t even go far enough in discarding the past, for my tastes. This article details the ideas behind some of my machine text work I’ve been free to pursue more in earnest, recently:
http://verisimilitudes.net/2018-06-06
@Unpretentious
Even the conspiracy theory has some grain of truth: employers do care about being resilient, and one important aspect is the ability to replace employees if they leave or (hopefully rarely) need to get fired.
That means finding ways to make programmers replaceable. That means using languages accessible to many programmers. That means a tendency to avoid niche languages, even if they happen to be the right tool for the job. In big enough companies, there’s even a top down policy that limit’s the acceptable programming languages programmers can use.
In the end, having a larger pool of available programmers can also give you leverage when the time comes to negotiate the salary. So it’s not just resiliency that improves, it’s also the costs that can decrease. (Sure, quality may go down as well, but it’s harder to notice from the top.)
The end result? Companies turning employees into replaceable cogs so they can happily maximize profits. That’s not a conspiracy, that’s just rich people looking out for their own interests.