R is now important enough to have a paid for PR make-over
With the creation of the R consortium R has moved up a rung on the ladder of commercial importance.
R has captured the early adopters and has picked up a fair few of the early majority (I’m following the technology adoption life-cycle model made popular by the book Crossing the Chasm), i.e., it is starting to become mainstream. Being mainstream means that jobsworths are starting to encounter the language in situations of importance to them. How are the jobsworths likely to perceive R? From my own experience I would say it will be perceived as being an academic thing, which in the commercial world is not good, not good at all.
To really become mainstream R needs to shake off its academic image, and as I see it, the R consortium has been set up to make that happen. I imagine it will try to become the go-to point for journalists wanting information or a quote about things-related-to R. Yes, they will hold conferences with grandiose sounding titles and lots of business people will spend surprising amounts of money to attend, but the real purpose is to solidify the image of R as a commercial winner (the purpose of a very high conference fee is to keep the academics out and convince those attending that it must be important because it is so expensive).
This kind of consortium gets set up when some technology having an academic image is used by large companies that need to sell this usage to potential customers (if the technology is only used internally its wider image is unimportant).
Unix used to have an academic image, one of the things that X/Open was set up to ‘solve’. The academic image is now a thing of the past.
For the first half of the 1980s it looked like Pascal would be a mainstream language; a language widely taught in universities and perceived as being academic. Pascal did not get its own consortium and C came along and took its market (I was selling Pascal tools at the time and had lots of conversations with companies who were switching from Pascal to C and essentially put the change down to perception; it did not help that Pascal implementations did their best to hide/ignore the 8086 memory model, something of interest when memory is scarce).
How will we know when R reaches the top rung (if it does)? Well there are two kinds of languages, those that nobody uses and those that everybody complains about.
R will be truly mainstream once people feel socially comfortable complaining about it to any developer they are meeting for the first time.
Yeah … I’m an early R adopter – 15 years and counting. IMHO this is mostly about Microsoft using its cash to protect its investment in Revolution Analytics and as a closing tool to compete with IBM’s SPSS. I wasn’t serious about moving over to Julia but now I don’t see how I can avoid it.
@M. Edward Borasky (@znmeb)
I doubt the Microsoft/Revolution Analytics deal had anything much to do with R. Intel was a major investor in Revolution Analytics and I would not be surprised if the sale was all about Microsoft/Intel back scratching.
Microsoft is involved with the R consortium so it can plug Azure along with the trendy subject of the day, data science.
Reading the tea leaves of this move requires knowing what sort of tea is involved. Let’s start with the perceived major fault of R. That would be its memory resident data model.
Next, what sort of agenda have some (or all?) of the Consortium members followed in previous, similar, ventures? Oracle moated java. IBM moated COBOL (you have to be kinda olde to remember that).
Next, yet again, what of SAS and SPSS vis-a-vis the Consortium members? SAS still appears determined to remain independent. SPSS, may or may not, be dying on the vine; some data says yes, some says may be.
In sum, we should expect the Consortium to promote RA/R as the disc-based alternative for Enterprise SPSS/SAS. Yes, that puts the final nail in IBM/SPSS’s coffin, but IBM already has in-database R with Netezza (a PG based db; true PG has had such for years). As does Oracle. As does SAP/HANA. As M$ has promised for SQL Server, soon. If we see in-database R for DB2, then we’ll know fur shur. “The Community” will be relegated to the Kid’s Table for dinner.
I don’t think there is anything to worry about. The development of R stays in the hands of R-core and if you haven’t noticed the R foundation is also in the R consortium.
I expect the R consortium to improve the tools around R and CRAN, e.g. help with the distribution of binary CRAN packages, set up community web sites, etc. They will initiate projects like these, and they will also accept proposals for projects. Importantly, they will be able to pay people to make these developments happen!
The consortium will probably not support developments that are beneficial to only one (or a few) members. E.g. they won’t sponsor in-database R for Microsoft SQL server, or something like that. That does not mean the Microsoft will not develop it, in fact they want to, but that is independent of the consortium.
Again, R-core and the R foundation has full control over the *official* GNU R source code, and there is nothing to worry about.
@Gabor
I would be surprised if the R consortium did any non-trivial technical work. The only thing I can imagine them changing is the online help, getting rid of the academic citations and introducing color configuration options (jobsworths do love configuring the color of the tools they use, it makes them feel like they have contributed).
Of course they will be very involved in talking about whatever the hot technical topic of the day is, but only because they want to be in the spotlight and guide the conversation.
“the purpose of a very high conference fee is to keep the academics out and convince those attending that it must be important because it is so expensive” – this is slightly tongue-in-cheek, right?
@Barry R
Yes, slightly. This kind of consortium are packed with marketing people, a species known for spending money like water. Conferences are a source of income for the milking of.
Next year’s userR! at Stanford might well be the last big R event of the year held on a University campus.
Another take on the lifecycle aspect of going mainstream is this. It could be that in 25 years, the “old dude” R developers gets paid $150k/yr (in today’s money) to maintain the then 25-year old analytics workflows because these workflows will be too way to complicated for anyone else to understand enough to re-code.Yet, these workflows will be mission critical…much like today there are still Burroughs, Unisys, MCP-based financial applications that have to be kept intact and running at any cost.
Base R has to get a 64-bit integer data type. If not, maybe this will be the thing that everyone complains about in a room full of developers after R has gone mainstream.