Oracle is, I think, a company in trouble. There will always be a
need for high-end databases, but not enough to sustain Oracle as an
I'm not sure there always will. There are many, many open ecosystem or cloud alternatives right now that are highly performant (or at least very horizontally scalable) that can handle Tons Of Fucking Data. Their SQL dialects might not be as rich, but it turns out that that doesn't matter so much.
Unless they get clever, and think beyond databases or something to provide a service that folks need.
They do more than databases. Oracle has all sorts of business software ... CRM, supply chain management, ERP ... expensive stuff.
They're still the second largest software company in the world, less than half the size of Micro$oft but still significantly ahead of companies like SAP. They're not going away anytime soon, even if the remains of Sun are ground into dust and the Relational Database revenue dries up in the face of commodity alternatives.
It doesn't help that Oracle is a company that a lot of people in the industry love to hate. Sun was a company run by engineers. Oracle isn't.
Oracle databases aren't going away. Big companies love to buy stuff. Why use Centos when you can buy a Redhat license. Things that are for free have no value. Unless the company at least sells support.
On the other hand, in my project, they rather let me fool around for a month instead of paying some oracle crack for a day... but the big blue companies I am working for are totally out of their mind.
I've seen that plenty of times too. Big money spent on RHEL when there's already plenty of Linux talent in-house. I don't mind so much because at least some of the money paid to Red Hat is funding more Linux and open source development. Or you could buy Oracle Linux, which is basically the same thing as CentOS and instead of your money going back into Linux development it goes into Larry Ellison's pocket.
I suppose you could pay Oracle for MySQL as well. :) I think it's hilarious that Monty forked MySQL and pretty much the entire world went to MariaDB with him.
I've been more a Postgresql fanboy, I suppose, than MySQL/MariaDB anyway.
Although the differences between these matters less and less.
In the not too distant future, Oracle will probably settle in as a vendor of middleware and business intelligence software, offered both as packaged software and as a service. Solaris and SPARC are already the walking dead, and high end Oracle DB is like IBM mainframes in that it'll be around forever but it will take a century to fully decline.
Amazon is just that damned good.
Unfortunately.
You pay only for what you use... but you pay for everything. CPU time, RAM, disk space, bandwidth... all of it. But only what you actually use.
And as far as I've been able to tell, whatever hardware they're using, it works damned well. I moved our source repository to an AWS instance from our own hardware in a VMWare ESXi environment, and enjoyed a significant improvement in performance in cloning and general use. I also moved our development ticketing tool (Redmine) out there, and it works better, too. All the production servers I've moved to AWS work pretty well when I don't do anything stupid (like consume all the RAM because I didn't allocated any swap space or the like).
Run out of hard-drive space for your machine? Not a problem. Just allocate more, twiddle the OS to recognize it, and without having to take down the machine, you can have more hard-drive space for it.
Are you a corporate user trying to deal with layers of security for these machines? Their IAM system lets you provide fairly granular control over who gets to do what. My builds go to an S3 bucket (ridiculously cheap storage), and I provide logins for people to access it for installing updates. Way easier than setting up FTP or SCP crap and directing people with logins for it.
The only thing that really sucks about it is that nagging feeling that, somehow, by using Amazon's services, you're killing your own children.
Not to change the subject, but...
I performed this update that broke the web application I use. It's my fault, as I altered the database to use utf8_general_ci collation instead of latin1_general_ci, because, well, I prefer a sane character encoding scheme, and don't really understand why these idiots used latin1 (especially when something else they used conflicted with that character encoding, which called attention to the problem in the first place).
This meant I needed to hunt down all the .php files using latin1_general_ci and change it to utf8_general_ci.
In Windows, I likely would have had to iterate over every PHP file, pull it up in my favorite text editor, global search-and-replace it, then move to the next one. It might have taken me thirty minutes to an hour to fix.
In Unix, I just had to figure out the right string of commands, which took me all of about 3 minutes. Used 'grep' to recursively search for the files in the subfolder that had the offending text, 'cut' to get just the filename, 'sort' to put them in alphabetical order, and ensure everything was grouped together, 'uniq' to ensure I didn't have any duplicate lines, and finally 'xargs' combined with 'sed' to edit the filenames and make the global change.
Fucking xargs, man... what a beautiful command.
What I did find, however, is that once you port applications to OpenSSL 1.1, they still work in OpenSSL 1.0, because the "new" APIs have been there for a while.
The funny thing is, when you install libssl-dev, you get 1.1, and then when you build any significantly complex program, you get compiler warnings about other libraries using libssl-1.0. Then when you look at your final binary you find it's linked to OpenSSL *and* GnuTLS because some other library brought the latter along with it.
This is the real reason we end up with containers :)
Yeah, right? I'm not a big fan of containers, but Linux probably does need them, in a sense, because some of the shared library decisions are so careless.
Plus you can install and remove them in one step, and they can be sandboxed, etc. etc.
I think what bothers me the most is there are signs that sandboxing is starting to become a userland crutch for DLL hell. That just leads to bloat.
As suggested above, I think in the long run, memory deduplication is probably the bigger win, and if the operating systems can get it right, more long term viable than shared libraries.