Opinion: Writing generic code to allow switching databases isn't a good plan

I work with a lot of different software houses. Microsoft calls these ISVs (Independent Software Vendors). They develop applications that we all use day to day.
When I look at their code, many developers have tried to write database-agnostic code. They might be working with SQL Server today, but want to be able to use Oracle, or Cassandra, etc. without changing code.
As a concept, that sounds great. Why would you want to tie yourself into a specific vendor?
It’s interesting that they’ve often tied themselves into a specific development tools vendor. They don’t plan to switch from C# to Java any time soon but somehow databases are considered different.
Why are databases different?
The SQL standards are great, but they are why people think this will work OK. One camp of developers (let’s call them group #1) think that if they keep their SQL code completely standard, that they should be able to run on any database.
Another group of developers (group #2) will try to do all their work with generic classes. For example, in C#, they won’t use a SqlConnection, they’ll use a DBConnection.
Yet another group of developers (group #3) will decide that if they add another layer of software like an ORM (Object-Relational-Mapper) that they can let it do the translation, and they can just deal with the objects. At least then, only one layer of code (that they didn’t need to write) will deal with the differences.
So how does that work out?
Group #1 quickly find that they have such a limited range of code that will work across all the target databases, that it’s unusable.
Group #2 eventually find that they’re often casting their generic DB-related classes to specific ones, to get access to pesky things like parameters, and other properties and methods.
Group #3 find that they have very limited options around data types, etc. and that the performance of their applications can be lousy.
And all the groups aren’t able to use the rich options available in a specific database.
But they do switch databases right?
Well no. It’s admittedly anecdotal, but what I see as I go from site to site, is that they NEVER change to a different database anyway. And worse, they invariably end up using database-specific features.
So what’s the end result?
What do I end up seeing? Almost always, I end up seeing poorly-performing code, with huge compromises. I see people unable to take advantage of advances at the database layer (like richer data types, etc.)
Worse, I see database-specific features used, so they couldn’t easily move if they wanted to.
Even if they do manage to not use any database-specific code, they usually still end up with code that runs across a number of databases, but poorly on them all.
And they do all this to themselves, for a scenario (i.e. changing databases) that almost NEVER happens.
I’d love to hear your thoughts.
2020-02-25