I'm always surprised how apparently not-deterministic SQL Server appears to be when it comes to performance.
We use for our apps Kendo grids at work which allows you to create any report by dynamically choosing the filters and columns you want. On the server side a query is composed from the parameters the user chooses, including only the columns that are necessary. The data source for the query is a inline table valued function (TVF).
At one point we had trouble with performance after going live, and it could not be explained by changes to the TVF. Even when we reverted the released TVF changes the issues kept happening.
A day or so later I was with a database expert one day long optimising the query (may I plug the excellent free Sentry One tool here?). Eventually we had optimised the query. However, when we put the query back into the TVF performance was down again. The slightest things that changed affected the performance.
This is something that affects every RDBMS that exposes SQL. The SQL language describes what and not how, how specifically to retrieve your data is left to the implementation.
The most common causes for this in MySQL (and I expect SQL Server too) is that often your query can be resolved on multiple indexes, and picking which one is a matter of a heuristic. On MySQL changing table statistics can change which index is used, which can drastically affect query performance.
So a common poweruser optimization strategy (which should be used very carefully) is FORCE INDEX, also known as "I know better which index to use". More often than not I've found uses of FORCE INDEX that were more harmful than not.
But an ever more insidious version of this is queries that retrieve cached data being faster. This is something you can generally only observe in production because what's in cache is a matter of other traffic on the server.
So you can easily get cases where something looks much worse under EXPLAIN, but is faster. E.g. it can be faster to execute a "worse" query with no indexes than one with, if the one with no indexes happens to need data that's already in memory v.s. sitting on disk.
We use for our apps Kendo grids at work which allows you to create any report by dynamically choosing the filters and columns you want. On the server side a query is composed from the parameters the user chooses, including only the columns that are necessary. The data source for the query is a inline table valued function (TVF).
At one point we had trouble with performance after going live, and it could not be explained by changes to the TVF. Even when we reverted the released TVF changes the issues kept happening.
A day or so later I was with a database expert one day long optimising the query (may I plug the excellent free Sentry One tool here?). Eventually we had optimised the query. However, when we put the query back into the TVF performance was down again. The slightest things that changed affected the performance.