I'm processing a lot of stateless time based data in a (edit: Microsoft SQL Server) SQL database - for the sake of this question

I'm preparing a lot of stateless time based information in an (alter: Microsoft SQL Server) SQL database - for this inquiry, it's fundamentally IoT sensor information. We've been graphing this information for clients and now they need us to do some analysis of the information, especially pro-rating/interpolating values to get a estimated value for 15 minute boundaries, etc.

We're right now doing the majority of this in SQL stored procedures, for the most part in light of the fact that the underlying necessities were (and still are) question substantial rather then processor overwhelming.

We're presently at the phase where our put away strategies begin with a SELECT TOP 1 into factors, huge amounts of contingent preparing with drifting point division pursued by a solitary UPDATE or INSERT toward the end.

This is all scheduled/non real time processing so we can call a script every minute or so if required.

Has anyone else had a similar problem, or done the analysis to compare Stored Procedures vs .net CLR vs .net console app?

Maybe the more important question is, am I likely to have a noticeable performance improvement by moving away from doing CPU heavy work inside stored procedures?