Function approximation with regression analysis
This online calculator uses several regression models for approximation of an unknown function given by a set of data points.
This content is licensed under Creative Commons Attribution/Share-Alike License 3.0 (Unported). That means you may freely redistribute or modify this content under the same license conditions and must attribute the original author by placing a hyperlink from your site to this work https://planetcalc.com/5992/. Also, please do not modify any references to the original work (if any) contained in this content.
The function approximation problem is how to select a function among a well-defined class that closely matches ("approximates") a target unknown function.
This calculator uses provided target function table data in the form of points {x, f(x)} to build several regression models, namely: linear regression, quadratic regression, cubic regression, power regression, logarithmic regression, hyperbolic regression, ab-exponential regression and exponential regression. Results can be compared using the correlation coefficient, coefficient of determination, average relative error (standard error of the regression) and visually, on chart. Theory and formulas are given below the calculator, as per usual.
Linear regression
Equation:
a coefficient
b coefficient
Linear correlation coefficient
Coefficient of determination
Standard error of the regression
Quadratic regression
Equation:
System of equations to find a, b and c
Correlation coefficient
,
where
Coefficient of determination
Standard error of the regression
Cubic regression
Equation:
System of equations to find a, b, c and d
Correlation coefficient, coefficient of determination, standard error of the regression – the same formulas as in the case of quadratic regression.
Power regression
Equation:
b coefficient
a coefficient
Correlation coefficient, coefficient of determination, standard error of the regression – the same formulas as above.
ab-Exponential regression
Equation:
b coefficient
a coefficient
Correlation coefficient, coefficient of determination, standard error of the regression – the same.
Hyperbolic regression
Equation:
b coefficient
a coefficient
Correlation coefficient, coefficient of determination, standard error of the regression - the same as above.
Logarithmic regression
Equation:
b coefficient
a coefficient
Correlation coefficient, coefficient of determination, standard error of the regression – the same as above.
Exponential regression
Equation:
b coefficient
a coefficient
Correlation coefficient, coefficient of determination, standard error of the regression – the same as above.
Derivation of formulas
Let's start from the problem:
We have an unknown function y=f(x), given in the form of table data (for example, such as those obtained from experiments).
We need to find a function with a known type (linear, quadratic, etc.) y=F(x), those values should be as close as possible to the table values at the same points. In practice, the type of function is determined by visually comparing the table points to graphs of known functions.
As a result we should get a formula y=F(x), named the empirical formula (regression equation, function approximation), which allows us to calculate y for x's not present in the table. Thus, the empirical formula "smoothes" y values.
We use the Least Squares Method to obtain parameters of F for the best fit. The best fit in the least-squares sense minimizes the sum of squared residuals, a residual being the difference between an observed value and the fitted value provided by a model.
Thus, when we need to find function F, such as the sum of squared residuals, S will be minimal
Let's describe the solution for this problem using linear regression F=ax+b as an example.
We need to find the best fit for a and b coefficients, thus S is a function of a and b. To find the minimum we will find extremum points, where partial derivatives are equal to zero.
Using the formula for the derivative of a complex function we will get the following equations:
For function partial derivatives are
,
Expanding the first formulas with partial derivatives we will get the following equations:
After removing the brackets we will get the following:
From these equations we can get formulas for a and b, which will be the same as the formulas listed above.
Using the same technique, we can get formulas for all remaining regressions.
Comments