Тёмный

Loss Trends #2 (Fitting a Single Variable Exponential) | Loss Trending (CAS Exam 5 / CAS Exam MAS1) 

Mancinelli's Math Lab
Подписаться 9 тыс.
Просмотров 2,3 тыс.
50% 1

Опубликовано:

 

23 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 2   
@katt157
@katt157 4 года назад
A little clarification for this problem, which will make it a lot easier to do by hand which I've seen required in some longer exams. You already linearized the equation to the form ln(y)=ln(c_1)+c_2x or equivalently y=k_1+k_2x, and you perform a LSE solution to the linear equation. This last step is quite simply an orthogonal projection on the function space, for a class of function f:A->R this space can be seen as a R^A a Cartesian product of the reals where the "dimension" is equal to the cardinality of A and indexed such that for a in A the a-th coordinate is equal to f(a). This can then be solved simply by creating a matrix A where the rows are all [1,x] (the respective arguments to the coefficient in the equation) taken from the data and a vector b which contains the respective target ln(y) values. We note now that for a two element vector xhat=[k_1,k_2], Ax is a vector where every element is of the form k_1+k_2x. The vector Ax and b will be of the same size n which is our sample size. What is generally difficult to see is that the transformation A(xbar):R^n->R^n maps to a plane in R^n, every point in this plane corresponds to a function c_1+c_2x, as such it also has a representation in R^R. What we now seek is a solution where A(xhat)-b is orthogonal to the image of A(xbar), luckily we know that the column space of A is the image, so we seek the solution to A^T(A(xhat)-b)=0, or A^TA(xhat)=A^Tb, we end up with n linear equations of type k_1+k_2*x=p. Obviously we only need two and to remember that c_1=e^k_1. Edit: Quick note about the dimension of the function space, we use it quite informally as just a notion of how many elements the vectors has, we do not care about linear dependence.
@jacobm7026
@jacobm7026 3 года назад
Great work, man. Preciate you for this