How to get optimal linear scaling factor for a set of vectors

9 visualizaciones (últimos 30 días)
newbie9
newbie9 el 17 de Sept. de 2021
Editada: newbie9 el 18 de Sept. de 2021
I am trying to determine the optimal linear scaling factor for a set of vectors, such that the mean of the scaled set is close to a reference or target vector. An example dataset is below.
All data use the same x ordinate. I have a reference vector y_ref. I have three raw data vectors: y_raw1, y_raw2, and y_raw3.
I want to determine a linear scale factor for each y_raw? vector, such that the mean squared error (MSE) between the average of all three y_raw? vectors and y_ref is minimized at each x value.
Is this a common enough need that Matlab has a built-in function to do this? If not, are there any tips on getting started on my own function? I have Googled a bit, but I must not be using the right buzz words, because I can't find any equations or guidance to get started. Thank you
Using Matlab R2018a (v 9.4). I have the Aerospace Toolbox (v 2.21) and Statistics and Machine Learning Toolbox (v 11.3).
%%% DEFINE INPUT DATA
x = [0, 0.7, 1.1, 1.61, 2.02, 2.31, 2.71, 3, 3.22, 3.41, ...
3.69, 3.92, 4.32, 4.61, 5.02, 5.3, 5.71, 6, 6.22, ...
6.62, 6.91];
y_ref = [4.08, 4.14, 4.03, 4.62, 4.47, 4.26, 4.12, 4.11, ...
4.16, 4.24, 4.39, 4.71, 5.74, 6.15, 5.72, 5.15, ...
4.59, 4.65, 4.4, 4.28, 4.28];
y_raw1 = [4.66, 6.67, 5.36, 5.91, 3.12, 4.46, 4.3, 5.57, ...
3.52, 5.22, 6.02, 5.05, 6.86, 6.64, 4.99, 4.06, ...
3.16, 6.9, 5.13, 6.17, 5.47];
y_raw2 = [5.55, 3.24, 3.05, 6.89, 5.32, 4.97, 3.54, 5.78, ...
4.92, 5.46, 6.35, 6.88, 5.17, 7.04, 6.92, 4.16, ...
6.07, 4.55, 6.46, 4.91, 5.65];
y_raw3 = [3.8, 5.51, 4.76, 6.09, 4.39, 5.14, 6.92, 4.36, ...
3.83, 6.13, 6.4, 4.17, 6.57, 8.73, 5.35, 5.48, ...
4.59, 5.02, 4.99, 4.62, 5.46];
%%% VISUALIZE INPUT DATA
plot(x, y_ref, 'k', 'LineWidth', 2, 'DisplayName', 'Refernce')
hold on
plot(x, y_raw1, 'b', 'DisplayName', '#1')
plot(x, y_raw2, 'g', 'DisplayName', '#2')
plot(x, y_raw3, 'm', 'DisplayName', '#3')
xlabel('x')
ylabel('x')
legend('Location', 'best')
  2 comentarios
Image Analyst
Image Analyst el 17 de Sept. de 2021
It goes by different names. In spectroscopy it's called "Multivariate Curve Resolution". See
Like you have a spectrum (curve) and some component reference curves for different chemical species, and you want to find out how much of each potential component species went into making up the compound (chemical) being tested. It's commonly used to reverse engineer materials to see what they're actually made up of (like "What's really in that ketchup?").
Or in general:
newbie9
newbie9 el 18 de Sept. de 2021
Editada: newbie9 el 18 de Sept. de 2021
@Image Analyst thank you for the helpful documentation links

Iniciar sesión para comentar.

Respuesta aceptada

Matt J
Matt J el 17 de Sept. de 2021
Editada: Matt J el 17 de Sept. de 2021
I want to determine a linear scale factor for each y_raw? vector, such that the mean squared error (MSE) between the average of all three y_raw? vectors and y_ref is minimized at each x value.
If there is only one scale factor for each vector, you cannot minimize the MSE at each x-value independently, but the minimum L2-norm difference is obtained with,
scale_factors = ( [y_raw1;y_raw2;y_raw3].' \ y_ref(:) )*3; %EDITED
  2 comentarios
Image Analyst
Image Analyst el 17 de Sept. de 2021
Is this right?
%%% DEFINE INPUT DATA
x = [0, 0.7, 1.1, 1.61, 2.02, 2.31, 2.71, 3, 3.22, 3.41, ...
3.69, 3.92, 4.32, 4.61, 5.02, 5.3, 5.71, 6, 6.22, ...
6.62, 6.91];
y_ref = [4.08, 4.14, 4.03, 4.62, 4.47, 4.26, 4.12, 4.11, ...
4.16, 4.24, 4.39, 4.71, 5.74, 6.15, 5.72, 5.15, ...
4.59, 4.65, 4.4, 4.28, 4.28];
y_raw1 = [4.66, 6.67, 5.36, 5.91, 3.12, 4.46, 4.3, 5.57, ...
3.52, 5.22, 6.02, 5.05, 6.86, 6.64, 4.99, 4.06, ...
3.16, 6.9, 5.13, 6.17, 5.47];
y_raw2 = [5.55, 3.24, 3.05, 6.89, 5.32, 4.97, 3.54, 5.78, ...
4.92, 5.46, 6.35, 6.88, 5.17, 7.04, 6.92, 4.16, ...
6.07, 4.55, 6.46, 4.91, 5.65];
y_raw3 = [3.8, 5.51, 4.76, 6.09, 4.39, 5.14, 6.92, 4.36, ...
3.83, 6.13, 6.4, 4.17, 6.57, 8.73, 5.35, 5.48, ...
4.59, 5.02, 4.99, 4.62, 5.46];
%%% VISUALIZE INPUT DATA
plot(x, y_ref, 'k', 'LineWidth', 2, 'DisplayName', 'Reference')
hold on
plot(x, y_raw1, 'b', 'DisplayName', '#1')
plot(x, y_raw2, 'g', 'DisplayName', '#2')
plot(x, y_raw3, 'm', 'DisplayName', '#3')
xlabel('x')
ylabel('x')
scaleFactors = ( [y_raw1; y_raw2; y_raw3].' \ y_ref(:) ) / 3
yEstimated = scaleFactors(1) * y_raw1 + scaleFactors(2) * y_raw2 + scaleFactors(3) * y_raw3;
plot(x, yEstimated, 'r-', 'LineWidth', 3, 'DisplayName', 'yEstimated');
grid on;
legend('Location', 'best')
Why are you dividing the scale factors by 3?
Matt J
Matt J el 17 de Sept. de 2021
Editada: Matt J el 17 de Sept. de 2021
@Image Analyst You're right. It should be
scaleFactors = ( [y_raw1; y_raw2; y_raw3].' \ y_ref(:) ) * 3
scaleFactors = 3×1
0.4329 1.0392 1.0817
yEstimated = mean( [scaleFactors(1) * y_raw1 ; ...
scaleFactors(2) * y_raw2 ; ...
scaleFactors(3) * y_raw3]);

Iniciar sesión para comentar.

Más respuestas (1)

Image Analyst
Image Analyst el 17 de Sept. de 2021
Editada: Image Analyst el 17 de Sept. de 2021
You can use the Regression Learner App. It's a useful tool to know how to use. You basically give it all your inputs, and your desired output, and you can try all kinds of different models, like multiple linear regression, neural networks, decision trees, etc. to find the one with the best fit (lowest residuals). Here are the steps
  1. Turn your inputs into a table and your desired values into a column vector.
  2. Go to the Apps tab of the tool ribbon and select Regression Learner and start a new session.
  3. Select the input table from the workspace as your "Data Set Variable" and the y_reference as your desired output.
  4. Click the little down arrow to select models to try. For example select "All Linear" models.
  5. Click the green Train triangle.
  6. When it's done in a few seconds, click the Predicted vs. Actual to see how well it did.
  7. Export your model to the workspace. Save it to a .mat file if you want. You can call it with yFit = trainedModel.predictFcn(T) where T is a table of the input curves.
I did that and you can see the results below.
It works pretty good, though for your data set with very few training points and very noisy data there is not a greate way to get yRef from the 3 noisy curves no matter how they're combined.
% Demo by Image Analyst
clc; % Clear the command window.
close all; % Close all figures (except those of imtool.)
clear; % Erase all existing variables. Or clearvars if you want.
workspace; % Make sure the workspace panel is showing.
format short g;
format compact;
fontSize = 20;
%%% DEFINE INPUT DATA
x = [0, 0.7, 1.1, 1.61, 2.02, 2.31, 2.71, 3, 3.22, 3.41, ...
3.69, 3.92, 4.32, 4.61, 5.02, 5.3, 5.71, 6, 6.22, ...
6.62, 6.91];
y_ref = [4.08, 4.14, 4.03, 4.62, 4.47, 4.26, 4.12, 4.11, ...
4.16, 4.24, 4.39, 4.71, 5.74, 6.15, 5.72, 5.15, ...
4.59, 4.65, 4.4, 4.28, 4.28];
y_raw1 = [4.66, 6.67, 5.36, 5.91, 3.12, 4.46, 4.3, 5.57, ...
3.52, 5.22, 6.02, 5.05, 6.86, 6.64, 4.99, 4.06, ...
3.16, 6.9, 5.13, 6.17, 5.47];
y_raw2 = [5.55, 3.24, 3.05, 6.89, 5.32, 4.97, 3.54, 5.78, ...
4.92, 5.46, 6.35, 6.88, 5.17, 7.04, 6.92, 4.16, ...
6.07, 4.55, 6.46, 4.91, 5.65];
y_raw3 = [3.8, 5.51, 4.76, 6.09, 4.39, 5.14, 6.92, 4.36, ...
3.83, 6.13, 6.4, 4.17, 6.57, 8.73, 5.35, 5.48, ...
4.59, 5.02, 4.99, 4.62, 5.46];
%%% VISUALIZE INPUT DATA
plot(x, y_ref, 'k', 'LineWidth', 2, 'DisplayName', 'Reference')
hold on
plot(x, y_raw1, 'b', 'DisplayName', '#1')
plot(x, y_raw2, 'g', 'DisplayName', '#2')
plot(x, y_raw3, 'm', 'DisplayName', '#3')
xlabel('x')
ylabel('x')
scaleFactors = [y_raw1; y_raw2; y_raw3].' \ y_ref(:)
yEstimated = scaleFactors(1) * y_raw1 + scaleFactors(2) * y_raw2 + scaleFactors(3) * y_raw3;
plot(x, yEstimated, 'r-', 'LineWidth', 3, 'DisplayName', 'yEstimated');
grid on;
legend('Location', 'best')
% Create table for the regression learner
yRef = y_ref'; % Turn into column vector.
tPredictions = table(y_raw1(:), y_raw2(:), y_raw3(:))

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by