Changing a string in a matrix to a variable with a value given in another matrix.
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Cam
el 6 de Oct. de 2016
Editada: James Tursa
el 6 de Oct. de 2016
Its pretty simple what i want to do but seems complex on matlab. For example we have two difference types of arrays
if true
% code
Names = {'AB','GH','BC';'BG','FG','CF';'DE','CD','EF'}
Values = [1 2 5;1 4 5;1 5 6]
end
all I need to do is take the Value from the String array and make a variable out of it and then give that variable a value from the Values matrix in the corresponding spot.
0 comentarios
Respuesta aceptada
James Tursa
el 6 de Oct. de 2016
Editada: James Tursa
el 6 de Oct. de 2016
Well, you can do this using the eval() function, but it is generally not a good idea to do so. You are popping these variables into the workspace dynamically. How do you expect to deal with them downstream in your code? With more eval() statements? Ugly and difficult to maintain. See this link for alternatives:
E.g.,
eval([Names{2,3} '=' num2str(Values(2,3))])
Do you really want to do stuff like this everywhere in your code when you want to use the variable? Probably not.
0 comentarios
Más respuestas (0)
Ver también
Categorías
Más información sobre Characters and Strings en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!