How to deploy ML model on thingspeaks
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I have an ML model trained locally on my system. I want to know how I can deploy my ML model on thingspeak. I would need to work on this model online. I would feed it data and would collect the output. Can I know how to deploy it? Also if it isnt possible to deploy it here, I could deploy it on EC2 on AWS. So is there a way I can access EC2 using thingspeak (feed it data and collect the output)?
I'm new to this so please bear me.
0 comentarios
Respuestas (1)
Raghava S N
el 11 de Mzo. de 2024
Hi Arnav,
There is a "Developing an IoT Analytics System with MATLAB, Machine Learning, and ThingSpeak" paper linked from the ThingSpeak website that may be of help -
In this example, the author takes the model and generates MATLAB code and includes it as a function. The paper also outlines a general workflow, which can be utilised for this use-case.
The paper demonstrates the preferred approach to train the model on a desktop and embed the trained model parameters in the MATLAB code that is operationalized on ThingSpeak.
Another approach is to upload the trained model to DropBox - https://www.mathworks.com/matlabcentral/fileexchange/59673-upload-files-to-your-dropbox-folder-from-matlab and have the code operationalized on ThingSpeak pull the model down parameters from DropBox - https://www.mathworks.com/matlabcentral/fileexchange/67833-download-files-from-your-dropbox-api-folder-using-matlab on demand and apply it to the ingested data.
Please refer to this MATLAB Answers post for a more detailed discussion on the same topic - https://www.mathworks.com/matlabcentral/answers/516865-upload-trained-machine-learning-model-to-thingspeak
0 comentarios
Ver también
Categorías
Más información sobre ThingSpeak en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!