Reducing running time in image processing

4 visualizaciones (últimos 30 días)
Ronaldo
Ronaldo el 30 de Ag. de 2013
I wrote a code to find circles in an image by using imfindcircles and do some other calculations on the detected circles. I plan to apply the code to 250000 images. My current code takes 0.8 seconds per image. Processing of each image is completely independent from other images. I am aware of parfor commands but I do my best not to use it because my code is complex enough and I do not like to make it more complex. Is there any way that I can run the script in a parallel way to reduce the total time (and not the running time for each which is 0.8 seoconds)? It should be noted that in some parts of the code I take advantage of GPU as well.

Respuesta aceptada

Walter Roberson
Walter Roberson el 31 de Ag. de 2013
parfor() and related commands such as spmd() are the main approach. Otherwise, especially if you are on Linux or OS-X, run a script that hives off a number of different MATLAB processes, each with slightly different parameters. Though if you are keeping a GPU busy, it is not certain that running multiple such routines would be any faster.
The usual method is to (A) optimize the algorithm; and (B) vectorize the code.

Más respuestas (0)

Categorías

Más información sobre Big Data Processing en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by