I have a large amount of training data (about 10,000) wherein I have a large array (~500,000) of small integer inputs (typically 0 or 1 with a tail that falls off exponentially toward 10) and a relatively small array (~30) of Double values between 0 and 1 as outputs.
In general the elements of the input array are uncorrelated with each other (meaning they will independently affect the output). Technically there are some correlations, but I'd be willing to ignore them as they are not terribly important.
I can add a few categorical and numeric engineered features to the input that would have a significant affect on the output.
I would like to try a machine learning approach to this problem within Matlab, but I am worried about the large amount of inputs. Is there an approach that's viable, or would some dimensionality reduction be absolutely necessary before proceeding? Any sort of dimensionality reduction would destroy the corellations between inputs - but I can live with that.