Borrar filtros
Borrar filtros

How to run Random Forest Classification code for my attached input file?

19 visualizaciones (últimos 30 días)
I have a file which contains normalized values between 0 to 1 for nine meteorological variables as predictors and 10th variable as GPP. Can some body use this attached file into Random Forest classification code and provide me the plot of following
out of bag mean error
out of bag mean squared error
out of bag classification error
out of bag feature importance
I would appreciate your kind cooperation.
Devendra

Respuesta aceptada

Diwakar Diwakar
Diwakar Diwakar el 8 de Jul. de 2023
May be this code will help you:
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import mean_absolute_error, mean_squared_error, accuracy_score
import matplotlib.pyplot as plt
# Read the data from the provided file
data = pd.read_csv('your_file_path.csv')
# Separate the predictor variables (X) and the target variable (y)
X = data.iloc[:, :-1] # Select all columns except the last one
y = data.iloc[:, -1] # Select the last column
# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Create a Random Forest classifier
rf_classifier = RandomForestClassifier(n_estimators=100, oob_score=True, random_state=42)
# Fit the classifier to the training data
rf_classifier.fit(X_train, y_train)
# Make predictions on the training set
y_train_pred = rf_classifier.predict(X_train)
# Calculate the out-of-bag mean error
oob_error = 1 - rf_classifier.oob_score_
# Make predictions on the testing set
y_test_pred = rf_classifier.predict(X_test)
# Calculate the out-of-bag mean squared error
oob_mse = mean_squared_error(y_train, y_train_pred)
# Calculate the out-of-bag classification error
oob_classification_error = 1 - accuracy_score(y_train, y_train_pred)
# Calculate the accuracy on the training set
train_accuracy = accuracy_score(y_train, y_train_pred)
# Calculate the accuracy on the testing set
test_accuracy = accuracy_score(y_test, y_test_pred)
# Get the feature importance from the trained model
feature_importance = rf_classifier.feature_importances_
# Plot the feature importance
plt.figure(figsize=(10, 6))
plt.bar(range(len(feature_importance)), feature_importance, tick_label=X.columns)
plt.xticks(rotation=90)
plt.xlabel('Predictor Variables')
plt.ylabel('Feature Importance')
plt.title('Random Forest Feature Importance')
plt.tight_layout()
plt.show()
# Print the calculated metrics
print("Out-of-Bag Mean Error:", oob_error)
print("Out-of-Bag Mean Squared Error:", oob_mse)
print("Out-of-Bag Classification Error:", oob_classification_error)
print("Training Accuracy:", train_accuracy)
print("Testing Accuracy:", test_accuracy)
  5 comentarios
Image Analyst
Image Analyst el 8 de Jul. de 2023
@Devendra evidently @diwakar diwakar is not actually running the code. That may be a function he wrote. You can split your data into training set and test set by using randsample Something like (untested)
% Get the total number of ground truth values.
numGroundTruth = numel(y)
% Get 80% of the data for training.
numTrainingSamples = round(0.8 * numel(y))
trainingIndexes = randsample(numel(y), numTrainingSamples)
% Leaving 20% for testing:
testIndexes = setdiff((1 : numGroundTruth)', trainingIndexes)
% Extract training and test sets from the complete set
% into their respective individual variables:
X_train = X(trainingIndexes, :)
X_test = X(testIndexes, :)
y_train = y(trainingIndexes, :)
y_test = y(testIndexes, :)
Sanchit
Sanchit el 8 de Jul. de 2023
Error using TreeBagger/get.OOBPermutedVarDeltaError
Out-of-bag permutations were not saved. Run with 'OOBPredictorImportance' set to 'on'.
Error in indexing (line 22)
[varargout{1:nargout}] = builtin('subsref',this,s);
Error in RF_classifier(line 24)
feature_importance = rf_classifier.OOBPermutedVarDeltaError;
Devendra

Iniciar sesión para comentar.

Más respuestas (0)

Productos


Versión

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by