Predecir el estado de carga de una batería utilizando deep learning
En este ejemplo se muestra cómo entrenar una red neuronal para predecir el estado de carga de una batería utilizando deep learning.
El estado de carga (SOC) de una batería es el nivel de carga de una batería eléctrica en relación con su capacidad, medido como un porcentaje. El SOC es información fundamental para el sistema de gestión de energía de un vehículo y debe estimarse de manera precisa para garantizar vehículos electrificados fiables y asequibles (xEV). Sin embargo, debido a la temperatura, estado y comportamiento dependiente del SOC no lineales de las baterías de iones de litio, la estimación del SOC sigue siendo un reto para la ingeniería de automoción. Los enfoques tradicionales para este problema, como modelos electromecánicos, normalmente requieren parámetros precisos y conocimientos sobre la composición de la batería, así como su respuesta física. En cambio, el uso de redes neuronales es un enfoque basado en datos que requiere un conocimiento mínimo sobre la batería o su comportamiento no lineal [1].
Este ejemplo está basado en el script de MATLAB® de [1]. En el ejemplo se entrena una red neuronal para predecir el estado de carga de una batería de iones de litio, dados los datos de series de tiempo que representan distintas características de la batería, como voltaje, corriente, temperatura, voltaje medio y corriente media (durante los últimos 500 segundos).
Los datos de entrenamiento contienen una única secuencia de datos experimentales recopilados mientras la batería proporcionaba energía a un vehículo eléctrico durante un ciclo de conducción con una temperatura externa de 25 °C. Los datos de prueba contienen cuatro secuencias de datos experimentales recopilados durante ciclos de conducción a cuatro temperaturas diferentes. En este ejemplo se utiliza el conjunto de datos preprocesado LG_HG2_Prepared_Dataset_McMasterUniversity_Jan_2020
de [1]. Para ver un ejemplo de cómo utilizar una red neuronal entrenada dentro de un modelo de Simulink® para predecir el SOC de una batería, consulte Estimación del estado de carga de una batería en Simulink utilizando una red neuronal prealimentada.
Descargar datos
Cada archivo del conjunto de datos LG_HG2_Prepared_Dataset_McMasterUniversity_Jan_2020
contiene una serie de tiempo X
de cinco predictores (voltaje, corriente, temperatura, voltaje medio y corriente media) y una serie de tiempo Y
de un objetivo (SOC). Cada archivo representa datos recopilados en una temperatura ambiente diferente.
Especifique la URL desde la que descargar el conjunto de datos. Como alternativa, puede descargar este conjunto de datos manualmente de https://data.mendeley.com/datasets/cp3473x7xv/3.
url = "https://data.mendeley.com/public-files/datasets/cp3473x7xv/files/ad7ac5c9-2b9e-458a-a91f-6f3da449bdfb/file_downloaded";
Establezca downloadFolder
en donde desea descargar el archivo zip y outputFolder
en donde desea extraer el archivo zip.
downloadFolder = tempdir;
outputFolder = fullfile(downloadFolder,"LGHG2@n10C_to_25degC");
Descargue y extraiga el conjunto de datos LG_HG2_Prepared_Dataset_McMasterUniversity_Jan_2020
.
if ~exist(outputFolder,"dir") fprintf("Downloading LGHG2@n10C_to_25degC.zip (56 MB) ... ") filename = fullfile(downloadFolder,"LGHG2@n10C_to_25degC.zip"); websave(filename,url); unzip(filename,outputFolder) end
Preparar datos de entrenamiento
Para los datos de entrenamiento, cree un almacén de datos de archivo y especifique la función de lectura como la función load
. La función load
carga los datos del archivo MAT en un arreglo de estructura.
folderTrain = fullfile(outputFolder,"Train");
fdsTrain = fileDatastore(folderTrain,ReadFcn=@load);
Cada archivo de este almacén de datos contiene tanto los predictores X
como los objetivos Y
.
Para crear un almacén de datos transformados dsTrain
que devuelva los datos de predictores X
y los datos de objetivos Y de cada archivo, transforme el almacén de datos de archivos fdsTrain
.
dsTrain = transform(fdsTrain,@(data) {data.X,data.Y});
Previsualice el almacén de datos transformado. La salida corresponde a una secuencia de predictores X
del primer archivo y una secuencia de objetivos Y
del primer archivo.
preview(dsTrain)
ans=1×2 cell array
{5×669956 double} {1×669956 double}
Observe que para introducir los datos de secuencias de almacenes de datos en una red de deep learning, los minilotes de las secuencias deben tener la misma longitud, lo que requiere normalmente rellenar las secuencias del almacén de datos. En este ejemplo, no es necesario rellenar porque los datos de entrenamiento constan de una única secuencia. Para obtener más información, consulte Train Network Using Out-of-Memory Sequence Data.
Preparar datos de prueba y de validación
Para los datos de prueba, cree un almacén de datos de archivo y especifique la función de lectura como la función load
. La función load
carga los datos del archivo MAT en un arreglo de estructura.
folderTest = fullfile(outputFolder,"Test");
fdsTest = fileDatastore(folderTest,ReadFcn=@load);
Cada archivo de este almacén de datos contiene tanto los predictores X
como los objetivos Y
.
Para crear un almacén de datos transformados tdsPredictorsTest
que devuelve únicamente los datos de predictores X
de cada archivo, transforme el almacén de datos de archivos fdsTest
.
tdsPredictorsTest = transform(fdsTest,@(data) {data.X});
Previsualice el almacén de datos transformado. La salida corresponde a una única secuencia de predictores X
del primer archivo.
preview(tdsPredictorsTest)
ans = 1×1 cell array
{5×39293 double}
Para crear un almacén de datos transformados tdsTargetsTest que devuelve únicamente los datos de objetivos Y de cada archivo, transforme el almacén de datos de archivos fdsTest.
tdsTargetsTest = transform(fdsTest,@(data) {data.Y});
Previsualice el almacén de datos transformado. La salida corresponde a una única secuencia de objetivos Y del primer archivo.
preview(tdsTargetsTest)
ans = 1×1 cell array
{[1 1.0000 1.0000 1.0000 1.0000 1.0000 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9998 0.9998 0.9998 0.9998 0.9998 0.9998 0.9997 0.9996 0.9994 0.9991 0.9989 0.9988 0.9987 0.9983 0.9982 0.9980 0.9980 0.9980 0.9980 0.9981 0.9981 0.9981 0.9981 0.9981 0.9981 0.9980 0.9980 0.9979 0.9978 0.9975 0.9972 0.9970 0.9969 0.9969 0.9969 0.9970 0.9970 0.9970 0.9970 0.9970 0.9968 0.9965 0.9962 0.9960 0.9958 0.9957 0.9956 0.9956 0.9956 0.9955 0.9954 0.9954 0.9954 0.9953 0.9953 0.9952 0.9950 0.9950 0.9950 0.9950 0.9949 0.9948 0.9947 0.9946 0.9946 0.9945 0.9944 0.9942 0.9939 0.9938 0.9936 0.9935 0.9934 0.9933 0.9932 0.9932 0.9931 0.9931 0.9930 0.9929 0.9929 0.9929 0.9929 0.9928 0.9926 0.9925 0.9924 0.9923 0.9923 0.9923 0.9923 0.9923 0.9921 0.9920 0.9918 0.9917 0.9915 0.9914 0.9914 0.9914 0.9915 0.9915 0.9916 0.9916 0.9916 0.9917 0.9917 0.9918 0.9918 0.9918 0.9918 0.9918 0.9918 0.9918 0.9918 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9914 0.9913 0.9911 0.9909 0.9905 0.9901 0.9898 0.9894 0.9891 0.9890 0.9891 0.9891 0.9891 0.9890 0.9889 0.9889 0.9887 0.9885 0.9886 0.9886 0.9887 0.9887 0.9888 0.9888 0.9887 0.9886 0.9884 0.9880 0.9876 0.9871 0.9864 0.9856 0.9849 0.9846 0.9840 0.9835 0.9830 0.9824 0.9818 0.9814 0.9811 0.9807 0.9806 0.9805 0.9804 0.9803 0.9801 0.9800 0.9799 0.9797 0.9795 0.9793 0.9790 0.9787 0.9783 0.9780 0.9777 0.9774 0.9770 0.9767 0.9763 0.9757 0.9752 0.9748 0.9745 0.9743 0.9741 0.9740 0.9738 0.9735 0.9732 0.9729 0.9726 0.9722 0.9720 0.9716 0.9714 0.9712 0.9710 0.9708 0.9706 0.9704 0.9702 0.9700 0.9698 0.9697 0.9696 0.9696 0.9696 0.9695 0.9694 0.9693 0.9691 0.9688 0.9686 0.9684 0.9682 0.9681 0.9681 0.9680 0.9680 0.9679 0.9677 0.9676 0.9675 0.9674 0.9672 0.9670 0.9668 0.9665 0.9662 0.9658 0.9655 0.9651 0.9646 0.9642 0.9639 0.9636 0.9634 0.9633 0.9633 0.9633 0.9634 0.9635 0.9636 0.9634 0.9633 0.9632 0.9633 0.9632 0.9630 0.9629 0.9628 0.9627 0.9625 0.9624 0.9623 0.9623 0.9623 0.9624 0.9626 0.9627 0.9628 0.9630 0.9631 0.9633 0.9634 0.9635 0.9637 0.9638 0.9639 0.9640 0.9641 0.9641 0.9641 0.9642 0.9643 0.9644 0.9646 0.9647 0.9647 0.9647 0.9648 0.9649 0.9650 0.9651 0.9652 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9652 0.9652 0.9652 0.9652 0.9651 0.9650 0.9648 0.9645 0.9642 0.9639 0.9635 0.9633 0.9630 0.9627 0.9624 0.9620 0.9618 0.9616 0.9614 0.9612 0.9610 0.9608 0.9607 0.9605 0.9605 0.9604 0.9603 0.9601 0.9599 0.9598 0.9597 0.9596 0.9595 0.9594 0.9593 0.9592 0.9591 0.9590 0.9590 0.9591 0.9592 0.9593 0.9594 0.9596 0.9597 0.9599 0.9600 0.9601 0.9602 0.9604 0.9605 0.9606 0.9606 0.9606 0.9606 0.9606 0.9606 0.9606 0.9606 0.9605 0.9604 0.9603 0.9600 0.9597 0.9593 0.9589 0.9585 0.9583 0.9579 0.9576 0.9574 0.9573 0.9573 0.9573 0.9573 0.9573 0.9574 0.9575 0.9577 0.9578 0.9580 0.9581 0.9582 0.9583 0.9584 0.9584 0.9584 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9582 0.9582 0.9582 0.9582 0.9582 0.9582 0.9581 0.9580 0.9579 0.9576 0.9573 0.9568 0.9563 0.9558 0.9555 0.9552 0.9547 0.9543 0.9541 0.9538 0.9537 0.9535 0.9533 0.9532 0.9531 0.9530 0.9530 0.9529 0.9528 0.9527 0.9527 0.9527 0.9526 0.9525 0.9525 0.9524 0.9523 0.9522 0.9522 0.9521 0.9520 0.9519 0.9518 0.9518 0.9517 0.9516 0.9516 0.9515 0.9515 0.9514 0.9516 0.9517 0.9519 0.9520 0.9522 0.9523 0.9525 0.9526 0.9527 0.9529 0.9530 0.9531 0.9531 0.9531 0.9531 0.9531 0.9530 0.9530 0.9530 0.9530 0.9530 0.9529 0.9528 0.9527 0.9527 0.9526 0.9525 0.9523 0.9521 0.9519 0.9517 0.9516 0.9514 0.9512 0.9511 0.9509 0.9508 0.9507 0.9506 0.9506 0.9505 0.9505 0.9504 0.9504 0.9502 0.9501 0.9501 0.9501 0.9501 0.9501 0.9500 0.9500 0.9500 0.9501 0.9503 0.9504 0.9506 0.9507 0.9508 0.9509 0.9509 0.9509 0.9509 0.9509 0.9509 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9507 0.9507 0.9506 0.9504 0.9502 0.9500 0.9499 0.9498 0.9497 0.9497 0.9496 0.9496 0.9495 0.9495 0.9495 0.9495 0.9495 0.9494 0.9494 0.9494 0.9494 0.9494 0.9493 0.9493 0.9492 0.9491 0.9490 0.9489 0.9487 0.9486 0.9486 0.9485 0.9484 0.9483 0.9482 0.9482 0.9482 0.9481 0.9480 0.9478 0.9476 0.9474 0.9473 0.9472 0.9473 0.9475 0.9476 0.9477 0.9479 0.9480 0.9481 0.9482 0.9482 0.9482 0.9482 0.9482 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9479 0.9479 0.9479 0.9477 0.9476 0.9474 0.9473 0.9471 0.9469 0.9466 0.9464 0.9462 0.9460 0.9458 0.9456 0.9455 0.9454 0.9454 0.9453 0.9452 0.9452 0.9451 0.9451 0.9451 0.9451 0.9453 0.9454 0.9456 0.9457 0.9458 0.9460 0.9461 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9461 0.9461 0.9461 0.9461 0.9461 0.9461 0.9461 0.9460 0.9460 0.9459 0.9458 0.9456 0.9454 0.9453 0.9452 0.9451 0.9451 0.9450 0.9448 0.9447 0.9445 0.9443 0.9442 0.9441 0.9441 0.9440 0.9439 0.9438 0.9439 0.9440 0.9441 0.9442 0.9444 0.9445 0.9446 0.9447 0.9447 0.9447 0.9447 0.9447 0.9447 0.9447 0.9446 0.9444 0.9442 0.9440 0.9438 0.9436 0.9434 0.9431 0.9428 0.9426 0.9424 0.9422 0.9421 0.9419 0.9418 0.9417 0.9416 0.9415 0.9415 0.9415 0.9415 0.9415 0.9416 0.9417 0.9418 0.9420 0.9421 0.9423 0.9424 0.9425 0.9426 0.9427 0.9428 0.9428 0.9428 0.9428 0.9427 0.9427 0.9426 0.9424 0.9422 0.9419 0.9417 0.9416 0.9414 0.9412 0.9410 0.9408 0.9405 0.9402 0.9400 0.9399 0.9398 0.9396 0.9395 0.9395 0.9395 0.9394 0.9394 0.9394 0.9393 0.9393 0.9393 0.9393 0.9392 0.9392 0.9391 0.9390 0.9390 0.9389 0.9388 0.9386 0.9383 0.9380 0.9378 0.9375 0.9374 0.9373 0.9371 0.9370 0.9369 0.9369 0.9368 0.9368 0.9367 0.9368 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9366 0.9366 0.9365 0.9365 0.9365 0.9364 0.9364 0.9364 0.9364 0.9365 0.9365 0.9367 0.9368 0.9370 0.9371 0.9372 0.9371 0.9370 0.9368 0.9367 0.9366 0.9365 0.9364 0.9362 0.9360 0.9358 0.9357 0.9356 0.9356 0.9355 0.9354 0.9352 0.9351 0.9350 0.9349 0.9348 0.9347 0.9347 0.9348 0.9348 0.9348 0.9349 0.9350 0.9350 0.9349 0.9348 0.9347 0.9346 0.9345 0.9343 0.9342 0.9341 0.9339 0.9338 0.9337 0.9336 0.9336 0.9336 0.9336 0.9336 0.9335 0.9335 0.9335 0.9335 0.9335 0.9334 0.9333 0.9332 0.9331 0.9331 0.9330 0.9329 0.9329 0.9329 0.9329 0.9329 0.9329 0.9328 0.9328 0.9327 0.9327 0.9327 0.9326 0.9326 0.9326 0.9326 0.9325 0.9324 0.9324 0.9325 0.9326 0.9327 0.9327 0.9327 0.9326 0.9324 0.9323 0.9322 0.9321 0.9320 0.9319 0.9318 0.9318 0.9317 0.9316 0.9316 0.9316 0.9316 0.9315 0.9315 0.9315 0.9315 0.9314 0.9313 0.9312 0.9312 0.9312 0.9312 0.9312 0.9311 0.9309 0.9308 0.9309 0.9310 0.9311 0.9313 0.9315 0.9316 0.9317 0.9319 0.9319 0.9319 0.9319 0.9319 0.9319 0.9319 0.9318 0.9316 0.9314 0.9311 0.9309 0.9307 0.9305 0.9303 0.9302 0.9300 0.9297 0.9294 0.9292 0.9290 0.9289 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9289 0.9291 0.9291 0.9290 0.9290 0.9288 0.9287 0.9287 0.9287 0.9286 0.9286 0.9286 0.9285 0.9283 0.9282 0.9281 0.9280 0.9279 0.9278 0.9278 0.9278 0.9278 0.9278 0.9278 0.9279 0.9279 0.9279 0.9279 0.9279 0.9280 0.9282 0.9283 0.9285 0.9286 0.9287 0.9287 0.9287 0.9287 0.9287 0.9287 0.9287 0.9287 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9284 0.9284 0.9284 0.9284 0.9282 0.9280 0.9277 0.9274 0.9272 0.9270 0.9268 0.9266 0.9264 0.9262 0.9261 0.9259 0.9258 0.9256 0.9255 0.9254 0.9254 0.9254 0.9254 0.9254 0.9255 0.9256 0.9258 0.9258 0.9259 0.9261 0.9263 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9265 0.9265 0.9265 0.9264 0.9264 0.9264 0.9264 0.9262 0.9261 0.9259 0.9258 0.9257 0.9255 0.9252 0.9250 0.9249 0.9248 0.9248 0.9247 0.9246 0.9245 0.9244 0.9242 0.9241 0.9240 0.9239 0.9239 0.9238 0.9237 0.9236 0.9235 0.9234 0.9233 0.9232 0.9232 0.9231 0.9231 0.9230 0.9230 0.9230 0.9229 0.9229 0.9229 0.9230 0.9230 0.9231 0.9233 0.9234 0.9236 0.9237 0.9238 0.9239 0.9241 0.9241 0.9242 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9240 0.9240 0.9240 0.9240 0.9240 0.9240 0.9239 0.9237 0.9235 0.9232 0.9228 0.9224 0.9222 0.9221 0.9221 0.9222 0.9224 0.9226 0.9227 0.9229 0.9230 0.9231 0.9231 0.9231 0.9231 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9229 0.9228 0.9226 0.9225 0.9224 0.9224 0.9224 0.9223 0.9223 0.9222 0.9221 0.9220 0.9218 0.9216 0.9214 0.9213 0.9212 0.9211 0.9211 0.9211 0.9210 0.9210 0.9209 0.9208 0.9208 0.9208 0.9208 0.9207 0.9207 0.9208 0.9208 0.9208 0.9207 0.9207 0.9207 0.9207 0.9209 0.9210 0.9211 0.9213 0.9213 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9213 0.9213 0.9213 0.9213 0.9213 0.9213 0.9213 0.9212 0.9212 0.9211 0.9210 0.9209 0.9209 0.9209 0.9209 0.9209 0.9208 0.9207 0.9204 0.9201 0.9198 0.9196 0.9195 0.9194 0.9192 0.9191 0.9190 0.9189 0.9189 0.9189 0.9189 0.9189 0.9188 0.9188 0.9187 0.9187 0.9186 0.9185 0.9184 0.9183 0.9182 0.9181 0.9181 0.9180 0.9179 0.9178 0.9176 0.9174 0.9173 0.9171 0.9171 0.9170 0.9172 0.9174 0.9176 0.9177 0.9179 0.9181 0.9182 0.9183 0.9183 0.9183 0.9183 0.9183 0.9183 0.9183 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9180 0.9180 0.9178 0.9176 0.9174 0.9173 0.9171 0.9169 0.9167 0.9166 0.9164 0.9163 0.9163 0.9162 0.9162 0.9162 0.9162 0.9163 0.9163 0.9163 0.9164 0.9164 0.9165 0.9165 0.9166 0.9168 0.9169 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9168 0.9168 0.9168 0.9168 0.9168 0.9168 0.9167 0.9166 0.9165 0.9163 0.9160 0.9158 0.9158 0.9156 0.9153 0.9151 0.9150 0.9150 0.9150 0.9150 0.9150 0.9151 0.9152 0.9153 0.9154 0.9154 0.9153 0.9153 0.9152 0.9150 0.9147 0.9145 0.9143 0.9142 0.9142 0.9142 0.9143 0.9145 0.9146 0.9147 0.9146 0.9144 0.9141 0.9138 0.9136 0.9134 0.9133 0.9132 0.9132 0.9132 0.9131 0.9131 0.9130 0.9130 0.9129 0.9129 0.9128 0.9127 0.9126 0.9126 0.9126 0.9125 0.9124 0.9123 0.9123 0.9122 0.9121 0.9120 0.9118 0.9115 0.9114 0.9112 0.9111 0.9110 0.9109 0.9108 0.9108 0.9107 0.9107 0.9106 0.9105 0.9105 0.9105 0.9105 0.9104 0.9102 0.9101 0.9100 0.9099 0.9099 0.9099 0.9099 0.9098 0.9097 0.9096 0.9094 0.9092 0.9091 0.9090 0.9090 0.9091 0.9093 0.9095 0.9097 0.9099 0.9101 0.9102 0.9104 0.9105 0.9105 0.9105 0.9105 0.9105 0.9105 0.9105 0.9105 0.9105 0.9104 0.9104 0.9104 0.9104 0.9104 0.9104 0.9104 0.9104 0.9104 0.9104 0.9104 0.9103 0.9103 0.9103 0.9103 0.9103 0.9103 0.9103 0.9103 0.9103 0.9103 0.9103 0.9102 0.9102 0.9102 0.9102 0.9102 0.9102 0.9102 0.9102 0.9102 0.9102 0.9101 0.9100 0.9098 0.9095 0.9092 0.9088 0.9084 0.9081 0.9078 0.9078 0.9078 0.9078 0.9078 0.9077 0.9076 0.9076 0.9074 0.9072 0.9073 0.9075 0.9077 0.9079 0.9079 0.9079 0.9078 0.9077 0.9074 0.9071 0.9066 0.9061 0.9054 0.9046 0.9040 0.9036 0.9030 0.9026 0.9020 0.9015 0.9009 0.9005 0.9002 0.8999 0.8997 0.8996 0.8995 0.8994 0.8993 0.8991 0.8990 0.8988 0.8986 0.8984 0.8981 0.8978 0.8974 0.8971 0.8968 0.8965 0.8962 0.8958 0.8953 0.8948 0.8943 0.8939 0.8937 0.8934 0.8933 0.8932 0.8929 0.8927 0.8923 0.8920 0.8917 0.8914 0.8911 0.8908 0.8905 0.8903 0.8902 0.8899 0.8897 0.8895 0.8893 0.8891 0.8889 0.8888 0.8888 0.8888 0.8887 0.8887 0.8886 0.8884 0.8882 0.8880 0.8877 0.8875 0.8874 0.8873 0.8872 0.8872 0.8871 0.8870 0.8868 0.8868 0.8866 0.8865 0.8864 0.8861 0.8859 0.8856 0.8853 0.8849 0.8846 0.8841 0.8837 0.8833 0.8830 0.8827 0.8825 0.8824 0.8824 0.8825 0.8826 0.8827 0.8827 0.8825 0.8824 0.8824 0.8824 0.8823 0.8821 0.8820 0.8820 0.8818 0.8817 0.8815 0.8815 0.8815 0.8815 0.8816 0.8818 0.8819 0.8821 0.8822 0.8823 0.8825 0.8828 0.8830 0.8832 0.8833 0.8835 0.8836 0.8836 0.8836 0.8836 0.8837 0.8839 0.8842 0.8844 0.8845 0.8846 0.8846 0.8847 0.8848 0.8850 0.8851 0.8852 0.8853 0.8853 0.8853 0.8853 0.8853 0.8853 0.8852 0.8852 0.8852 0.8852 0.8852 0.8852 0.8852 0.8852 0.8852 0.8852 0.8851 0.8851 0.8849 0.8847 0.8844 0.8841 0.8837 0.8834 0.8832 0.8829 0.8826 0.8823 0.8819 0.8817 0.8815 0.8813 0.8811 0.8809 0.8807 0.8806 0.8804 0.8804 0.8803 0.8802 0.8799 0.8798 0.8797 0.8796 0.8795 0.8794 0.8793 0.8792 0.8791 0.8790 0.8789 0.8790 0.8791 0.8791 0.8793 0.8795 0.8798 0.8801 0.8804 0.8806 0.8808 0.8810 0.8811 0.8813 0.8813 0.8814 0.8814 0.8814 0.8814 0.8814 0.8813 0.8813 0.8813 0.8812 0.8810 0.8807 0.8804 0.8800 0.8795 0.8792 0.8790 0.8786 0.8783 0.8781 0.8781 0.8780 0.8780 0.8780 0.8781 0.8782 0.8785 0.8788 0.8790 0.8792 0.8794 0.8796 0.8796 0.8797 0.8797 0.8796 0.8796 0.8796 0.8796 0.8796 0.8796 0.8796 0.8796 0.8796 0.8796 0.8796 0.8795 0.8795 0.8795 0.8795 0.8795 0.8795 0.8795 0.8794 0.8793 0.8791 0.8788 0.8785 0.8780 0.8775 0.8770 0.8767 0.8764 0.8759 0.8756 0.8753 0.8751 0.8749 0.8747 0.8746 0.8745 0.8744 0.8743 0.8743 0.8742 0.8741 0.8740 0.8740 0.8739 0.8739 0.8738 0.8737 0.8737 0.8736 0.8735 0.8734 0.8734 0.8733 0.8732 0.8731 0.8730 0.8730 0.8729 0.8728 0.8728 0.8728 0.8727 0.8729 0.8731 0.8733 0.8735 0.8738 0.8741 0.8743 0.8746 0.8748 0.8749 0.8750 0.8750 0.8751 0.8751 0.8750 0.8750 0.8750 0.8750 0.8750 0.8750 0.8749 0.8749 0.8748 0.8747 0.8746 0.8746 0.8744 0.8742 0.8740 0.8738 0.8737 0.8735 0.8733 0.8732 0.8730 0.8729 0.8727 0.8726 0.8726 0.8725 0.8725 0.8724 0.8724 0.8723 0.8722 0.8721 0.8721 0.8721 0.8721 0.8720 0.8720 0.8720 0.8720 0.8722 0.8725 0.8727 0.8729 0.8731 0.8732 0.8732 0.8732 0.8732 0.8732 0.8732 0.8732 0.8732 0.8732 0.8732 0.8731 0.8731 0.8731 0.8731 0.8731 0.8731 0.8731 0.8731 0.8731 0.8730 0.8729 0.8727 0.8725 0.8723 0.8722 0.8721 0.8720 0.8720 0.8719 0.8719 0.8718 0.8718 0.8718 0.8718 0.8718 0.8717 0.8717 0.8717 0.8717 0.8717 0.8716 0.8716 0.8715 0.8714 0.8713 0.8712 0.8710 0.8709 0.8709 0.8708 0.8707 0.8706 0.8705 0.8705 0.8705 0.8704 0.8703 0.8701 0.8698 0.8697 0.8696 0.8695 0.8697 0.8699 0.8702 0.8704 0.8706 0.8708 0.8709 0.8709 0.8709 0.8709 0.8709 0.8709 0.8709 0.8709 0.8709 0.8709 0.8708 0.8708 0.8708 0.8708 0.8708 0.8708 0.8708 0.8708 0.8708 0.8708 0.8708 0.8707 0.8707 0.8707 0.8707 0.8707 0.8707 0.8707 0.8706 0.8706 0.8704 0.8703 0.8701 0.8700 0.8698 0.8695 0.8693 0.8691 0.8689 0.8687 0.8685 0.8683 0.8682 0.8681 0.8681 0.8680 0.8680 0.8679 0.8678 0.8678 0.8678 0.8679 0.8681 0.8683 0.8685 0.8686 0.8688 0.8689 0.8690 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8691 0.8690 0.8690 0.8690 0.8690 0.8690 0.8689 0.8688 0.8687 0.8685 0.8683 0.8682 0.8681 0.8680 0.8680 0.8679 0.8677 0.8676 0.8674 0.8672 0.8671 0.8671 0.8670 0.8669 0.8668 0.8668 0.8669 0.8669 0.8671 0.8673 0.8675 0.8676 0.8677 0.8678 0.8678 0.8679 0.8679 0.8678 0.8678 0.8678 0.8676 0.8675 0.8673 0.8671 0.8669 0.8667 0.8664 0.8662 0.8659 0.8657 0.8655 0.8653 0.8651 0.8650 0.8649 0.8648 0.8647 0.8646 0.8646 0.8646 0.8646 0.8646 0.8647 0.8648 0.8650 0.8652 0.8654 0.8656 0.8657 0.8658 0.8659 0.8660 0.8661 0.8661 0.8661 0.8661 0.8660 0.8660 0.8659 0.8657 0.8654 0.8652 0.8650 0.8648 0.8647 0.8645 0.8643 0.8640 0.8637 0.8635 0.8633 0.8631 0.8630 0.8629 0.8628 0.8628 0.8628 0.8627 0.8627 0.8627 0.8626 0.8626 0.8626 0.8626 0.8625 0.8624 0.8624 0.8623 0.8623 0.8622 0.8621 0.8618 0.8615 0.8613 0.8610 0.8608 0.8607 0.8605 0.8604 0.8602 0.8602 0.8601 0.8601 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8600 0.8599 0.8598 0.8598 0.8598 0.8597 0.8597 0.8597 0.8597 0.8597 0.8598 0.8599 0.8600 0.8602 0.8603 0.8604 0.8605 0.8604 0.8603 0.8601 0.8600 0.8599 0.8598 0.8597 0.8595 0.8593 0.8591 0.8590 0.8589 0.8589 0.8588 0.8587 0.8585 0.8584 0.8583 0.8582 0.8581 0.8580 0.8580 0.8581 0.8581 0.8582 0.8583 0.8583 0.8583 0.8582 0.8581 0.8580 0.8579 0.8578 0.8576 0.8575 0.8574 0.8572 0.8571 0.8570 0.8570 0.8569 0.8569 0.8569 0.8569 0.8568 0.8568 0.8568 0.8568 0.8568 0.8567 0.8566 0.8565 0.8564 0.8564 0.8563 0.8562 0.8562 0.8562 0.8562 0.8562 0.8562 0.8561 0.8561 0.8560 0.8560 0.8560 0.8559 0.8559 0.8559 0.8558 0.8558 0.8557 0.8558 0.8558 0.8559 0.8560 0.8560 0.8560 0.8559 0.8557 0.8556 0.8555 0.8554 0.8553 0.8552 0.8551 0.8551 0.8550 0.8549 0.8549 0.8549 0.8549 0.8548 0.8548 0.8548 0.8548 0.8547 0.8546 0.8545 0.8545 0.8545 0.8545 0.8544 0.8543 0.8542 0.8541 0.8542 0.8543 0.8545 0.8547 0.8549 0.8551 0.8553 0.8554 0.8554 0.8555 0.8554 0.8554 0.8554 0.8554 0.8553 0.8551 0.8548 0.8546 0.8543 0.8542 0.8540 0.8538 0.8537 0.8534 0.8531 0.8528 0.8526 0.8525 0.8524 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8524 0.8525 0.8526 0.8526 0.8525 0.8524 0.8523 0.8522 0.8522 0.8521 0.8521 0.8521 0.8520 0.8519 0.8518 0.8517 0.8516 0.8515 0.8514 0.8513 0.8513 0.8512 0.8513 0.8513 0.8513 0.8514 0.8514 0.8514 0.8514 0.8514 0.8516 0.8518 0.8520 0.8522 0.8523 0.8523 0.8524 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8523 0.8522 0.8522 0.8522 0.8522 0.8522 0.8522 0.8522 0.8522 0.8522 0.8522 0.8521 0.8521 0.8521 0.8521 0.8521 0.8521 0.8521 0.8521 0.8521 0.8520 0.8518 0.8516 0.8513 0.8510 0.8508 0.8506 0.8504 0.8502 0.8500 0.8498 0.8497 0.8495 0.8494 0.8492 0.8491 0.8490 0.8490 0.8491 0.8491 0.8490 0.8492 0.8493 0.8495 0.8495 0.8497 0.8499 0.8501 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8502 0.8503 0.8503 0.8503 0.8502 0.8502 0.8502 0.8501 0.8500 0.8498 0.8496 0.8495 0.8494 0.8492 0.8490 0.8488 0.8486 0.8486 0.8485 0.8485 0.8484 0.8482 0.8481 0.8480 0.8478 0.8477 0.8477 0.8477 0.8476 0.8475 0.8474 0.8473 0.8472 0.8471 0.8470 0.8469 0.8469 0.8468 0.8468 0.8467 0.8467 0.8467 0.8467 0.8467 0.8467 0.8468 0.8469 0.8471 0.8472 0.8474 0.8475 0.8476 0.8478 0.8479 0.8480 0.8480 0.8480 0.8480 0.8479 0.8479 0.8479 0.8479 0.8479 0.8479 0.8479 0.8479 0.8479 0.8479 0.8479 0.8478 0.8478 0.8478 0.8478 0.8477 0.8475 0.8472 0.8469 0.8465 0.8462 0.8460 0.8459 0.8460 0.8461 0.8463 0.8466 0.8467 0.8469 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8470 0.8469 0.8469 0.8468 0.8467 0.8465 0.8464 0.8463 0.8463 0.8463 0.8463 0.8462 0.8462 0.8461 0.8459 0.8457 0.8455 0.8454 0.8452 0.8451 0.8450 0.8450 0.8450 0.8450 0.8449 0.8448 0.8448 0.8448 0.8448 0.8447 0.8447 0.8447 0.8448 0.8448 0.8448 0.8447 0.8446 0.8446 0.8447 0.8448 0.8450 0.8451 0.8452 0.8453 0.8454 0.8454 0.8454 0.8454 0.8454 0.8454 0.8454 0.8454 0.8453 0.8453 0.8453 0.8453 0.8453 0.8453 0.8453 0.8453 0.8453 0.8452 0.8452 0.8452 0.8451 0.8451 0.8450 0.8449 0.8449 0.8449 0.8449 0.8449 0.8448 0.8446 0.8443 0.8440 0.8437 0.8435 0.8434 0.8433 0.8432 0.8430 0.8429 0.8428 0.8428 0.8428 0.8428 0.8428 0.8428 0.8427 0.8427 0.8426 0.8425 0.8424 0.8423 0.8422 0.8421 0.8421 0.8420 0.8419 0.8418 0.8417 0.8415 0.8413 0.8412 0.8410 0.8410 0.8410 0.8412 0.8415 0.8417 0.8420 0.8422 0.8424 0.8425 0.8426 0.8426 0.8426 0.8426 0.8426 0.8426 0.8425 0.8425 0.8425 0.8425 0.8425 0.8425 0.8425 0.8425 0.8425 0.8425 0.8425 0.8424 0.8424 0.8424 0.8424 0.8424 0.8424 0.8424 0.8424 0.8424 0.8424 0.8423 0.8422 0.8420 0.8419 0.8417 0.8415 0.8413 0.8411 0.8410 0.8408 0.8407 0.8406 0.8405 0.8405 0.8405 0.8405 0.8405 0.8406 0.8406 0.8406 0.8407 0.8407 0.8408 0.8409 0.8410 0.8411 0.8412 0.8413 0.8413 0.8413 0.8413 0.8413 0.8413 0.8413 0.8413 0.8412 0.8412 0.8412 0.8412 0.8412 0.8412 0.8412 0.8412 0.8412 0.8412 0.8411 0.8411 0.8411 0.8411 0.8411 0.8411 0.8411 0.8411 0.8410 0.8409 0.8407 0.8405 0.8403 0.8401 0.8400 0.8398 0.8395 0.8394 0.8393 0.8392 0.8393 0.8393 0.8393 0.8393 0.8395 0.8397 0.8397 0.8397 0.8396 0.8395 0.8394 0.8392 0.8390 0.8387 0.8386 0.8385 0.8385 0.8385 0.8387 0.8388 0.8389 0.8390 0.8388 0.8386 0.8383 0.8381 0.8378 0.8377 0.8376 0.8375 0.8375 0.8375 0.8374 0.8374 0.8373 0.8373 0.8372 0.8371 0.8370 0.8369 0.8369 0.8369 0.8369 0.8368 0.8367 0.8366 0.8365 0.8365 0.8363 0.8362 0.8360 0.8357 0.8356 0.8354 0.8353 0.8352 0.8351 0.8351 0.8350 0.8350 0.8349 0.8348 0.8348 0.8348 0.8348 0.8347 0.8346 0.8344 0.8343 0.8342 0.8342 0.8342 0.8342 0.8342 0.8341 0.8339 0.8338 0.8336 0.8334 0.8333 0.8332 0.8332 0.8334 0.8337 0.8339 0.8342 0.8345 0.8347 0.8349 0.8350 0.8351 0.8351 0.8351 0.8351 0.8351 0.8351 0.8351 0.8351 0.8350 0.8350 0.8350 0.8350 0.8350 0.8350 0.8350 0.8350 0.8350 0.8350 0.8350 0.8349 0.8349 0.8349 0.8349 0.8349 0.8349 0.8349 0.8349 0.8349 0.8349 0.8349 0.8348 0.8348 0.8348 0.8348 0.8348 0.8348 0.8348 0.8348 0.8348 0.8348 0.8347 0.8347 0.8345 0.8343 0.8340 0.8336 0.8333 0.8329 0.8326 0.8324 0.8323 0.8324 0.8324 0.8324 0.8323 0.8322 0.8321 0.8319 0.8318 0.8320 0.8321 0.8323 0.8325 0.8326 0.8325 0.8324 0.8323 0.8320 0.8317 0.8312 0.8306 0.8299 0.8291 0.8286 0.8281 0.8276 0.8271 0.8265 0.8260 0.8255 0.8251 0.8247 0.8245 0.8243 0.8242 0.8241 0.8240 0.8239 0.8237 0.8236 0.8234 0.8232 0.8229 0.8226 0.8223 0.8220 0.8217 0.8213 0.8210 0.8206 0.8203 0.8198 0.8192 0.8188 0.8184 0.8182 0.8180 0.8178 0.8177 0.8174 0.8171 0.8168 0.8165 0.8162 0.8158 0.8155 0.8152 0.8150 0.8148 0.8146 0.8144 0.8142 0.8140 0.8138 0.8136 0.8134 0.8133 0.8133 0.8133 0.8132 0.8131 0.8130 0.8129 0.8127 0.8124 0.8122 0.8120 0.8118 0.8117 0.8117 0.8116 0.8116 0.8114 0.8113 0.8112 0.8111 0.8109 0.8108 0.8106 0.8103 0.8100 0.8097 0.8093 0.8089 0.8084 0.8080 0.8077 0.8073 0.8071 0.8069 0.8068 0.8068 0.8069 0.8070 0.8071 0.8071 0.8069 0.8068 0.8068 0.8068 0.8066 0.8065 0.8064 0.8063 0.8062 0.8060 0.8059 0.8059 0.8059 0.8059 0.8060 0.8062 0.8063 0.8065 0.8066 0.8068 0.8070 0.8072 0.8074 0.8076 0.8078 0.8079 0.8080 0.8080 0.8080 0.8081 0.8082 0.8084 0.8087 0.8089 0.8090 0.8090 0.8091 0.8092 0.8093 0.8094 0.8095 0.8096 0.8097 0.8097 0.8097 0.8097 0.8097 0.8097 0.8097 0.8097 0.8096 0.8096 0.8096 0.8096 0.8096 0.8096 0.8096 0.8096 0.8095 0.8094 0.8093 0.8090 0.8087 0.8084 0.8080 0.8077 0.8075 0.8072 0.8069 0.8066 0.8063 0.8061 0.8059 0.8057 0.8054 0.8052 0.8050 0.8049 0.8048 0.8047 0.8047 0.8045 0.8042 0.8041 0.8040 0.8039 0.8038 0.8038 0.8037 0.8035 0.8034 0.8033 0.8033 0.8033 0.8034 0.8035 0.8037 0.8039 0.8042 0.8045 0.8048 0.8050 0.8052 0.8054 0.8055 0.8057 0.8057 0.8058 0.8058 0.8057 0.8057 0.8057 0.8057 0.8057 0.8056 0.8055 0.8053 0.8050 0.8047 0.8042 0.8038 0.8035 0.8032 0.8029 0.8026 0.8024 0.8024 0.8023 0.8023 0.8024 0.8024 0.8026 0.8029 0.8032 0.8034 0.8036 0.8038 0.8039 0.8040 0.8040 0.8040 0.8040 0.8040 0.8040 0.8040 0.8040 0.8039 0.8039 0.8039 0.8039 0.8039 0.8039 0.8039 0.8039 0.8039 0.8039 0.8038 0.8038 0.8038 0.8037 0.8036 0.8034 0.8031 0.8027 0.8022 0.8017 0.8013 0.8010 0.8005 0.8001 0.7998 0.7995 0.7993 0.7991 0.7989 0.7988 0.7987 0.7986 0.7986 0.7985 0.7984 0.7983 0.7983 0.7982 0.7982 0.7981 0.7981 0.7980 0.7979 0.7978 0.7977 0.7977 0.7976 0.7975 0.7974 0.7973 0.7973 0.7972 0.7971 0.7971 0.7970 0.7970 0.7970 0.7972 0.7974 0.7976 0.7979 0.7981 0.7984 0.7987 0.7989 0.7991 0.7992 0.7993 0.7993 0.7993 0.7993 0.7993 0.7993 0.7993 0.7993 0.7993 0.7993 0.7992 0.7991 0.7991 0.7990 0.7989 0.7988 0.7986 0.7984 0.7982 0.7981 0.7979 0.7977 0.7976 0.7974 0.7972 0.7971 0.7969 0.7969 0.7968 0.7968 0.7967 0.7967 0.7966 0.7965 0.7964 0.7963 0.7963 0.7963 0.7963 0.7963 0.7962 0.7962 0.7963 0.7965 0.7968 0.7970 0.7972 0.7973 0.7974 0.7975 0.7975 0.7975 0.7975 0.7974 0.7974 0.7974 0.7974 0.7974 0.7974 0.7974 0.7974 0.7974 0.7974 0.7974 0.7973 0.7973 0.7973 0.7972 0.7971 0.7969 0.7967 0.7965 0.7964 0.7963 0.7963 0.7962 0.7962 0.7961 0.7961 0.7961 0.7961 0.7961 0.7960 0.7960 0.7960 0.7960 0.7959 0.7959 0.7959 0.7958 0.7957 0.7956 0.7955 0.7954 0.7952 0.7951 0.7951 0.7950 0.7949 0.7948 0.7948 0.7947 0.7947 0.7946 0.7945 0.7942 0.7940 0.7939 0.7937 0.7937 0.7939 0.7942 0.7945 0.7947 0.7949 0.7950 0.7951 0.7952 0.7952 0.7951 0.7951 0.7951 0.7951 0.7951 0.7951 0.7951 0.7951 0.7951 0.7951 0.7950 0.7950 0.7950 0.7950 0.7950 0.7950 0.7950 0.7950 0.7950 0.7950 0.7950 0.7949 0.7949 0.7949 0.7949 0.7949 0.7948 0.7946 0.7945 0.7943 0.7942 0.7939 0.7937 0.7934 0.7932 0.7930 0.7928 0.7926 0.7925 0.7924 0.7923 0.7922 0.7922 0.7921 0.7920 0.7920 0.7920 0.7920 0.7921 0.7923 0.7926 0.7927 0.7929 0.7930 0.7932 0.7933 0.7933 0.7934 0.7934 0.7934 0.7933 0.7933 0.7933 0.7933 0.7933 0.7933 0.7933 0.7933 0.7933 0.7933 0.7933 0.7932 0.7932 0.7932 0.7932 0.7931 0.7930 0.7928 0.7927 0.7925 0.7924 0.7923 0.7922 0.7922 0.7920 0.7919 0.7917 0.7915 0.7913 0.7913 0.7913 0.7912 0.7911 0.7910 0.7910 0.7911 0.7911 0.7913 0.7915 0.7917 0.7918 0.7920 0.7920 0.7920 0.7920 0.7920 0.7920 0.7920 0.7919 0.7918 0.7916 0.7914 0.7912 0.7910 0.7908 0.7905 0.7903 0.7900 0.7898 0.7896 0.7894 0.7892 0.7891 0.7890 0.7889 0.7888 0.7888 0.7888 0.7887 0.7887 0.7888 0.7889 0.7890 0.7892 0.7894 0.7896 0.7897 0.7899 0.7900 0.7901 0.7902 0.7902 0.7902 0.7902 0.7902 0.7902 0.7901 0.7900 0.7898 0.7895 0.7893 0.7891 0.7890 0.7888 0.7886 0.7884 0.7881 0.7878 0.7875 0.7874 0.7872 0.7871 0.7870 0.7869 0.7869 0.7869 0.7869 0.7868 0.7868 0.7867 0.7867 0.7867 0.7866 0.7866 0.7865 0.7865 0.7864 0.7864 0.7863 0.7861 0.7858 0.7856 0.7853 0.7850 0.7849 0.7847 0.7846 0.7844 0.7843 0.7843 0.7842 0.7841 0.7841 0.7841 0.7841 0.7841 0.7841 0.7841 0.7841 0.7840 0.7841 0.7841 0.7841 0.7840 0.7840 0.7839 0.7839 0.7839 0.7838 0.7838 0.7838 0.7838 0.7838 0.7839 0.7840 0.7841 0.7843 0.7844 0.7845 0.7845 0.7844 0.7843 0.7842 0.7841 0.7839 0.7838 0.7837 0.7835 0.7833 0.7831 0.7830 0.7830 0.7829 0.7828 0.7827 0.7825 0.7824 0.7823 0.7822 0.7821 0.7821 0.7821 0.7821 0.7822 0.7823 0.7823 0.7824 0.7823 0.7822 0.7821 0.7820 0.7819 0.7818 0.7816 0.7815 0.7814 0.7812 0.7811 0.7810 0.7810 0.7809 0.7809 0.7809 0.7809 0.7808 0.7808 0.7808 0.7809 0.7808 0.7807 0.7805 0.7804 0.7804 0.7803 0.7803 0.7802 0.7802 0.7802 0.7802 0.7802 0.7802 0.7801 0.7801 0.7800 0.7800 0.7799 0.7799 0.7799 0.7799 0.7798 0.7797 0.7797 0.7798 0.7799 0.7800 0.7800 0.7800 0.7800 0.7798 0.7797 0.7795 0.7794 0.7794 0.7793 0.7792 0.7791 0.7790 0.7790 0.7789 0.7789 0.7789 0.7788 0.7788 0.7788 0.7788 0.7787 0.7787 0.7786 0.7785 0.7785 0.7784 0.7784 0.7784 0.7783 0.7781 0.7781 0.7782 0.7783 0.7785 0.7787 0.7790 0.7792 0.7793 0.7794 0.7794 0.7794 0.7794 0.7794 0.7794 0.7793 0.7792 0.7790 0.7787 0.7785 0.7783 0.7781 0.7779 0.7777 0.7776 0.7773 0.7770 0.7767 0.7765 0.7764 0.7763 0.7763 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7763 0.7765 0.7765 0.7765 0.7765 0.7763 0.7762 0.7762 0.7762 0.7761 0.7761 0.7760 0.7760 0.7759 0.7757 0.7756 0.7755 0.7754 0.7753 0.7752 0.7752 0.7752 0.7752 0.7753 0.7753 0.7753 0.7753 0.7753 0.7753 0.7754 0.7756 0.7758 0.7760 0.7762 0.7763 0.7763 0.7763 0.7763 0.7763 0.7763 0.7763 0.7763 0.7763 0.7763 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7762 0.7761 0.7761 0.7761 0.7761 0.7761 0.7761 0.7761 0.7761 0.7761 0.7761 0.7761 0.7760 0.7760 0.7759 0.7757 0.7755 0.7752 0.7749 0.7747 0.7745 0.7742 0.7741 0.7739 0.7737 0.7735 0.7734 0.7733 0.7731 0.7730 0.7729 0.7729 0.7730 0.7729 0.7730 0.7731 0.7733 0.7734 0.7735 0.7736 0.7739 0.7740 0.7741 0.7741 0.7741 0.7742 0.7742 0.7742 0.7742 0.7741 0.7741 0.7741 0.7741 0.7741 0.7741 0.7741 0.7742 0.7742 0.7742 0.7742 0.7742 0.7742 0.7741 0.7741 0.7740 0.7739 0.7737 0.7735 0.7734 0.7733 0.7730 0.7728 0.7726 0.7725 0.7724 0.7724 0.7723 0.7722 0.7721 0.7719 0.7718 0.7716 0.7716 0.7715 0.7715 0.7714 0.7713 0.7712 0.7711 0.7710 0.7709 0.7708 0.7707 0.7707 0.7706 0.7706 0.7706 0.7705 0.7705 0.7705 0.7705 0.7706 0.7707 0.7708 0.7710 0.7711 0.7713 0.7714 0.7715 0.7717 0.7718 0.7718 0.7718 0.7718 0.7718 0.7718 0.7718 0.7718 0.7718 0.7718 0.7718 0.7717 0.7717 0.7717 0.7717 0.7717 0.7717 0.7717 0.7717 0.7716 0.7715 0.7713 0.7710 0.7706 0.7703 0.7699 0.7698 0.7698 0.7698 0.7700 0.7702 0.7704 0.7706 0.7708 0.7709 0.7709 0.7709 0.7709 0.7709 0.7709 0.7709 0.7708 0.7708 0.7708 0.7708 0.7708 0.7708 0.7708 0.7707 0.7707 0.7705 0.7703 0.7702 0.7702 0.7702 0.7701 0.7701 0.7701 0.7700 0.7699 0.7697 0.7695 0.7693 0.7691 0.7690 0.7689 0.7688 0.7688 0.7688 0.7688 0.7687 0.7686 0.7686 0.7686 0.7685 0.7685 0.7685 0.7685 0.7686 0.7686 0.7685 0.7685 0.7684 0.7684 0.7685 0.7687 0.7688 0.7690 0.7691 0.7691 0.7692 0.7692 0.7692 0.7692 0.7692 0.7692 0.7692 0.7692 0.7692 0.7691 0.7691 0.7691 0.7691 0.7691 0.7691 0.7691 0.7691 0.7690 0.7690 0.7690 0.7689 0.7688 0.7687 0.7686 0.7687 0.7687 0.7687 0.7687 0.7685 0.7683 0.7680 0.7677 0.7674 0.7673 0.7672 0.7670 0.7669 0.7668 0.7667 0.7666 0.7666 0.7666 0.7666 0.7665 0.7665 0.7664 0.7664 0.7663 0.7662 0.7661 0.7661 0.7660 0.7659 0.7658 0.7657 0.7656 0.7655 0.7654 0.7652 0.7650 0.7649 0.7647 0.7647 0.7647 0.7650 0.7653 0.7655 0.7658 0.7660 0.7661 0.7663 0.7663 0.7663 0.7663 0.7663 0.7663 0.7663 0.7663 0.7663 0.7663 0.7662 0.7662 0.7662 0.7662 0.7662 0.7662 0.7662 0.7662 0.7662 0.7662 0.7662 0.7661 0.7661 0.7661 0.7661 0.7661 0.7661 0.7661 0.7660 0.7659 0.7657 0.7655 0.7654 0.7652 0.7650 0.7648 0.7646 0.7645 0.7644 0.7643 0.7642 0.7642 0.7642 0.7642 0.7642 0.7642 0.7643 0.7643 0.7644 0.7644 0.7645 0.7646 0.7647 0.7648 0.7649 0.7650 0.7650 0.7650 0.7650 0.7650 0.7650 0.7650 0.7650 0.7649 0.7649 0.7649 0.7649 0.7649 0.7649 0.7649 0.7649 0.7649 0.7649 0.7649 0.7648 0.7648 0.7648 0.7648 0.7648 0.7648 0.7648 0.7647 0.7646 0.7644 0.7642 0.7639 0.7638 0.7637 0.7634 0.7631 0.7630 0.7629 0.7629 0.7629 0.7630 0.7630 0.7630 0.7632 0.7634 0.7634 0.7633 0.7633 0.7632 0.7631 0.7628 0.7626 0.7623 0.7622 0.7622 0.7621 0.7622 0.7624 0.7625 0.7626 0.7626 0.7625 0.7622 0.7619 0.7616 0.7614 0.7613 0.7612 0.7611 0.7611 0.7611 0.7610 0.7610 0.7609 0.7609 0.7608 0.7607 0.7606 0.7605 0.7605 0.7605 0.7605 0.7604 0.7603 0.7602 0.7602 0.7601 0.7599 0.7598 0.7595 0.7593 0.7591 0.7590 0.7589 0.7588 0.7587 0.7587 0.7586 0.7586 0.7585 0.7584 0.7583 0.7584 0.7584 0.7583 0.7581 0.7580 0.7579 0.7578 0.7577 0.7577 0.7578 0.7577 0.7576 0.7575 0.7573 0.7571 0.7570 0.7568 0.7568 0.7568 0.7570 0.7573 0.7576 0.7578 0.7581 0.7583 0.7585 0.7586 0.7587 0.7587 0.7587 0.7587 0.7587 0.7586 0.7586 0.7586 0.7586 0.7586 0.7586 0.7586 0.7586 0.7586 0.7586 0.7585 0.7585 0.7585 0.7585 0.7585 0.7585 0.7585 0.7585 0.7585 0.7585 0.7585 0.7584 0.7584 0.7584 0.7584 0.7584 0.7584 0.7584 0.7584 0.7584 0.7584 0.7584 0.7583 0.7583 0.7583 0.7583 0.7582 0.7580 0.7578 0.7575 0.7571 0.7567 0.7563 0.7560 0.7559 0.7559 0.7559 0.7559 0.7559 0.7558 0.7557 0.7556 0.7553 0.7553 0.7556 0.7557 0.7559 0.7561 0.7561 0.7560 0.7559 0.7558 0.7555 0.7551 0.7546 0.7539 0.7532 0.7524 0.7520 0.7514 0.7509 0.7504 0.7498 0.7492 0.7487 0.7483 0.7480 0.7478 0.7477 0.7475 0.7475 0.7473 0.7472 0.7470 0.7469 0.7467 0.7465 0.7462 0.7459 0.7455 0.7452 0.7449 0.7445 0.7442 0.7438 0.7434 0.7429 0.7423 0.7419 0.7416 0.7413 0.7411 0.7410 0.7408 0.7405 0.7402 0.7399 0.7396 0.7392 0.7389 0.7386 0.7383 0.7381 0.7379 0.7377 0.7375 0.7373 0.7370 0.7368 0.7366 0.7365 0.7364 0.7364 0.7364 0.7363 0.7362 0.7361 0.7359 0.7357 0.7354 0.7352 0.7350 0.7349 0.7348 0.7347 0.7347 0.7346 0.7344 0.7343 0.7342 0.7341 0.7340 0.7338 0.7335 0.7332 0.7329 0.7326 0.7322 0.7318 0.7313 0.7309 0.7306 0.7302 0.7300 0.7298 0.7298 0.7298 0.7299 0.7300 0.7301 0.7300 0.7298 0.7297 0.7298 0.7297 0.7296 0.7294 0.7293 0.7292 0.7291 0.7289 0.7288 0.7288 0.7288 0.7289 0.7290 0.7292 0.7293 0.7295 0.7296 0.7298 0.7300 0.7302 0.7305 0.7307 0.7308 0.7309 0.7310 0.7310 0.7310 0.7311 0.7312 0.7315 0.7318 0.7319 0.7320 0.7320 0.7321 0.7322 0.7324 0.7325 0.7326 0.7327 0.7327 0.7327 0.7327 0.7327 0.7327 0.7327 0.7327 0.7327 0.7327 0.7326 0.7326 0.7326 0.7326 0.7326 0.7326 0.7326 0.7325 0.7324 0.7322 0.7320 0.7316 0.7313 0.7309 0.7307 0.7304 0.7301 0.7298 0.7294 0.7291 0.7289 0.7287 0.7285 0.7283 0.7280 0.7279 0.7278 0.7277 0.7276 0.7275 0.7273 0.7271 0.7270 0.7269 0.7268 0.7267 0.7266 0.7265 0.7264 0.7262 0.7262 0.7261 0.7262 0.7263 0.7264 0.7266 0.7269 0.7272 0.7275 0.7277 0.7279 0.7282 0.7283 0.7285 0.7286 0.7286 0.7287 0.7287 0.7286 0.7286 0.7286 0.7286 0.7286 0.7285 0.7284 0.7281 0.7278 0.7274 0.7270 0.7265 0.7263 0.7260 0.7256 0.7253 0.7252 0.7252 0.7252 0.7252 0.7252 0.7252 0.7255 0.7258 0.7260 0.7263 0.7265 0.7267 0.7268 0.7268 0.7269 0.7268 0.7268 0.7268 0.7268 0.7268 0.7268 0.7268 0.7268 0.7268 0.7268 0.7267 0.7267 0.7267 0.7267 0.7267 0.7267 0.7267 0.7267 0.7266 0.7265 0.7264 0.7261 0.7258 0.7254 0.7249 0.7243 0.7240 0.7237 0.7232 0.7228 0.7225 0.7222 0.7220 0.7218 0.7216 0.7215 0.7214 0.7213 0.7213 0.7212 0.7211 0.7210 0.7210 0.7209 0.7209 0.7208 0.7207 0.7207 0.7206 0.7205 0.7204 0.7204 0.7203 0.7202 0.7200 0.7200 0.7200 0.7199 0.7198 0.7197 0.7197 0.7196 0.7197 0.7199 0.7201 0.7204 0.7206 0.7209 0.7212 0.7214 0.7216 0.7218 0.7219 0.7220 0.7221 0.7221 0.7221 0.7220 0.7220 0.7220 0.7220 0.7220 0.7220 0.7219 0.7219 0.7218 0.7217 0.7216 0.7215 0.7213 0.7211 0.7209 0.7207 0.7205 0.7204 0.7202 0.7200 0.7199 0.7197 0.7196 0.7195 0.7195 0.7194 0.7194 0.7193 0.7193 0.7191 0.7190 0.7190 0.7190 0.7190 0.7189 0.7189 0.7189 0.7189 0.7190 0.7192 0.7195 0.7197 0.7199 0.7201 0.7201 0.7202 0.7202 0.7201 0.7201 0.7201 0.7201 0.7201 0.7201 0.7201 0.7201 0.7201 0.7201 0.7201 0.7200 0.7200 … ] (1×39293 double)}
Especifique los datos de validación como un subconjunto de datos de prueba que contiene únicamente el primer archivo y transforme el almacén de datos fdsVal
para devolver los datos de predictores X
y los datos de objetivos Y
.
indices = 1; fdsVal = subset(fdsTest,indices); dsVal = transform(fdsVal,@(data) {data.X, data.Y});
Definir la arquitectura de red
Defina la arquitectura de la red. Establezca el número de características de entrada en cinco (voltaje, corriente, temperatura, voltaje medio y corriente media).
numFeatures = 5;
Establezca el número de características de salida en uno (SOC).
numResponses = 1;
Especifique el número de neuronas ocultas.
numHiddenNeurons = 55;
Definir la arquitectura de la red neuronal.
layers = [
sequenceInputLayer(numFeatures,Normalization="zerocenter")
fullyConnectedLayer(numHiddenNeurons)
tanhLayer
fullyConnectedLayer(numHiddenNeurons)
leakyReluLayer(0.3)
fullyConnectedLayer(numResponses)
clippedReluLayer(1)];
Especifique las opciones de entrenamiento. Para escoger entre las opciones se requiere un análisis empírico. Para explorar diferentes configuraciones de opciones de entrenamiento mediante la ejecución de experimentos, puede utilizar la app Experiment Manager.
Entrene durante 1200 épocas con minilotes de tamaño 1 mediante el solver "
adam
".Para evitar que los gradientes exploten, establezca el umbral del gradiente en 1.
Dado que los datos de entrenamiento tienen secuencias con filas y columnas correspondientes a canales y unidades de tiempo, respectivamente, especifique el formato de los datos de entrada
"CTB"
(canal, tiempo, lote).Especifique una tasa de aprendizaje inicial de 0.01.
Especifique un periodo de caída de la tasa de aprendizaje de 400.
Especifique un factor de caída de la tasa de aprendizaje de 0.1.
Especifique una frecuencia de validación de 30.
Muestre el progreso del entrenamiento en una gráfica y monitorice el error cuadrático medio raíz.
Deshabilite la salida detallada.
Los experimentos de Experiment Manager mostraron que la tasa de aprendizaje inicial de 0.01 junto con el factor de caída de la tasa de aprendizaje de 0.1 minimizan el error de validación. Para obtener más información sobre cómo optimizar hiperparámetros usando Experiment Manager, consulte Choose Training Configurations for LSTM Using Bayesian Optimization.
epochs = 1200; miniBatchSize = 1; LRDropPeriod = 400; InitialLR = 0.01; LRDropFactor = 0.1; valFrequency = 30; options = trainingOptions("adam", ... InputDataFormats="CTB", ... MaxEpochs=epochs, ... SequencePaddingDirection="left", ... Shuffle="every-epoch", ... GradientThreshold=1, ... InitialLearnRate=InitialLR, ... LearnRateSchedule="piecewise", ... LearnRateDropPeriod=LRDropPeriod, ... LearnRateDropFactor=LRDropFactor, ... ValidationData=dsVal, ... ValidationFrequency=valFrequency, ... MiniBatchSize=miniBatchSize, ... Plots="training-progress", ... Metrics="rmse", ... Verbose=0);
Entrenar la red
Entrene la red neuronal con la función trainnet
. Para la regresión, utilice la pérdida de error cuadrático medio. De forma predeterminada, la función trainnet
usa una GPU en caso de que esté disponible. Para utilizar una GPU se requiere una licencia de Parallel Computing Toolbox™ y un dispositivo GPU compatible. Para obtener información sobre los dispositivos compatibles, consulte GPU Computing Requirements (Parallel Computing Toolbox). De lo contrario, la función trainnet
usa la CPU. Para especificar el entorno de ejecución, utilice la opción de entrenamiento ExecutionEnvironment
.
net = trainnet(dsTrain,layers,"mse",options);
Probar la red
Realice predicciones con la función minibatchpredict
. De forma predeterminada, la función minibatchpredict
usa una GPU en caso de que esté disponible. Para utilizar una GPU se requiere una licencia de Parallel Computing Toolbox y un dispositivo GPU compatible. Para obtener información sobre los dispositivos compatibles, consulte GPU Computing Requirements (Parallel Computing Toolbox). De lo contrario, la función usa la CPU. Para especificar el entorno de ejecución, use la opción ExecutionEnvironment
. Dado que los datos tienen secuencias con filas y columnas correspondientes a canales y unidades de tiempo, respectivamente, especifique el formato de los datos de entrada "CTB"
(canal, tiempo, lote).
YPred = minibatchpredict(net,tdsPredictorsTest,InputDataFormats="CTB",MiniBatchSize=1,UniformOutput=false);
Compare el SOC predicho por la red con el SOC objetivo a partir de los datos de prueba para diferentes temperaturas.
YTarget = readall(tdsTargetsTest);
Represente el SOC predicho y el SOC objetivo para diferentes temperaturas ambiente.
figure nexttile plot(YPred{1}) hold on plot(YTarget{1}) legend(["Predicted" "Target"],Location="Best") ylabel("SOC") xlabel("Time(s)") title("n10degC") nexttile plot(YPred{2}) hold on plot(YTarget{2}) legend(["Predicted" "Target"],Location="Best") ylabel("SOC") xlabel("Time(s)") title("0degC") nexttile plot(YPred{3}) hold on plot(YTarget{3}) legend(["Predicted" "Target"],Location="Best") ylabel("SOC") xlabel("Time(s)") title("10degC") nexttile plot(YPred{4}) hold on plot(YTarget{4}) legend(["Predicted" "Target"],Location="Best") ylabel("SOC") xlabel("Time(s)") title("25degC")
Calcule el error entre el SOC predicho y el SOC objetivo para cada temperatura ambiente.
Err_n10degC = YPred{1} - YTarget{1}; Err_0degC = YPred{2} - YTarget{2}; Err_10degC = YPred{3} - YTarget{3}; Err_25degC = YPred{4} - YTarget{4};
Calcule la raíz del error cuadrático medio (RMSE) como un porcentaje.
RMSE_n10degC = sqrt(mean(Err_n10degC.^2))*100; RMSE_0degC = sqrt(mean(Err_0degC.^2))*100; RMSE_10degC = sqrt(mean(Err_10degC.^2))*100; RMSE_25degC = sqrt(mean(Err_25degC.^2))*100;
Calcule el error máximo como un porcentaje.
MAX_n10degC = max(abs(Err_n10degC))*100; MAX_0degC = max(abs(Err_0degC))*100; MAX_10degC = max(abs(Err_10degC))*100; MAX_25degC = max(abs(Err_25degC))*100;
Represente el RMSE para las diferentes temperaturas ambiente.
temp = [-10,0,10,25]; figure nexttile bar(temp,[RMSE_n10degC,RMSE_0degC,RMSE_10degC,RMSE_25degC]) ylabel("RMSE (%)") xlabel("Temperature (C)")
Represente el error absoluto máximo para las diferentes temperaturas ambiente.
nexttile bar(temp,[MAX_n10degC,MAX_0degC,MAX_10degC,MAX_25degC]) ylabel("MAX (%)") xlabel("Temperature (C)")
Cuanto más bajos sean los valores de las gráficas RMSE y MAX, más precisas serán las predicciones para las temperaturas correspondientes. Cuanto más altos sean los valores de las mismas gráficas, menos precisas serán las predicciones para las temperaturas correspondientes.
Referencias
[1] Kollmeyer, Phillip, Carlos Vidal, Mina Naguib y Michael Skells. “LG 18650HG2 Li-Ion Battery Data and Example Deep Neural Network XEV SOC Estimator Script”. Mendeley, 5 de marzo de 2020. https://doi.org/10.17632/CP3473X7XV.3.
Consulte también
trainnet
| trainingOptions
| dlnetwork
| sequenceInputLayer
Consulte también
Temas
- Evaluate Code Generation Inference Time of Compressed Deep Neural Network
- Estimación del estado de carga de una batería en Simulink utilizando una red neuronal prealimentada
- Regresión de secuencia a secuencia mediante deep learning
- Regresión de secuencia a uno mediante deep learning
- Pronóstico de series de tiempo mediante deep learning
- Clasificación secuencia a secuencia mediante deep learning
- Deep learning en MATLAB