Energy is becoming an issue at a global level for every human activity. The on-wheel transportation is at the center of the discussion for companies and government because of fossil fuel, pollution and greenhouse gas emission, that is negatively effecting living on our planet.
The most efficient engines cannot reach an efficiency of 40%. This means that more than 60% of the available energy is wasted as heat.
The goal is to develop more sophisticated systems that can collect and distribute energy from different sources to different users, as shown in the diagram, deciding how to manage flows according to the current user’s needs and by evaluating as many data as possible to reach the best optimization.
Our approach aims at identifying all the inputs that are likely to affect the energy request in the current instant and in the future, and this information is both depending on vehicle conditions and the external environment.
Our idea is to mix the options that would represent the best time to market with respect to the optimization levels that the control may reach.
In this sense, we propose a hybrid MPC-ML control system that would allow the controller to run a simplified vehicle model with reasonable computing resources, and a deep neural network engine based on reinforced learning that would provide the extra push to optimization and adapt during time