Di-Plast @ confrerence on state of the art artificial intelligence

Leonid Schwenke and Martin Atzmueller from the Universtiy of Osnabrück, presented online at the International FLAIRS Conference, their research topic "Show Me What You’re Looking For: Visualizing Abstracted Transformer Attention for Enhancing Their Local Interpretability on Time Series Data". The paper was nominated for the best paper award, introducing the application of a new neural network architecture on time series data.

Below you find a short summary of their work:

Neural network models are still lacking interpretability and are still hard to apply on real world data. This is especially the case for time series data, which is less intuitive than images and text. A new neural network architecture called Transformers introduces a so-called attention mechanism. We took advantage of this attention mechanism inside our proposed human-in-the-loop approach to reduce the complexity of the input data; helping experts to better understand the crucial decision elements in the data and the trained neural network model. An example can be seen in the figure on the right. It is appliable on numeric supervised sequences (including real world and prediction tasks) or on supervised categorical data (e.g. symbolic approaches like subgroups) and can therefore help our experts in the Di-Plast project to improve the Data Analytics Tool and Data Validation tool.

if you are looking for digital solutions for the uptake of recycled plastic applying the latest reseach in data science, please contact us!

 

Time series

Share this

Tweet Share