AI4PublicPolicy Components pt.3
In our previous blog post about the main components of the AI4PublicPolicy platform, the Cross Country Interoperability and Datasets and Policies Catalogue components have been described – find the 2nd blog post on the platform components here.
In this third part of the AI4PublicPolicy platform components’ analysis, two more components are being analyzed.
XAI (eXplainable AI) Techniques
This component provides information for analysing and explaining the predictions made by machine learning models. The XAI component gets a model (classifier, regression, or decision trees) and generates a dashboard for checking the model performance and SHAP values, which rely on input perturbations to explain the model output. The dashboard presents a set of interactive aspects the user can control. Examples of plots to be shown are: the model performance, the SHAP dependence plot, the impact of a feature on a predicted value, the SHAP interaction, the features importance, etc. A set of default dashboards for each model are developed as well as customised dashboards for the project pilots. The software initially considered to implement this component is ExplainerDashboard, which can be integrated with Jupyter notebooks that are used by the front end of the AI4PublicPolicy, the VPME. An example of the dashboard is given in the following figure:
The predefined dashboards receive the type of model (classifier, regression, decision tree), the model, and a dataset.
The XAI component produces an interactive dashboard integrated with Jupyter notebooks.
The following figure shows the EXplainable AI flow:
Policy Explainability and Interpretation
This component realises the tools to build the policy models which produce the analysis and the interpretation of the policy datasets given by the project’s pilots (Athens, Lisbon, Nicosia, Genova, and Burgas). In particular, the policy models produce insights drawn from the datasets and forecasts on each unique scenario presented below. The policymaker is able to understand the patterns existing in data and the models’ forecasts in order to fine-tune or shape future policies.
The different scenarios described for each pilot are created upon the inspection of available datasets. A transformation can take place for them after the finalisation of the use cases from the provided user stories.
Among the technologies to be used for the implementation of the tools are Machine Learning and Deep Learning algorithms (e.g., clustering algorithms, QARMA (INTRASOFT’s ML algorithm), Artificial Neural Networks etc.) through Keras and Tensorflow libraries. The optimization technologies to be used are widely known solvers such as Gurobi, SCIP and OR-Tools.
Datasets from the pilots related to both the pilots’ problem scope/use cases and the project execution activities.
The component has as output the policy models based on the input datasets. These models perform analysis on the data and produce a variety of visualisations and forecasts that can be used by the policymaker to reshape or create a variety of policies.
Indicative examples of policies that can be drawn from each scenario are the following:
Based on the forecasts for the available parking spaces a policy can be made in order to raise or lessen the parking prices so as to increase the municipality’s profit or to enhance the parking space availability.
Based on the optimal polling locations suggestion policy model a policymaker can alter the polling buildings for the citizens’ convenience.
Through creating and visualising a sustainability index, a policymaker is able to have a clear view of the city’s areas index and produce policies in order to enhance the sustainability index of areas that have small index scores.
The policy model is able to forecast a pipe’s RUL, or at which points a pipe may have leakage. From these forecasts, a policy can be drawn in order to proactively fix or replace these pipes.
Through the analysis of the bus routes, parking spaces revenues and occupancy, and environmental performance, it is possible to build visualisations that explain hidden insights from the data and build ML models and optimization algorithms that are able to predict/output the best way to travel from a point A to B, the most environmentally friendly one and cost-effective. Furthermore, it is attempted to develop better city services for disabled people and enhance their quality of life.