Arborescence des pages
ConfigureOutils de l'espace
Aller directement à la fin des métadonnées
Aller au début des métadonnées


Process and knowledge modelling


Duration : 21 hours

ECTS Credits : 2

Semester : S7

Person(s) in charge :

Bart LAMIROY, Associate Professor,

Keywords :   big data, formal learning, NoSQL, Map-Reduce, Ontologies, formal analysis of concepts.

Prerequisites: algorithmic, programming, SQL, transactional model, SGDB-R


 Understanding the models and acquiring the necessary knowledge for massively distributed data



Program and Contents:

Acquiring a general knowledge related to all approaches dealing with big data.

First part (Complex data)

1. Knowledge featuring

2. Logical reasoning

3. Design formal analysis

Second part (Big Data)

1. NoSQL : Introduction to BASE vs. ACID, CAP theorem

2. Technical solutions for scaling, Map-Reduce

3. case study 1 : key-value store, document databases

4. case study 2 : column-oriented databases, graph databases




Description and operational verbs


 Distributed data access optimization mechanisms

Main approaches for non static data extraction from brute data



The technological fundamentals of massively distributed data exploiting

The relation between technological solutions, networks and clouds social and economic issues


Use scenarios on concrete cases with real situation constraints


Knowledge schemas and extraction modalities of newly made knowledge




The adequate solutions, their quality, their limits and their performances as well as their relevancy as for the alternative modelling.

Evaluations :

  • Written test
  • Continuous Control
  • Oral Report
  • Project
  • Written Report
  • Aucune étiquette