Agenda

Data Science Seminar

Le 7 septembre 2017. Séminaire entièrement en anglais.
The LTCI Data Science Seminar is a joint research seminar between the DIG and the S2A teams. It focuses on machine learning and data science topics.
September 7, 2017

The seminar took place from 2PM to 4PM in Amphi Saphir, and featured two talks:

Talk 1: Albert Bifet (Télécom Paris): Massive Online Analytics for the Internet of Things (IoT)

You can download the slides of this talk.

Abstract: Big Data and the Internet of Things (IoT) have the potential to fundamentally shift the way we interact with our surroundings. The challenge of deriving insights from the Internet of Things (IoT) has been recognized as one of the most exciting and key opportunities for both academia and industry. Advanced analysis of big data streams from sensors and devices is bound to become a key area of data mining research as the number of applications requiring such processing increases. Dealing with the evolution over time of such data streams, i.e., with concepts that drift or change completely, is one of the core issues in stream mining. In this talk, I will present an overview of data stream mining, and I will introduce some popular open source tools for data stream mining.

Talk 2: François Roueff (Télécom Paris): Prediction of weakly locally stationary processes by auto-regression

You can download the slides of this talk.

Abstract: We introduce locally stationary time series through the  local approximation of the non-stationary covariance structure by a  stationary one. This allows us to define autoregression coefficients in a  non-stationary context, which, in the particular case of a locally stationary  Time Varying Autoregressive (TVAR) process, coincide with the generating coefficients. We provide and study an estimator of the time varying autoregression coefficients in a general setting. The proposed estimator of  these coefficients enjoys an optimal minimax convergence rate under limited  smoothness conditions. In a second step, using a bias reduction technique, we  derive a minimax-rate estimator for arbitrarily smooth time-evolving  coefficients, which outperforms the previous one for large data sets. In  turn, for TVAR processes, the predictor derived from the estimator  exhibits an optimal minimax prediction rate.