● Développement et gestion d’une application dédiée à l’analyse du marché immobilier et de
l’environnement socio-démographique en fonction d’une localisation (WISE).
● Construction de cartes interactives combinant indicateurs et rapports automatisés pour permettre l’analyse
complète d’un territoire : parc immobilier, offres, transactions, entreprises, équipements, open data...
● Accompagnement des métiers autour de l’analyse et la valorisation des données : projets, cas d’usage,
explorations de données, formations et « best practices » autour des solutions TIBCO Spotfire et Tableau.
Training on Kaggle challenges: recommendations and predictions on users behavior, testing formulas
for some service calculation fees (clustering, random forests, agglomerative hierarchical clustering …)
using R and Spark.
Geolocation and use of mapping packages in R (cartography, maptools, ggmap, plotGoogleMaps …).
Install, deployment and tests of Drill and Jethro.
Install and create notebooks through Zeppelin (Spark, SparkR, R, Python, Hive, SQL and Shell).
Upgrade Cloudera platform from 5.3 to 5.7.
Self-study in Tableau Software.
Implementations of H2O deep learning algorithms.
Tests on AtScale virtual machines.
Install and set up of a platform Hortonworks (4 machines) to retrieve information and sensitize
employees to limit risks over Internet (category, country, source, vulnerability and websites
localization).
Import data (PostGre databases, files ...) with Sqoop and store it in Hbase tables.
Data ingestion with a Pig script dedicated to load data from Hbase, call Web Services (ALEXA,
Xforce) to enrich data and insert the results into HDFS and Hbase.
Export with Sqoop to load restitution tables (PostGre).
Building indicators (categorization of websites, country, sources, vulnerability ...) and data
visualization through Tableau Software.
Consulting to find the better tool to make data visualization and data exploration/mining.
Expertise, use cases, data mining, development, training (more than 50 people) and best practices
around TIBCO Spotfire.
Connection to Big Data environments (datalake, Netezza databases ...).
Implementation of analysis with TIBCO Spotfire for the « Statistical Services » team responsible for
predict sinister: data integration, data enrichment calling R and SAS scripts and mapping visualizations
(including district segmentation).
Integration of data, production of various analysis, export results, diffusion via Web browsers.
Managing produced analysis (mapping, reporting, trend curves, heat maps, tree maps, dashboards ...).
Clustering, scripting and integration of statistical models.
Twitter analysis: streaming of tweets with Spark Streaming and Flume, clustering using Spark
Machine Learning scripts.
Retrieve content of websites (Nutch) to make Text Mining with Mahout, analyze information with
Spark and index them in a search engine (ElasticSearch, Solr) for quicker access.
Interaction with traditional database: import and export Sqoop.
Building of Predictive Analysis Impact model through Big-Data platform based on Cloudera and
Hadoop to calculate the correlation between parts of an aircraft (AIRBUS refactoring needs).
Implementations using Hadoop ecosystem: data management (HDFS, Yarn) and data access
(MapReduce, Hive, Pig, Impala, HBase, Mahout and Spark).
Running of Mahout algorithms (frequent pattern mining, k-mean, collaborative filtering) to process
data mining, clustering, and classification.
Alternative solution scripted in Python for AIRBUS modification impacts.
Deployments on Amazon Web Services platform.
• Settings of KPI (Key Performance Indicators) using TIBCO Spotfire.
• Developments of « Information Links », buildings of visualizations and analysis.
• Trainings on TIBCO Spotfire.
• Manager of the services platform with TIBCO ActiveMatrix performed.
• Setting standards of the SOA platform, installation and configuration of production environments (information bus based on TIBCO EMS).
• Operating procedures, scripts, monitoring, performance testing, deployment management.
• Transfer of skills, training on the platform’s architecture and tools.
• Development and maintenance using TIBCO Business Works of an EAI Front to Back Office within BNP Paribas (information bus based on « Rendez-vous »).
• Maintenance of productivity tools (monitoring, audit).
• Interaction with Oracle, SQL Server and Sybase databases.
• Training on TIBCO suite.