Workshop

Workshop 1

  • COMING SOON

    09h30 - 10h00
    View in details
  • COMING SOON

    10h00 - 10h30
    View in details
  • COMING SOON

    10h30 - 11h00
    View in details
  • The latest Google Cloud Solutions to host your legacy workload

    11h00 - 11h30

    You would like to migrate your BigData infrastructure to the Cloud without getting blocked by some technologies (esp. by
    licencing, version, administration) ? Let’s have a look how the latest Google Cloud solutions and partnerships allow you
    to migrate or rehost your legacy workload / databases seamlessly

    During this session, the following topics will be covered :
    ●possible migration
    ●certified hardware
    ●deployment
    ●licencing
    ●Infrastructure : Bare Metal Solution
    ●Managed Partner Services
    ●Latency
    ●Installation, configuration, management and support

    Speaker : Maxime Lanciaux - Sales Engineer, Google Cloud

    • Maxime Lanciaux Sales Engineer, Google Cloud GOOGLE CLOUD
    View in details
  • Ab Initio: Develop once, run anywhere

    11h30 - 12h00
    View in details
  • COMING SOON

    12h00 - 12h30
    • Franck NGANIET Team Leader WaaS CA-GIP
    • José ALFARO Technical director FINAXYS
    View in details
  • How to build your agile, scalable data platform in the cloud

    12h30 - 13h00

    At a time when data is the main driver of the digital age, agility and the ability to deploy new technologies has become essential. The convergence of PaaS Cloud platforms provides an ideal environment to meet these challenges, offering a broad set of managed services, mature data architectures and agile methods.

    Micropole helps you create highly scalable and secure enterprise data platforms in the Cloud, combining traditional Business Intelligence and Big Data processes with advanced analytical capabilities, starting with AI.

    In our workshop, learn how to leverage your data and implement an agile data platform to handle any type of use case, whether analytical or predictive, in batch or real-time mode.

    Speaker : Thomas Dallemagne, Director, AWS Excellence Center, Micropole

    View in details
  • Self Service Data Analytics at Digital Virgo with Looker, BigQuery and Google Analytics 360

    13h00 - 13h30

    During this conference, Digital Virgo will share their experience regarding the Looker data platform. You’ll discover how Looker help democratizing access to data thanks to the LookML modelling layer and the explore interface. The main use case will highlight how Looker pairs well with BigQuery for analyzing multi terabytes Google Analytics data.

    Speaker: Anthony GIANASTASIO

    View in details
  • COMING SOON

    13h30 - 14h00
    View in details
  • COMING SOON

    14h00 - 14h30
    View in details
  • Quantum Supremacy and Beyond

    14h30 - 15h00

    Often seen as a key milestone for quantum computing, quantum supremacy was achieved in 2019. It’s an opportunity to:
    - Understand how quantum physic gave birth to quantum computing
    - How it will impact Big Data, what kind of use cases will unlocked
    - How quantum supremacy is meaningful
    - What is possible today, and what is coming

    Speaker: Pierre Frouge - Customer Engineer, Google Cloud et Damien Roux - Customer Engineer, Google Cloud
    Platform

    • Pierre Frouge Customer Engineer, GOOGLE CLOUD
    • Damien Roux Customer Engineer, GOOGLE CLOUD
    View in details
  • COMING SOON

    15h00 - 15h30
    View in details
  • COMING SOON

    15h30 - 16h00
    View in details
  • Would you trust your model with your life?

    16h00 - 16h30

    The development of artificial intelligence (AI) models is becoming increasingly popular in our industries, having a real impact on our daily lives. The « explainability » and interpretability of AI models are thus questioned today. While in many applications the ramifications of failure are quite low (for example, a movie-recommender fails to provide an accurate prediction), it is imperative that systems where safety is critical, such as automated driving, aerospace, or medical devices, not only get the right answer, but also explain how a final decision was made.
    Engineers and scientists must be able to thoroughly understand and investigate a model before putting it into production. In many industries, there are often regulatory requirements to allow a model into production. The higher the ramification of failure, the more need to fully explain the model behavior, and a “black-box” model will not suffice.
    Several techniques to improve the explainability of the model will be discussed during this session. This includes simple to more advanced deep learning models with proven visualizations as well as models tested and approved by experts. Emphasis will be placed on the ability to communicate the data properties, decisions and results of the model, detailing not only the inner workings of the model, but how the model was trained, different validation techniques, comparisons with other models, how the data were collected and the importance of training, validation and test sets.
    An explainable model is only one part of a complete AI system: a production-ready model must be integrated into a much larger system and may need to be run on specific hardware such as GPUs, FPGAs or the cloud. In this session, using MATLAB, we will lay the foundation for the steps required to implement such a system: from model testing and ease of explanation to system deployment and design.

    Speaker: Pierre Harouimi, Application Engineer specialized in data analytics and finance

    View in details
  • COMING SOON

    16h30 - 17h00
    View in details
  • COMING SOON

    17h00 - 17h30
    View in details
  • Powering Data Experiences To Drive Growth

    17h30 - 18h00
    View in details
  • Get an agile data stack in weeks with no Data Engineer

    18h00 - 18h30

    Speaker : Nolwenn Belliard Data Ops at Tyller Systems and Bruno Emsellem CTO at Tyller Systems.

    View in details

Workshop 2

  • COMING SOON

    09h30 - 10h00
    View in details
  • The Digital Twins in the strategy of Mercedes-AMG Petronas Motorsport, world champion of Formula 1 manufacturers

    10h00 - 10h30
    Many known factors contribute to the performance of a Formula 1 - design, aerodynamics, configuration, strategy - to which is added a decisive new element: advanced simulation.
    Mercedes-AMG Petronas Motorsport invests in digital simulators that reproduce the track experience. Find out how TIBCO’s analytical and artificial intelligence technologies help shape digital twins and forge a winning racing strategy.
    • Franck Léonard consulting senior TIBCO
    View in details
  • COMING SOON

    10h30 - 11h00
    View in details
  • COMING SOON

    11h00 - 11h30
    View in details
  • How to create a fraudless environment thanks to automated antifraud processes

    11h30 - 12h00

    Discover how with the Alteryx platform you will be able to quickly decrease the cost of fraud.
    In this session we will present and demonstrate 2 use cases:
    o Geospatial analysis with distance calculation between customers and agencies to identify abnormal situation in credit acceptance,
    o Predictive analysis with Machine Learning algorithm to identify bank transactions which are at risk.

    • Stéphane Portier Sales engineer ALTERYX
    View in details
  • COMING SOON

    12h00 - 12h30
    View in details
  • Analyzing video content using neural networks with web data

    12h30 - 13h00

    We will introduce the basics of video analysis using neural networks and explain an innovative approach to learn with minimum human intervention using Web data and synergies between text metadata and Visual video data.

    Speaker : Nicolas Chesneau, Technical Lead Machine Learning – Ekimetrics

    View in details
  • COMING SOON

    13h00 - 13h30
    View in details
  • COMING SOON

    13h30 - 14h00
    View in details
  • COMING SOON

    14h00 - 14h30
    View in details
  • COMING SOON

    14h30 - 15h00
    View in details
  • COMING SOON

    15h00 - 15h30
    View in details
  • Integration of Machine learning algorithms in a Big Data environment

    15h30 - 16h00

    What are the future challenges? What are the limits of distribution / parallelization of these calculations and data.

    Speaker : Fahd ESSID, CTO LANSROD

    View in details
  • COMING SOON

    16h00 - 16h30
    View in details
  • Digital transformation : a real challenge

    16h30 - 17h00

    Illustration, by a client case study, of a Big Data platform’ deployment specialized in personal data - RGPD as code - which made the customer think out of the box and address his digital transformation challenge in a centralized way. This client case study demonstrate that with such an approach, it is possible to go beyond the analytical exploitation of personal data.

    • Hugues Levental NC BLACK TIGER
    View in details
  • Accélérez les initiatives big data avec le virtual data lake ou du rôle de la data virtualisation dans l'accès aux données

    17h00 - 17h30
    View in details
  • AB testing & CRO with Advanced Analytics

    17h30 - 18h00

    CRO (Conversion Rate Optimization) is a key lever to make websites performant, by boosting the number of buyers, relative to the number of visitors.

    Within large websites organizations, CRO is often lead by the product teams, with strong UI/ UX approaches, and plenty of great tools (Google Analytics, ContentSquare, Kameleoon, AB tasty, …).

    However, everyone knows that data is key to make it efficient, but analysis required are often quite complex, as they require both a strong statistical/technical background, and a good sense of business priorities. Therefore, AB tests analysis are often the prerogative of costly and busy resources within analytics or consultancy teams.

    With DataMa, Pierre et Vacances Center Parcs Analytics team has managed to get a step ahead in providing Product teams with actionable analytics, wherever the data comes from

    This workshop will give a glance on how you could be able in few days to :

    • Attribute value to each page and Prioritize AB tests based based on this value
    • Build an AB test automated readout tool for both server and client-side tests
    • Understand the drivers behind those readouts to inform next version

    Speakers: Fanjuan Shi (Pierre et Vacances Center Parcs), Matthieu Barrue (Pierre et Vacances Center Parcs), Guillaume de Bénazé (DataMa)

    View in details
  • KILI TECHNOLOGY

    18h00 - 18h30
    View in details

Workshop 3

  • CAP DIGITAL

    09h30 - 10h00
    View in details
  • COMING SOON

    10h00 - 10h30
    • Mael Ropars Principal Solution engineer Cloudera
    View in details
  • COMING SOON

    10h30 - 11h00
    • Faiza BOUFROURA Responsable Valorisation des données AAA DATA
    • Nicolas ROTY Datascientist AAA DATA
    • Nathalie COSTA Directrice du Développement Groupe Estia
    View in details
  • Operational Excellence with Data at Scale

    11h00 - 11h30

    Data plays a key role in growth and innovation. Yet, most companies make it exceptionally difficult to work with data for everyone in the organization. Data democratization can help alleviate the pain felt by the "data-poor" and your overtaxed Data Team. In this talk you'll learn actionable methods to help your organization excel at using data to drive growth and innovation while simultaneously increasing efficiency.

    Speaker : Christy Haragan, Solutions Engineer EMEA

    View in details
  • COMING SOON

    11h30 - 12h00
    • Julien Sigonney Directeur Régional France DATAROBOT
    • Stéphan André Data Scientist DATAROBOT
    View in details
  • How to fail your migration in the Cloud in 5 steps

    12h00 - 12h30

    More and more companies want to migrate their data to the cloud in order to optimize their costs.
    In an era where data is generated at an exponential rate, it is increasingly difficult to migrate successfully. JEMS was one of the first companies to be present in the Big Data market and therefore has a very strong experience with more than 40 projects migrated to the cloud so far.
    This conference was built from our experience. We will explain how to miss your Big Data migration in 5 steps. Because we learn much more from our failures than from our successes.

    • AYMEN GHADGHADI VP Cloud Computing JEMS GROUP
    View in details
  • Towards AI Standards, Why Context Is Critical for Artificial Intelligence

    12h30 - 13h00

    COMING SOON

    • Nicolas Rouyer Senior Pre-Sales NEO4J
    View in details
  • COMING SOON

    13h00 - 13h30
    View in details
  • Analyse de données textuelles : approche qualitative ou machine learning ? un faux dilemme

    13h30 - 14h00
    View in details
  • COMING SOON

    14h00 - 14h30
    View in details
  • OneTrust

    14h30 - 15h00
    View in details
  • 3 ways to use your data in a dashboard: drive, explore and communicate.

    15h00 - 15h30

    DigDash is a French software editor of an agile dashboard, data visualization, exploration and analytics solution. After a presentation of Digdash, we will tackle several points through precise examples.

    ● Driving dashboards
    ● Exploration and data manipulation
    ● Communication data to a large audience

    Discover the tool that revolutionizes the way our customers work. All sectors combined, it allows companies to obtain a clear visibility on their data.

    • Antoine Buat Co- founder and CEO DIGDASH
    View in details
  • COMING SOON

    15h30 - 16h00
    • Alain Biancardi VP Sales EXPERT SYSTEM
    View in details
  • COMING SOON

    16h00 - 16h30
    View in details
  • COMING SOON

    16h30 - 17h00
    View in details
  • Why Data Virtualization has become essential in all data strategies

    17h00 - 17h30

    Denodo, leader and pioneer of Data Virtualization talks to CxO, CDO, architects, data scientists and developers.

    In this workshop discover how Data Virtualization (DV) allows you to go forward with your data initiatives and the evolution of your technical infrastructure at the same time. Unlock access to your corporate data, optimize your big data strategies and your data governance while tracking data usage, easily manage cloud migration projects without impacting business.

    With the testimony of Vincent Boucheron, independent Data Influencer and former Head of IS Data Services of a large bank: discover how he reduced time to data thanks to data virtualization and all the gains achieved in terms of self service, governance and security.

    Speakers:

    • Denodo
    • Vincent Boucheron - Data Influencer & Managing Partner, D.I.A.M.S
    View in details
  • Databricks & WANdisco - Building Reliable Data Lakes at Scale with Delta Lake

    17h30 - 18h00

    Hadoop Data lakes are facing significant data reliability challenges with very few projects delivered in production (10 to 20% on average) due to the inability of meeting the required SLAs. Failure to address them effectively can adversely impact analytics and Machine Learning initiatives.

    Delta Lake is an open-source storage layer that brings reliability to data lakes by providing ACID transactions, scalable metadata handling, and unifying streaming and batch data processing. Delta Lake runs on top of your existing data lake (AWS S3, Azure Blob, GCP DataProc) and is fully compatible with Apache Spark APIs.

    WANdisco & Databricks will present the best existing way to achieve a smooth transition (with no disruption guaranteed for existing applications in production) of your Hadoop Analytics workloads to the DeltaLake, and then significantly leverage your ML & AI use cases with better performance, scalability, and reliability

    Speakers

    • Seifeddine SAAFI - Solutions Architect - Databricks
    • Pierre TROUFFLARD - Territory Manager - WANdisco
    View in details
  • SOYHUCE

    18h00 - 18h30
    View in details

Workshop 4

  • COMING SOON

    09h30 - 10h00
    View in details
  • COMING SOON

    10h00 - 10h30
    View in details
  • Leveraging DataOps to Deliver an Omnichannel Customer Experience

    10h30 - 11h00

    Wynd is the one stop solution for global retailers’ omnichannel transformation, enabling them to handle their cash-in and fulfilment. Thanks to Wynd’s software, retailers can connect their stores with their website or mobile apps and offer innovative services to their customers like home delivery or click and collect. Meanwhile, they save money on stock levels, smoothen order management and turn their store associates into assets. 

    Having real time, unified data on stocks, orders, customers and capacity is crucial to building an omnichannel customer experience. Integration and stability can prove challenging in complex networks of stores and logistics centers across multiple industries.

    Wynd has partnered with Streamsets to solve this challenge. By leveraging the StreamSets platform Wynd uses php workers to retrieve the various kinds of data about customers and then integrate them using the StreamSetsPlatform. In concert Wynd and StreamSets deliver relevant and easily maintainable solutions without slowing down the completion times of data integration flows, and that's how Streamsets is helping Wynd build a DataOps practice.without slowing down the completion times of data integration flows, and that's how Streamsets is helping Wynd build a DataOps practice.

    View in details
  • COMING SOON

    11h00 - 11h30
    View in details
  • COMING SOON

    11h30 - 12h00
    View in details
  • COMING SOON

    12h00 - 12h30
    View in details
  • 6 Reasons Not All Search and AI-Driven Analytics are Created Equal

    12h30 - 13h00

    It’s easy to add a search box. Making search work for analytics? That’s a whole lot harder, especially when you factor in the complex needs of the enterprise. Learn how ThoughtSpot’s unique approach to search-driven analytics delivers lightning-fast answers on billions of rows of enterprise data.

    Speaker: Alexandre PICARD, Sales Engineer chez ThoughtSpot

    View in details
  • Discover the testimonial of a leader in the banking sector, Société Générale: feedback on a Client Data Hub in the Azure Cloud

    13h00 - 13h30

    Faced with ever-increasing data volumes and the emergence of new competitors, Société Générale is taking advantage of technological innovations to gain comprehensive knowledge of its customers, products and points of contact.

    Micropole supports Société Générale in its acceleration on the utilization and de-partitioning of its data. This data integration and governance platform has enabled the Finance Department to acquire a 360° view of all financial and non-financial information.

    Discover, through a cross-discussion, how the Cloud enables Société Générale to regain control and exploit its customer data to its full potential.

    Speakers :

    • Ludovic Moullé, Head of IS, Société Générale
    • Jean-Michel Franco, Sr Director, Product Marketing, Talend
    • Bertrand Rolain, Director Microsoft Cloud Excellence Center, Micropole

     

     

    View in details
  • COMING SOON

    13h30 - 14h00
    View in details
  • COMING SOON

    14h00 - 14h30
    View in details
  • COMING SOON

    14h30 - 15h00
    View in details
  • COMING SOON

    15h00 - 15h30
    View in details
  • COMING SOON

    15h30 - 16h00
    View in details
  • COMING SOON

    16h00 - 16h30
    View in details
  • COMING SOON

    16h30 - 17h00
    View in details
  • COMING SOON

    17h00 - 17h30
    View in details
  • Oppscience

    17h30 - 18h00
    View in details
  • KS CONSULTING

    18h00 - 18h30
    View in details
  • KS CONSULTING

    18h30 - 18h30
    View in details

Workshop 5

  • COMING SOON

    09h30 - 10h00
    View in details
  • Open Source in Big Data: pick the right tech for the right job

    10h00 - 10h30

    Choosing the right technology to build and power your data platform can be a daunting challenge. For many years, closed source solutions would be the first choice: vendors would sell you a licensed product backed by support services and maybe some R&D. For IT decision-makers, this convenient model comes at the cost of vendor lock-in, costly license and flexibility loss.

    With the advancement of many open source projects (think Cassandra, MongoDB, Elasticsearch, Kafka, Spark, Kubernetes, Ignite etc…) tackling complex big and distributed data problems, the process of choosing and selecting the right tech has changed. As an IT leader in your organization, you will need to understand the benefits, maturity, and fit against your use case of each technology, without necessarily speaking to vendors. You will need to gather information from multiple sources and ask yourself in your team the right questions before making an informed decision.

    Instaclustr has many years of experience working with open source products and offering services around them. In this talk, we will share with you what we believe are some key considerations that you should be mindful of when picking a particular technology to solve your use case. We will evaluate this approach against some of the key technologies that we have extensive expertise - namely Cassandra, Kafka, Spark, and Elasticsearch. At the end of this talk, you will be armed with the right approach to adopt open source technology, and you will have a high-level understanding of the problem solved by some of them.

    • Christophe Schmitz VP Consulting INSTACLUSTR
    View in details
  • How Amadeus and Couchbase partner to offer best-in-class Merchandising solutions to airlines

    10h30 - 11h00

    Amadeus is a leading travel technology company powering the operations of all the actors in the industry, and assist them to improve and redefine the travel experience. Among our objectives, we provide airlines with solutions to optimize their merchandising strategy by adapting to the offer context and their customers in real-time.

    During this session we will showcase how Couchbase helps us to deliver on this promise and get ready for going to the next level:
    · Covering a large range of Big Data needs, from key/value stores to complex queries and indexing
    · Enabling multi-cloud global deployments of our services

    View in details
  • COMING SOON

    11h00 - 11h30
    View in details
  • Infor Birst Smart Analytics

    11h30 - 12h00
    View in details
  • COMING SOON

    12h00 - 12h30
    View in details
  • COMING SOON

    12h30 - 13h00
    View in details
  • COMING SOON

    13h00 - 13h30
    View in details
  • Six steps to build strong foundation for real Data Value

    13h30 - 14h00

    Want to realize real value from your data? ASG Technologies proposes a six-step approach that will guide you to your objective. Realizing value from data is a journey that needs to start with inventory and understanding of the data you have - if you want to govern them and correctly share with the right stakeholders. These logic steps are translated into a sound DI process, in six steps, that will enable you to achieve your business objectives effectively and faster. Learn how to simply manage for example migration to cloud or a data lake with a clear path in mind and reliable technologies at hand.


    Speaker:
    Alain BUENO - Data Intelligence Sales Specialist, ASG Technologies

    View in details
  • COMING SOON

    14h00 - 14h30
    View in details
  • COMING SOON

    14h30 - 15h00
    View in details
  • Graph: from Database to Analytics

    15h00 - 15h30

    While the concept of the graph database is not quite new, their use for graph analytics is far from being ubiquitous. In our workshop we will demonstrate how a machine learning scenario with real-time model scoring can be implemented in TigerGraph, a non-trivial or impossible task for legacy graph databases.

    The earlier implementations of the technology limited the depth of understanding of connections; the quickly degrading response times to queries beyond three hops of graph traversal made their use impractical, especially in real-time environments. Yet the real value of a graph database is to find those non-superficial connections which are several hops deep.

    TigerGraph’s native parallel graph implementation is the next generation of graph databases for real-time deep link analytics. A combination of native, compressed and distributed graph storage, MPP scaling and parallelism, efficient processing engine and powerful yet easy to learn graph query language (GSQL) enables unrivalled high-performance graph analytics.

    This session will demonstrate the simplicity and speed of creating and populating a graph, the ease of developing analytical queries in GSQL, and how these queries allow immediate operationalisation of the results – in real-time.

    View in details
  • COMING SOON

    15h30 - 16h00
    View in details
  • COMING SOON

    16h00 - 16h30
    • François-Régis Chaumartin CEO de Proxem & auteur du livre PROXEM
    View in details
  • COMING SOON

    16h30 - 17h00
    View in details
  • COMING SOON

    17h00 - 17h30
    View in details
  • Logs, Metrics, APM, Uptime: Observer à 360° votre SI avec Elastic

    17h30 - 18h00
    View in details
  • Turing Club

    18h00 - 18h30
    View in details

Related tags

Book your pass today to attend Big Data Paris 2020