CV BIGQUERY : Les derniers consultants enregistrés

Je dépose une mission gratuitement
Je dépose mon CV

Les derniers profils BIGQUERY connectés

CV Data Analyst TALEND
Azzedine

Data Analyst TALEND

  • POISSY
TALEND BODS BIGQUERY SQL BUSINESS OBJECTS AGILE Microsoft Power BI PYTHON
CV Sénior Data Engineer
Marouane

Sénior Data Engineer

  • PARIS
PYTHON PYSPARK Kafka APACHE KAFKA TALEND DI BIG DATA
Disponible
CV Développeur web REACT.JS
Paul

Développeur web REACT.JS

  • PARIS
REACT.JS NODE.JS TypeScript DOCKER REST AMAZON AWS Vue.js NestJS SQL
Disponible
CV Directeur des systèmes d'information de transition Google Cloud Platform
Filipe

Directeur des systèmes d'information de transition Google Cloud Platform

  • PARIS
Google Cloud Platform Gestion multi-projets RGPD DATA DATAVIZ BIG DATA SQL PROJECT MANAGEMENT OFFICE AGILE Cloud AWS
Disponible
CV Proxy PO /Scrum Master
Erhard

Proxy PO /Scrum Master

  • VÉLIZY-VILLACOUBLAY
JIRA SQL JUnit SONARQUBE SQUASH TM Xray KATALON
Disponible
CV Ingénieur Big Data
Tresor

Ingénieur Big Data

  • RUEIL-MALMAISON
APACHE HADOOP PYTHON SQL APACHE SPARK AZURE Microsoft Power BI Google Cloud Platform Cloud AWS SCALA DEVOPS
Disponible
CV Data Scientist
Khalid

Data Scientist

  • TOULOUSE
PYTHON DATA SQL BIG DATA DATAVIZ BI APACHE HADOOP Cloud AWS Microsoft Power BI APACHE SPARK
Disponible
CV Développeur TALEND DI /Cloud
Youssef

Développeur TALEND DI /Cloud

  • PARIS
TALEND SQL SAP BIGQUERY SALESFORCE MONGODB AZURE APACHE KAFKA TALEND DI OpCon
Disponible
CV Data Manager, Architecte technique
Mériem

Data Manager, Architecte technique

  • CHAMPS-SUR-MARNE
JIRA AGILE TALEND PYTHON
Disponible
CV Data Analyst - Data Scientist
Chihab

Data Analyst - Data Scientist

  • LILLE
PYTHON SQL Microsoft Power BI SAS API RESTful BIG DATA Databricks AZURE GIT BIGQUERY
Disponible
Je trouve un CV BIGQUERY
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Aperçu d'expériences de Berramou,
freelance BIGQUERY résidant dans l'Isère (38)

  • Full stack Data DPD FRANCE
    2022 - aujourd'hui

    • Onprem Postgres datawarehouse : Develop comprehensive specifications for the IT department, enabling the construction of a PostgreSQL data warehouse that effectively utilizes data from local servers across the agencies.
    • Datamarts for delivery activity reporting : Leverage advanced SQL queries and utilize the power of dbt to establish robust datamarts for monitoring and analyzing delivery activity within the organization.
    • Data migration to GCP: Create a secure and GDPR-compliant Google Cloud infrastructure (using terraform, CI/CD). Transfer data to cloud storage in Parquet format. Transform and ingest data into BigQuery using dbt for efficient data transformation and Airflow/Cloud Composer for seamless orchestration. Ensure GDPR data compliance by implementing purge and anonymization procedures.

  • Full stack Data Madiaperformance
    2021 - 2023

    • Categorial analysis (Carrefour data retail) : Automation of categorical analysis in order to provide insights on a brand or a
    product category from the transactional data of carrefour france
    (Big data)) using python and bigquery (Advanced sql queries)
    • Data migration (OnPrem to GCP): Migrate data from Postgresql
    to Google cloud storage then ingest the data to bigquery in a
    defined frequency of a batch data using python, Fastapi, deployed
    on cloud run and managed with cloud scheduler (All compute
    resources are terraformed)
    • Data retrieval and structure : Automate mixpanel data retrieval
    via api mixpanel to google cloud storage and then ingest the data
    to Bigquery using python, fastapi, deployed on cloud run and
    scheduled with cloud scheduler.
    • Data quality/analysis : Data analysis and data quality using
    python (Pandas, data prep, fuzzyWuzzy algorithm), bigquery and
    dataiku ng python, Fastapi, deployed on cloud run and managed
    with cloud scheduler (All compute resources are terraformed).
    • GCP : Implement, manage GCP and create an organisation
    whithin it using cloud identity and configure Azure active directory Single Sign-On (sso) with google cloud connector.
    • Cloud provider : Benchmarking multiple cloud providers using
    TPC-DS in order to find the most suitable solution for the company (Built all compute resources are terraformed)

  • Data Scientist / data analyst

    Prisma Media (PMS)
    2019 - 2021

    • Scoring: predict the gender and age of visitors of Prisma media’s websites based on their browsing behaviour and the CRM
    database. Developped with python, scikitlearn decision trees
    model.
    • Segment manager: an intercative segment catalogue. Developed
    with Rshiny, python and deployed with Google compute engine
    (GCE) and cloud build for CI/CD.
    • Revenue dashboard: reporting in detailed manner, revenues generated from advertising and segments data (cookie). Developed
    with Rshiny, python, Google Ad manager API and deployed with
    GCE and cloud build for CI/CD.

  • Data Scientist / data analyst

    Internship (5 months) JohnDeere
    2019 - aujourd'hui

Voir le profil complet de ce freelance