CV BIGQUERY : Les derniers consultants inscrits

Je dépose une mission gratuitement
Je dépose mon CV

Les derniers profils BIGQUERY connectés

CV Architecte  Big Data / Expert Cloud AWS/GCP/Azure
Mohamed

Architecte Big Data / Expert Cloud AWS/GCP/Azure

  • PALAISEAU
JAVA APACHE HADOOP DATA SCALA APACHE SPARK APACHE KAFKA BIG DATA AMAZON WEB SERVICES PYTHON AZURE
Disponible
CV Directeur IT / Directeur & Chef de projet / Manager
Christian

Directeur IT / Directeur & Chef de projet / Manager

  • PARIS
AGILE SQL PROJECT MANAGEMENT OFFICE PYTHON SCRUM JIRA DEVOPS JAVA .NET ITIL
Disponible
CV Architecte DATA
Christ

Architecte DATA

  • NANTERRE
BUSINESS OBJECTS TALEND DATAWAREHOUSE BIG DATA PYTHON ORACLE Cloud AWS TABLEAU SOFTWARE APACHE SPARK DATASTAGE
Disponible
CV Data Analyst TIBCO SPOTFIRE
Adnane

Data Analyst TIBCO SPOTFIRE

  • CHÂTENAY-MALABRY
TIBCO SPOTFIRE SQL DATAVIZ AGILE Microsoft Power BI BIGQUERY TABLEAU SOFTWARE SAP BO LEAD MANAGEMENT QLIKVIEW
Disponible
CV Chef de projet
Aicha

Chef de projet

  • RAMBOUILLET
AGILE LEAN MANAGEMENT PROJECT MANAGEMENT OFFICE
Disponible
CV Data Engineer / Data Modeler / Data Architect
Ahmed

Data Engineer / Data Modeler / Data Architect

  • MALAKOFF
APACHE HADOOP PYTHON SQL APACHE SPARK Data science BIG DATA JAVA Google Cloud Platform DATA HASHICORP TERRAFORM
Disponible
CV Data Engineer - BI
Nathan

Data Engineer - BI

  • TOULOUSE
Google Cloud Platform SQL Looker PYTHON HASHICORP TERRAFORM BI
Disponible
CV Ingénieur IA générative
Omar

Ingénieur IA générative

  • NANTERRE
CHATBOT Cloud AWS AZURE Kubernetes
Disponible
CV Data Scientist E-commerce
Mustapha

Data Scientist E-commerce

  • ANNECY
SQL SAS SAP Microsoft Power BI Tableau R EXCEL JAVA TALEND
Disponible
CV Machine Learning Engineer
Nelly

Machine Learning Engineer

  • PARIS
PYTHON DOCKER SQL SAS CSS Google Cloud Platform HASHICORP TERRAFORM APACHE KAFKA APACHE SPARK
Disponible
Je trouve un CV BIGQUERY
Vous êtes freelance ?
Sécurisez votre activité grâce au portage salarial !

Aperçu des emplois de Berramou,
freelance BIGQUERY résidant dans l'Isère (38)

  • Full stack Data DPD FRANCE
    2022 - aujourd'hui

    • Onprem Postgres datawarehouse : Develop comprehensive specifications for the IT department, enabling the construction of a PostgreSQL data warehouse that effectively utilizes data from local servers across the agencies.
    • Datamarts for delivery activity reporting : Leverage advanced SQL queries and utilize the power of dbt to establish robust datamarts for monitoring and analyzing delivery activity within the organization.
    • Data migration to GCP: Create a secure and GDPR-compliant Google Cloud infrastructure (using terraform, CI/CD). Transfer data to cloud storage in Parquet format. Transform and ingest data into BigQuery using dbt for efficient data transformation and Airflow/Cloud Composer for seamless orchestration. Ensure GDPR data compliance by implementing purge and anonymization procedures.

  • Full stack Data Madiaperformance
    2021 - 2023

    • Categorial analysis (Carrefour data retail) : Automation of categorical analysis in order to provide insights on a brand or a
    product category from the transactional data of carrefour france
    (Big data)) using python and bigquery (Advanced sql queries)
    • Data migration (OnPrem to GCP): Migrate data from Postgresql
    to Google cloud storage then ingest the data to bigquery in a
    defined frequency of a batch data using python, Fastapi, deployed
    on cloud run and managed with cloud scheduler (All compute
    resources are terraformed)
    • Data retrieval and structure : Automate mixpanel data retrieval
    via api mixpanel to google cloud storage and then ingest the data
    to Bigquery using python, fastapi, deployed on cloud run and
    scheduled with cloud scheduler.
    • Data quality/analysis : Data analysis and data quality using
    python (Pandas, data prep, fuzzyWuzzy algorithm), bigquery and
    dataiku ng python, Fastapi, deployed on cloud run and managed
    with cloud scheduler (All compute resources are terraformed).
    • GCP : Implement, manage GCP and create an organisation
    whithin it using cloud identity and configure Azure active directory Single Sign-On (sso) with google cloud connector.
    • Cloud provider : Benchmarking multiple cloud providers using
    TPC-DS in order to find the most suitable solution for the company (Built all compute resources are terraformed)

  • Data Scientist / data analyst

    Prisma Media (PMS)
    2019 - 2021

    • Scoring: predict the gender and age of visitors of Prisma media’s websites based on their browsing behaviour and the CRM
    database. Developped with python, scikitlearn decision trees
    model.
    • Segment manager: an intercative segment catalogue. Developed
    with Rshiny, python and deployed with Google compute engine
    (GCE) and cloud build for CI/CD.
    • Revenue dashboard: reporting in detailed manner, revenues generated from advertising and segments data (cookie). Developed
    with Rshiny, python, Google Ad manager API and deployed with
    GCE and cloud build for CI/CD.

  • Data Scientist / data analyst

    Internship (5 months) JohnDeere
    2019 - aujourd'hui

Voir le profil complet de ce freelance