
MOURAD HAMMALE
Python, openedge abl dev at centralesupélec · pinned knowledge-distillation-dit-teaches-layoutlmv3 · active — 617 contribs last year.
GitHub · @hammale2003
Pinned
This project implements a knowledge distillation pipeline to train a lightweight and high-performing document classification model. The approach uses a large vision-expert model, DiT (Document Image Transformer), as the "teacher" to instruct a multimodal model, LayoutLMv3, which acts as the "student".
Research internship project on knowledge distillation and continual learning. Developed at the Laboratory of Complex Systems, Ecole Centrale Casablanca.
This project provides an interactive web interface using Gradio to experiment with training Andrej Karpathy's nanoGPT model. It allows users to configure model architecture, training hyperparameters, and select various optimization techniques like gradient accumulation, activation recomputation, and data parallelism
Ce projet propose une solution interactive au problème de routage de véhicules avec livraisons fractionnées (SD-VRP), implémentant à la fois des méthodes exactes via Gurobi et des approches métaheuristiques.
Code release for the paper "Style Vectors for Steering Generative Large Language Models", accepted to the Findings of the EACL 2024.
This project applied machine learning methods to improve the evaluation of welding quality by predicting critical mechanical properties.