Sophilex's Blog
  • Home
  • Archive
  • Category
  • Tags
  • About me
  • Friends

共计 44 篇文章


2025

08-18
BOND: Aligning LLMs with Best-of-N distillation
08-11
Evaluating Position Bias in Large Language Model Recommendations
08-11
DATASET DISTILLATION VIA KNOWLEDGE DISTILLATION: TOWARDS EFFICIENT SELF-SUPERVISED PRETRAINING OF DEEP NETWORKS
08-10
Distilling the Knowledge in Data Pruning
08-04
DA-KD: Difficulty-Aware Knowledge Distillation for Efficient Large Language Models
08-04
Boosting Parameter Efficiency in LLM-Based Recommendation through Sophisticated Pruning
08-04
C2KD: Cross-layer and Cross-head Knowledge Distillation for Small Language Model-based Recommendations
07-15
SVD Decompositon in LLM Compression
07-07
DipSVD: Dual-importance Protected SVD for Efficient LLM Compression
07-07
SVD-LLM: TRUNCATION-AWARE SINGULAR VALUE DECOMPOSITION FOR LARGE LANGUAGE MODEL COMPRESSION
123…5

搜索

Hexo Fluid
京ICP证123456号 | police-icon 京公网安备12345678号
载入天数... 载入时分秒...