Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey on lora of large language models
Y Mao, Y Ge, Y Fan, W Xu, Y Mi, Z Hu… - Frontiers of Computer …, 2025 - Springer
Abstract Low-Rank Adaptation (LoRA), which updates the dense neural network layers with
pluggable low-rank matrices, is one of the best performed parameter efficient fine-tuning …
pluggable low-rank matrices, is one of the best performed parameter efficient fine-tuning …
Ldadam: Adaptive optimization from low-dimensional gradient statistics
We introduce LDAdam, a memory-efficient optimizer for training large models, that performs
adaptive optimization steps within lower dimensional subspaces, while consistently …
adaptive optimization steps within lower dimensional subspaces, while consistently …
Towards a science exocortex
KG Yager - Digital Discovery, 2024 - pubs.rsc.org
Artificial intelligence (AI) methods are poised to revolutionize intellectual work, with
generative AI enabling automation of text analysis, text generation, and simple decision …
generative AI enabling automation of text analysis, text generation, and simple decision …
On fairness of low-rank adaptation of large models
Low-rank adaptation of large models, particularly LoRA, has gained traction due to its
computational efficiency. This efficiency, contrasted with the prohibitive costs of full-model …
computational efficiency. This efficiency, contrasted with the prohibitive costs of full-model …
Pre-trained Audio Transformer as a Foundational AI Tool for Gravitational Waves
As gravitational wave detectors become more advanced and sensitive, the number of
signals recorded by Advanced LIGO and Virgo from merging compact objects is expected to …
signals recorded by Advanced LIGO and Virgo from merging compact objects is expected to …
Learning Parameter Sharing with Tensor Decompositions and Sparsity
Large neural networks achieve remarkable performance, but their size hinders deployment
on resource-constrained devices. While various compression techniques exist, parameter …
on resource-constrained devices. While various compression techniques exist, parameter …
MiniMedGPT: Efficient Large Vision–Language Model for medical Visual Question Answering
AR Alsabbagh, T Mansour, M Al-Kharabsheh… - Pattern Recognition …, 2025 - Elsevier
Abstract While Large Vision–Language Models (LVLMs) like GPT-4 and Gemini
demonstrate significant potential, their utilization in the medical domain remains largely …
demonstrate significant potential, their utilization in the medical domain remains largely …
Sparse Gradient Compression for Fine-Tuning Large Language Models
Fine-tuning large language models (LLMs) for downstream tasks has become increasingly
crucial due to their widespread use and the growing availability of open-source models …
crucial due to their widespread use and the growing availability of open-source models …
Shifting Attention to You: Personalized Brain-Inspired AI Models
SC Zhao, Y Hu, J Lee, A Bender, T Mazumdar… - arxiv preprint arxiv …, 2025 - arxiv.org
The integration of human and artificial intelligence represents a scientific opportunity to
advance our understanding of information processing, as each system offers unique …
advance our understanding of information processing, as each system offers unique …
RepLoRA: Reparameterizing Low-Rank Adaptation via the Perspective of Mixture of Experts
Low-rank adaptation (LoRA) has emerged as a powerful method for fine-tuning large-scale
foundation models. Despite its popularity, the theoretical understanding of LoRA has …
foundation models. Despite its popularity, the theoretical understanding of LoRA has …