Shortened llama: A simple depth pruning for large language models

BK Kim, G Kim, TH Kim, T Castells, S Choi… - arxiv preprint arxiv …, 2024 - openreview.net
Structured pruning of modern large language models (LLMs) has emerged as a way of
decreasing their high computational needs. Width pruning reduces the size of projection …

Opportunities and Challenges in Transformer Neural Networks for Battery State Estimation: Charge, Health, Lifetime, and Safety

J Zhao, X Han, Y Wu, Z Wang, AF Burke - Journal of Energy Chemistry, 2024 - Elsevier
Battery technology plays a crucial role across various sectors, powering devices from
smartphones to electric vehicles and supporting grid-scale energy storage. To ensure their …

Elms: Elasticized large language models on mobile devices

W Yin, R Yi, D Xu, G Huang, M Xu, X Liu - arxiv preprint arxiv:2409.09071, 2024 - arxiv.org
On-device Large Language Models (LLMs) are revolutionizing mobile AI, enabling
applications such as UI automation while addressing privacy concerns. Currently, the …

Towards More Trustworthy and Interpretable LLMs for Code through Syntax-Grounded Explanations

DN Palacio, D Rodriguez-Cardenas, A Velasco… - arxiv preprint arxiv …, 2024 - arxiv.org
Trustworthiness and interpretability are inextricably linked concepts for LLMs. The more
interpretable an LLM is, the more trustworthy it becomes. However, current techniques for …

The openelm library: Leveraging progress in language models for novel evolutionary algorithms

H Bradley, H Fan, T Galanos, R Zhou, D Scott… - … Theory and Practice XX, 2024 - Springer
Abstract In recent years, Large Language Models (LLMs) have rapidly progressed in their
capabilities in natural language processing (NLP) tasks, which have interestingly grown in …

Evolutions of semantic consistency in research topic via contextualized word embedding

S Huang, W Lu, Q Cheng, Z Luo, Y Huang - Information Processing & …, 2024 - Elsevier
Topic evolution has been studied extensively in the field of the science of science. This study
first analyzes topic evolution pattern from topics' semantic consistency in the semantic vector …

LoRA : Multi-Scale Low-Rank Approximations for Fine-Tuning Large Language Models

JC Zhang, YJ **ong, HX Qiu, DH Zhu… - arxiv preprint arxiv …, 2024 - arxiv.org
Fine-tuning large language models (LLMs) with high parameter efficiency for downstream
tasks has become a new paradigm. Low-Rank Adaptation (LoRA) significantly reduces the …

Parameter-Efficient Fine-Tuning of Large Language Models via Deconvolution in Subspace

JC Zhang, YJ **ong, CM **a, DH Zhu… - Proceedings of the 31st …, 2025 - aclanthology.org
This paper proposes a novel parameter-efficient fine-tuning method that combines the
knowledge completion capability of deconvolution with the subspace learning ability …

Bilateral Cross-Modal Fusion Network for Multimodal Whole-Body Tumor Segmentation

T Yu, X Zhan, L **ang, X Gao… - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
3D whole-body tumor segmentation plays a critical role in cancer treatment. In relation to
this, multimodal fluorodeoxyglucose (FDG) positron emission tomography/computed …

Presidential Communication and Its Impact on the Mexican Stock Market: Evidence Using a Sentiment Analysis Approach

JA Cazares Aguilar… - Latin American Business …, 2024 - Taylor & Francis
This research paper addresses the impact of the positive or negative polarity of the
presidential messages contained in the stenographic version of the daily conferences …