An Efficient Transformer-Based Model for Automated Code Generation: Leveraging Large Language Models for Software Engineering

Authors

  • Dr. Leema Rose Department of IT, Ethiraj College for Women, Chennai, India. Author

DOI:

https://doi.org/10.63282/3050-922X.IJERET-V1I3P101

Keywords:

Code Generation, Transformer Model, Deep Learning, Software Engineering, Code Automation, Neural Networks, Self-Attention, Context-Aware Learning, Machine Learning, Code Refactoring

Abstract

Automated code generation (ACG) is a critical component of modern software engineering, enabling developers to write code more efficiently and with fewer errors. This paper presents a novel transformer-based model, CodeGen-Transformer, designed to enhance the capabilities of large language models (LLMs) in generating high-quality, contextually relevant code. We explore the architecture, training methodologies, and performance metrics of CodeGen-Transformer, and compare it with existing state-of-the-art models. Our model leverages the strengths of transformers, including self-attention mechanisms and deep neural networks, to generate code that is both syntactically correct and semantically meaningful. We also discuss the integration of CodeGen-Transformer into software development workflows and its potential impact on productivity and code quality. The results of our experiments demonstrate that CodeGen-Transformer outperforms existing models in terms of code accuracy, context understanding, and adaptability to diverse programming languages.

References

[1] Vaswani, A., et al. (2017). Attention is All You Need. NeurIPS.

[2] Devlin, J., et al. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. NAACL.

[3] Liu, Y., et al. (2020). CodeBERT: A Pre-trained Model for Programming and Natural Languages. ArXiv.

[4] Guo, D., et al. (2020). GraphCodeBERT: A Pre-trained Model for Programming and Natural Languages. ArXiv.

[5] Brown, T. B., et al. (2020). Language Models are Few-Shot Learners. NeurIPS.

[6] https://towardsdatascience.com/building-a-python-code-generator-4b476eec5804/

[7] https://arxiv.org/abs/2410.24119

[8] https://arxiv.org/html/2412.05749v1

[9] https://xinyi-hou.github.io/files/hou2023large.pdf

[10] https://www.youtube.com/watch?v=vLZC8N8LmEU

[11] https://github.com/xinyi-hou/LLM4SE_SLR

[12] https://huggingface.co/docs/transformers/en/model_doc/codegen

[13] https://github.com/iSEngLab/AwesomeLLM4SE

Downloads

Published

2020-08-03

Issue

Section

Articles

How to Cite

1.
Rose L. An Efficient Transformer-Based Model for Automated Code Generation: Leveraging Large Language Models for Software Engineering. IJERET [Internet]. 2020 Aug. 3 [cited 2025 Sep. 12];1(3):1-9. Available from: https://ijeret.org/index.php/ijeret/article/view/21