Review of Deep Learning Models from Convolution Neural Networks to Transformers
Author(s):
Sandeep Maan, Gian Devi
Keywords:
Artificial Intelligence, Machine Leaning, Deep Learning, Convolution Neural Networks (CNN), Recurrent Neural Networks (RNN), Attention, Transformers, Large Language Models (LLM)
Abstract
Generative Artificial Intelligence has become synonym for Artificial Intelligence. Specifically, success of Large Language Model (LLM) is going to make long lasting disruptive effect. Application like ChatGPT [1] by OpenAI, BARD by Google are believed to change the human-machine relation in coming years. All these can be attributed to the developments in the field of deep learning during last decade. Things started with Convolutional Neural Networks (CNN) and advancement of GPUs that has made as lasting effect in the field of image processing. In this paper authors have reviewed features and limitation of most three most popular deep learning models viz. Convolution Neural Networks (CNN), Recurrent Neural Networks (RNN) and Transformer model. A systemic study of factors leading to the development of large language models is also presented.
Article Details
Unique Paper ID: 160893

Publication Volume & Issue: Volume 10, Issue 1

Page(s): 1331 - 1335
Article Preview & Download


Share This Article

Join our RMS

Conference Alert

NCSEM 2024

National Conference on Sustainable Engineering and Management - 2024

Last Date: 15th March 2024

Latest Publication

Call For Paper

Volume 10 Issue 10

Last Date for paper submitting for March Issue is 25 June 2024

About Us

IJIRT.org enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on editor@ijirt.org

Social Media

Google Verified Reviews