Review of Deep Learning Models from Convolution Neural Networks to Transformers
Author(s):
Sandeep Maan, Gian Devi
Keywords:
Artificial Intelligence, Machine Leaning, Deep Learning, Convolution Neural Networks (CNN), Recurrent Neural Networks (RNN), Attention, Transformers, Large Language Models (LLM)
Abstract
Generative Artificial Intelligence has become synonym for Artificial Intelligence. Specifically, success of Large Language Model (LLM) is going to make long lasting disruptive effect. Application like ChatGPT [1] by OpenAI, BARD by Google are believed to change the human-machine relation in coming years. All these can be attributed to the developments in the field of deep learning during last decade. Things started with Convolutional Neural Networks (CNN) and advancement of GPUs that has made as lasting effect in the field of image processing. In this paper authors have reviewed features and limitation of most three most popular deep learning models viz. Convolution Neural Networks (CNN), Recurrent Neural Networks (RNN) and Transformer model. A systemic study of factors leading to the development of large language models is also presented.
Article Details
Unique Paper ID: 160893

Publication Volume & Issue: Volume 10, Issue 1

Page(s): 1331 - 1335
Article Preview & Download


Share This Article

Conference Alert

NCSST-2023

AICTE Sponsored National Conference on Smart Systems and Technologies

Last Date: 25th November 2023

SWEC- Management

LATEST INNOVATION’S AND FUTURE TRENDS IN MANAGEMENT

Last Date: 7th November 2023

Go To Issue



Call For Paper

Volume 10 Issue 1

Last Date for paper submitting for March Issue is 25 June 2023

About Us

IJIRT.org enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on editor@ijirt.org

Social Media

Google Verified Reviews