चर्चाGPT (Charcha GPT)

  • Unique Paper ID: 163414
  • Volume: 10
  • Issue: 11
  • PageNo: 1644-1650
  • Abstract:
  • Large language models (LLMs) have recently become a popular topic in the field of Artificial Intelligence (AI) research, with companies such as Google, Amazon, Facebook, Amazon, Tesla, and Apple (GAFA) investing heavily in their development. These models are trained on massive amounts of data and can be used for a wide range of tasks, including language translation, text generation, and question answering. However, the computational resources required to train and run these models are substantial, and the cost of hardware and electricity can be prohibitive for research labs that do not have the funding and resources of the GAFA. In this paper, we will examine the impact of LLMs on AI research. The pace at which such models are generated as well as the range of domains covered is an indication of the trend which not only the public but also the scientific community is currently experiencing. We give some examples on how to use such models in research by focusing on GPT3.5/ChatGPT3.4 and ChatGPT4 at the current state and show that such a range of capabilities in a single system is a strong sign of approaching general intelligence. Innovations integrating such models will also expand along the maturation of such AI systems and exhibit unforeseeable applications that will have important impacts on several aspects of our societies. ChatGPT is a large language model that uses deep learning techniques to generate human-like text.It is based on the GPT (Generative Pre-Trained Transformer) architecture, which uses a transformer neural network to process and generate text. The model is pre trained on a massive dataset of text such as books, articles and websites, so it can understand the pattern and structure of natural language when given a prompt or a starting point, the model uses this pre-trained knowledge to generate text that continues the given input in a coherent and natural way

Copyright & License

Copyright © 2025 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{163414,
        author = {Prakash Hongal and Shrinkanth Jogar and Nandita Tarikeri},
        title = {चर्चाGPT (Charcha GPT)},
        journal = {International Journal of Innovative Research in Technology},
        year = {},
        volume = {10},
        number = {11},
        pages = {1644-1650},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=163414},
        abstract = {Large language models (LLMs) have recently become a popular topic in the field of Artificial Intelligence (AI) research, with companies such as Google, Amazon, Facebook, Amazon, Tesla, and Apple (GAFA) investing heavily in their development. These models are trained on massive amounts of data and can be used for a wide range of tasks, including language translation, text generation, and question answering. However, the computational resources required to train and run these models are substantial, and the cost of hardware and electricity can be prohibitive for research labs that do not have the funding and resources of the GAFA. In this paper, we will examine the impact of LLMs on AI research. The pace at which such models are generated as well as the range of domains covered is an indication of the trend which not only the public but also the scientific community is currently experiencing. We give some examples on how to use such models in research by focusing on GPT3.5/ChatGPT3.4 and ChatGPT4 at the current state and show that such a range of capabilities in a single system is a strong sign of approaching general intelligence. Innovations integrating such models will also expand along the maturation of such AI systems and exhibit unforeseeable applications that will have important impacts on several aspects of our societies. ChatGPT is a large language model that uses deep learning techniques to generate human-like text.It is based on the GPT (Generative Pre-Trained Transformer) architecture, which uses a transformer neural network to process and generate text. The model is pre trained on a massive dataset of text such as books, articles and websites, so it can understand the pattern and structure of natural language when given a prompt or a starting point, the model uses this pre-trained knowledge to generate text that continues the given input in a coherent and natural way},
        keywords = {Large Language Models, GPT, ChatGPT, General AI Knowledge Manipulation, Reasoning, Applications in AI},
        month = {},
        }

Cite This Article

  • ISSN: 2349-6002
  • Volume: 10
  • Issue: 11
  • PageNo: 1644-1650

चर्चाGPT (Charcha GPT)

Related Articles