INTRODUCTION
With the ever-changing financial markets becoming more and more complex and volatile, stock market trading
and prediction now become an arduous task for potential investors, consultants, and academicians. Traditional
methods of market analysis relying on tried and tested statistical formulas and human analogy cannot easily
capture the nonlinear and noisy structure of financial data. In this season, AI and deep learning have become
game-changing technologies, employing sophisticated algorithms to analyze enormous datasets for patterns and
make signal-based price forecasts. Despite its promise, the adaptation of AI for stock trading remains a problem,
for example in explaining models, affecting hypermarkets, and handling different data sources, including
contrasts in social media sentiments and satellite images in this field.
This research paper addresses the critical need for a systematic review of AI techniques in stock trading. The
study maps the intellectual landscape of the field by analyzing 9,088 documents from 1971 to 2025, disclosing
its very rapid development at 10.6% annual increases and a very high academic impact of 15.62 average citations
per document. It reveals dominant research clusters, including market applications, stock-specific predictive
modeling, and AI/ML methodologies, illuminating their interactions between theoretical advancements and real-
world applications. Key methodologies, including neural networks (NN), long-short-term memory (LSTM), and
hybrid models, are surveyed together with emerging trends such as transformer architectures and Explainable
AI (XAI).
The study also underscores the global nature of research in this domain, with strong contributions from East
Asian institutions like Beihang University and influential authors such as WANG J and LI Y. Challenges such
as "unexplainable" AI and market volatility are discussed, with proposed solutions including hybrid approaches,
real-time adaptive models, and ethical AI frameworks aligned with ESG (Environmental, Social, Governance)
goals. By bridging gaps between theory and practice, this paper not only consolidates decades of research but
also provides a roadmap for future innovation, emphasizing the need for transparency, robustness, and
interdisciplinary collaboration in AI-driven financial decision-making.
Through bibliometric analysis and thematic mapping, the research demonstrates how AI is reshaping stock
trading, from foundational algorithms to cutting-edge applications, while calling for a balanced approach that
prioritizes both technological innovation and ethical accountability in the financial sector.
LITERATURE REVIEW
Bibliometrics has emerged as a powerful quantitative methodology for analyzing scholarly publications, closely
related to fields like "infometrics" (Egghe & Rousseau, 1990; Wolfram, 2003) and "scientometrics" [4]. This
approach systematically examines various forms of academic output, including journal articles, books, patents,
dissertations, and grey literature, while its counterpart "webometrics" extends this analysis to digital content.
Originally focused on basic bibliographic metrics like author productivity and citation counts, bibliometric
studies have evolved to encompass geographical distributions, institutional contributions, and discipline-specific
developments (Lin, 2012; Zhuang et al., 2013; Huffman et al., 2013; Liu et al., 2012). [3] [16]
Modern bibliometric research leverages sophisticated tools such as Scopus, Gephi [5], and VOSviewer, enabling
comprehensive analyses of citation networks, co-authorship patterns, and thematic trends. These methods now
incorporate alternative metrics like download statistics and social media engagement alongside traditional
citation analysis [17]. However, researchers must exercise caution in data normalisation (Pellegrino, 2011) [15]
to ensure valid cross-disciplinary comparisons, given the methodology's reliance on large datasets. [5]
While bibliometrics has proven valuable for assessing research impact, institutional performance, and academic
productivity through citation analysis [15], it has faced criticism for potential over-reliance on quantitative
metrics. The Leiden Manifesto [10] notably cautions against allowing numerical data to overshadow qualitative
scholarly judgment. Despite these concerns, bibliometric techniques have gained prominence in business
research, facilitated by analytical tools like VOSviewer, Leximancer, and SciVal that can process extensive
publication databases. [10]
Page 206