International Journal of Research and Innovation in Social Science

Submission Deadline- 14th October 2025
October Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-04th November 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-17th October 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Technology Acceptance and Self-Efficacy in Digital Examinations: Insights from Accounting Students

  • Nooraslinda Abdul Aris
  • Asyaari Elmiza Ahmad
  • Siti Syaqilah Hambali
  • Afaf Ahmad Jalaludin
  • 5106-5115
  • Oct 14, 2025
  • Accounting

Technology Acceptance and Self-Efficacy in Digital Examinations: Insights from Accounting Students

Nooraslinda Abdul Aris1*, Asyaari Elmiza Ahmad1, Siti Syaqilah Hambali1, Afaf Ahmad Jalaludin2

1Faculty of Accountancy, University Technology MARA, Shah Alam, Selangor, Malaysia

2Academy Sector, Matriculation Division, Ministry of Education Malaysia

*Corresponding author

DOI: https://dx.doi.org/10.47772/IJRISS.2025.909000415

Received: 10 September 2025; Accepted: 15 September 2025; Published: 14 October 2025

ABSTRACT

The digitalisation of professional assessments has accelerated across higher education and professional bodies, including ACCA, CPA, and MICPA, making students’ readiness for computer-based assessment (CBA) a pressing priority. This study examines students’ perceptions of readiness for digital examinations through the combined lenses of the Technology Acceptance Model (TAM) and self-efficacy. Using survey data from 126 accounting students, the analysis explored student, provider, and technology readiness. The findings reveal that students demonstrate limited self-efficacy, reflected in weak computer know-how and minimal prior exposure, while moderate perceptions of usefulness emerged in relation to broader coverage and efficiency. Provider readiness was undermined by concerns about insecure assessment tools and limited institutional support. Technology readiness was most constrained by expectations of technical difficulties, with training and demonstrations viewed as insufficient. The results highlight the need to embed digital literacy training, strengthen institutional reliability, and align university assessment systems with the digital standards of professional bodies. This study contributes to the literature by extending TAM with self-efficacy in the context of professional accounting education and offers practical guidance for faculties seeking to prepare students for digital professional examinations.

Keywords: Computer-Based Assessment, Technology Acceptance Model, Self-Efficacy, Accounting Education, Digital Examinations

INTRODUCTION

The accounting profession is undergoing rapid digital transformation, shaped by automation, data analytics, and artificial intelligence. This shift challenges higher education to prepare students for a workplace where digital fluency is essential (ref). One critical aspect of this transformation is assessment. Traditional paper-based exams (PBE) are being replaced by computer-based assessments (CBA), which offer efficiency, scalability, and stronger alignment with the competencies expected in professional practice (Yeboah, 2023).

The transition to CBA, however, is not without challenges. Research shows that educational technologies often fail when institutions focus heavily on infrastructure while overlooking human factors (Bearman et al., 2023). Many of the existing literature addresses technical barriers such as costs, software, and academic integrity protection, far less attention has been given to students’ perceptions of these changes (Dangi et al., 2023). Yet students are the primary stakeholders, and their readiness, confidence, and trust in the system are decisive for adoption.

Readiness is both technical and psychological issues. It embraces students’ belief in their ability to manage CBA (self-efficacy), their perceptions of the reliability of technology, and their confidence in their institution’s capacity to deliver assessments fairly and securely (Bandura, 1997; Davis, 1989). If any of these elements are deficient, students are more likely to experience resistance or anxiety, which undermines the purpose of digital assessment.

The gap is particularly serious in accounting education, where professional bodies such as ACCA, CPA Australia, and MICPA have already implemented digital examinations. Most prior studies examine only one aspect of digital assessment, such as teachers’ readiness, institutional preparedness or technology acceptance, but little has explored how students weigh all three dimensions (self, institution and technology) together. As such, this study intended to close some of the gaps by assessing the student’s readiness state, combining two frameworks of Technology Acceptance Model (TAM) and self-efficacy. It is believed these would offer more holistic views as compared to existing research which focuses on either TAM (ref) or self-efficacy (ref) in evaluating readiness.

The study, in essence, deliberates on how accounting students view (1) their ability to manage digital examinations, (2) evaluate the practicality and reliability of the technology, and (3) perceive faculty and institutional readiness. Students’ experience (positive or negative) in learning has a significant impact on their attitude, acceptance, and tendency to use the CBA, thus likely to impact the future use of technology within an academic context (Alwi & Khan, 2024). Aside from technical barriers, the study provides a clearer picture of the perceptions that can make or break the integration of CBA in accounting curricula. The findings are envisioned to support educational institutions moving towards borderless learning environment. Digitalisation transformation set by professional accounting bodies (ACCA, n.d.) is forward looking that will empower students, strengthen institutional position, and align with market demand.

LITERATURE REVIEW

Digital Transition in Higher Education

Education has revolved in accordance with the fourth industrial revolution. The rapid usage of technology has spur and mould the education landscape from traditional face-to-face to online digital learning (Skhephe, 2022). Such switch is virtually apparent as a sudden response against Covid-19 pandemic (Ally, 2024; Dangi et al., 2023). Online learning has become the champion in delivering teaching and learning. Consequently, this affects the assessment mode. Furthermore, the rising number of enrolments in education setting driven by population growth and free education policies amplify pressure on institutions to adopt assessment methods that are efficient, cost-effective, reliable, and secure (Ally, 2024).

The move towards CBA is part of the national education reform which emphasise on online or e-learning. Strategically, CBA, online examination or e-assessment is seen as a strategic solution to these demands. CBAs can streamline exam administration, conserve resources, and extend coverage of learning outcomes. Unlike traditional PBE, CBAs support various competencies, from knowledge recall to critical thinking (Ally, 2024). Education institutions increasingly see CBA as part of their digital transformation agenda, as research highlights its efficiency, scalability, and capacity to provide faster feedback (Bearman et al., 2023). For professional disciplines such as accounting, however, the stakes are higher. Here, assessment credibility and fairness directly affect certification and employability (Scott & Unsworth, 2018).

Professional bodies like ACCA, CPA Australia, and MICPA have embraced this shift. The progressive transition to CBA marks a global move toward digital examinations. ACCA, for instance introduced on-demand CBAs at the foundational level in 2016 and has since expanded them across multiple levels witnessing year 2019 as the final move (ACCA, n.d.). CPA and MICPA have followed suit, embedding digital platforms to reflect the realities of professional practice. These reforms may be regarded as improving efficiency and accessibility. It also signals the birth of future digital competent accountant, that not only poses strong knowledge in accounting-related, top technology capability and hold on to the ethical values set by the profession. More importantly, the education system must be ready in preparing students with technical knowledge, as well as readiness to succeed in digital assessment environments.

Technology Acceptance Model (TAM) in Higher Education

The Technology Acceptance Model (TAM) has been dominating studies related to predicting the adoption of new information systems. Developed by Davis (1989), it centres on two perceptual beliefs of ‘perceived usefulness’ (PU) and ‘perceived ease of use’ (PEOU). PU represents to ‘the extent to which a person trusts a system will boost job performance’, while PEOU denotes ‘the belief that the system can be used with nominal effort (Davis, 1989).

In higher education, these constructs have been widely applied to comprehend student and faculty acceptance of varied technologies ranging from learning management systems (LMS) to digital assessment tools (Dangi et al., 2023; Mukred et al., 2024). In the case of CBA, TAM captures whether students’ see the systems as performance-enhancing and manageable. Research shows that students are more willing to engage with e-learning platforms, digital libraries, and CBAs when they find them useful and easy to navigate (Mukred et al., 2024; Tahir & Jabar, 2024). PEOU is a solid predictor of PU, suggesting system that easy to use are expected to be judged as useful (King & He, 2006). Both constructs help reduce negative attitudes like anxiety, thereby fostering a more positive engagement with technology (Yavuz Temel et al., 2025)In accounting education specifically, PU is often linked to alignment with professional exam requirements and employability outcome (Alozie, 2022; Scott & Unsworth, 2018).

Still, TAM pose some limitation. Arguably, TAM often overlooks broader and complex human and contextual factors influencing adoption (Alwi & Khan, 2024; Strzelecki, 2024). Its focus on voluntary use can be problematic in mandatory settings like higher education compulsory assessments, where usefulness may be fluxed with the motivation of higher grades. Moreover, TAM does not fully account for self-belief (e.g., students’ confidence in their ability to use technology) or the organizational readiness (e.g., institutional support and credibility). As such, while TAM provides a str fuller view of readiness requires complementary perspectives that capture psychological and contextual dimensions.

Extension of TAM has begun to address these gaps by incorporating factors such as self-efficacy, trust, and institutional support (Mukred et al., 2024). These additions improve explanatory power by recognising that even reliable technologies, it may fail given lower students confidence or weaker institutional backing (Ain et al., 2016). This makes TAM especially particularly relevant for high-stakes contexts like professional accounting assessments where success highly dependable on confidence and support as on the technology itself.

Self-Efficacy in Professional Accounting Education

Bandura (1997) explains self-efficacy as an individual’s trust in their capability to perform the actions needed to succeed in a certain condition. It shapes motivation and behaviour. Students with strong self-efficacy incline to treat challenges as opportunities to master, while those with low self-efficacy often view them as threats to avoid. In digital education, self-efficacy is particularly important because it influences whether students feel capable of managing online platforms, troubleshooting technical problems, and applying their knowledge under exam pressure (Sherafati & Mahmoudi Largani, 2023).

Research shows that high computer self-efficacy reduces exam anxiety and increases acceptance of CBA, while low self-efficacy can lead to resistance and poor performance despite adequate subject knowledge (Yeşilyurt et al., 2016). In professional accounting, where examinations carry significant career implications, self-efficacy plays a dual role: it not only lowers anxiety but also builds resilience when technical disruptions occur. Exposure, training, and institutional support are therefore critical for ensuring that accounting students feel confident about transitioning to digital assessment formats such as those mandated by ACCA, CPA, and MICPA (Ain et al., 2016).

Integrating TAM and Self-Efficacy in Accounting Assessment

TAM and self-efficacy work together to explain the readiness state for digital assessment. TAM highlights cognitive appraisals of usefulness (PU) and ease of use (PEOU), while self-efficacy adds the psychological dimension by focusing on students’ confidence in their ability to use the system. Students with strong computer self-efficacy are more likely to perceive CBAs as manageable and beneficial, reinforcing both TAM constructs (King & He, 2006). Conversely, weak self-efficacy can diminish adoption, even when systems are technically sound (Mukred et al., 2024; Tahir & Jabar, 2024).

In professional accounting education, this integration is especially important. Students preparing professional examinations face higher risks where competence and confidence determine their performance. By combining TAM with self-efficacy, this study provides a multidimensional framework for assessing readiness that captures both perceptions of technology and the psychological resilience needed to succeed in digital examinations.

METHODOLOGY

A quantitative survey design was used in observing the accounting students’ perceptions of readiness for CBA. The framework depicted by TAM and self-efficacy theory to capture three dimensions of readiness namely student, provider, and technology.

The target population comprised undergraduate accounting students enrolled in a faculty offering professional pathways. A total of 150 questionnaires sized A6 was printed out using coloured paper aiming at returning students. The questionnaire was distributed in the advisors’ room, where students typically passed through before submitting their registration forms, ensuring a high response rate. This approach was engaged as it allows students to fill up the questionnaire while waiting for their turn. The questionnaire required less than five minutes to complete, and the students were given an option to either return the questionnaire to their advisor or via a return box placed at the office. Participation was voluntary, and respondents were assured of confidentiality. A total of 126 complete and valid responses were collected. While convenience sampling limits generalisability, it provides valuable exploratory insights for this initial investigation into perceived readiness.

The survey instrument was developed by adapting items from prior studies of TAM and self-efficacy to fit the CBA context. The questionnaire was divided into two sections, section A captured demographic profile (three questions) and section B measured on readiness – (1) students (six questions), (2) technology (four questions)), and (3) tuition provider (five questions). A minimal section was chosen to reduce confusion for the respondents as well to look simple and easy. Responses were captured on a multi-point Likert scale – ‘Yes’ (1) and ‘No’ (2) or ‘Agree’ (1), ‘Unsure’ (2) and ‘Disagree’ (3) to measure the intensity of their perceptions. There is one optional open-ended question allowing respondents to share their view on the CBA (if any). The readiness dimensions and its measurement were tabulated in Table 1 below.

Table 1. The three dimensions and measurements used for the study

Dimension Measurement description
Self-Readiness Measured students’ perceptions of their own capability and preparedness for CBA. This construct aligns with the concept of change efficacy.
Faculty Readiness Gauged students’ perceptions of the instructors’ and institution’s preparedness to implement and support CBA effectively. This construct probes perceived management support.
Technology Readiness Captured students’ perceptions of the reliability and usability of the CBA systems and infrastructure.

Survey data were coded and analysed using descriptive statistics to identify key trends in student perceptions. The analysis compared readiness levels across the three dimensions (student, provider, and technology), highlighting strengths, weaknesses, and areas requiring intervention.

FINDINGS AND DISCUSSION

This study assessed students perceived readiness for CBA through the Technology Acceptance Model (TAM) and self-efficacy constructs. The results reflect three dimensions: student readiness, provider readiness, and technology readiness.

Table 2. Demographic profile

Characteristics Frequency %
Gender Female 91 72.2
Male 35 27.8
Level Beginner 25 19.8
Intermediate 94 74.6
Professional 7 5.6
Age < 20 years 92 73.0
> 20 years 34 27.0

Table 1 presents the respondents’ demographic characteristics. A total of 126 respondents were collected, representing 84% of the population. Most of the respondents were female, with 91 respondents (72.2%) and 35 male respondents (27.8%). Respondents were mostly aged less than 20 years old (73.0%), while 34 (27.0%) were aged more than 20 years old. In terms of academic progression, majority respondents were at the intermediate level (74.6%), with beginners and professionals constituting 19.8% and 5.6% of the sample, respectively. This demographic profile, characterized by a younger, predominantly female, and somewhat experienced but not professional group, is important for interpreting their perceptions of readiness, particularly in the context of digital technology. Younger individuals are often considered more digitally native, which could influence their comfort and familiarity with technology-driven assessments.

Student Readiness

There are six constructs used in representing student’s self-efficacy and perceived ease of CBA usage.

Table 3. Student readiness

 Items Min Max Mean Std. Deviation
Early CBA exposure 0 2 1.020 0.389
Computer know-how 1 3 1.100 0.376
Examination skill 0 3 1.360 0.572
Time management 1 3 1.300 0.610
Systematic tool 0 3 1.150 0.421
Wider coverage 1 3 1.540 0.561

The findings reveal that students demonstrate low levels of early exposure (M = 1.02) and computer know-how (M = 1.10). These results suggest weak self-efficacy, with many students’ lacking confidence in their ability to operate CBA systems. Although examination skills (M = 1.36) and time management (M = 1.30) scored slightly higher, they remain modest, reflecting partial but not sufficient preparedness to engage with the digital assessment format. Interestingly, wider coverage (M = 1.54) was the most positively perceived element, indicating recognition of the usefulness of CBA in expanding assessment scope, in line with TAM’s “perceived usefulness.”

The student readiness constructs were further analysed by respondents’ education level illustrated in Figure 1. The results indicate that students displayed limited self-efficacy in handling CBAs. Many reported weak computer know-how and minimal prior exposure to digital examinations. While they recognised potential benefits such as broader coverage and improved efficiency, their confidence in using technology under exam conditions remained low. These findings align with Bandura (1997)view that self-efficacy is not a measure of actual skill but of belief in one’s ability to use skills effectively. When self-efficacy is weak, students are more likely to experience anxiety and resistance, even if they are otherwise academically capable. This highlights the importance of embedding structured digital literacy training within accounting curricula to strengthen student readiness.

The implementation of CBA has forced the teaching and learning style to shift from merely in the classroom using the whiteboard to the computer lab. Computer usage is a skill needed as the structure of the exam requires students to answer their exam in several formats, namely multiple choice, written answer using MS Word, and illustrate their numbers using MS Excel. Computer skills are an essential skill that needs to be taught and developed before the exam date. This also means that the lecturers must be trained in teaching their students. Due to the reasons mentioned, the five items being asked were focused on developing the skills needed for the future digital accountants. Computer know-how ensures students understand the peripheral being used in assessing their knowledge on the accounting-related subject. Majority of the respondents (92%) agreed that knowing how to handle the computer is a must before sitting the CBA.

Figure 1. Students’ readiness by respondents’ level of study

Provider Readiness

The second dimension captures students’ perceptions on institutional support and perceived usefulness of migration from the traditional paper-based to CBA.

Table 4. Provider readiness

 Items Min Max Mean Std. Deviation
Enhanced security 0 3 1.350 0.584
Reduces cost 1 3 1.300 0.583
Minimises errors 0 3 1.560 0.676
Better assessment tool 1 3 1.340 0.493
Insecure assessment tool 0 3 1.790 0.765

Provider-related factors highlight areas of concern. Students expressed the highest negative perception toward insecure assessment tools (M = 1.79), reflecting doubt in the system’s safety and reliability. While minimisation of errors (M = 1.56) was rated more positively, other aspects such as enhanced security (M = 1.35), reduces cost (M = 1.30), and better assessment tools (M = 1.34) were perceived as weak. These results imply that institutions have not yet established strong credibility in supporting CBA implementation, thereby weakening students’ perceptions of usefulness and institutional readiness.

Provider-related factors emerged as a major area of concern. Students consistently rated institutional support as inadequate, particularly in relation to secure platforms and technical assistance. The perception that assessment tools lacked sufficient safeguards undermined institutional credibility, which is critical for building trust. This finding reinforces prior studies that show institutional readiness is a key determinant of adoption (Ain et al., 2016). A study by Skhephe (2022) (Yavuz Temel et al., 2025)stated that poor knowledge and incompetent skills among the accounting teachers caused the failure and non-acceptance of online learning in some high schools of South Africa. Thus, extending mock CBA sessions to both students and lecturers could address this gap by increasing familiarity and reducing uncertainty, thereby enhancing provider credibility.

Technology Readiness

The third dimension was on the perceived ease of use (PEOU) and barriers of using CBA which capture the technology readiness state.

Table 5. Technology readiness

 Items Min Max Mean Std. Deviation
Technical difficulties 0 3 2.240 0.824
CBA demo 0 2 1.170 0.475
Demo usefulness 0 3 1.510 0.666
Shorter assessment time 1 3 1.440 0.614

Technology-related factors show significant barriers to adoption. The highest means were reported for technical difficulties (M = 2.24), underscoring the strong perception that technological disruptions are inevitable. CBA demo (M = 1.17) and demo usefulness (M = 1.51) were also underwhelming, suggesting inadequate training and orientation. These findings indicate that the current technological environment reduces both self-efficacy and perceived ease of use. Conversely, shorter assessment time (M = 1.44) suggests some recognition of efficiency, but this benefit is overshadowed by technical concerns.

Technology readiness was most constrained by concerns over reliability. Students expressed low confidence that systems would operate smoothly, frequently citing fears of technical difficulties. Training sessions and demonstrations were viewed as insufficient to prepare them for high-stakes conditions. In TAM terms, these perceptions weakened both determinants, the PEOU and PU, undermining one desire to accept CBA. This echoes Yavuz Temel et al., (2025), who found that usability issues strongly predict resistance to digital assessment. Similarly, Bearman et al., (2023) stated that technology-readiness implies individual commitment to leverage new technologies and applications in performing tasks. Hence, building more robust digital infrastructures and providing hands-on exposure can directly address these concerns.

The technology element is crucial as it may provide insight for examination body in ensuring both the infrastructure and infostructure are in good shape prior offering the online-based assessment. The ACCA, for instance announced the transition plan to CBA one year ahead and made the CBA demo available via the website. Such transition plan offers both teachers and institutions time to explore beforehand and transfer such information to the students.

The findings alleviate a critical reality: the path to successful CBA implementation is fraught with perceptual barriers that extend beyond technical infrastructure. Students perceive significant shortfalls in their own readiness, the technology’s reliability, and the institution’s capacity to manage the change from traditional PBE to CBA effortlessly. TAM helps explain how students weigh usefulness and ease of use, but self-efficacy adds to the crucial psychological layer. Professional accounting students having strong self-efficacy tend to see CBAs as valuable and handy. Contrary, those with less confidence are more likely to view CBA as difficult, unreliable or not worth the effort. The message for faculties offering professional pathways is to be ready with strategies that strengthen both the practical and psychological sides of digital assessment which connected to employability (Scott & Unsworth, 2018).

CONCLUSION

This study assessed accounting students perceived readiness for CBA across three dimensions namely self, institution and technology. The findings reveal that readiness depends less on access to technical and more on perceptions of confidence, institutional credibility, and system reliability. Students consistently identified technological difficulties as a universal barrier, signifying a lack of trust in system reliability. Low self-efficacy, particularly among foundational students, suggested uncertainty on own ability to adapt and succeed in a digital exam environment. Perception of faculty readiness was indecisive; although students acknowledged potential pedagogical benefits, they expressed skepticism on administrative efficiency and exam security. These findings underline that readiness is a multi-faceted construct, and that a shortfall in any one dimension can undermine the entire CBA initiative.

The findings carry both theoretical and practical implications. Theoretically, they show that readiness must be studied through a multidimensional lens. Integrating TAM and self-efficacy places perceived readiness as a critical precursor to successful implementation and offer a validated model that can be adapted to other educational technology contexts.

This study offers a practical blueprint for higher education institutions to strengthen student readiness for computer-based or digital assessment. The findings highlight the need to move beyond technology-driven rollouts toward strategies that actively address student perceptions (Yeşilyurt et al., 2016). Universities could implement mandatory digital literacy modules for first-year accounting students to build foundational computer self-efficacy. Regular low-stakes mock CBAs, designed to mirror professional exam conditions, should be offered to both students and lecturers to reduce anxiety and familiarise users with digital platforms. Institutions should also establish transparent communication protocols including clear guidance on training, security measures, and contingency plans to reinforce trust in the system. Responsive and visible technical support would further strengthen institutional credibility.

Collaboration with professional bodies such as ACCA, CPA, and MICPA in developing training and mock assessment frameworks would ensure alignment between academic preparation and industry expectations. By embedding these practices, faculties not only enhance digital readiness but also directly support employability, as graduates will enter the workforce with both the technical and psychological competencies required to succeed in professional digital examinations. Aligning university practices with the digital standards already adopted by professional bodies will therefore improve readiness and reinforce graduate employability (Scott & Unsworth, 2018).

This study has several limitations. First, it was confined to a single faculty and relied solely on self-reported perceptions, which may not fully reflect actual readiness. Future research should triangulate these findings by assessing faculty members’ own preparedness and comparing the two perspectives to identify perception-reality gaps. Second, the cross-sectional design restricts insights into how readiness evolves over time. Longitudinal studies that track the rollout of structured CBA implementation programs would help identify the most effective interventions for building readiness. Third, expanding the sample across multiple faculties, institutions, or countries would strengthen the generalisability of findings.

Methodological extensions would also add value. Mixed method approaches that integrate surveys with qualitative methods such as focus groups (Sherafati & Mahmoudi Largani, 2023), case study (Skhephe, 2022), or observation (Yavuz Temel et al., 2025) could provide richer insights into students’ anxieties, coping strategies, and expectations around digital examinations. Finally, future studies should consider the influence of professional bodies, as their policies and assessment reforms directly shape student expectations, perceptions, and confidence in adopting digital assessments (Alozie, 2022).

ACKNOWLEDGEMENT

We wish to thank the Faculty of Accountancy, Universiti Teknologi MARA cawangan Selangor for their support and funding.

REFERENCE

  1. ACCA. (n.d.). Computer-based exams. Students Technical Articles. https://www.accaglobal.com/my/en/student/exam-support-resources/fundamentals-exams-study-resources/f1/technical-articles/computer-based-exams.html
  2. Ain, N. U., Kaur, K., & Waheed, M. (2016). The influence of learning value on learning management system use. Information Development, 32(5), 1306–1321. https://doi.org/10.1177/0266666915597546
  3. Ally, S. (2024). A National E-assessment Implementation Framework: Assessing Readiness in Secondary Schools and Teacher Education in Tanzania. Journal of Issues and Practice in Education, 16.
  4. Alozie, C. (2022). Future of Accounting Education, Comparative Review of Divergent Issues in Accounting Education: Evidence from Five Focal Countries. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4044941
  5. Alwi, N. H., & Khan, B. N. A. (2024). Technology Readiness and Adoption of Artificial Intelligence Among Accounting Students in Malaysia. International Journal of Religion, 5(10), 4029–4038. https://doi.org/10.61707/e30gnv95
  6. Bandura, A. (1997). Self-efficacy: The exercise of control. (11th ed.). Freeman.
  7. Bearman, M., Nieminen, J. H., & Ajjawi, R. (2023). Designing assessment in a digital world: an organising framework. Assessment & Evaluation in Higher Education, 48(3), 291–304. https://doi.org/10.1080/02602938.2022.2069674
  8. Dangi, M. R. M., Saat, M. M., & Saad, S. (2023). Teaching and learning using 21st century educational technology in accounting education: Evidence and conceptualisation of usage behaviour. Australasian Journal of Educational Technology, 2023(1), 19–34. https://doi.org/10.14742/ajet.6630
  9. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly: Management Information Systems, 13(3), 319–339. https://doi.org/10.2307/249008
  10. King, W. R., & He, J. (2006). A meta-analysis of the technology acceptance model. Information & Management, 43(6), 740–755. https://doi.org/10.1016/j.im.2006.05.003
  11. Mukred, M., Mokhtar, U. A., Hawash, B., AlSalman, H., & Zohaib, M. (2024). The adoption and use of learning analytics tools to improve decision making in higher learning institutions: An extension of technology acceptance model. Heliyon, 10(4), e26315. https://doi.org/10.1016/j.heliyon.2024.e26315
  12. Scott, M., & Unsworth, J. (2018). Matching final assessment to employability: developing a digital viva as an end of programme assessment. Higher Education Pedagogies, 3(1), 373–384. https://doi.org/10.1080/23752696.2018.1510294
  13. Sherafati, N., & Mahmoudi Largani, F. (2023). The potentiality of computer-based feedback in fostering EFL learners’ writing performance, self-regulation ability, and self-efficacy beliefs. Journal of Computers in Education, 10(1), 27–55. https://doi.org/10.1007/S40692-022-00221-3/METRICS
  14. Skhephe, M. (2022). Accounting Teachers’ Readiness to Continue With E-Learning and Teaching after the Covid-19 Pandemic in South Africa from 2020. Journal of Educational Studies, 21(3), 165–178. https://journals.co.za/doi/abs/10.10520/ejc-jeds_v21_n3_a10
  15. Strzelecki, A. (2024). Students’ Acceptance of ChatGPT in Higher Education: An Extended Unified Theory of Acceptance and Use of Technology. Innovative Higher Education, 49(2), 223–245. https://doi.org/10.1007/S10755-023-09686-1/TABLES/4
  16. Tahir, W. M. M. W., & Jabar, J. A. (2024). Analysis of Students’ Acceptance of Online Assessment in an Accounting Course Towards Academic Integrity Using Technology Acceptance Model. Journal of Academia, 12(1), 93–101.
  17. Yavuz Temel, G., Barenthien, J., & Padubrin, T. (2025). Using Jupyter Notebooks as digital assessment tools: An empirical examination of student teachers’ attitudes and skills towards digital assessment. Education and Information Technologies, 30(13), 18621–18650. https://doi.org/10.1007/S10639-025-13507-7/TABLES/5
  18. Yeboah, D. (2023). Undergraduate students’ preference between online test and paper-based test in Sub-Saharan Africa. Cogent Education, 10(2). https://doi.org/10.1080/2331186X.2023.2281190
  19. Yeşilyurt, E., Ulaş, A. H., & Akan, D. (2016). Teacher self-efficacy, academic self-efficacy, and computer self-efficacy as predictors of attitude toward applying computer-supported education. Computers in Human Behavior, 64, 591–601. https://doi.org/10.1016/J.CHB.2016.07.038

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

0 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER