Evaluating Student Satisfaction with Rethica: A Digital Tool for Research Ethics Approval at The Faculty of Law, University Teknologi Mara
- Dr. Ira Rozana Mohd Asri
- Dr. Ibtisam Ilyana Ilias
- Dr. Muhammad Fikri Othman
- Khairul Aiman Samsudin
- Norsyazwani Mohamad Nazri
- 2782-2790
- Aug 11, 2025
- Environment
Evaluating Student Satisfaction with Rethica: A Digital Tool for Research Ethics Approval at the Faculty of Law, University Teknologi Mara
Dr. Ira Rozana Mohd Asri, Dr. Ibtisam Ilyana Ilias, Dr. Muhammad Fikri Othman, Khairul Aiman Samsudin, Norsyazwani Mohamad Nazri
Faculty of Law, Universiti Teknologi MARA
DOI: https://dx.doi.org/10.47772/IJRISS.2025.907000233
Received: 03 July 2025; Accepted: 10 July 2025; Published: 11 August 2025
ABSTRACT
In recent years, the need for a structured and transparent process for research ethics approval has become increasingly important in higher education institutions. To address administrative delays and improve students’ understanding of ethical compliance, the Faculty of Law, University Technology MARA (UiTMLaw) has introduced the REthica application, a digital platform designed to facilitate and monitor ethics approval for undergraduate and postgraduate research. As UiTMLaw places strong emphasis on academic integrity and ethical research practices, evaluating students’ experience with this tool is essential. This study investigates the level of satisfaction among Part 5 and Master of Enforcement UiTMLaw students who have used the REthica application for their research ethic approval application. Employing a quantitative approach, data were collected via a structured questionnaire distributed to Part 5 and Master of Enforcement UiTMLaw students during the 2024/2025 academic session. Participants were selected through purposive sampling, as they are required to complete a final-year research project and dissertation respectively. The data were analysed using descriptive statistical techniques to assess students’ perceptions of the app’s usability, effectiveness, and areas for improvement. The findings reveal that while students generally appreciated the app’s user-friendly interface and the transparency it offers in tracking the ethics approval process, several technical challenges were noted, indicating room for refinement. The study underscores the importance of ongoing system enhancements to ensure that the ReThica app effectively supports students in navigating the ethics approval process, upholds academic integrity, and promotes a robust research culture within the faculty.
Keywords – Student Satisfaction, Ethics Approval, Digital Platform, Higher Education, Law Students, Research Management
INTRODUCTION
In today’s digital landscape, technology has transformed the way tasks are completed, providing users with unparalleled convenience and accessibility. From remote work to instant information access and global collaboration, digital platforms have become essential. This digital shift is evident in academic environments, where students increasingly rely on technology to facilitate their studies. Recognizing this trend, Universiti Teknologi MARA (UiTM) Shah Alam has introduced REthica, an application to streamline the ethics review process for research conducted by undergraduate and postgraduate law students.
Figure 1- Home screen of REthica application
Prior to this, the Coordinator for Research Ethic Committee will send the notification by sending four different QR Codes representing four different types of applications through WhatsApp group. The four applications are: 1) application by undergraduate or exemption; 2) application by undergraduate for full approval; 3) application by postgraduate for exemption; and 4) application by postgraduate for full approval. The students have to download the relevant forms, fill in and submit them via prescribed google form link. The students cannot track the status of the application and the contact number of person in-charge is not specified. Thus, some of them even contacted Deputy Dean, Research and Industry Networking to know the status of the application.
This shows that the traditional method for ethics approval at the Faculty of Law, UiTM was disorganised, unclear and involved too much paperwork which caused delays, confusion and dissatisfaction among students. To solve these problems, the REthica application was created. REthica is developed to improve the process of application for research ethic and enhancing the students’ experience through a centralised online platform. Nevertheless, this product is still new and was used for the first time this semester. While the system aims to provide greater transparency, efficiency, and user-friendliness, students may encounter challenges related to technical issues, communication clarity and usability. Understanding the satisfaction levels and experiences of students using the system is essential to identify its strengths and areas for improvement. This study seeks to address this gap by evaluating students’ satisfaction and proposing enhancements to optimise the application’s functionality and support student research endeavors.
Therefore, the research questions for this paper are: 1) to what extent is the satisfaction of UiTMLaw students on REthica application?; 2) what are the common challenges faced by students while using the application?; and 3) what suggestions do students have for improving the REthica application?
Meanwhile, the research objectives for this paper are: 1) to evaluate the level of satisfaction of UiTMLaw students with the REthica application; 2) to identify challenges encountered by students using the application; and 3) to gather students’ suggestions for improving REthica application.
LITERATURE REVIEW
Technology has significantly transformed higher education, providing flexible and accessible platforms for learning and administrative processes. Online systems, including learning management platforms and administrative tools, enhance the academic experience by streamlining tasks and promoting collaboration (Anderson & Dron, 2017). Similar technological advancements have also been applied to research management systems, simplifying application and approval processes (Johnson et al., 2020). The importance of ethics approval in academic research is well-established. Ethics committees play a vital role in ensuring that research adheres to established ethical standards, safeguarding the rights and welfare of participants (Resnik, 2018). Efficient ethics approval processes contribute to smoother research progress and enhance the overall research culture in institutions (Smith, 2019). Several institutions have adopted digital ethics management systems to enhance the transparency and efficiency of the approval process. Research indicates that digital platforms reduce administrative burden, improve tracking capabilities and provide clear communication channels for applicants and reviewers (Brown & Green, 2021). However, challenges such as inadequate technical support, delayed feedback and lack of user training have also been reported (Taylor et al., 2022). Student satisfaction is a key metric in evaluating the effectiveness of academic support systems. Factors influencing satisfaction include ease of use, responsiveness, transparency and communication quality (Davis, 1989). Ensuring that users have access to appropriate resources and support mechanisms further enhances satisfaction levels (Chen et al., 2021).
RESEARCH METHODOLOGY
This study adopted a quantitative survey approach to assess the level of satisfaction with the REthica application among university students. The target population consisted of students from the Faculty of Law, Universiti Teknologi MARA (UiTM) Shah Alam. A total of 67 students took part in the survey, selected through purposive sampling to ensure that participants were Part 5 undergraduate students and Master of Enforcement postgraduate students at UiTMLaw, all of whom had experience using the REthica system for their research ethics approval application.
Quantitative research is a systematic approach used to numerically assess and analyse variables, allowing researchers to identify patterns and relationships within a target population (Creswell & Creswell, 2018). This method is particularly useful when the goal is to test hypotheses, measure attitudes or predict outcomes based on statistical evidence (Neuman, 2014). One of the main strengths of quantitative research lies in its ability to produce precise, replicable and objective results, as it relies on standardised tools and structured procedures (Bryman, 2016). It also allows for broader generalisation of findings due to its reliance on larger, representative samples. However, quantitative methods are not without limitations. Among the common criticisms are the lack of contextual richness, limited flexibility in exploring deeper meanings and the risk of misrepresenting complex social phenomena due to rigid structures (Punch, 2013; Silverman, 2020).
In this study, the primary tool for data collection was a structured survey questionnaire comprising 10 (ten) closed-ended questions and 1 (one) open-ended question. Survey questionnaires are widely recognised as a practical and effective tool in educational and social research for gathering consistent data across multiple respondents (Check & Schutt, 2012). The use of closed-ended questions enables researchers to quantify responses efficiently, while the inclusion of an open-ended question allows for limited qualitative insight. The questionnaire was designed to align closely with the research objectives, ensuring that each item contributes directly to measuring the constructs of interest (Fowler, 2014). According to Groves et al. (2009), well-constructed survey instruments enhance the reliability and validity of findings by providing clear and understandable questions that minimize ambiguity and bias. Furthermore, surveys are particularly valuable in educational contexts due to their cost-effectiveness and scalability, making them suitable for reaching a large number of participants within a short time frame (Mertens, 2020).
The questions were formulated in English as the respondents are more familiar with this language. The questions were designed to elicit “Yes” or “No” responses, enabling straightforward analysis of the students’ level of awareness.
Data Collection
A pilot test was carried out to evaluate the design and instrumentation of the study for potential weaknesses (Blumberg, Cooper, & Schindler, 2014). The main purpose of this preliminary test was to identify any ambiguous or misleading items in the questionnaire that could undermine the alignment with the research objectives. The questionnaire was distributed to 16 respondents which encompasses of postgraduate law students. The selected areas of study were the faculty of law, UiTM Shah Alam. Using a quantitative research design, survey questionnaires were distributed online through WhatsApp Messenger.
The nature of the case study required collection of quantitative data from undergraduate law students who are taking final year project course from the Faculty of Law, UiTM Shah Alam. Eventually, the study managed to collect 67 respondents. The survey questionnaires were distributed to the respondents through online WhatsApp Messenger from 14th May 2025 until 20th June 2025.
In the survey, the respondents were asked on the practicality of the application. The survey also asked their satisfaction and opinions to improve the application. The researcher adopted multiple choice answers in most of the questions and logical answers like yes/no in some of the questions. The survey was created by using Google forms in order to facilitate online distribution.
Data Analysis
Responses were coded and analysed manually using descriptive statistics. Each question was scored based on correct responses and results were interpreted using percentage metrics to identify trends in satisfaction levels. This method allows for a clear interpretation of the overall level of satisfaction in each area of the product surveyed. The data were then tabulated and compared across different sections to identify strengths and weaknesses in the students’ satisfaction.
FINDINGS
Figure-2: Mobile Operating System Used.
Figure-2 shows that a total of 56 respondents (approximately 84%) reported using an iPhone (iOS), while 11 respondents (approximately 16%) indicated that they use an Android device. This shows that the majority of users are on iOS.
Figure-3: User-friendliness
Based on figure-3, a total of 33 respondents (49%) indicated that the application is user-friendly all the time, while 26 respondents (39%) said it is user-friendly most of the time. Another 8 respondents (12%) felt it is user-friendly only sometimes. This suggests that the majority of users generally find the application to be user-friendly.
Figure-4: Ease of use
According to figure-4, a total of 34 respondents (approximately 49%) stated that the application is very easy to use, while 29 respondents (42%) described it as easy. An additional 6 respondents (9%) indicated that it is only sometimes easy to use. This indicates that most users find the application generally easy or very easy to navigate.
Figure-5: Satisfaction level of users
Out of the total respondents, 20 (30%) indicated that they are highly satisfied with the application, while 43 (64%) reported being satisfied. Only 4 respondents which equals to 6% described themselves as occasionally satisfied. This demonstrates that overall user satisfaction with the application is very high.
Figure-6: Common users issues reported
Figure-6 clearly shows that visual design (34%) and bugs or crashes (34%) are the most frequently reported issues. Meanwhile, other issues like software stability, confusing UI and storage problems had not been the main concern of the users.
Figure-7: Successfulness of application
Out of 67 respondents, 66 or 99% answered “Yes,” indicating that they believe the application is successful. Only one respondent or 1% answered “No.” This demonstrates overwhelming user confidence in the success of the application.
Figure-8: Effectiveness of application
All 67 respondents (100%) answered “Yes” to this question, indicating that users unanimously find the application useful. This reflects a strong endorsement of the application’s value and functionality.
Figure-9: User’s favourite of the application
When asked what they liked most about the application, the majority of users (23 respondents or approximately 31% highlighted its functionality. 14 respondents (19%) appreciated the interface and appearance, while 11 (15%) noted the clarity of content. Stability and performance were mentioned 10 times (14%) and 9 respondents (12%) praised the application’s ease of use and navigation. A smaller number of respondents appreciated its accessibility or convenience (5 respondents or approximately 7%) and 2 respondents (3%) liked the notification feature. This feedback reflects a strong appreciation for the application’s core functions and usability.
Figure-10: Clarity of information and instruction of the application
A total of 27 respondents (40%) stated that the information was “very clear,” while 35 (52%) said it was “clear.” Only 5 (7%) users felt that the information was only “sometimes clear.” This indicates that most users perceive the instructions and content in the application as clear and easy to understand.
Figure-11: Application rating
Among the respondents, 32 which is approximately 48% rated the application as “Very Good,” while 30 (45%) considered it “Good.” Only 5 respondents (7%) rated it as “Average.” These results indicate that the majority of users hold a positive view of the application’s overall quality.
Figure-12: Suggestions for improvement of application
Users provided a range of suggestions for improving the application. The most common request (14 respondents or approximately 27%) was to improve the visual design and aesthetics. Fixing bugs and crashes was also frequently suggested (12 respondents or approximately 23%). 9 respondents or 17% asked for simpler navigation and more intuitive language. Other recommendations included improving performance speed (7 respondents or 13%), adding useful features like notifications (5 respondents or 10%), optimizing storage use (3 respondents or 6%) and adding support for dual-screen features on Android (2 respondents or approximately 4%). These insights highlight key areas for potential enhancement based on user feedback.
DISCUSSION
The findings reveal a generally positive user experience, marked by high satisfaction, ease of use, and clarity of content. Several key patterns and insights emerge from the data:
Platform Preference and Implications
The overwhelming use of iOS devices (84%) suggests that iOS optimization should remain a development priority. However, with 16% of users on Android, ensuring feature parity and stability on that platform is still important, especially as some suggestions related specifically to Android needs (e.g., dual-screen support).
Usability and Accessibility
The majority of users found the application user-friendly and easy to navigate. This is further supported by users highlighting ease of use and interface as favourite features. The low incidence of complaints about navigation supports the overall positive sentiment.
High Satisfaction and Perceived Value
The satisfaction and success metrics (Chart 4 and 5) demonstrate strong user endorsement. That 100% of respondents found the application effective (Chart 6) is a significant validation of its core purpose and usability.
Common Challenges and Areas for Improvement
Although generally positive, the survey did identify recurring technical issues, particularly with visual presentation and bugs. These technical challenges align with user suggestions to enhance visual design and fix crashes. Although less commonly reported, stability and storage issues must also be addressed to maintain user trust.
Strengths of REthica
Functionality stood out as the most appreciated feature, indicating that the application delivers well on its intended functions. The high praise for its interface and clarity of content points to successful design and communication strategies.
Communication and Content Clarity
With over 90% of users finding the information either “clear” or “very clear,” it can be concluded that the instructional design is effective. Still, a small group (5 users) expressed occasional confusion, which suggests room for improving microcopy and user guides.
Overall Application Perception
The fact that over 90% of users rated the application as “Very Good” or “Good” highlights a strong brand perception and user trust. Maintaining this perception will require proactive attention to user feedback and ongoing updates.
User Suggestions as a Roadmap
The suggestions provided offer a clear direction for future development. Improving aesthetics and addressing technical bugs should be prioritized. Simplifying language, enhancing speed, and adding features like improved notifications will further elevate the user experience.
CONCLUSION AND RECCOMENDATIONS
The REthica Apps represents a meaningful step forward in digitalising and streamlining the ethics approval process for law students at UiTM Shah Alam. The study reveals a generally high level of student satisfaction, particularly regarding the system’s functionality, user-friendliness and clarity of information. However, several technical and usability-related challenges remain, such as visual design concerns, bugs and performance inefficiencies.
By addressing these issues and implementing user-centered improvements, the university can further enhance the effectiveness and reliability of the application. Ultimately, a well-functioning digital ethics approval system not only benefits students in completing their research requirements efficiently but also reinforces the institution’s commitment to ethical research practices and academic excellence.
Based on the findings of the study, several recommendations are proposed to improve the REthica application. First, it is essential to improve the visual design and user interface to make the application more modern, attractive and easier to navigate. A more consistent and intuitive layout will reduce user confusion and improve accessibility. In addition, the development team should prioritise fixing technical issues such as bugs and system crashes which were frequently reported by users. Ensuring the application runs smoothly through both iOS and Android platforms is also critical especially addressing specific problems faced by Android users such as limited dual-screen support.
Furthermore, simplifying the language and navigation within the application can help students complete tasks more efficiently. Instructions should be written in clear, user-friendly language and the menu options should be logically structured to reduce confusion. The addition of notification features and real-time status tracking would also greatly benefit users by keeping them informed about the progress of their ethics application without needing to follow up with administrators. Providing a dashboard that clearly displays application stages and contact information for relevant personnel would further improve communication and transparency.
To support users, a help section or chat box could be integrated to answer common questions, along with brief tutorial videos or guides to assist new users in navigating the application. It is also important to ensure that the application is optimized for performance, particularly in terms of speed and storage efficiency to ensure it runs smoothly on all devices. Lastly, to ensure that the application continues to meet students’ needs, regular feedback should be gathered through in-app surveys or suggestion forms. This will allow the development team to respond to user concerns promptly and make continuous improvements to the system.
REFERENCES
- Anderson, T., & Dron, J. (2017). Teaching and learning in digital environments: The theory and practice of online education. Routledge.
- Blumberg, B., Cooper, D. R., & Schindler, P. S. (2014). Business research methods (4th ed.). McGraw-Hill Education.
- Brown, P., & Green, L. (2021). Digital ethics management in academic institutions. Academic Press.
- Bryman, A. (2016). Social research methods. Oxford University Press.
- Check, J., & Schutt, R. K. (2012). Research methods in education. SAGE Publications.
- Chen, Y., Smith, R., & Jones, M. (2021). Evaluating user satisfaction in digital academic platforms. Springer.
- Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed methods approaches. SAGE Publications.
- Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340.
- Fowler, F. J. (2014). Survey research methods. SAGE Publications.
- Groves, R. M., et al. (2009). Survey methodology. Wiley.
- Johnson, R., & Taylor, L. (2020). Technology and research management in higher education. Wiley.
- Mertens, D. M. (2020). Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative, and mixed methods. SAGE.
- Neuman, W. L. (2014). Social research methods: Qualitative and quantitative approaches. Pearson.
- Punch, K. F. (2013). Introduction to social research: Quantitative and qualitative approaches. SAGE.
- Resnik, D. B. (2018). Ethics in research: Protecting human participants. Springer.
- Silverman, D. (2020). Interpreting qualitative data (6th ed.). SAGE.
- Smith, (2019). Institutional ethics committees and research oversight. Oxford University Press.
- Taylor, K., et al. (2022). Digital challenges in ethics management: A comparative analysis. Cambridge University Press.