International Journal of Research and Innovation in Social Science

Submission Deadline- 14th October 2025
October Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-04th November 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-17th October 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Assessing the Impact of Digital Literacy Programs in Uttar Pradesh

  • Dr. Amita Kushwaha
  • 7286-7297
  • Oct 13, 2025
  • Education

Assessing the Impact of Digital Literacy Programs in Uttar Pradesh

Dr. Amita Kushwaha

Assistant Professor, Institute of Education Bundelkhand University, Jhansi (UP) India 284128

DOI: https://dx.doi.org/10.47772/IJRISS.2025.903SEDU0543

Received: 03 October 2025; Accepted: 08 October 2025; Published: 13 October 2025

ABSTRACT

This paper assesses the impact of digital literacy programs in Uttar Pradesh (UP), India with a primary focus on large-scale government initiatives such as the Pradhan Mantri Gramin Digital Saksharta Abhiyan (PMGDISHA), complementary state-level activities, and selected NGO-led interventions. By using a rigorously designed mixed-methods evaluation framework, the study measures changes in beneficiaries’ digital competencies, usage behavior (e-governance uptake, digital financial transactions, online learning), and socio-economic outcomes over immediate (post-training) and medium-term (6 months) horizons. The paper documents contextual barriers (device access, connectivity, gender norms, and language and literacy constraints), examines heterogeneity of impacts across gender, education and rural–urban location, and offers programmatic and policy recommendations tailored to Uttar Pradesh demographic and infrastructural profile. The national PMGDISHA initiative reached unprecedented scale and used remote proctoring for assessments; impact evaluations have noted improvements in ICT adoption but also pointed to depth and sustainability challenges that are particularly relevant in UP’s heterogeneous districts. Drawing on secondary program data and a modeled empirical evaluation (illustrative quantitative estimates and integrated qualitative themes), the study concludes that digital literacy training substantially increases basic digital competencies and short-term use of digital services, but sustained socio-economic benefits require complementary measures device and data access, vernacular content, gender-responsive delivery, and post-training support.

Keywords: Digital Literacy, PMGDISHA, Uttar Pradesh, Impact Evaluation, Digital Inclusion, Mixed Methods

INTRODUCTION

Digital literacy the capacity to use digital devices, applications and the internet safely and effectively is now a core skill for socio-economic participation. In India, central and state governments have prioritized digital literacy through programs aiming to reach rural households and marginalized groups, with the explicit objectives of enhancing access to e-governance, enabling digital financial inclusion, improving employability, and bridging the urban rural information divide.

Uttar Pradesh, with the largest population among Indian states and broad intra-state variation in socio-economic indicators, presents both an urgent need and a major opportunity for digital literacy efforts. Large-scale national schemes such as PMGDISHA aimed to make at least one person in every rural household digitally literate; Uttar Pradesh accounted for a very large share of enrollments, reflecting the scale of the challenge and the potential reach of interventions. As programs transition from enrollment-focused targets to outcomes-oriented strategies, rigorous impact assessments are necessary to understand whether training translates into sustained usage and socio-economic gains, and which design features maximize inclusion and effectiveness. This paper provides a comprehensive evaluation framework and presents an empirically grounded assessment of program impacts combining secondary program data, a designed quasi-experimental evaluation approach, and detailed qualitative insights to guide policy and implementation in Uttar Pradesh.

BACKGROUND AND POLICY CONTEXT

PMGDISHA: National Scale and Design

The Pradhan Mantri Gramin Digital Saksharta Abhiyan (PMGDISHA) was launched to impart digital literacy

to rural citizens, targeting one person per household. Implemented through Common Service Centers (CSCs), state nodal agencies, NGOs and private partners, PMGDISHA combined classroom training, hands-on practice, and a standardized assessment with remote proctoring. By March 31, 2024, the scheme had enrolled over 73 million candidates nationally, with more than 63.9 million successfully trained and approximately 47.7 million certified positioning PMGDISHA among the world’s largest digital literacy initiatives. The program’s scale and use of remote proctoring were noted as unique features in governmental impact analyses.

Uttar Pradesh: Demographic and Digital Profile

Uttar Pradesh population density, diversity across districts, and large rural population (over two-thirds) make digital inclusion an operational challenge. State plans and SDG-aligned strategies emphasize digital skills, but infrastructure especially reliable last-mile connectivity and device penetration varies greatly across districts. Annual and state planning reports indicate investments in digital infrastructure and trainings, but district-level heterogeneity necessitates sub-state assessment to tailor interventions.

Why Assess Impact in Uttar Pradesh Specifically?

Three reasons justify a focused evaluation in UP: (1) sheer scale high absolute numbers of trainees imply program outcomes in UP substantially affect national aggregates; (2) heterogeneity results from one or a few districts cannot be extrapolated to the state without understanding intra-state variation; (3) equity gender, caste and literacy differentials in UP can create distinct barriers and pathways to digital adoption that program design must address.

LITERATURE REVIEW

This section synthesizes international and Indian literature on digital literacy program design, delivery mechanisms, and measured impacts; it highlights conceptual pathways from training to socio-economic outcomes and summarizes findings of major evaluations of PMGDISHA and similar initiatives.

Conceptual Pathways: How Digital Literacy can Produce Impact

Digital literacy can influence outcomes through multiple pathways:

  • Access and Information Pathway: Enables individuals to access market information, government services, and educational resources.
  • Economic Pathway: Facilitates participation in digital marketplaces, job search, and digital financial transactions, potentially increasing incomes or employment opportunities.
  • Social and Civic Pathway: Increases access to information that can influence political participation, service uptake (health, education), and social networks.
  • Safety and Empowerment Pathway: Digital skills can increase personal agency in accessing information and asserting rights, particularly for women when delivered in an enabling social context.

The literature emphasizes that training alone is often insufficient: device ownership, affordable data, local relevance of content and socio-cultural enabling conditions are crucial mediators of real-world impact.

Global Evidence

International evidence (from low- and middle-income countries) suggests that basic digital training increases confidence and routine online uses (searching for information, communication), but impacts on economic outcomes vary and are strongly mediated by complementary assets (devices, connectivity) and local enabling environments. Evaluations often find robust short-term gains in reported digital tasks but mixed medium-term economic affects unless training explicitly targets livelihood skills or is paired with job-matching and entrepreneurship support.

Evidence from India and PMGDISHA Evaluations

Indian studies on PMGDISHA and predecessor programs highlight broad reach but variability in depth and sustainability:

  • Government-led evaluations and third-party analyses found that PMGDISHA achieved large enrollments and used remote proctoring to standardize certification. These studies reported increases in ICT adoption and basic digital engagement among trainees while noting that long-term usage and economic outcomes were conditional on the availability of devices and services.
  • Academic work and program evaluations emphasize the importance of localized, hands-on pedagogy, trainer quality, and follow-up support mechanisms (peer networks, help desks). One recent multi-thousand participant study found positive skill gains but cautioned about attrition in usage where follow-up support and device access were low.

Gaps in the Literature

Although national-level results are available, there is limited comprehensive district-level or sub-state analysis that integrates rigorous quasi-experimental estimation with deep qualitative insight in UP. Existing studies often focus on reach metrics (enrollments, certifications) rather than sustained behavioral change, service uptake, or economic outcomes leaving a gap this paper seeks to fill.

Research Questions, Objectives, and Hypotheses

Research Questions

  1. What is the causal impact of participation in structured digital literacy programs (e.g., PMGDISHA) on beneficiaries’ digital competencies in Uttar Pradesh?
  2. Does participation increase uptake of e-governance services, digital financial services, and online learning within six months?
  3. How do impacts vary by gender, education level, socio-economic status and rural/urban residence?
  4. What contextual barriers and enablers affect sustained adoption of digital skills post-training?

Objectives

Primary Objective: Estimate the impact of digital literacy programs on digital competency and digital service use among participants in selected districts of Uttar Pradesh.

Secondary Objectives:

  • Quantify heterogeneity of impact by gender, education, device ownership and connectivity.
  • Identify qualitative mechanisms, barriers, and enablers through KIIs, FGDs and observations.
  • Provide actionable recommendations to state and district-level policymakers and implementers to enhance program equity and sustainability.

Hypotheses

H1: Participation in a structured digital literacy training leads to a statistically significant increase in standardized digital competency scores immediately post-training.

H2: Participants will have higher probability of using at least one e-governance service, one digital financial service, and one online learning resource at six months compared to matched non-participants.

H3: The magnitude of impact is moderated by gender and baseline device/connectivity access; female participants and those without devices show smaller average gains.

H4: Sustained usage at six months is positively associated with post-training access to devices, lower data costs, presence of local support (mentor/CSCs), and contextual factors such as proximity to markets and local e-service availability.

RESEARCH DESIGN AND METHODOLOGY

Overall Design

A convergent mixed-methods design is used, combining:

  • Quantitative Quasi-Experimental Evaluation (pre–post measurement on participants; matched comparison group; difference-in-differences estimation) to estimate average and heterogeneous treatment effects.
  • Qualitative Inquiry (KIIs, FGDs, direct observation) to explain mechanisms, contextualize quantitative findings, and capture participant narratives.

This design balances internal validity (through matching and DiD) and contextual validity (through qualitative richness).

Study Sites and Rationale

Six purposively selected districts representing varying socio-economic and digital infrastructure profiles (e.g., one each from Western UP, Central UP, and Eastern UP; and within each, a mix of relatively more connected and less connected districts) will provide variation to study heterogeneity. District selection will also consider PMGDISHA enrollment intensity and presence of CSC infrastructure.

Suggested District Clusters (Illustrative): Lucknow (central/urban), Kanpur Nagar (industrial/urban), Varanasi (cultural/urban-adjacent rural), Gorakhpur (eastern/rural), Agra (western/agro-urban), Sitapur (northern/rural). Final selection should be based on updated state-level data and implementer cooperation.

Sampling and Sample Size

Target Sample: Target samples are 1,200 individuals (600 participants, 600 matched non-participants), approximately 100 participant-control pairs per district.

Power Considerations: With a sample of 600 participants and 600 controls, the design is powered to detect effect sizes of 0.15–0.20 standard deviations on primary competency outcomes at conventional power levels (80%) and alpha = 0.05, while permitting subgroup analysis (gender, rural/urban). Precise power calculations should be conducted based on pilot variance estimates.

Sampling Procedure:

  • Participants: Stratified random sampling from program enrollment lists (ensuring recent cohorts within the past 3 months to allow T1 assessment at completion and T2 at 6 months).
  • Controls: Residual non-participants from the same community or neighboring villages/wards, matched using propensity scores on age, gender, education, occupation, and baseline device ownership where available.

Identification Strategy

Primary causal identification uses a difference-in-differences (DiD) model on matched samples:

Y_{it} = α + δPost_t + τTreatment_i + β*(Treatment_i × Post_t) + γX_{i} + ε_{it}

Where β estimates the average treatment effect (DiD). Matching plus inclusion of covariates mitigates selection on observables. Robustness checks include alternative matching approaches (nearest-neighbor, kernel), inverse-probability weighting, and bounding exercises for potential unobserved confounding.

Timing of Data Collection

  • Baseline (T0): Immediately before training (participants) or contemporaneous baseline for matched controls.
  • End line (T1): Within 2–4 weeks of training completion to measure immediate competency gains and short-term uptake.
  • Follow-up (T2): Six months post-training to measure sustained use and medium-term outcomes.

Ethical Considerations

Informed consent, privacy protections, anonymized data storage, and the right to withdraw will be guaranteed. Approval from an institutional ethics review board and local permissions (district/state nodal offices) will be obtained. Participants will not be monetarily incentivized beyond modest travel/refreshment reimbursements as per IRB guidance.

Data Collection Instruments

Digital Competency Assessment (DCA)

A standardized instrument measuring five competency domains:

  1. Device handling and basic operations (turn on/off, settings, typing, and file save/open) practical tasks.
  2. Internet Navigation & Information literacy (opening browser, searching, evaluating a site).
  3. E-Governance Services (finding and downloading a government form, using a government portal).
  4. Digital Financial Services (opening UPI app simulation, sending/receiving small transfer, understanding transaction receipts).
  5. Digital Safety & Privacy (password strength, identifying phishing, privacy settings).

Each domain is scored 0–20; composite 0–100. The DCA combines observationally scored practical tasks and short knowledge tests.

Household and Individual Questionnaire

Sections:

  • Demographics: age, gender, caste, education, occupation, household size, income band.
  • Device ownership and usage: mobile phone type, smartphone ownership, shared device access.
  • Connectivity: home internet, SIM data availability, monthly data expenditure.
  • Prior exposure: prior digital trainings, literacy level, local CSC interactions.
  • Digital behaviors: frequency and types of digital tasks performed in past 30 and 90 days.
  • Economic/civic outcomes: any changes in income activities, use of government schemes, grievance redressal use.
  • Barriers and perceptions: perceived usefulness, constraints, trust.

Qualitative Guides

  • KII with Program Managers/Trainers: questions on curriculum, trainer recruitment, assessment procedures, challenges, and data on enrollments and retention.
  • FGD Guides (beneficiaries): questions on learning experience, perceived benefits, barriers to usage, social norms (gendered expectations), and suggestions.
  • Observation Checklist: training environment, trainer–trainee ratio, equipment quality, attendance patterns.

Sampling Implementation and Field Operations

Coordination with Implementing Agencies

Obtain enrollment lists and program schedules from state PMGDISHA nodal agency and district CSC managers. Liaise with local panchayats, SHGs and community leaders for outreach and control recruitment.

Enumerator Training and Piloting

Enumerators will receive intensive training on the DCA practical tasks, tablet-based data entry, ethical protocols, and qualitative facilitation. Pilot testing of instruments in an adjacent district will calibrate timing and scoring rubrics.

Data Quality Assurance

Real-time checks (range checks, skip patterns), supervisor field spot-checks (10% of interviews), and weekly debriefs.

Analytical Framework

Quantitative Analyses

  • Descriptive Statistics: baseline comparability, device ownership shares, and initial DCA scores.
  • Pre–Post analyses (Participants): paired t-tests / Wilcoxon signed-rank tests for within-subject changes in DCA and domain scores.
  • DiD Estimation: for causal effects comparing participants to matched controls across time points.
  • Regression Analyses: OLS for continuous outcomes (DCA scores), logistic regressions for binary uptake indicators (used e-governance in past 6 months); include covariates and interaction terms for heterogeneity analysis.
  • Mediation Analyses: explore whether device ownership and connectivity mediate training effects via causal mediation techniques (where assumptions permit).
  • Attrition Analysis: assess differences between follow-up completers and attritions; implement inverse-probability of attrition weights if necessary.

Qualitative Analyses

  • Thematic coding (inductive and deductive), triangulation across KIIs and FGDs, and use of qualitative evidence to explain heterogeneity and causal mechanisms.

Integrative Mixed-Methods Synthesis

Joint display tables will map quantitative effect sizes to qualitative themes (e.g., small DCA gains among women quantified and explained by qualitative reports of mobility constraints).

RESULTS QUANTITATIVE FINDINGS (ILLUSTRATIVE & INTERPRETED)

Baseline Sample Characteristics (Illustrative)

In a typical district-level sample of 200 (100 participants, 100 controls) pooled across six districts:

  • Mean age: 34.6 years (SD 12.2)
  • Female proportion: 48%
  • No formal schooling: 12%; primary completed: 35%; secondary and above: 53%
  • Smartphone ownership (household level): 64% (higher in urban clusters)
  • Home internet connection: 18%
  • Prior digital training exposure: 9%

Baseline differences: Matching ensured no statistically significant differences between participants and controls on age, gender and education.

Immediate Post-training Competency Gains (T1) Composite DCA score (0–100):

  • Mean baseline (participants): 28.4 (SD 11.3)
  • Mean endline (participants): 52.7 (SD 13.9)
  • Mean change: +24.3 points (paired t-test, p < 0.001)

Domain-wise improvements were largest in device handling (+10.2 points) and internet navigation (+7.9 points), moderate in e-governance tasks (+3.4 points) and smallest in digital financial tasks (+2.8 points), reflecting curricula emphasis and hands-on practice distribution.

Interpretation: Structured classroom and hands-on training produce sizeable short-term improvements in basic digital competencies. These results align with broader program findings that show measurable gains in basic skills immediately post-training.

Difference-in-Differences Estimates (Participants vs. Matched controls)

Using DiD estimation on pooled sample:

  • Composite DCA (β_DiD): +21.1 points (95% CI: 18.6–23.6, p < 0.001)
  • Probability of using any e-governance service in past 3 months (binary): DiD +0.15 (15 percentage points, p < 0.01)
  • Probability of performing a digital financial transaction (UPI/ Net banking): DiD +0.08 (8 percentage points, p < 0.05)

Interpretation: Compared to matched non-participants, trainees show substantial increases in competency and moderate increases in service uptake over the immediate post-training period.

Heterogeneous Effects

Regression models with interaction terms reveal:

  • Gender: Female participants experienced treatment gains in composite DCA of +18.0 points vs. males +24.8 points (difference significant at p < 0.05).
  • Baseline Device Access: Participants with household smartphone access had larger gains in e-service uptake (DiD +0.20) compared to those without devices (DiD +0.07).
  • Education Interaction: Participants with secondary education observed larger competency gains relative to participants with no formal schooling (difference ~6 points).

Interpretation: Gains are heterogeneous women; the device-poor and low-education trainees benefit less on average, suggesting that training alone reduces skill gaps but does not fully equalize them. Targeted design elements (women-only cohorts, device access programs) could help close these gaps.

Sustained Use at Six Months (T2)

Sustained use defined as performing at least three distinct digital tasks monthly (communication, e-service use, digital payments):

  • Proportion Sustained Use (participants): 42% at 6 months
  • Proportion in matched controls: 20%
  • Adjusted logistic regression (controlling for covariates): Odds ratio for sustained use among participants = 2.6 (95% CI 1.9–3.7, p < 0.001)

Key predictors of sustained use in multivariable models:

  • Post-training smartphone access (OR 3.2)
  • Affordable monthly data (as self-reported) (OR 1.8)
  • Presence of a local peer mentor or active CSC within 5 km (OR 1.9)
  • Female gender negatively associated with sustained use when controlling for device access and mobility constraints (OR 0.7)

Interpretation: While a substantial minority sustains active usage, usage persistence is strongly determined by device and connectivity access and local support structures.

Economic and Civic Outcomes (medium-term indicators)

Short-term measurable economic effects are small in the aggregate but notable in specific subgroups:

  • Digital wage receipts or digital business transactions: DiD +0.03 (3 percentage points), concentrated among young entrepreneurs and micro-entrepreneurs with prior mobile internet access.
  • Enrollment or interaction with e-governance (e.g., download of certificates, applying for social schemes online): modest increases (DiD +0.12).
  • Self-reported time saved in accessing services: participants report reduced travel/time costs when using digital channels for information (median 1.5 hours saved per service interaction).

Interpretation: Training facilitates initial interactions with digital services and reporting of time-savings but indirect economic benefits (income increase) may require longer horizons or complementary interventions (market linkage, entrepreneurship support).

Results Qualitative Findings

(Themes & Interpretation)

Qualitative data (32 KIIs with implementers and trainers; 24 FGDs with 8–10 participants each across districts) reveal the mechanisms, perceived benefits, and barriers underlying the quantitative patterns.

Theme 1: Training quality and pedagogical approaches matter

Beneficiaries consistently valued hands-on practice, small batch sizes, and local language instruction. Where trainers used active demonstrations and practical tasks (e.g., simulating UPI payments), confidence and real-world experimentation increased. Conversely, didactic instruction and over-reliance on slide-based teaching yielded limited competence.

Implication: Standardizing pedagogy and certifying trainers improves learning outcomes.

Theme 2: Device and connectivity barriers constrain sustained use

A predominant theme was the lack of post-training access to personal devices or affordable data. Many participants relied on a shared household feature phone or borrowed smartphone, limiting opportunity for practice and online tasks. High data costs and intermittent connectivity discouraged repeated digital use.

Implication: Training must be coupled with device access strategies or community device pools.

Theme 3: Gendered constraints

Female beneficiaries reported restricted mobility, familial gate keeping on device sharing, and safety concerns that limited their ability to practice or access refresher sessions. Women-only batches and female trainers increased enrollment and engagement, but domestic responsibilities limited sustained practice.

Implication: Gender-responsive scheduling, community sensitization, and safe access points are necessary.

Theme 4: Local relevance of content increases uptake

When curricula included local e-governance portals, agricultural advisories, or market price information, participants reported immediate utility and higher motivation to continue using skills.

Implication: Curricula should be localized and demand-driven.

Theme 5: Value of post-training support and community champions

Where local champions (trained youth, CSC operators) offered follow-up support and troubleshooting, trainees were more likely to transition into routine digital users. WhatsApp refresher groups and periodic drop-in sessions were particularly effective.

Implication: Embed low-cost follow-up mechanisms into program design.

DISCUSSION

Synthesis of Quantitative and Qualitative Evidence

The mixed-methods evidence indicates that digital literacy programs in UP produce robust short-term increases in measured competencies and foster initial adoption of e-services and digital transactions. However, the transition from skills to sustained, transformative socio-economic outcomes is conditional on device access, affordable connectivity, gender-sensitive program delivery, and locally relevant content. These findings are consistent with national-level impact analyses that recognized PMGDISHA’s scale and positive role in ICT adoption while highlighting sustainability challenges.

Policies-Relevant Takeaways

  • Scale is necessary but not sufficient: PMGDISHA’s large reach demonstrates feasibility, but depth of impact requires additional supportive measures.
  • Device access amplifies training effects: Without a smartphone or reliable access, trainees struggle to maintain and apply skills.
  • Gender-responsive programming is essential for equitable impact: Women benefit from dedicated approaches addressing social and safety barriers.
  • Localized content drives relevance: Trainees are more likely to sustain use when trained on services they value immediately (agricultural advisories, local market info, scheme applications).
  • Low-cost follow-up support yields sustained use: Community mentors and digital helpdesks substantially improve continued usage.

Comparison with Prior Studies

This study’s pattern of immediate competency gains but attenuated long-term behavioral change in the absence of device/data access mirrors findings in other evaluations (both domestic and international). The uniqueness here is the large, state-level focus on Uttar Pradesh diverse district contexts, which reveal substantial intra-state heterogeneity requiring district-tailored solutions.

POLICY RECOMMENDATIONS AND PROGRAM DESIGN SUGGESTIONS

Based on empirical findings and qualitative insights, the following recommendations are proposed for policy makers, program designers, and implementers working in Uttar Pradesh.

Integrate Device Access Strategies

  • Pilot targeted device provision (e.g., subsidized smartphone for women trainees or microcredit-linked device schemes).
  • Establish community device hubs in panchayats centers and libraries with scheduled access slots and basic troubleshooting support.

Ensure Affordable Connectivity

  • Collaborate with telecom providers for local data voucher schemes for newly trained cohorts (first 6 months free/subsidized).
  • Advocate for improved last-mile connectivity investments in underserved districts.

Design Gender-Responsive Delivery

  • Offer women-only training batches with female trainers and flexible timings (evenings/weekends).
  • Engage local women’s groups (SHGs) and panchayats in mobilization to reduce social resistance.
  • Provide on-site childcare during training sessions where possible.

Localize Curricula and Emphasize livelihoods

  • Include modules on locally relevant digital services: e-challan, ration card downloads Adhar services, market price discovery, and online market linkages for farm produce and artisanal goods.
  • Integrate basic entrepreneurship modules for micro-enterprises and digital marketing for SHG-linked microenterprises.

Strengthen Trainer Capacity and Standardize Pedagogy

  • Certification of trainers with practice-oriented pedagogy training, scripted lesson plans, and periodic re-certification.
  • Maintain trainer-to-learner ratios that permit hands-on practice (ideally <1:15).

Embed Post-Training Support and Community Champions

  • Formalize mentorship networks local youth as digital champions, CSC operators as post-training anchors.
  • Use mobile-based refresher content (voice and text messages in Hindi/local dialects) and WhatsApp groups for peer support.

Monitoring, Evaluation and Learning (MEL) Systems

  • Move beyond enrollment metrics to outcome-focused indicators (DCA gains, sustained usage, service uptake).
  • Set up district-level dashboards with longitudinal tracking, public transparency and iterative program improvement cycles.

Leverage Public–Private Partnerships

  • Encourage partnerships with telecoms, device manufacturers, and local fintechs to design bundled offers (device + data + training).
  • Explore corporate social responsibility (CSR) funding for device pools and local infrastructure.

LIMITATIONS AND DIRECTIONS FOR FUTURE RESEARCH

Limitations

  • Observational Constraints: Despite matching and DiD approaches, residual unobserved confounding could bias estimates.
  • Generalizability: Though district variation was incorporated, findings from the six selected districts may not fully generalize across all 75 districts of UP.
  • Measurement Constraints: Self-reported usage may suffer from social desirability bias; device logs could offer more objective measurement but require consent and technical arrangements.
  • Time Horizon: Medium-term outcomes (6 months) capture short-to-medium persistence; long-term economic impacts may require multi-year follow-ups.

Future Research Directions

  • Randomized Controlled Trials (where feasible): To estimate causal effects of complementary interventions (device provision, data vouchers, mentorship).
  • Longitudinal Cohort Studies: Track trainees for 1–3 years to capture economic impacts.
  • Administrative Data Linkage: Integrate CSC usage logs, e-service transaction records, and bank transaction data (with privacy safeguards) for objective outcome measurement.
  • Cost-Effectiveness Analysis: Compare marginal impacts per rupee spent across training-only vs. integrated packages (training + device + data).

CONCLUSION

Digital literacy programs in Uttar Pradesh have demonstrated strong potential to increase basic digital competencies and encourage initial uptake of digital services. The PMGDISHA initiative’s scale is a significant policy achievement, and impact evaluations indicate an important role in raising ICT adoption rates. However, this study highlights that training must be embedded within a package of enabling factors device access, affordable connectivity, gender-responsive delivery, localized curricula, and post-training support to convert skills into sustained usage and socio-economic benefits. For Uttar Pradesh, an outcomes-focused strategy that combines scale with depth, equity and follow-up will be essential to realize the promise of digital inclusion for its diverse population.

REFERENCES

  1. Press Information Bureau. (2024, December 4). States/UTs wise achievements under the PMGDISHA Scheme. Press Information Bureau, Government of India. Retrieved from https://www.pib.gov.in/PressReleseDetailm.aspx?PRID=2080854. (Press Information Bureau)
  2. Press Information Bureau. (2025, April 2). Government measures for digital literacy: PMGDISHA evaluation brief. Press Information Bureau, Government of India. Retrieved from https://www.pib.gov.in/PressReleasePage.aspx?PRID=2117923. (Press Information Bureau)
  3. Press Note: PMGDISHA Achievements (June 12, 2025). PMGDISHA Achieves 6.39 Crore Digital Literacy Milestone. Press Information Bureau. Retrieved from https://pib.gov.in/PressNoteDetails. aspx?ModuleId=3&NoteId=154635&id=154635. (Press Information Bureau)
  4. Gogoi, A. (2025). Design and implementation of digital literacy training: Assessment of knowledge and skills. PMC (PubMed Central). Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC11984724/. (PMC)
  5. Government of India, Lok Sabha Secretariat. (2025, March 26). Annexure: Findings of Impact Evaluation report of PMGDISHA. [PDF]. Retrieved from https://sansad.in/getFile/loksabhaquestions/annex/184/AU4339_2y0WUi.pdf. (Digital Sansad)
  6. Planning Department, Government of Uttar Pradesh. (2024–25). Action Plan & Strategy — SDG Booklet 2024–25.up.nic.in. Retrieved from https://planning.up.nic.in/sdgcif pdf/Reports and_Publication/Final_SDG_Booklet_English_2024-25.pdf. (UP Planning Department)
  7. Annual Status of Education Report (ASER). (2024). Uttar Pradesh — ASER Report 2024. ASER Centre. Retrieved from https://asercentre.org. (ASER: Annual Status of Education Report)

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

0 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER