Digital Maturity Assessment Model (DMAM): assimilation of Design Science Research (DSR) and Capability Maturity Model Integration (CMMI)

Purpose – ThisstudyintroducestheDigitalMaturityAssessmentModel(DMAM),amodeltailoredtoassess the digital maturity of SMEs, tracing its development from addressing business challenges to establishing a comparative analysis framework grounded in Resource Dependence Theory (RDT). Design/methodology/approach – DMAM is based on positivist philosophy and objectivist epistemology, supported by Design Science Research (DSR) and Capability Maturity Model Integration (CMMI). The methodology involves iterative development, from problem identification to creating a practical solution for assessing SMEs’ digital maturity and guiding digitalization efforts. Findings – DMAM offers a clear and specific methodology, distinguishing itself by addressing the unique needs of SMEs, particularly resource-dependent ones. The model’s development fills critical gaps in existing literature and provides a practical artifact for SMEs’ digitalization. Originality/value – DMAM is original in its focus on the specific needs of resource-dependent SMEs, offering actionable recommendations and addressing shortcomings in existing models. It serves as a foundational framework for SMEs’ digital transformation, making a significant contribution to the digital maturity assessment literature.

The absence of a universally recognized model for evaluating digital maturity in resourcedependent SMEs could exacerbate the digital divide between SMEs and larger organizations, threatening the competitiveness of smaller firms (Kljaji� c Bor� stnar & Pucihar, 2021).Therefore, developing a standardized model tailored for resource-dependent SMEs is crucial for advancing digital maturity in this sector (Zaoui & Souissi, 2020;Gimpel et al., 2018).

Related digital maturity models 2.1 Overview of existing models
Maturity models are essential for SMEs to assess and enhance their maturity levels in specific domains, offering a structured approach to understand their current state, identify gaps, set goals, and pursue continuous improvement (Makupi & Karume, 2021).According to Omol et al. (2023), these models guide organizations through stages of development to improve processes and practices.Commonly known as Maturity Models (MMs) or Maturity Assessment Models, they are vital for organizational development and transformation (Aguiar, Gomes, da Cunha, & da Silva, 2019;Makupi & Karume, 2021;Junior, Marczak, Santos, Rodrigues, & Moura, 2022).
Key elements of maturity models include: (1) Maturity Levels, representing stages of capability and effectiveness; (2) Key Process Areas, essential for achieving maturity and enhancing performance; (3) Indicators and Criteria, metrics for assessing maturity levels; (4) Continuous Improvement, focusing on ongoing enhancement and regular reassessment; and (5) Tailoring and Customization, adapting the model to specific needs and contexts (De Bruin, Rosemann, Freeze, & Kaulkarni, 2005).

Gaps in existing models
One significant research gap is the lack of tailoring to the unique characteristics and requirements of SMEs.Several models, such as the Multi-Attributed Assessment Digital Model (MAADM) (Kljaji� c Bor� stnar & Pucihar, 2021), Digital Transformation Model (DTM) (Viloria-N� uñez, V� azquez, & Fern� andez-M� arquez, 2022), and Digital Transformation Capability Maturity Model Framework (DTCMMF) (Aguiar et al., 2019), are not specifically designed to address the distinctive needs of small and medium-sized enterprises.This generalization limits the applicability and effectiveness of these models in SME contexts.
Another notable gap is the unclear and poorly articulated methodologies employed in many models.For instance, the methodologies of the Digital Maturity Model (DMM) by Anderson & Ellerby (2018) and the Digital Capability Model (DCM) by Ramantoko, Fatimah, Pratiwi, and Kinasih (2018) are not clearly specified or described, making it difficult to understand the research processes and validate the findings.Additionally, the process for selecting dimensions or criteria in several models is not well-defined.This issue is observed in the DMM by Schallmo, Lang, Hasler, Ehmig-Klassen, and Williams (2021), the DTM by Viloria-N� uñez et al. (2022), and the Digital Readiness Assessment (DRA) by Pirola, Cimini, and Pinto (2020).Without a clear approach to dimension selection, the reliability and validity of these models are compromised.
Furthermore, some models fail to translate their concepts into tangible or measurable artifacts, making practical implementation challenging.This limitation is evident in the DMM by Williams, Schallmo, Lang, and Boardman (2019) and the DMM by Schallmo et al. (2021), where the absence of tangible artifacts hinders operationalization.The lack of practical implementation tools is also a common gap in models like the Digital Maturity Readiness Model (DMRM) by Yezhebay, Sengirova, Igali, Abdallah, and Shehab (2021) and the Maturity Model for Evaluating Readiness (MMER) by Chonsawat and Sopadang (2021), reducing their usability in real-world settings.
Lastly, several models do not provide detailed methodologies for empirical validation or fail to explain how their methodologies were implemented.For example, the Digital Assessment Model (DAM) by Trotta and Garengo (2019) and the DTM by Jeansson and Bredmar (2019) lack comprehensive methodological clarity, which is crucial for assessing the robustness and applicability of their findings.

Comparative analysis
As shown in Table 1, numerous models with varying dimensions are available for SMEs pursuing digitization, creating a challenge in identifying the most critical dimensions for resource-dependent SMEs during digital transformation.To address this, the study introduces an 18 by 20 matrix to pinpoint dimensions commonly recognized in digital maturity assessments for SMEs and understand their environmental characteristics from the perspective of RDT, as presented in Table 1.The study sets a minimum requirement that dimensions must appear in at least 40% of the models, meaning they should score at least 8 out of 20 (Bumann & Peter, 2019).
To address this, the study adopts RDT to explore how external factors influence digital maturity.The proposed conceptual framework, depicted in Figure 2, integrates internal dimensions (Product, Technology, Organization, People, Strategy, Operations) with external factors (Government Support, Partnerships, Loans and Grants).This comprehensive approach aims to better assess digital maturity by considering both internal and external influences on SMEs.
The interplay between internal action fields and the SMEs Environment highlights the complex relationship between internal dynamics and external factors.SMEs' digital maturity is influenced not only by internal strategies and operations but also by external support, collaborations, and financial resources.The dependent variable, SME Digital Maturity, encapsulates the overall effectiveness of SMEs in utilizing digital technologies, strategies, and human capital.The conceptual framework underscores the importance of a holistic approach, integrating both internal and external dimensions, to fully assess and improve SMEs' digital maturity.

DMAM's unique contributions
The DMAM has evolved through a comprehensive development process, starting with a coherent methodology to identify business problems, as detailed in the introduction and "Gaps in Existing Models" sections.A comparative analysis using the matrix in Table 1 and RDT facets (  (Williams et al., 2019) and DMRM (Yezhebay et al., 2021) rely on systematic literature reviews and surveys but lack tangible artifacts or SMEspecific focus.In contrast, the DMAM addresses these gaps by providing a specific, practical tool for assessing digital maturity in resource-dependent SMEs, significantly advancing the field.

Design Science Research (DSR)
The study is based on the Design Science Research (DSR) framework (Hevner, March, Park, & Ram, 2008;Kljaji� c Bor� stnar & Pucihar, 2021), which focuses on rigor, relevance, and development cycles, as shown in Figure 3. DSR is chosen for its suitability in addressing complex real-world problems by creating and evaluating innovative artifacts.The research aims to develop the Digital Maturity Assessment Model (DMAM) as an IT artifact to assess digital maturity levels for resource-dependent SMEs.Guided by the Goal Question Metrics (GQM) approach, the study mapped variable weights to the model, resulting in a web-based prototype.

Research process
As depicted in Figure 4, this study applied synthesized and composite coherence to existing digital maturity and transformation literature to identify a research problem.This necessitated the adoption of a positivist philosophy and Design Science Research (DSR) integrated with the Capability Maturity Model Integration (CMMI).A Systematic Literature Review (SLR) was conducted, utilizing comparative analysis (Omol, Abeka, & Wauyo, 2017) as shown in Table 1, to identify gaps in the literature and the study's action fields, as depicted in Figure 2.
A research questionnaire, informed by literature review and expert consultations, was then developed and validated by three information systems professors.It consisted of three sections: general business inquiries, CMMI five-level scale questions on various business aspects, and digital maturity outcomes.The questionnaire's validity was confirmed with an average score of 7.7/10, and its reliability was verified using Cronbach's alpha, with coefficients ranging from 0.756 to 0.987, indicating strong internal consistency.A pilot study in Kisumu's CBD, with a sample of 39 respondents (10% of the total), achieved an 87% response rate.The final DMAM questionnaire was deemed a reliable and valid tool for assessing SMEs' digital maturity of licensed SMEs in Nairobi's Central Business District, targeting managers and owners across various sectors such as Retail, Transport, Hospitality, and Technology.From 53,600 licensed SMEs, a stratified sampling method selected 382 respondents, calculated using Fisher et al.'s formula for a 95% confidence level.The online questionnaire, administered via Kobo Toolbox, achieved a 99% response rate (378/382).Data collection occurred between November and December 2023, adhering to ethical guidelines and approvals from KCA University, Kenyatta University, and NACOSTI.Using the Beta values shown in Table 2, the study derived a linear equation (Equation 1) and mapped it to the CMMI five stages: Initial (1), Managed (2), Defined (3), Quantitatively Managed (4), and Optimizing (5), fulfilling Equation (2).DMAM design boundaries were calculated and summarized in Figure 6, providing a scale for indexing SMEs using the principles of the Goal Question Matrix, as illustrated in Figure 7.The DMAM logic was then created, as shown in Figure 11, implementing the DMAM prototype framework depicted in Figure 8.This was guided by the architectural pattern shown in Figure 9 for the Proof-of-Concept (POC), resulting in the Minimum Viable Product (MVP) for DMAM implementation.The prototype was later evaluated using cognitive walkthroughs, Nielsen's heuristic evaluation, and IT-Systems goal-based evaluation, demonstrating its capability to index SMEs as illustrated in Figure 14.

DMAM hierarchical regression modeling
The hierarchical regression model, after confirming data assumptions, analyzed the relationships between independent variables and the DMAM.As shown in Table 3, the beta values in Model 1, technology (β 5 0.428, p < 0.001) emerges as a significant positive predictor of digital maturity, indicating that increased technology adoption enhances digital maturity in resource-dependent SMEs.Strategy (β 5 0.184, p < 0.001) and operations (β 5 0.365, p < 0.001) also positively affect digital maturity, underscoring the importance of strategic planning and efficient operations.Conversely, the people variable (β 5 �0.081, p 5 0.002) shows a negative impact, suggesting that the current workforce may not be fully aligned with digital transformation efforts.Product (β 5 �0.039, p 5 0.261) does not significantly impact digital maturity, indicating that product-related factors are less critical compared to other variables.Model 2 introduces a mediator variable, which alters the influence of other variables.Technology remains a significant positive predictor with an increased beta value (β 5 0.472, p < 0.001), reinforcing its crucial role.The negative effect of the product variable becomes more pronounced (β 5 �0.211, p < 0.001), highlighting the significance of product-related challenges when accounting for the mediator's influence.The people variable's negative impact decreases (β 5 �0.048, p 5 0.037), suggesting ongoing issues with human capital in digital transformation.The mediator variable itself (β 5 0.343, p < 0.001) has a significant positive effect, emphasizing its role in clarifying the relationships between independent variables and digital maturity.
These findings underscore the importance of technology, strategy, and operations in driving digital maturity, while also identifying potential concerns such as workforce readiness.The pronounced negative impact of the product variable in Model 2 points to external challenges that may hinder digital progress, highlighting the need for SMEs to proactively address these issues.The mediator's significant role emphasizes the complexity of digital maturity and the need to consider intermediary factors in assessments.
Comparing these results with existing literature, such as studies by Williams et al. (2019) and Yezhebay et al. (2021), reveals common themes around the critical dimensions of people/ culture, technology, and processes.However, this study uniquely introduces the SMEs' Dependence Level on environmental factors as a mediating variable.This novel approach acknowledges the role of external dependencies, offering a fresh perspective on SME digital maturity not extensively covered in previous research.While other models, like those by Goumeh and Barforoush (2021), have explored new elements, the focus on external dependencies provides a more comprehensive understanding of the factors influencing digital maturity in resource-dependent SMEs.This contribution highlights the need for further exploration of complex mechanisms affecting digital maturity.
The regression modeling in this study was guided by the following equation taking note of all the beta values of model 2 in Table 3:

DMAM modeling 5.1 Mathematical conceptualization
To assess SMEs' digital maturity, a predefined set of questions is used as indicators, presented via a web-based interface.Respondents rate their maturity level on a scale of 1 to 5, where 1 indicates initial stages and 5 reflects optimized and continuously improving processes, a hallmark of digital maturity.These scores serve as indicators, aligning with the CMMI five levels as depicted in Figure 5.This assessment, conducted for both the respondent and the associated SME, was termed the Digital Maturity Level of the SME.The operationalization of this concept involved applying the hierarchical regression modeling equation derived from the proposed model in Equation (1).The weighted coefficients obtained from the regression model played a crucial role in determining the Digital Maturity of the SME, as outlined in the following equation.

Initial
Managed Defined Quantitatively Managed Optimizing

DM AM
In the context of this study, the weights, denoted as W 1 , W 2 , W 3 , . .., W n , are derived from field data specifically collected for the research's purposes.These weights are extracted from the regression equation outlined earlier (Equation 1), where W 1 corresponds to 0.612, W 2 to �0.308, W 3 to 0.196, W 4 to �0.058, W 5 to 0.134, W 6 to 0.293, and W 7 to 0.406.Simultaneously, L 1 , L 2 , . .., Ln represent the weighted indicators determining the status of a given maturity factor.The assigned weight is contingent on the score entered by the individual SME, reflecting their level of agreement with the respective weights.This scoring mechanism operates within the permissible range of 1 to 5 for each maturity concern presented on the web-based model interface.

Design boundaries
The best-case scenario occurs when a user scores the highest possible average (5) on all 72 assessment questions (9 questions per category).With the maximum operational score being 56:76574ð100%Þ: We can therefore calculate the minimum operational rate by 10:86574 56:76574 *100 5 19.1414%.Thus, the DMAM score ranges between 19.14% and 100%.
This study posits that SME digital maturity progresses through a series of defined stages.The initial stage, Initial, is characterized by unpredictable, poorly controlled, and reactive processes.The subsequent stage, Managed, sees processes being characterized at the project level, though still often reactive.The third stage, Defined, involves processes being standardized across the organization, with a shift towards proactive management.In the Quantitatively Managed stage, processes become measured and controlled through quantitative means.Finally, in the Optimizing stage, the focus shifts towards continuous process improvement, ensuring the organization operates at peak efficiency.The CMMI model provides a structured approach to process improvement, which helps organizations to increase their productivity, reduce costs, and improve quality (Team, 2002).It is widely used in software development and other industries, such as aerospace, defense, and healthcare.According to a study by Team (2002), the CMMI model has been shown to be effective in improving software development processes.The study found that organizations that implemented the CMMI model were able to improve their process performance and achieve their business goals.The study also found that the CMMI model was able to provide a common language and framework for process improvement, which helped to facilitate communication and collaboration within organizations.The metrics indicate the maturity level from initial to optimizing stages.(Spremi� c, Zentner, & Zentner, 2022).This index evaluates the SME's maturity level against the CMMI five levels.

CMMI maturity levels.
1. Optimizing Level: Full maturity with a score of 100%.SMEs at this level focus on continuous improvement and innovation.These levels illustrate the SME's maturity journey from initial, ad-hoc processes to optimizing, continually improving processes, with specific percentage ranges indicating their current status within the CMMI framework.

Assessment scale and threshold scores
The functionality of the DMAM assessment scale operates with threshold scores, set on a scale of 1 to 5, where a score of 5 signifies the SME's acknowledgment of digital maturity aligned with the CMMI optimizing level.An assigned score of 5 indicates that the SME's processes are in an optimal state, consistently enhancing in accordance with the requirements of the CMMI.This implies that the SME's average score per assessment question attains a mature 5, reflecting the desired level of maturity.Conversely, average scores of 1, 2, 3 and 4, falling below the threshold score of 5, indicate that the SME's maturity index is retrogressively approaching 0%, signifying a low level of maturity.In such instances, proactive measures are recommended for the SME to leverage its resources and ascend toward the optimization level of the CMMI.Best practice recommendations are aligned with these threshold scores as shown in Figure 6.
As outlined in the equations, the initial level corresponds to a maturity score ranging from 19% to 38%, the managed level spans from 39% to 59%, the defined level encompasses scores from 60% to 79%, the quantitatively managed level spans from 80% to 99%, and the 100% represents the optimizing maturity level.The 19% and 100% maturity levels are considered as boundary values on a scale of 1 to 5, serving as theoretical benchmarks that are unattainable in practical Digital Maturity scenarios.The computation of the maturity index within the range of 0% to 18% is formally infeasible within the established model, primarily attributed to two factors.Firstly, the presence of an error term impedes the accurate derivation of the index.Secondly, the model's computation for these specific factors is constrained to a scale ranging from 1 to 5, thereby producing results that align with the range.This scenario is formally denoted as an "impossible state.".The comprehensive overview of Digital Maturity levels aims to illustrate the extent of digitization achievable by a specific SME in alignment with its digital maturity process.The subsequent chapter on model implementation provides computational logic and a guiding recommendation report, emphasizing efficiency and innovation associated with the maturity index, for the SME to consider to enhance its overall digitization posture.

Goal Question Metric (GQM)
The Goal Question Metric (GQM) methodology emphasizes SMEs' need to define goals and objectives for assessments, linking them with operational data.Results from the regression equation (Equation 1) are structured into three levels: conceptual (GOAL), operational (QUESTION), and metric levels.The GQM approach ensures alignment with objectives and the model, facilitating a top-down measurement process.See Figure 7 for a visual representation.
The anticipated weightings, which play a pivotal role in determining the digital maturity of SMEs, are depicted in the diagram presented in Figure 7.These weightings are derived from the regression equation (Equation 1) as discussed above.The questions aimed at eliciting digital maturity levels are explicitly outlined in the designated question section of the diagram.Ultimately, the overarching objective is ascertained by carefully considering each itemized question within the digital maturity determinant factor.This factor introduces operationalized data items that contribute to the computation of weights for each independent variable.

DMAM prototype implementation
The prototype utilized several technologies: PHP for server-side scripting, HTML for page structure, and CSS for visual enhancements.JavaScript enabled interactive features like modal popups and Ajax functionality.MySQL managed system data efficiently, while Bootstrap ensured responsive design and consistent visuals.TCPDF was used for generating standardized reports.This combination of technologies was carefully selected to enhance the prototype's functionality, aesthetics, and effectiveness.

Prototype architecture
By considering the various dimensions (Figure 2) within the independent variables, the maturity level of digital maturity for Resource-dependent SMEs was calculated based on the What are the varieties of products do we produce?
How supportive is the organization towards digitization initiatives?
What is the people's perceptions on digitalization processes?
What is the place of digitization on the SMEs business strategies?
Are the SME's operations efficient and effective?
Are regulatory frameworks and financial institutions encouraging SME existence?

Metrics
Question Goal Source(s): Authors assigned weights for each dimension (Figure 7).This calculation provided valuable insights into the level of maturity attained.Figure 8 depicts stage two of the conceptual framework, which provided guidance for the implementation of the artifact.
The prototype consists of several modules to fulfill specific functions.These modules include the SME Profile module (Registration and Login), which enables users to register for the system.The SME Login and Authentication module ensuring that only authorized users can access the system.The maturity Valuation module prompts users to express their concerns related to digital maturity.The Database module stores user information and assessment scores.The Information Processing module computes the Digital Maturity (DM) based on the stored weights and the digital positional information provided by the user.Additionally, the prototype will feature a module for displaying DMAM information and provides a mechanism for downloading the DMAM index report and recommendations.This artifact framework will further be translated into the corresponding architectural patter (Figure 9).
For the development of the artifact in this study, the Model-view-controller (MVC) and Event-bus architectural patterns were utilized (Figure 9).The MVC pattern typically involves dividing an application into three components: The Model, which encompasses the In the context of the prototype, the MVC and Event-bus architecture pattern can be related to the three main components of the prototype as follows: (1) Model: The "Model" component represents the underlying data and business logic of the prototype.It encapsulates the core functionalities, algorithms, and calculations related to assessing and measuring the digital maturity of SMEs.(2) View: The "View" component is responsible for presenting the visual representation of the digital maturity assessment results to the users of the prototype.It encompasses the user interface elements and other visualizations that help SMEs understand their digital maturity levels and areas for improvement.(3) Controller: The "Controller" component acts as the intermediary between the Model and View components.It receives user inputs, such as the data related to the SME's digital practices and processes, and coordinates the communication and interaction between the Model and View.The Controller processes the inputs, triggers the appropriate actions in the Model, and updates the View to reflect the changes in the digital maturity assessment.
By employing the MVC and Event-bus architecture patterns, the prototype separates the concerns of data manipulation and presentation, enhancing maintainability, reusability, and scalability.The Model component handles the underlying logic, the View component focuses on visual representation, and the Controller component manages the flow of information between the two, ensuring a structured and organized approach to assessing and improving the digital maturity of SMEs.

Prototype design
Following the requirement gathering phase, a rapid conceptual design of the DMAM system and its modules was conducted.This involved creating a flow diagram, as illustrated in Figure 10.

Prototype logic
The implementation logic for the maturity regression equation is embodied in a prototype to achieve the overall goal.This model interprets the output value based on various input data sets representing SME digital maturity.Linear regression modeling, a specific type of regression analysis, assumes that the output can be explained through a linear combination of input values (Lin, Wang, & Sheng, 2020).Moreover, it aligns with simulation modeling, manifested in computer programs where logical arithmetic operations occur in a predetermined sequence.This approach offers increased flexibility in model formulation and allows for a high level of realism to be attained, particularly beneficial when uncertainties are crucial in decision-making processes.The coding logic implemented in the web-based model is presented in Figure 11.
After assessments are done the user scores (1-5) are posted to the assessments table upon which the DMAM score is computed for session user using the DMAM computation equation.Automation of the DMAM computation equation is presented in the code snippet in Figure 11.Here DMAM is a function of category weights, SME scores, constant and the error term.

Proof-of-concept
As highlighted by BenMahmoud-Jouini and Midler (2020), prototypes are integral to designers as they facilitate the physical manifestation of ideas for problem-solving purposes.In the realm of software development, the term "proof of concept" is frequently employed to denote various distinct processes, each with unique objectives and participant roles (Sanabria et al., 2022).The implementation of a proof of concept involved the engagement of selected SMEs, who utilized the web-based model to assess their digital maturity status and determine if the system fulfills certain requirements (Neylon, Luximon, Ritter, & Lamb, 2023).This approach facilitated the verification of whether the model aligns with the anticipated solution, validating its conformity to the expected outcomes.
The POC approach for the DMAM adhered to the MVC and Event-bus architecture patterns, as depicted in Figure 9.This approach served as a guide for progressing from the initial week zero stage to the development of a fully functional and market-ready product (Aune et al., 2023).Similarly, the proof of concept operation yielded a Minimum Viable Product (MVP) once the artifact was successfully implemented (Makupi & Karume, 2021) as shown in Figure 12.
6.4.1 Case scenario assessments.This section examines practical applications of the DMAM through case scenarios, showcasing its effectiveness and adaptability in real-world contexts.By evaluating both worst-case and best-case scenarios, we assess how the model

DTS
performs at various digital maturity levels within SMEs.This analysis validates the DMAM's capabilities and highlights its potential to guide strategic decisions and drive digital transformation.The DMAM prototype's operational boundaries align with the assessment scale (Figure 6), ranging from a minimum score of 19.14% to a maximum of 100%.
6.4.1.1Worst case scenario.As shown in Figure 13, the worst-case scenario occurs when a user consistently scores an average of 1 for every assessment question.
6.4.1.2Best case scenario.As shown in Figure 14, the best-case scenario is attained when a user achieves an average score of 5 for every assessment question.
6.4.2SME maturity index.In Figure 15, the digital maturity score of the SME is 72.26% and averagely categorized as Defined.Additionally, the distribution of SME assessments scores is visualized in a pie chart.Scores below the threshold are also visualized and recommendations thereof can be displayed as a printable report.The background information of the SME provided during registration is displayed on the SME dashboard.

Evaluation and validation of DMAM
The DMAM prototype was rigorously evaluated using three distinct methods: cognitive walkthroughs, Nielsen's heuristic evaluation, and IT-Systems Goal-based evaluation and Society (Alomari, Ramasamy, Kiper, & Potvin, 2020;Armstrong, Brewer, & Steinberg, 2019;Osita-Ejikeme, 2021).Cognitive walkthroughs involved inviting users to register and conduct assessments, focusing on learnability and gathering feedback to address issues.Nielsen's heuristic evaluation, performed by an expert, assessed usability based on predefined guidelines, emphasizing system status visibility and error prevention (Alomari et al., 2020;Armstrong et al., 2019).The IT-Systems Goal-based evaluation assessed whether the system met predefined objectives, including user registration, login, readiness assessment, digital maturity index computation, and the provision of scores and recommendations (Osita-Ejikeme, 2021;R€ uegg et al., 2018).This evaluation ensured alignment with user needs and system functionality.Following these evaluations, the DMAM code underwent fine-tuning based on the feedback and issues identified during the testing phase.The final prototype was deployed online, allowing SMEs to register, log in, and assess the system's functionalities.User feedback led to additional refinements to enhance the system further.After resolving concerns raised during testing, the system was fully deployed on the internet.Access to the system is facilitated via the URL: (https://matricuda.com/dmam/).This evaluation and

Conclusion
DMAM is the result of rigorous research designed to meet the digital maturity assessment needs of SMEs.By identifying key business challenges and developing a comparative analysis framework aligned with Resource Dependence Theory (RDT), DMAM provides a practical tool for assessing SMEs' digital maturity.Grounded in positivism and supported by DSR and CMMI frameworks, DMAM stands out for its clarity, specificity, and alignment with SMEs' unique needs, offering a tangible solution for digitalization.Unlike other models, DMAM addresses critical gaps, laying a foundation for future research and becoming a vital tool in helping SMEs navigate their digital transformation.

Limitations of the study
One limitation of this study is the small sample size of 382 SMEs, limited to the Central Business District of a single city due to time and budget constraints.This narrow geographic focus may affect the representativeness of the findings, as the sample may not reflect the diversity of SMEs across different sectors or regions.Consequently, the study's generalizability and robustness are restricted, and the results should be interpreted with this limitation in mind.
maturity models DMM MAADM DMM DMRM DTM DTCMMF SDT DMM MMD MMER MMR MMP DRA MM DMM MDMD DBMM DCM DAM DTM Figure 1.Digital maturity action fields Figure 1) was conducted to identify relevant variables.In comparison to various models, such as the Digital Maturity Model (DMM) by Williams et al. (2019), Multi-Attributed Assessment Digital Model (MAADM) by Kljaji� c Bor� stnar and Pucihar (2021), Digital Transformation Capability Maturity Model Framework (DTCMMF) by Aguiar et al. (2019), Digital Maturity Readiness Model (DMRM) by Yezhebay et al. (2021), and Digital Transformation Model (DTM) by Viloria-N� uñez et al. (2022), which either lack SME focus or methodological clarity, DMAM stands out for its methodological rigor and alignment with SME requirements.Models like the DMM

Figure 11 .
Figure 11.DMAM code logic Figure 12.POC approach Figure 14.Best case scenario graphical user interface

Table 2 ,
The analysis of SME distribution across categories highlighted the dominant presence of the Technology sector, constituting 26.72% of the sample, followed by Health Services and Catering.Operational duration revealed a majority falling within the years range, while the assessment of workforce size indicated a substantial presence of businesses with 21-30 employees.Regarding yearly sales turnover, a considerable proportion fell within the 21-30 million range.Interestingly, the adoption of DMAM tools favored web applications over mobile applications, emphasizing the pivotal role of digital tools in SME operational strategies.

Table 3 .
Beta values of the variables