1 Introduction

The exchange of health information enables the electronic transfer of clinical data between different health systems and preserving their importance. Facilitate access and retrieval of clinical data to provide safe, timely, effective, and equitable patient-centred care, the World Health Organization (WHO) digitally-enabled health systems that place people at the centre of digital health, adopt and use digital health technologies [33]. Therefore, this scope requires countries to empower an integrated patient-centred approach. Innovative technologies integrated via IoT for virtual care and remote monitoring, such as smartphones connected to healthcare platforms, will allow data exchange, storage, and acquisition. The objective is to share information throughout the health ecosystem by creating ongoing care to enhance health outcomes. Of paramount importance to achieve innovation in current digitalization is to unfold the full potential data exchange activity between countries, industries, companies, healthcare systems, etc. To incentivize data exchange, we must ensure data sovereignty by facilitating the secure exchange of data between trusted parties.

The Fast Healthcare Interoperability Resources (FHIR) standard, created by Health Level Seven International (HL7), is currently the most widely used set of protocols for merging different healthcare systems. It is also an Application Programming Interface (API) for interoperability and exchanging health information [20]. Substitutable Medical Apps, Reusable Technologies (SMART) integrated with FHIR to enable healthcare applications connection to Electronic Health Record EHR systems with the appropriate safety guarantees and support authorization, authentication, and User Interface UI integration [26]. FHIR profiles assure SMART applications that resource coding of medication, procedures, conditions, laboratory results, or allergies meet data quality compliance requirements [31]. SMART’s integration with FHIR also seeks to restraint significant fragmentation by implementing widely applicable data constraints based on the terms introduced by the US Meaningful Use program [11].

Security and reliability of data-sharing infrastructure require a trusted community; therefore, mutually customized components such as the encryption system are mandatory. Fragmentation is a fundamental cryptographic technique for a data exchange strategy and is also required when COVID-19 deteriorates [6]. This paper proposes distributing fragments to Cloud Repositories (CR) from where the server module will further spread the datasets into separated data repositories. The shares can only be retrieved and reconstruct from the reliable DBMS deployed in the cloud. Thus, this approach improves data privacy and confidentiality during the data exchange process and facilitates the process without further complex encryption techniques. The primary requirement for healthcare innovation is to ensure the confidentiality of the patient’s data, stored in the system, or sharing with reliable external parties. The demand for secure image sharing in the healthcare sector has prompted us to include a virtual plan for sharing secret binary images.

The objectives that guide the development of the architectural model in Fig. 1 are the following:

  • Personal data must be processed legitimately, confirming judicature to the owner. Therefore, restrictions should apply by the individuals whose personal data processed, such as the terms of usage;

  • Data acquisition should be secure along with the flow from the initial phase of the development. For instance, from sensors to the backend repository and IoT devices;

  • Interlinked data, becoming beneficial through semantic interrogations. Thus, linking data should be simple and effective between trusted sites;

  • Delivering quality digital healthcare services requires the system to be compatible and able to serve multiple operating environments. We aim to implement a system that requires low computational complexity to maintain the advantage of limited calculation time and memory usage;

  • Government legislation obliges healthcare ecosystems to use secure data management and privacy techniques. The cross-border healthcare data exchange aligned with the country’s bioethics legislation that provides the data;

  • Since the COVID-19 pandemic, it is a prerequisite for healthcare systems to exchange and share through internet medical records. Thus, the paper aims to provide a novel method for the safe exchange of confidential datasets and medical images.

Fig. 1
figure 1

Architecture of the proposed model

Initially, we implement a literature review from various scientific articles and books to study the relevant work in this domain and set our innovative contribution. After that, the paper analyses the proposed model’s architecture, including the VPN’s data flow process. In the same section, we analyze the fragmentation process by applying an example of separating health care data into fragments. Using Newton-Gregory’s divided difference interpolation, we implement reconstruction and retrieve a specific health record from the primary dataset. The prerequisite requirement for medical data exchange is medical image transfer. We dedicate a subsection on image secret sharing framework based on the same theory of k out of n secret sharing, including an example. The last section comprises the conclusion and further discussion for future work and contribution in this domain framework.

2 Literature review

Limitation to integrate smart services by connecting heterogeneous platform devices through the IoT because they are prone to hardware/software and network attacks and, if not properly secured, can lead to privacy issues. To resolve the problem, S. Sridhar and S. Smys et al. proposed an Intelligent Security Framework for IoT Devices [30]. Snezana S. et al. introduced a novel concept of personal health records based on an e-health strategy where patients own their data using different ways of obtaining data [29]. The article [3] developed a hybrid measurement technique for digital image watermarking using medical images (X-ray, MRA, and CT), an extremely robust method for protecting clinical information. An innovative watermarking scheme based on the biorthogonal family (biorthogonal 2.2, biorthogonal 3.5, and biorthogonal 5.5) wavelet transform [4]. Simultaneously, it used a convolution for eyelets wavelet transform and conflicts wavelet transform to exchange images in the IoT frame. In the paper [2], the authors proposed a watermarking scheme in the structure of Daubechies wavelets, Daubechies-5, and Daubechies-7 wavelet transform. This wavelet approach is highly robust against various attacks, prohibiting the digital data’s piracy and authentication.

Aggarwal et al. have initially proposed the fragmentation cryptographic technique model by dividing the dataset among two data repositories. Although the idea was innovative, the collaboration between the two servers and the restriction of repositories lead to a lack of security and require further encryption to ensure the data’s privacy [21]. After that, Ciriani et al. proposed a model without limitations on the number of datasets partitioning based on improved security frame encryption derived from fragments [15, 18]. In 2009 Ciriani et al. proposed a cryptographic fragmentation model where the data owner manages his reliable DBMS [16, 17]. Following Shamir’s [28] proposal for a secret sharing scheme, Agrawal et al. and Emekci et al. extended the model by dividing and storing data into CRs, which could only be reconstructed by the knowledge of any k of the n shares and the secret value [1]. The model also reveals information by queries without deciphering the essential attributes of the subsets [19]. Sareen et al. contribute to their work by proposing a new model to protect the confidentiality of outsourced data [27].

Naor and Shamir et al. introduced the idea of image secret sharing by distributing an image into several different images, and the reconstruction is done only by aligning the shares [25]. Based on Naor’s and Shamir’s secret sharing scheme, which referred to as black and white images, Verheul and Tilborg et al. extended this framework to coloured photos [32]. The same approach of secret image share without the use of cryptography followed by Chin-Chen Chang et al. as well as Ching-Nung Yang et al. [13, 14, 36]. Bisio, I., Fedeli, A., Lavagetto, F. et al. conducted a numerical study dedicated to evaluating the implementation of a microwave imaging method to detect stroke [9]. I. Bisio, C. Garibotto, A. Grattarola, F. Lavagetto and A. Sciarrone et al. introduced the IoT as the key to I4.0 production optimization [10]. I. Bisio, F. Lavagetto, M. Márchese and A. Sciarrone et al. obtainable a performance assessment among AR approaches based on the accelerometer signal recorded through patients’ smartphones [8].

In the healthcare industry, some limited caregivers vigorously promote innovative technologies. Based on the above research projects, this paper aims to integrate confidentiality into the exchange of healthcare data provided as a text, either as an image. The novel idea generates a state-of-the-art model based on a fundamental mathematical approach that could be the key to ensuring the digitization of an ecosystem framework for virtual medical therapy and remote treatment. The goal is health data exchange architectures, application interfaces that allow data to be accessed and shared securely and adequately across the spectrum of care, in all applicable settings, and with relevant stakeholders.

3 Proposed model

For prosperous and interoperable data sharing, we proposed the development of data spaces where everyone accepted. Still, the entry should be secure, the management system will identify who uses the system, and all trusted sites will align with the regulations. Figure 1 is a visual presentation of the proposed approach. The raw data generated by sensors applied to users or the data provided by doctors, hospitals, laboratories, etc., will be distributed in fragments based on the owners’ requirements and the regulation of restriction concerning the level of confidentiality. In terms of data holder requirements, the distribution unit increases or decreases privacy level to distribute sections to multiple servers and maintain security. Thus, a commitment from the SLA Service Level Agreement will establish appropriate service and confidentiality levels by cloud storage service providers. After that, dataset fragments are distributed in separate cloud data repositories. The original dataset can be reconstructed only from the DBMS and provide data strictly only to certified users.

3.1 Interpretation

Figure 1 shows the end user’s gateway server is connected to VPG Virtual Private Gateway in VPC Virtual Private Cloud to establish a VPN Virtual Private Network connection [24]. This scheme provides a connection via a private IP address. It allows the exchange of VPCs in different areas in a public cloud that can connect multiple VPCs within a public cloud for communication without the Internet connection [23]. We propose constraining constraints that will be the rule for distribution among the attributes as agreed through SLA regarding the data owner requirements. If A is a set of users’ attributes and c is a set of confidentiality constraints, then c will be a subset of A, c ⊆ A and each constraint cannot be a subset of another constraint [22]. A constraint is defined as the restriction for combining sensitive attributes within the same fragment [5]. The Singleton pattern’s deployment will ensure that a dataset has only one acute attribute instance and provides a global access point [7]. The distribution module will manage sharing datasets into separated CR concerning user requirements regarding which attributes agreed in SLA to be together. After that, each CR server module will further distribute the encrypted datasets into cloud repositories. The reconstruction phase will follow the VPN path through the reliable DBMS, which will compute each share, recover, and present the data to the end-users.

3.2 Mathematical approach

Let A be the set of attributes a1, a2, …, an which the provider requires to distribute among CR and C the set of confidential constraints c1, c2, …, cn where ci ⊂ A. Constraints separated into singleton where unique sensitive attributes are alone in a set and subset of constraints where the attributes within cannot merge with others. The attributes fragmentation will be applied in the distribution unit by an algorithm based on the decision tree approach. It will calculate the minimum fragmentation that satisfies all confidence-building correlation constraints [18]. Singleton constraints will be distributed from the same unit by a (k. n) threshold scheme approach in such a way that the knowledge of any k ≤ n sensitive attribute values and the knowledge of the secret xi, i ∈ 1, 2…, n stored in DBMS can retrieve the information, but no group of k − 1 or fewer can do so even with the knowledge of xi. By k − 1 coefficients we mean the set of {a0, a1, …, ak − 1} constants which derives from a0 the National Identity Number (NIDN) and the respective divided differences as \( {a}_0={\Delta }_{P\left({x}_{k-1}\right)},{a}_1={\Delta }_{\Delta P\left({x}_{k-1}\right)}^2,\dots, {a}_{k-1}={\Delta }_{\Delta ^{n-k-2}P\left({x}_{k-1}\right)}^{n-k-1} \). Therefore, if we want to distribute the information into k fragments, then we choose (k − 1) randomly coefficients and we let the constant value a0 be the sensitive value of NIDN, thus creating a (k − 1) degree polynomial as follow:

$$ P(x)={a}_0+{a}_1x+{a}_2{x}^2+\dots +{a}_{k-1}{x}^{k-1} $$

According to the above, the DBMS stores the secret information x = (x1, x2, …, xn), whereby the knowledge of polynomial coefficients and the substitution of xi, (the large of i, corresponds to the number of shares) can compute the encrypted values of NIDN National Identity Number from P(xi), i = 1, 2, …, k, and thus by just substituting k of the n values from the vector x. Through Newton-Gregory’s divided difference interpolation and by the knowledge of k order pairs (xi, P(xi)), i = 1, 2, …, k we can determine the (k − 1) coefficients of the polynomial as well as the original value of NIDN corresponding to the constant a0 as follow:

$$ P(x)=P\left({x}_{k-1}\right)+{\Delta }_{P\left({x}_{k-1}\right)}\left(x-{x}_{k-1}\right)+{\Delta }_{\Delta P\left({x}_{k-1}\right)}^2\left(x-{x}_{k-1}\right)\left(x-{x}_k\right)+\dots +{\Delta }_{\Delta ^{n-k-2}P\left({x}_{k-1}\right)}^{n-k-1}\prod \limits_{i=k-1}^n\left(x-{x}_i\right) $$

Where \( {\Delta }_{P\left({x}_{k-1}\right)},{\Delta }_{\Delta P\left({x}_{k-1}\right)}^2,\dots, {\Delta }_{\Delta ^{n-k-2}P\left({x}_{k-1}\right)}^{n-k-1} \)will be the 1st, 2nd and (n − k − 1)th divided differences, respectively.

3.3 Paradigm

Let A be the set of patients’ attributes that includes the following information to be distributed among CR:

A = {National Identity Number (NIDN), Name, Date of Birth (DoB), Mobile Number (MN), Postal Code (PC), Chronic Disease (CD)}

figure a

By fragmentation, we mean the distribution of attributes so that their associated values are separated and linked only to the encryption key. An example of fragmenting the attributes involved in the constraints so that they are not visible together could be f1 = {Name}, f2 = {DoB, MN} and f3 = {PC, CD}. Fragments are stored in three separate CR CR1, CR2, and CR3, respectively. We will develop a second-degree polynomial to share the data among CRs as follow:

$$ P(x)={a}_2{x}^2+{a}_1x+{a}_0 $$

Where a0 represents the NIDN and the coefficients a1 = (1,2,5,6,4) and a2 = (7,3,2,1,9) randomly selected. Also, the secret values of xi, i = 1, 2, 3 are randomly selected and correspond to each SP respectively, let x1 = 1, x2 = 2, x3 = 4. Tables 1, 2, 3, 4 and 5 presents the computational results of substitution for each polynomial of the coefficients and the secret values.

Table 1 Registered data
Table 2 Substitution results
Table 3 Data of CR1
Table 4 Data of CR2
Table 5 Data of CR3

The fragments will distribute as shown within the following tables, presenting an incorrect value of NIDN for each data from Table 2:

Reconstruction implementation can only be done by the knowledge of the three ordered pairs {(xi, P(xi)), i = 1, 2, 3} corresponding to the three CRs which kept stored in DBMS. The decryption implemented using Newton-Gregory’s divided difference interpolation, as shown in Table 6, which will reconstruct the polynomial and reveal the original value of NIDN as the constant part of it a0 [27].

Table 6 Newton-Gregory’s divided difference interpolation

The calculations resulting from Table 6 shown below:

$$ {\displaystyle \begin{array}{l}P(x)=P\left({x}_2\right)+{\Delta }_{P\left({x}_2\right)}\left(x-{x}_2\right)+{\Delta }_{\Delta P\left({x}_2\right)}^2\left(x-{x}_2\right)\left(x-{x}_3\right)\\ {}P(x)=880636+13\left(x-2\right)+1\left(x-2\right)\left(x-4\right)\\ {}\begin{array}{l}P(x)=880636+13x-26+{x}^2-4x-2x+8\\ {}P(x)={x}^2+7x+880618\end{array}\end{array}} $$

After computing the constant value a0 that reveals the initial NIDN number. As shown in Table 7, we can retrieve any information from the patient’s record.

Table 7 Reconstructed table

4 Image exchange

A visual secret sharing scheme is a method of sharing intimate images among a group of stakeholders. Each participant gets a piece of the secret image, called a share. The allowed coalition of the participators can reveal the original image by accumulating their shares. However, any subsets of the alliance cannot retrieve the secret image by amassing their shares. For instance, if we called each share, transparency, the secret is visible in a (k, n) visual cryptography scheme if ≥ k transparencies stack together.

Nevertheless, none can see the original image if <k transparencies are stacked together. Transmitting and sharing information in a healthcare system requires medical image sharing (e.g., MRI images). Thus, we propose a secure fragmentation scheme for image exchange in the same concept of k out of n secret sharing. We have incorporated a visual secret sharing scheme to encode an image required to be secure in “shadow” embodiments called shares. The secret can be visually reconstructed only when k or more shares are available. Each pixel of the secret image is “expanded” into m sub-pixels in each share, and in the reconstruction process, the stacking of the sub-pixels is a Boolean ‘OR’ operation.

4.1 Implementation

The method requires distributing the image’s pixels in n modified versions and sharing them among n cloud repositories through VPN. Each fragment is a collection of m subpixels. It is essential for logic disciplines, including cryptography, to use a fundamental rule named Richard Hamming and Hamming Weight. The determination comes from the count of ‘1’ within a binary number. For instance, the Hamming Weight for 101,001 is 3, and 1,110,011 is 5. Hence the architecture is represented by n × m Boolean matrix A = [Aij], where aij = 1 if the jth subpixel in the ith share is black; otherwise, it is 0, represented by white. Grey-scale on images revealed using the Hamming weight defined as the amount of ‘1’ from the ‘OR’ operation on matrix A [34]. More specifically, B = OR(i1, i2, …, ir) where i1, i2, …, ir are the rows of matrix A and H(B) is the Hamming weight. C0 and C1 are defined as the n × m Boolean matrices that can compute the k out of n secret sharing. White and black pixel corresponds to the two matrices respectively, which specify the m-subpixels’ colour among the n shares in n repositories. The requirements for a calculation to be considered valid are the following:

  1. 1.

    For any A ⊆ C0 the B0 among k out of the n rows satisfies H(B0) ≤ l, l ∈ +

  2. 2.

    For any A ⊆ C1 the B1 among k out of the n rows satisfies H(B1) ≤ h, l ∈ +, l < h ≤ m

  3. 3.

    For any {i1, i2, …, iq} ⊆ {i1, i2, …, in}, q < k the q × m matrices Dt, t ∈ {0, 1} derive by restricting each Ct to rows i1, i2, …, iq cannot be distinguished

The definition of contrast, which is the combination of the Hamming Weight difference between white and black pixels in a share, could be calculated as follows:

$$ a=\frac{H\left({B}^1\right)-H\left({B}^0\right)}{m}=\frac{h-l}{m} $$

Let P0 and P1 be the probability of white and black pixel appearing in a white and black area respectively, and let Pth ∈ [0, 1] be a threshold probability. If P0 ≥ Pth and P1 ≤ Pth − a where a ≥ 0 is the contrast as defined above, then the frequency of white pixels in a white area of the recovered image will be higher than in a black area. E0 and E1 are white and black sets respectively with nλ and nγ (n × 1, matrices). The reconstruction probability is valid if the following conditions are met [35]:

  1. 1.

    The ‘OR’ operation of any n × 1 matrix is H(B)

  2. 2.

    If P0 and P1 are the probabilities of white (white = 0) appearing in the sets λ and γ respectively, then we have the satisfaction of P0 ≥ Pth and P1 ≤ Pth − a

  3. 3.

    For any {i1, i2, …, iq} ⊆ {i1, i2, …, in}, q < k, P0 = P1

The probabilities P0 and P1 are calculated as follow:

$$ {P}_0=\frac{m-l}{m},{P}_1=\frac{m-h}{m} $$

Let Gi = A0 ∘ … ∘ A0A1 ∘ … ∘ A1, i = 0, …, g − 1, where g ≥ 2 collections of Gi matrices develop a secret sharing scheme for g grey-levels with pixel expansion defined by mg. Reconstruction applied with a(1, 0), …, a(g − 1, g − 2) representing contrast and {di}, i = 0, …, g − 2 the threshold sets for n × mg, Gi matrices if the following two conditions met:

  1. 1.

    H(Bi) ≤ di − a(i + 1, i), where for Gi + 1 the Hamming weight for the ‘OR’ operation of any k of n rows results that H(Bi + 1) ≥ di

  2. 2.

    For any {r1, r2, …, rj} ⊆ {1, …, k}, 1 ≤ j < k, the matrices obtained by restricting \( {G}_{j\times {m}_g}^i \) to rows r1, r2, …, rj are equal up to a column permutation.

The following equation calculates contrast:

$$ {a}^{\left(i+1,i\right)}=\frac{H\left({B}^{i+1}\right)-H\left({B}^i\right)}{m_g}=\frac{a_{i+1}-{a}_i}{m_g}=\frac{h-l}{\left(g-1\right)\times m}=\frac{a}{g-1},i=0,\dots, g-2 $$

If ai is the amount of ‘1’ in Gi and bi the amount of ‘0’ then ai = H(Bi) = l × (g − i − 1) + h × i and bi = mg − ai = (m − l) × (g − i − 1) + (m − h) × i ⟹ ai + bi = m × (g − 1) = mg [12].

After that, we select s = 1, …, mg random columns from Gi matrices and obtain \( \left(\begin{array}{c}{m}_g\\ {}s\end{array}\right) \) the n × s matrices \( {T}_s^{(i)}=\left\{{\left.{G}^i\right|}_{s,p}\right\},p=1,\dots, \left(\begin{array}{c}{m}_g\\ {}s\end{array}\right) \). The average Hamming weight of the ith grey-level reconstructed pixel is \( \overline{H_s^i}=\sum \limits_{j=0}^sj.{p}_{s,j}^{(i.)} \)and \( {p}_{s,j}^{(i)} \) defined as the probabilistic of the Hamming weight of the ‘OR’ operation of any k rows, which is j = 0, …, s. The average grey-level and average contrast, respectively, are calculated as follow:

$$ {\displaystyle \begin{array}{c}\overline{e_s^{(i)}}=\frac{H_s^i}{S}\\ {}\overline{a_s^{\left(i+1,i\right)}}=\overline{e_s^{\left(i+1\right)}}-\overline{e_s^{(i)}},i=0,\dots, g-2\end{array}} $$

Thus, the grey level i = 0, 1, …, g − 1 is constructed by the \( \left(\begin{array}{c}{m}_g\\ {}s\end{array}\right),n\times s, \) Gi|s, p matrices respectively, where \( p=1,\dots, \left(\begin{array}{c}{m}_g\\ {}s\end{array}\right) \) and the set \( {T}_s^{(i)}=\left\{{\left.{G}^i\right|}_{s,p}\right\} \) can be used to construct a grey-scale probabilistic visual secret sharing scheme.

4.2 Paradigm

Let \( {A}^0=\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\right] \) and \( {A}^1=\left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\right] \) be the two fundamental Boolean matrices for a black and white pixel respectively the ‘OR’ operation (the operator ‘OR’ gives 1 as a result if at least one of the two elements is 1) of the elements in any two rows of A0 gives as a result of two zeros, thus h = 2 and respectively for the same reason from A1 derives l = 1. Therefore, we assume that the contrast of results, which is also known as the relative difference between the black and white pixel reconstruction, is \( =\frac{h-l}{m}=\frac{1}{3} \) .

From the definition of E0 and E1 we have \( {E}_0=\left\{\left[\begin{array}{c}0\\ {}0\\ {}0\end{array}\right],\left[\begin{array}{c}0\\ {}0\\ {}0\end{array}\right],\left[\begin{array}{c}1\\ {}1\\ {}1\end{array}\right]\right\} \) and \( {E}_1=\left\{\left[\begin{array}{c}1\\ {}0\\ {}0\end{array}\right],\left[\begin{array}{c}0\\ {}1\\ {}0\end{array}\right],\left[\begin{array}{c}0\\ {}0\\ {}1\end{array}\right]\right\} \) and as mentioned λ and γ calculated by the ‘OR’ operation of the column vectors, so \( \lambda =\left\{H\left(\left[\begin{array}{c}0\\ {}0\\ {}0\end{array}\right]\right),H\left(\left[\begin{array}{c}0\\ {}0\\ {}0\end{array}\right]\right),H\left(\left[\begin{array}{c}1\\ {}1\\ {}1\end{array}\right]\right)\right\}=\left\{0,\kern0.5em 0,\kern0.5em 1\right\} \) and \( \gamma =\left\{H\left(\left[\begin{array}{c}1\\ {}0\\ {}0\end{array}\right]\right),H\left(\left[\begin{array}{c}0\\ {}1\\ {}0\end{array}\right]\right),H\left(\left[\begin{array}{c}0\\ {}0\\ {}1\end{array}\right]\right)\right\}=\left\{1,\kern0.5em 1,\kern0.5em 1\right\} \). Thus, the appearance probabilities of white colour are \( {P}_0=\frac{m-l}{m}=\frac{3-1}{3}=\frac{2}{3} \), \( {P}_1=\frac{m-h}{m}=\frac{3-2}{3}=\frac{1}{3} \) and the threshold probability \( {P}_{th}=\frac{2}{3} \), since the contrast was \( \alpha =\frac{1}{3} \) the second condition was met. The last requirement referring to the equality of the two probabilities fulfilled as for all the shadows λ = {H([0]), H([0]), H([1])} = {0,  0,  1} and γ = {H([1]), H([0]), H([0])} = {1,  0,  0}, γ = {H([0]), H([1]), H([0])} = {0,  1,  0} and γ = {H([0]), H([0]), H([1])} = {0,  0,  1} for shadows 1,2 and 3 respectively, so \( {P}_0={P}_1=\frac{2}{3} \).

$$ {\displaystyle \begin{array}{c}{G}^0={A}^0\circ {A}^0=\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\right]\circ \left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\right]=\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern0.75em \begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\right]\\ {}{G}^1={A}^0\circ {A}^1=\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\right]\circ \left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\right]=\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern0.75em \begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\right]\\ {}{G}^2={A}^1\circ {A}^1=\left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\right]\circ \left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\right]=\left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\kern0.75em \begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\right]\end{array}} $$

The pixel expansion is mg = (g − 1) × m = 6 and the contrasts computed as \( {a}^{\left(1,0\right)}=\frac{H\left({B}^1\right)-H\left({B}^0\right)}{m_g}=\frac{3-2}{6}=\frac{1}{6} \), \( {a}^{\left(2,1\right)}=\frac{H\left({B}^2\right)-H\left({B}^1\right)}{m_g}=\frac{4-3}{6}=\frac{1}{6}\Longrightarrow {a}^{\left(1,0\right)}={a}^{\left(2,1\right)} \)

$$ {\displaystyle \begin{array}{c}{T}_5^{(0)}=\left\{\left[\begin{array}{ccc}0& 1& 0\\ {}0& 1& 0\\ {}0& 1& 0\end{array}\kern1em \begin{array}{cc}0& 1\\ {}0& 1\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 1& 0\\ {}0& 1& 0\\ {}0& 1& 0\end{array}\kern1em \begin{array}{cc}0& 1\\ {}0& 1\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 0\\ {}0& 0& 0\\ {}0& 0& 0\end{array}\kern1em \begin{array}{cc}0& 1\\ {}0& 1\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}0& 1\\ {}0& 1\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}0& 1\\ {}0& 1\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}0& 0\\ {}0& 0\\ {}0& 0\end{array}\right]\right\}\\ {}{T}_5^{(1)}=\left\{\left[\begin{array}{ccc}0& 1& 1\\ {}0& 1& 0\\ {}0& 1& 0\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 1& 1\\ {}0& 1& 0\\ {}0& 1& 0\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 0\\ {}0& 0& 0\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}1& 0\\ {}0& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}0& 0& 1\\ {}0& 0& 1\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}1& 0\\ {}0& 1\\ {}0& 0\end{array}\right]\right\}\\ {}{T}_5^{(2)}=\left\{\left[\begin{array}{ccc}0& 0& 1\\ {}1& 0& 0\\ {}0& 1& 0\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}1& 0& 1\\ {}0& 0& 0\\ {}0& 1& 0\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}1& 0& 1\\ {}0& 1& 0\\ {}0& 0& 0\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}0& 0\\ {}1& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}1& 0\\ {}0& 0\\ {}0& 1\end{array}\right],\left[\begin{array}{ccc}1& 0& 0\\ {}0& 1& 0\\ {}0& 0& 1\end{array}\kern1em \begin{array}{cc}1& 0\\ {}0& 1\\ {}0& 0\end{array}\right]\right\}\end{array}} $$

The ‘OR’ operation of the q1th row and the q2th row of the set \( {T}_5^{(0)} \) are j ∈ {0, 1, 2, 3, 4, 5}. The data included in the first row of Table 8 calculated as follow:

$$ {\displaystyle \begin{array}{c}{p}_{5,0}^{(0)}={p}_{5,3}^{(0)}={p}_{5,4}^{(0)}={p}_{5,5}^{(0)}=0/6=0,{p}_{5,1}^{(0)}=2/6=1/3,{p}_{5,2}^{(0)}=4/6=2/3\\ {}\overline{e_s^{(i)}}=\frac{H_s^i}{S}=\frac{1}{s}\sum \limits_{j=0}^sj.{p}_{s,j}^{(i)}\Longrightarrow \overline{e_5^{(0)}}=\frac{1}{5}\sum \limits_{j=0}^5j.{p}_{5,j}^{(0)}=\frac{1}{5}\left(1.\frac{1}{3}+2.\frac{2}{3}\right)=\frac{1}{3}\end{array}} $$
Table 8 Values of average grey-levels of probabilistic scheme

Table 9 contains data of \( {p}_{s,j}^{(i)} \) and \( \overline{e_s^{(i)}} \) in which s = 1, 2, 3, 4, 5, 6 and i = 1, 2, 3 from where derives that \( {\overline{a}}^{\left(1,0\right)}={\overline{a}}^{\left(2,1\right)}=1/6 \).

Table 9 Data for ps,j and es

5 Conclusions & future work

Sharing data over the cloud requires confidentiality, privacy, control, and compliance with laws and regulations. Thus, our approach suggested a framework that presents secure encryption of data stored in cloud repositories. Encryption based on an innovative data set fragmentation technique uses CRs through VPN to distribute data to separate sensitive data securely. In the future, we introduce a mathematical approach to Newton-Gregory interpolation to retrieve and reconstruct original data. A detailed example using random data explains the fragmentation data distribution and then the reconstruction application to retrieve a specific record. Fragmentation further improves current data encryption approaches by reducing the burden of high computing on the server. As shown, it can effectively apply to images to avoid cryptographic calculations and be used to transfer confidential images through the cloud. Therefore, we have proposed a binary secret sharing solution for grayscale images to be used in healthcare for a completely secure data exchange framework.

Integrating edge computing with cloud computing requires an efficient and secure data exchange during data flow and secure cloud repositories. Also, big data growth increases the obligations to secure information from data breaches and leakages of information. Therefore, we aim to continue contributing to data privacy mechanisms on big data to protect healthcare data exchange and storage confidentiality in future work. The data-rich environments resulting from cloud computing’s radical innovation in collaboration with ML Machine Learning and AI Artificial Intelligence require advanced encryption and security techniques with a low computational load. A cypher approach that satisfies the fundamental security properties of image could be used in conjunction with fragment images for further data exchange safety.