Understanding Development of Malnutrition in Hemodialysis Patients: A Narrative Review

Hemodialysis (HD) majorly represents the global treatment option for patients with chronic kidney disease stage 5, and, despite advances in dialysis technology, these patients face a high risk of morbidity and mortality from malnutrition. We aimed to provide a novel view that malnutrition susceptibility in the global HD community is either or both of iatrogenic and of non-iatrogenic origins. This categorization of malnutrition origin clearly describes the role of each factor in contributing to malnutrition. Low dialysis adequacy resulting in uremia and metabolic acidosis and dialysis membranes and techniques, which incur greater amino-acid losses, are identified modifiable iatrogenic factors of malnutrition. Dietary inadequacy as per suboptimal energy and protein intakes due to poor appetite status, low diet quality, high diet monotony index, and/or psychosocial and financial barriers are modifiable non-iatrogenic factors implicated in malnutrition in these patients. These factors should be included in a comprehensive nutritional assessment for malnutrition risk. Leveraging the point of origin of malnutrition in dialysis patients is crucial for healthcare practitioners to enable personalized patient care, as well as determine country-specific malnutrition treatment strategies.


Introduction
The last three decades witnessed considerable growth in the global burden of chronic kidney disease (CKD), accounted for by 77.5% of end-stage kidney disease (ESKD) patients on kidney replacement therapy (KRT), with 43.1% alone provided by dialysis [1]. Hemodialysis (HD) forms 89% of the global treatment for ESKD patients [2]. The technological delivery of HD treatment to patients today is considered optimal as per medical guidelines for practice with regard to biocompatibility of dialyzer membranes, dialysis dose, frequency of dialyzer reuse, and duration of dialysis [3].

Development of Malnutrition at the Time of HD Initiation and Indicators of Poor Nutritional Status
The decision to start dialysis for an ESKD patient varies across countries and is influenced by the local nephrology practice, healthcare policies, and cost for dialysis treatment [18]. The Dialysis Outcomes and Practice Patterns Study (DOPPS) Phase 2 with 12 participating countries indicated a greater mortality rate in patients new to dialysis compared to prevalent dialysis patients [19]. Early mortality at the time of dialysis initiation prevails with increased risk up to 80% within the first two months of HD initiation [20]. Apart from catheter vascular access [20] and pre-dialysis care [21], nutritional status is considered a potentially modifiable risk factor in early mortality [22]. Clearly, pre-existing malnutrition originates from progressive CKD stages 3 to 5 with vulnerability of the patient starting from the point of metabolic derangements associated with falling glomerular filtration rate, late nephrology access, and insufficient pre-dialysis dietetic care during this period [22,23].
Earlier opinion on dialysis initiation did consider poor nutritional status as a factor to initiate dialysis. However, this was not based on markers of malnutrition but rather signs and symptoms of malnutrition such as anorexia, nausea, and fatigue [24]. Van de Luijtgaarden et al. (2012) reported 53% of nephrologists agreeing to initiate dialysis in patients with poor nutritional status [25]. However, a review on dialysis initiation observed a lack of data on the benefits of early dialysis initiation in patients with low serum albumin level or in improving nutritional status [26]. The Canadian Society of Nephrology Guidelines (2014) [27] ceased to recommend dialysis initiation on the basis of a decline in nutritional status as indicated by serum albumin, lean body mass, or subjective global assessment (SGA), whereas the Caring for Australians with Renal Impairment Guidelines [28] recommend dialysis with glomerular filtration rate (GFR) < 10 mL/min per 1.73 m 2 to reduce uremic symptoms or signs of malnutrition.
Dialysis treatment is expected to improve nutritional status for patients with a more liberal protein prescription compared to the pre-dialysis stage [29][30][31]. However, dialysis treatment is cited to contribute to malnutrition burden [8], and newly dialyzing patients are at risk of the early mortality attributed to malnutrition as evidenced by diagnostic assessment of nutrition risk screening using SGA [32], low body mass index (BMI), low mid-arm muscle circumference (MAMC) [22,33,34], low albumin [20], low cholesterol levels [32], and reduced food intake [22,[33][34][35], as shown in Table 1.

Iatrogenic Factors of Malnutrition
ESKD patients with pre-existing malnutrition on maintenance dialysis become additionally vulnerable over time to the catabolic effects of the dialysis treatment, which predispose the patient to greater mortality and morbidity in long-term dialysis. The concern is that the presence of poor nutritional status in dialysis patients predicts increased mortality risk. Kwon et al. (2016) prospectively used SGA to monitor patients, observing that those in SGA B and C categories at baseline almost tripled their risk of mortality by 12 months [32]. HD patients experiencing declines in BMI and serum albumin levels over a 6 month follow-up also had increased mortality risk [36]. Table 2 summarizes studies reporting various indicators of poor nutritional status as strong predictors of mortality in maintenance HD patients. Iatrogenic malnutrition or "physician-induced malnutrition" is the development of malnutrition arising from medical procedures, pharmacological treatment, prolonged hospitalization, nosocomial infections, or delayed wound healing [40]. Similarly, aspects of the dialysis procedure contribute to malnutrition, which is unavoidable as it occurs as part of the treatment [8]. The iatrogenic aspects of dialysis procedure are detailed in the sections below.

Dialysis-Induced Nutrient Losses
The dialysis process is instrumental to chronic nutrient losses, particularly protein and amino acids. About 6-12 g of amino acids and 7-8 g of protein losses occurring during each dialysis session [41][42][43][44][45] may contribute to hypoalbuminemia, a strong predictor of malnutrition and mortality [11,33,36]. Optimal dietary protein intake (DPI) may replenish low plasma amino acids. However, DPI inadequacy is a common issue in HD patients, affecting 32-81% of HD populations globally [31]. Suboptimal DPI associated with dialysis-induced amino-acid losses [46] promotes protein catabolism through increased whole-body and muscle protein proteolysis [46,47].
Nutrient losses via dialysis depend on the mechanism of solute removal and the pore size of the dialyzer membrane, which determines solute removal [48]. However, increasing the pore size of dialyzer membranes to enable greater removal of middle molecules also increases involuntary albumin losses, estimated between 2 and 14 g depending on the degree of membrane permeability [49]. As such bioincompatible membranes [45], high flux membrane, hemofiltration (HF) and hemodiafiltration (HDF) techniques [46,50], or multiple dialyzer reuse practice [43] induce greater membrane permeability and facilitate greater losses of amino acids into the dialysate [50]. Table 3 shows the degree of protein and amino-acid losses associated with membranes characteristics. Dialysis performed during the 1960s used low-flux membranes [55], which efficiently removed uremic solutes with low molecular weight (<0.5 kDa) but not the middle molecules of 0.5-60 kDa size [56,57]; with advanced technology, greater removal of larger uremic solutes using high-flux membranes and, now, membranes with medium (MCO) and high cutoffs (HCO) are available [53], combining larger pore sizes with improved HF and HDF techniques [41,58].
With highly permeable membranes or HF and HDF techniques, patients are reported to achieve better intradialytic and hemodynamic stability [59], alongside improvements in nutritional status as evidenced by gains in BMI, dry weight, and appetite [60]. Improvements are attributed to greater removal of middle molecules using HDF. However, these membranes and/or techniques, in addition to increasing the cost burden [57,[59][60][61][62], induce greater albumin losses of 3.5 to 9.0 g per HD session [63,64], along with involuntary removal of vitamins, larger protein molecules, and lipids [62].
Contrarily, two studies reported a significant reduction in serum albumin levels in patients dialyzing with HCO membranes [63,65]. The risk-benefit balance by using dialyzer membranes with greater permeability for improved uremic solute removal versus greater albumin losses remains unknown [41]. Similarly, advanced techniques using either HF or HDF may also pose long-term risk of malnutrition development in HD patients. Given that the permissible threshold for tolerance of albumin losses with highly permeable membrane remains unclear, long-term use of MCO [66] or HCO membranes with HDF [63] techniques may pose malnutrition risk.

Multiple Dialyzer Reuse
In low-to-middle-income countries, the practice of dialyzer reuse is common [67,68]. However, multiple dialyzer reuse may contribute to negative outcomes [69] such as infection risks, biochemical and immunologic reactions, improper sterilization, increased membrane permeability [49], and loss of performance leading to inadequate dialysis adequacy. These issues are believed to arise from the reprocessing procedure involving sanitizing agents [67,70]. However, two studies have indicated that single, minimal (>6 times), or multiple dialyzer reuse carries no impact on dialysis adequacy, body weight, and serum albumin level [71,72].
Whether dialysis access directly contributes to malnutrition has not been shown. Arteriovenous fistula (AVF) failures were not influenced by markers of poor nutritional status except for high cholesterol (p = 0.034) and low normalized protein catabolic rate (nPCR) levels (p = 0.029) [77]. HD patients with catheter access compared to fistula and graft had significantly higher malnutrition-inflammation score (MIS) and lower serum albumin levels [78]. In fact, patients with AVF have 52% greater survival rate compared to those on central venous catheter (CVC) irrespective of nutritional status, although malnutrition was found to lower survival rate by 2% [79]. Instead, catheter rather than graft and fistula access appears to be a significant predictor of greater inflammatory response, and it is associated with the highest all-cause mortality rate [78] mediated by infection [20]. Therefore, the route of dialysis access is rather associated with inflammation and mortality risk [78], whereby presence of malnutrition may influence the survival rate [79].
Direct effects of the membrane, the extent of complement stimulation induced by the membrane, and the degree of eosinophilia associated with the clearance of cytokines determine the magnitude of the inflammatory response during dialysis [74]. Inflammatory marker levels may be modulated by different types of dialyzer membranes (Table 4). Generally, the high-flux dialyzer membrane and HDF technique are associated with lower inflammation grade in HD patients when compared to the low-flux dialyzer membrane. These differences are attributed to processing technology for structuring and composition of the membrane, conferring attributes to the dialyzer in terms of biocompatibility, water permeability, clearance, and appropriate sieving coefficients for myoglobin or albumin [59]. Abbreviations: CRP, C-reactive protein; IL-6, interleukin-6; TNF-α, tumor necrosis factor alpha; a significantly different (p < 0.05) compared to pre-treatment.
Inflammation also occurs with dialysate contamination by microorganisms, which produce endotoxins that pass through the dialyzer membrane and enter into blood circulation [83], amplifying the production of proinflammatory cytokines [84] such as interleukin (IL)-1, IL-6 [85], and tumor necrosis factor (TNF)-α [86]. Infected or old clotted grafts may also contribute to inflammation [87]. Of note, these middle molecules such as IL and TNF-α are not effectively removed by dialysis treatment with low-flux membrane [88] and are accumulated.
Overall, dialysis patients are vulnerable to oxidative stress with a marked increase in reactive oxygen species (ROS) production and antioxidant depletion. ROS induces activation of nuclear factor kappa B (NF-κB), which is translocated to the cell nucleus stimulating cytokine production, in turn causing inflammation [89]. Indeed, HD patients have pronounced NF-κB gene expression compared to a healthy population [90].
Another impact of the HD treatment is the activation of polymorphonuclear white blood cells, which trigger production of ROS and other pro-oxidants [91]. Indeed, increased indices of oxidative damage along with decreased indices of antioxidant defense have been observed in HD patients post-dialysis [92]. Low antioxidant levels in HD patients may also occur from limited vegetable and fruit intakes preventing hyperkalemia [93]. Resultant low intakes of vitamin A, C, and E and selenium would affect antioxidant defense mechanisms [93,94]. Additionally, involuntary removal of vitamins also occurs with every HD session as mentioned in Section 3.1.
However, Silva et al. (2019) found no difference in markers of oxidative stress and antioxidant defenses in malnourished HD patients identified using global objective assessment compared to mild or well-nourished patients [95]. In contrast, accumulation of advanced glycation end products, a biomarker of oxidative stress measured using skin auto fluorescence, was significantly associated with markers of malnutrition in HD patients such as lower serum albumin, lower handgrip strength, and lower protein intake [96]. HD patients with protein-energy wasting were 5.2 times more likely to experience oxidative stress as demonstrated by high protein carbonyl levels (95% confidence interval (CI): 24.0-1.1, p = 0.039) [97].
When malnutrition coexists with inflammation in dialysis patients, the combination of both conditions is known as malnutrition-inflammation complex syndrome [98]. The inflammation results in a reduction in albumin production in the liver [99] and fosters poor appetite, a non-iatrogenic factor implicated in malnutrition [100]. The strong relationship between malnutrition and inflammation in dialysis patients is evident, as indicated by significant association (r = 0.65, p = 0.040) of malnourished HD patients categorized by SGA B and C with high C-reactive protein (CRP) levels [101].

Efficacy of Uremia Correction
A major aim of dialysis therapy is to remove uremic waste products [57,102]. However, dialysis only reduces uremic burden through partial removal of uremic solutes [103]. Incomplete clearance of uremic solutes and the generation of urea from both dialysis-induced tissue degradation and dietary proteins contribute to uremic solute accumulation [104]. Excessive uremic solute burden influences amino-acid and protein metabolism by inhibiting transamination activities of enzymes such as threonine dehydratase and alanine and aspartate transferases [104], impairing membrane transport [105], inhibiting protein binding [106], and promoting muscle wasting [104].
The removal of uremic solutes depends on dialyzer membrane permeability. As mentioned earlier (see Section 3.1), uremic solutes with low molecular weight such as urea and creatinine are efficiently dialyzed via a low-flux dialyzer membrane [88], whereas membranes with higher permeability allow for greater clearance of small and large middle molecules. However, the threshold for clearance depends on uremic gains on non-dialysis days and total clearance achieved from the previous HD session [104,107]. Both these factors would determine the severity of uremia in HD patients. Aside from the permeability of the membrane, total clearance of uremic solutes is also determined by dialysis adequacy, dialysis frequency, and duration of dialysis session [108].

Dialysis Adequacy
Uremic solute clearance depends on dialysis adequacy, which refers to the frequency and duration of dialysis [102]. Expert guidelines for optimal uremic solute removal favor a three-to five-hourly dialysis session provided three times weekly [3] in order to meet dialysis adequacy by achieving a Kt/V urea of 1.2, which designates the dialyzer urea clearance (K), time on dialysis (t), and total body water (V) [57,102].
However, the hemodialysis (HEMO) study in a 3 year follow-up of 1846 dialyzing patients established that neither high (Kt/V = 1.65) nor low (Kt/V = 1.25) dialysis dose significantly affected markers of nutritional status such as serum albumin, post-dialysis weight, dietary energy and protein intakes, calf and upper arm circumference, and appetite status (all p > 0.05) [109]. The only effect was on normalized protein catabolic rate (nPCR), a surrogate marker for DPI with greater decline in the low compared to high dialysis dose group (p = 0.007).
Of note, the calculation of Kt/V urea is based on urea, a surrogate marker for clearance of small solutes. It does not represent removal of the more detrimental larger uremic solutes [107,110]. Inefficient removal of uremic toxins via dialysis is suggested to induce taste alteration in HD patients, which contributes to malnutrition [111,112].

Dialysis Frequency
Increasing the thrice-weekly frequency of dialysis sessions to >4 times may support better management of fluid removal [113] and lower systolic blood pressure, as well as improve QoL [23]. Alternately, a shorter but increased frequency of dialysis provided by six HD sessions two-hourly per week may benefit toward greater removal of uremic solutes [56], as shown in patients with lower trends in pre-dialysis serum levels of creatinine, urea, uric acid, and protein-bound solutes such as indole-3-acetic acid and indoxyl sulfate. However, serum albumin levels and post-dialysis weight did not improve for these patients [56]. In contrast, Rashidi et al. (2011) observed improved weight, BMI, and serum albumin status along with decreased serum CRP in patients converting to four HD sessions four-hourly per week from the standard dialysis regime by six weeks [114]. Dietary intake also improved albeit non-significantly. This effect may perhaps be explained by improved appetite occurring with greater removal of uremic compounds through more frequent dialysis [115].
In some countries, weekly frequency of dialysis may depend on the patient's access to financial support. For example, dialysis frequency in low-income countries such as India and Pakistan may be offered as two sessions four-hourly per week [116,117] compared to the standard dialysis of three sessions four-hourly per week in developing countries [116,118]. Treatment affordability, poor access to nephrology care and dialysis centers [116], and inadequately equipped dialysis facilities [117] are reasons for lower dialysis frequency. Interestingly, Chauhan and Mendonca (2015) showed that, for 50 Indian HD patients undergoing dialysis twice a week, those achieving dialysis adequacy of Kt/V ≥ 2.0 significantly improved their serum albumin and hemoglobin levels [118].

Dialysis Duration
Increasing the dialysis duration results in greater removal of small and large uremic solutes compared to the standard HD regime [102,119]. Patients dialyzing for 8 h using a high-flux dialyzer membrane have shown greater total solute removal, dialyzer extraction ratios, and total cleared volumes for urea, creatinine, phosphorus, and β 2 -microglobulin compared to patients on standard dialysis, and this occurs without affecting dialysis adequacy [102]. Lower post-dialysis levels of the uremic toxin indoxyl sulfate have been observed in patients dialyzing for 8 h compared to the standard 4 h regime (17.2 ± 3.6 vs. 27.5 ± 3.2 g/mL, p = 0.049), despite both patient groups having similar pre-dialysis levels [120].
Nutritional marker improvements through higher serum albumin and hemoglobin levels and lower white blood cell count appear to be associated with longer dialysis duration, as indicated from combined data of the three DOPPS [121]. These improvements may be explained by greater removal of both small and large solutes with longer hours of dialysis [102].

Efficacy of Metabolic Acidosis Correction
Metabolic acidosis develops in the early stages of CKD from the kidney's inability to excrete nonvolatile acids and synthesize bicarbonate to maintain acid-base balance [122]. HD treatment aims to correct metabolic acidosis via bicarbonate concentration of the dialysate [123], ultrafiltration rate [124], dialyzer membrane surface area and permeability [125], blood and dialysis flow rate [124], transmembrane concentration gradient set by the patient's serum bicarbonate level and bicarbonate availability from the dialysate [125], and dialysis adequacy [122,123] through maintaining the pre-dialysis serum bicarbonate levels between 24 and 26 mmol/L as recommended by current opinion [126]. However, metabolic acidosis correction depends on patient-related determinants such as interdialytic weight gain [123], acid generation from high protein intake [122,127], or gastrointestinal losses of bicarbonate [122,125]. Individual fluctuation in patients' bicarbonate levels challenges optimum management [122].
Metabolic acidosis contributes to malnutrition by reducing protein synthesis and increasing muscle degradation [123]. The malnutrition pathway in HD patients involves protein catabolism, secondary insulin resistance, inflammation, and increased serum leptin levels [122]. Lines of evidence using animal and human studies explain that increased muscle breakdown occurs during metabolic acidosis via two mechanisms. These involve increased activation of branched-chain ketoacid dehydrogenase (BCKAD) and the ATP-dependent ubiquitin-proteasome system (UPS) pathway [128]. Importantly, acidosis stimulates increased gene transcription and activity of BCKAD enzyme to degrade the branched-chain amino acids (BCAA), namely, leucine, isoleucine, and valine. BCAAs are important precursors for protein synthesis and are mainly metabolized in the muscle [50]. Increased BCAA oxidation, therefore, is the basis for a higher protein requirement for HD patients [129]. However, metabolic acidosis concomitant with dietary insufficiency and uremia further exacerbates protein catabolism in dialysis patients [130]. Metabolic acidosis activates UPS by increasing gene transcription of the proteasome and ATP-dependent ubiquitin, components involved in the muscle protein degradation pathway [128]. This chain leads to increased caspase-3 activity which promotes cleaving of muscle fibers, resulting in poor muscle mass [128].
Additionally, the acidic environment affects insulin binding to receptors, thus reducing tissue sensitivity to insulin and affecting glucose uptake [131]. Separately, metabolic acidosis also inhibits the anabolic effect of insulin, causing muscle depletion in dialysis patients [122]. Moreover, cell culture studies have shown that TNF-α and interleukins are generated in an acidic environment, triggering an inflammatory response [132,133].
The impact of metabolic acidosis on nutritional status of HD patients by assessment of serum bicarbonate levels may present anomalies in interpretation. In one study, patients with serum bicarbonate levels ≤22 mmol/L rather than serum bicarbonate levels >22 mmol/L had lower serum albumin levels (p = 0.046) [134]. In these patients, high serum bicarbonate levels correlated negatively with nPCR (r = −0.492, p = 0.045) but positively with serum albumin (r = 0.432, p = 0.019). Acidosis-led catabolism triggers breakdown of the endogenous proteins, which influence higher nPCR levels. In different malnourished HD populations, serum bicarbonate levels of >23 [135,136] or >27 mmol/L [127,136] have been associated with greater mortality risk. As malnutrition is a confounding factor for serum bicarbonate level, there is no ideal serum bicarbonate level that fits all dialysis patients [137].

Non-Iatrogenic Causes of Malnutrition
Comorbid non-iatrogenic factors may also contribute to malnutrition development in dialysis patients. These non-iatrogenic factors are elaborated on in the sections below.

Suboptimal Dietary Intake
Suboptimal dietary intake is a primary contributing factor to malnutrition [12] and is associated with increased mortality in HD patients [31,138]. Adult recommendations for dietary energy intake (DEI) and DPI to achieve nutrient adequacy have been proposed for HD patients by several expert groups, and these generally fall within 25-35 kcal/kg ideal body weight (IBW)/day for DEI and 1.0-1.2 g protein/kg IBW/day for DPI [126,[139][140][141]. Requirements factor in criteria to maintain physiological balance, prevent deficiencies from dialysis-induced nutrient losses, and reduce risk of malnutrition and mortality [126].
However, achieving DEI and DPI adequacies remains a challenge for HD patients with intakes falling below recommendations as indicated by many studies (Table 5). This is evidenced by 70-90% of global HD populations reported with DEI inadequacy, whereas DPI inadequacy ranges between 30% and 80%. Suboptimal DEI is of greater concern than DPI inadequacy, as gluconeogenesis is implied. Three studies reported HD patients achieving DPI adequacy >1.2g/kg/BW but failing to meet DEI adequacy [146,151,155]. Insufficient DEI, despite DPI adequacy, predisposes patients to negative nitrogen balance, resulting in both dietary protein and muscle protein to be diverted to fuel body energy requirements [154]. Additionally, amino-acid losses occurring through the dialysis procedure (see Section 3.1) affect protein synthesis, triggering muscle proteolysis to generate amino acids if there is low DPI [158]. Of concern, suboptimal dietary intake bears a negative impact on the survival rate of HD patients as indicated by some studies reporting patients with poor DEI and DPI (Table 6). Kang et al. (2017) found that HD patients with DEI < 25 kcal/kg BW/day and DPI < 0.8g/kg BW/day had 86% and 35% increased risk of mortality, respectively [138]. Similarly, those with DPI < 1.2g/kg BW/day had a 4.98-fold greater risk of mortality [159]. A recent metabolomics study reported higher concentrations of 3-hydroxybutyrate and tartrate along with low creatinine appearing in patients with protein energy wasting [160]. These metabolites are linked to gluconeogenesis and may be conditional to suboptimal DEI and DPI intakes [161].
Suboptimal DEI from reduced food intake also affects patient adequacy for other essential nutrients, as the overall diet quality falls. Kim et al. (2015) reported that an insufficient DEI of 21.90 ± 6.70 kcal/kg BW affected adequacy for micronutrients such as vitamin A and C, thiamin, riboflavin, niacin, folate, calcium, phosphorus, and zinc, as well as dietary fiber [154].

Taste Alterations
Low palatability of diets is underscored by taste alterations experienced by HD patients, and this factor reportedly affects 31-44% of HD populations [165]. Lynch et al. (2013) found 34.6% of 1745 HD patients in the HEMO study self-reporting taste alteration [165]. Such patients compared to those reporting "no taste alterations" clearly signified poor nutritional status as indicated by lower dry weight, serum albumin, serum creatinine, nPCR, and DPI. Reported DEI values for both groups were similarly low (22.8 ± 9.8 vs. 23.1 ± 9.4 kcal/kg/day, p = 0.260) highlighting that energy inadequacy evidently prevailed for all patients. This study also reported a 71% increased risk of mortality (odds ratio (OR) = 1.71, 95% CI: 1.01-1.37) in patients with altered taste perception.
Taste alterations experienced by HD patients may be explained by food aversion learning [166]. Aversions toward protein-rich foods such as meat have been significantly associated with enhanced metallic taste in patients also reporting poor appetite [167]. Clearly, taste alterations in HD patients develop food aversion learning, which impacts appetite and reduces overall diet quality, thus contributing to malnutrition. The reduction in taste perception may also be related to zinc deficiency [168].

Poor Appetite
HD patients reporting poor appetite experience significantly higher frequency of hospital admissions, longer duration of hospitalization, poor QoL, and nutritional outcomes such as lower normalized protein nitrogen appearance levels and high inflammatory marker levels than those reporting good appetite [163,165]. The immediate impact of poor appetite is reduced dietary adequacy and increased risk of malnutrition. This was shown in Malaysian HD patients where poor appetite compared to good appetite was significantly linked to lower DEI (14.34 vs. 23.12 kcal/kg/IBW/day, p = 0.049), DPI (0.45 vs. 0.94 g/kg IBW/day, p = 0.010) but higher MIS scores (9.5 vs. 6.6, p = 0.039) indicative of malnutrition [148]. Of concern, patients with diminished appetite faced 4.74 times greater risk of mortality [163].
The mechanism for poor appetite may be explained by changes in appetite hormones in HD patients. Ghrelin, an orexigenic hormone mainly secreted by the stomach, regulates appetite by stimulating spontaneous food intake [29,169]. Ghrelin present in its active form as des-acyl ghrelin has an anorexigenic effect, whereas acyl ghrelin as ghrelin in its inactive form is the main orexigenic molecule [169]. Together, these studies suggest that des-acyl ghrelin may have a negative effect on appetite, whilst high acyl ghrelin levels associated with adiposity indicate better nutritional status [29,169]. Moreover, an association among ghrelin, inflammation, and nutritional status has been reported [170].
Leptin, an adipokine, has an inhibitory effect on appetite in normal metabolism [171]. However, leptin's role in regulating appetite in CKD is controversial. Hypoleptinemia has been associated with malnutrition in HD populations although its mechanistic involvement in causing poor nutritional status is unknown [172,173]. Low serum leptin levels were independently associated with high MIS status observed in 100 Taiwanese HD patients [172] and 65 Turkish HD patients [174]. However, these studies could not show any association of low leptin levels with inflammatory markers. It may be implied that low leptin levels may serve as a marker of poor nutritional status, whereas higher levels may indicate leptin resistance [175], as there is an attenuation of appetite suppression [172]. Iikuni et al. (2008) proposed leptin as the link between energy homeostasis and inflammation in normal metabolism, where greater leptin secretion occurs with greater adiposity, which in turn promotes production of inflammatory cytokines [176]. Applying this hypothesis to the CKD population, it may be inferred that malnourished HD patients with low BMI will have less leptin secreted by adipocytes, whereas the reverse may occur with better nutritional status and higher BMI.
Studies reporting the impact of poor appetite on malnutrition as indicated by various nutritional outcomes in HD patients are summarized in Table 7.

Insulin Resistance
Insulin resistance is implicated in the etiology of malnutrition in HD patients. Insulin at physiological levels bears both catabolic and anabolic effects on skeletal muscle. Insulin's anabolic role is to promote BCAA transport and regulate protein synthesis in the muscle [180]. Another anabolic role of insulin is facilitating glucose transport and uptake by muscle tissues [181]. Reduced insulin secretion by the pancreatic β-cells or impaired tissue sensitivity to insulin at receptor and post-receptor levels in the heart, liver, or muscle are two pathways of insulin insufficiency [182,183]. More commonly, "uremic insulin resistance" through inflammatory pathways may occur from insufficient removal of dialyzable uremic solutes [180]. Of relevance, insulin resistance at receptor levels are traced to defects in the insulin receptor signaling pathway arising from metabolic derangements accompanying kidney disease such as uremia, metabolic acidosis, anemia, and inflammation [8,184].
Insulin resistance is associated with peripheral resistance of glucose uptake at the skeletal muscle site and manifests as impaired insulin signaling through the phosphorylation of insulin receptor substrate-1, which inhibits tyrosine kinase activity at the insulin receptor [181,183]. Another pathway of insulin resistance and impaired glucose metabolism may be suggested by high circulating retinol binding protein 4 (RBP4) [185]. Animal studies have demonstrated that RBP4 s inverse relationship with glucose transporter type 4 (GLUT4), which is insulin-dependent, induces glucose uptake in the fat and muscle [186,187]. High RBP4 has a role in glucose metabolism by inducing gluconeogenesis and inhibiting glucose uptake in the muscle via feedback suppression of adipose tissue expression, followed by reduced GLUT4 expression, which affects glucose uptake [185].
Reduced insulin sensitivity affects BCAA transport, blunting the anabolic effect of insulin for decreasing skeletal muscle breakdown [180]. Depletion in BCAA due to amino-acid losses via dialysis, along with suboptimal dietary intake lead to increased proteolysis to supply amino acids needed for protein synthesis [188]. Therefore, insulin resistance promotes muscle proteolysis, and this association is evident in HD patients with studies reporting a positive correlation between insulin resistance and muscle loss [189,190].
It is, thus, clear that, in ESKD patients, apart from chronic suboptimal food intake, gluconeogenesis may also be driven by insulin resistance associated with inflammation, uremia, and metabolic acidosis. Gluconeogenesis is a normal adaptive catabolic process to produce energy. Protein sparing under conditions of energy sufficiency occurs as the primary protein function for tissue synthesis and repair. With dietary energy insufficiency, amino acids and proteins derived from dietary protein or breakdown of skeletal muscle during starvation become new substrates for energy [158]. As HD patients are known to have suboptimal food intake, increased gluconeogenesis in these patients stimulates muscle proteolysis, leading to greater risk of malnutrition.

Psychosocial Factors
Psychosocial factors may negatively impact physical and emotional status, QoL, and nutritional status in HD patients [149,191,192], as shown in Table 8.

Depression
Depression is reported to be prevalent in 6% to 84% of HD patients [204] and arises from loss of the provider role within a family, unemployment, lack of social support, reduced mobility, physical strength, cognitive ability, and sexual function [204]. Additional factors are anxiety and stress from the burden of kidney failure followed by fluid and dietary restrictions, which are significantly associated with poor QoL in these patients [204,205].
Several studies have reported associations between depression and markers of malnutrition such as poor anthropometry measures, as well as low serum albumin, creatinine, hemoglobin, and nPCR levels with increased inflammation [191][192][193]. Separately, malnourished HD patients as indicated by MIS ≥ 6 faced 52% increased mortality risk (hazard ratio (HR) = 1.52, 95% CI: 1.13-2.05), along with higher depression symptoms and poorer QoL [194]. These findings suggest that depression should be considered as an independent risk factor for malnutrition. Indeed, malnutrition reversal in HD patients by antidepressant treatment has been observed with significant improvement in nPCR, serum albumin, and pre-dialysis blood urea nitrogen levels, along with a significant decrease in depression score compared to healthy controls [206]. Thus, assessment and treatment of depression should be considered as part of overcoming malnutrition in HD patients.

Lack of Social Support
ESKD patients exist in a complex matrix of relationships with family, friends, healthcare professionals, and financial support [207]. The quality of emotional and financial support provided by their social network influences stress management, QoL, health-promoting behaviors, malnutrition, and mortality in HD patients [196,201,207].
HD patients lacking social support have higher prevalence of diminished appetite, reduced physical functioning, and poor adherence to HD treatment [197,200,201]. Greater non-adherence toward nutritional recommendations found in HD patients without family support was due to absence of personal engagement and encouragement from family members to tackle these issues [197,198]. DOPPS Phases 1-3 found that HD patients with poor social support were more likely to experience serum albumin <3.5 g/dL (OR = 1.18, 95% CI: 1.02-1.37) [201]. In contrast, HD patients with social support achieved better social interactions and coping mechanisms toward kidney disease [202], as well as fewer depression symptoms [208]. Ultimately, presence of social support enables patient self-efficacy to reach better health status.

Financial Constraints
Financial constraints commonly faced by HD patients may be attributed to physical limitations to perform work tasks imposed by treatment and time commitments to dialysis treatment [200]. With unemployment, HD patients are dependent on financial support from caregivers or welfare agencies. Incurring financial dependence triggers loss of self-esteem and depression, leading to poor self-efficacy toward health management [195]. The consequence of limited financial resources is suboptimal dietary intake [200].
Employment status in 231 Chinese working-age HD patients was 51% prior to HD initiation, which fell to 11% once they began treatment [209]. These patients reported that the dialysis schedule and post-dialysis fatigue were major reasons for unemployment. HD patients need to strictly adhere to their dialysis schedule of three sessions per week, which consumes up to 18 working h a week [200]. Additionally, fatigue arising post dialysis affects their ability to work [210,211]. Data from the Finnish Registry for Kidney Diseases (n = 2637) found that peritoneal dialysis patients compared to HD had a higher employment rate (19% vs. 31%, p < 0.001) from greater flexibility in treatment schedule and mobility [212].
Lack of income is commonly prevalent in approximately 50% of the HD population [202,203]. Financial constraints were blamed for poor adherence to dietary recommendations as patients limited their access to healthy food choices on the basis of cost or these patients faced a dilemma on having to choose between spending on medicine or food [202]. Contrarily, HD patients with employment achieved greater DEI (+281 kcal/day, p < 0.01) despite perceiving their income to be insufficient to meet their needs [213].
The influence of financial status on dietary intake is unclear but forms a factor contributive to malnutrition [142,213]. Freitas et al. (2014) indicated that low income was associated with a 13% increase in malnutrition prevalence in 344 Brazilian HD patients [203]. Higher SGA and MIS scores of patients indicative of malnutrition were associated with socioeconomic-related nutritional barriers such as difficulty in purchasing food (OR = 1.89, 95% CI: 1.27-2.88, p = 0.002) and requiring assistance in meal preparation (OR = 1.15, 95% CI: 1.06-2.06, p = 0.001) [166]. Both factors highlighted the impact of financial constraints on nutritional status of HD patients.

Decreased Physical Functioning
Comorbidities associated with CKD such as sarcopenia, vascular dysfunction, inflammation, and malnutrition [214] negatively impacts the three components of physical functioning [215] which are related to body functions and structure, ability to perform, and participation in physical activity.
Fatigue is central to the reduced physical capacity to perform activities of daily living by HD patients and contributes to malnutrition [207,216,217]. The dialysis process itself may cause fatigue, stiffening of joints, and muscle cramping, thus affecting work task performance [218]. Indeed, fatigue was reported to affect the ability to prepare meals as indicated by 59% of HD patients reporting "being too tired to prepare meal" as a barrier toward dietary adherence, and this barrier was associated with lower DEI (r = −0.125, p = 0.002) [213]. Malnourished HD patients identified using Mini Nutritional Assessment < 19 and SGA > 8 had poor activities of daily living score [219], suggesting that ability to perform simple daily tasks was affected in these patients.

Conclusions
We presented a global view that malnutrition susceptibility in the HD community is either or both of iatrogenic and of non-iatrogenic origins. Keeping in view disparities in dialysis provision between upper-and lower-middle-income countries, leveraging the point of origin of malnutrition in dialysis patients by healthcare practitioners would enable personalized patient care, as well as country-specific malnutrition treatment strategies. Nutrition assessment is a critical first step to identify the factors of iatrogenic and/or non-iatrogenic origin implicated in malnutrition etiology. This is important as the next step of prevention or treatment for malnutrition depends on the identified factors and aligning effective strategies for nutritional intervention.
The iatrogenic factors that may be implicated in malnutrition may be (i) low dialysis adequacy resulting in poor uremia and metabolic acidosis correction, and/or (ii) low serum albumin levels for patients if dialyzing on highly permeable membranes or dialysis techniques that incur greater amino-acid losses. The non-iatrogenic approach should identify implications of (i) dietary inadequacy as per suboptimal DEIs and DPIs, (ii) poor appetite status, inflammation markers, low diet quality, and high diet monotony index, which indicate barriers to achieving dietary adequacy, and/or (iii) identification of psychosocial and financial barriers to nutritional optimization. These factors are modifiable and should be incorporated as part of a comprehensive nutritional assessment. Identification of factors causing malnutrition that are patient-specific would be crucial for healthcare practitioners to provide a more personalized patient care to treat malnutrition.