24th International Symposium on Infections in the Critically Ill Patient

This 24th International Symposium on Infections in the Critically Ill Patient aims to review current concepts, technology and present advances in infections in critically ill patient [...].


Introduction
This 24th International Symposium on Infections in the Critically Ill Patient aims to review current concepts, technology and present advances in infections in critically ill patient.
Sepsis, Pulmonary Infections and their therapeutic and preventive strategies will be the topics presented by international experts who will review and update sepsis as a global international problem.
New guidelines of Surviving Sepsis campaign, fluid therapy and vassopressors, a balance view between personalization and protocol treatment and new recommendations for the design of future randomized control trials are provided. The immune response and the emerging methods to personalize sepsis care including new biomarkers and immunomonitoring of patients with sepsis represent a new complementary view to treat patients with severe infections and organ failure in addition to early antibiotic and the control of source of infection.
New ways to treat pulmonary infections including the new global guidelines and the international actions against multiresistant microorganisms and the development of new antibiotics represent key factors to improve the outcome of severe infections.

José Garnacho, MD
Intensive Care clinical Unit Virgen Macarena University Hospital Seville, Spain

Introduction
This 24th International Symposium on Infections in the Critically Ill Patient aims to review current concepts, technology and present advances in infections in critically ill patient.
Sepsis, Pulmonary Infections and their therapeutic and preventive strategies will be the topics presented by international experts who will review and update sepsis as a global international problem.
New guidelines of Surviving Sepsis campaign, fluid therapy and vassopressors, a balance view between personalization and protocol treatment and new recommendations for the design of future randomized control trials are provided. The immune response and the emerging methods to personalize sepsis care including new biomarkers and immunomonitoring of patients with sepsis represent a new complementary view to treat patients with severe infections and organ failure in addition to early antibiotic and the control of source of infection.
New ways to treat pulmonary infections including the new global guidelines and the international actions against multiresistant microorganisms and the development of new antibiotics represent key factors to improve the outcome of severe infections.

Negative Clinical Trials: a Revisited Design James A. Russell
Professor of Medicine, University of British Columbia, Principal Investigator, Centre for Heart Lung Innovation, Vancouver, BC, Canada Sepsis is a very heterogeneous condition and there have been no drugs approved for sepsis save for the transient use of activated protein C. Nearly all randomized controlled trials (RCTs) in sepsis and septic shock are negative so we MUST change our paradigm to improve patient outcomes and mitigate increasing health care costs.
I suggest a BETTER design-Biomarker-guided Early anTibiotic adjuvanTs in ER (BETTER). Table 1 summarizes key features of BETTER RCTs in sepsis and septic shock compared to oncology trials that lead to BETTER outcomes.
Biomarker-guided means using predictive biomarkers that identify patients who have an improved response to therapy such as genomic markers for improved response to vasopressin [1], norepinephrine [2], angiotensin-II [3], corticosteroids [4,5], PCSK9 inhibition [6,7] and CETP inhibition [8,9] and weaning from mechanical ventilation [10]. More specifically, the half-life of vasopressin in human plasma is 4-24 min and is primarily determined by leucyl/cystinyl aminopeptidase (LNPEP; also known as vasopressinase), a physiologically essential enzyme that cleaves peptide bonds of vasopressin. The genetic variation in LNPEP (vasopressinase) is associated with 28-day mortality in septic shock and is associated with biological effect on vasopressin clearance and serum sodium regulation. Regarding norepinephrine, the β2-adrenergic receptor gene (ADRB2) plays a key role in outcome and response to adrenergic agonists in cardiovascular diseases. We found that the AA genotype of the β2-adrenergic receptor gene rs1042717 G/A polymorphism, marking the known functional ADRB2 CysGlyGln haplotype, was significantly associated with increased mortality and more organ dysfunction in two cohorts of septic shock patients. These results are consistent with the observation that the AA genotype is associated with decreased responsiveness to the anti-inflammatory effects of adrenergic agonists. Regarding weaning, an RCT compared two methods of weaning from ventilation: (1) weaning guided by fluid management to progressively decrease B-type natriuretic peptide (BNP, also known as NPPB) using diuretics (i.e., a biomarker-guided protocol) versus (2) weaning and fluid management by usual clinical guidelines. The BNP group weaned more quickly (time to extubation was decreased from about 59 to about 42 h) and had more ventilator-free days [10].
This Biomarker-guided strategy is the standard of care and has improved outcomes in oncology (e.g., standard of care in breast cancer: Her 2 neu expression defines who gets treatment with Herceptin).
Early means early in time and early in disease evolution. Time: Vasopressin infusion is more effective when used within 12 h; disease evolution: vasopressin is also more effective in patients who have less severe septic shock (norepinephrine dose < 15 µg/min or lactate < 2 mmol/L) [11,12] perhaps in part because vasopressin prevents progression of inflammation [13] and organ failure. Early goal-directed therapy (EGDT) was not effective overall in three recent RCTs [14] but may have been effective in patients who had lower levels of inflammatory and coagulation biomarkers [15]. We and others have noted that several therapies are more effective when given to patients who are less severely ill, perhaps because their condition is more reversible and the inflammatory, coagulation, and apoptosis cascades are not as briskly stimulated then [16].
anTibiotic adjuvanTs means supplementing the bacterial killing effects of antibiotics by augmenting the host's natural lipopolysaccharide (LPS) and lipotechoic acid (LTA) clearance mechanisms. Early antibiotics improve mortality of septic shock [17][18][19] but do not remove LPS or LTA directly. Microbial cell walls contain pathogenic lipids including LPS in Gram-negative bacteria, lipoteichoic acid in Gram-positive bacteria, and phospholipomannan in fungi. These pathogen lipids are major ligands for innate immune receptors that trigger the septic inflammatory response. Alternatively, the host has a defense mechanism: pathogen lipids can be cleared and neutralized, thereby mitigating the inflammatory response. Pathogen lipids released into the circulation are initially bound by transfer proteins, notably LPS binding protein (LBP) and phospholipid transfer protein (PLTP), and incorporated into high density lipoprotein (HDL) particles. LBP, PLTP, and other transfer proteins transfer these lipids to ApoB-containing lipoproteins, including low density (LDL) and very low density (VLDL) lipoproteins and chylomicrons. LDL and VLDL LPS and LTA are cleared by the LDL and VLDL receptors in liver and adipose tissue respectively. LDL receptor density is decreased by proprotein convertase subtilisin/kexin type 9 (PCSK9) and PCSK9 inhibition increases LDL receptor density and decreases LPS, LTA and cytokine levels as well as mortality in human genomic (in patients who carry PCSK9 loss-of-function variants) and animal model studies [7,20]. Low HDL levels are associated with increased mortality [21] and subjects who have cholesterylester transfer protein (CETP loss-of-function variants have higher HDL levels than wild type subjects. Human genomic (CETP loss-of-function variants) and animal models show that increasing HDL by CETP inhibition improves outcomes [8,9] of septic shock. In future studies, the integration of different genomic methods and application of multiple 'omics approaches (e.g., genomics, proteomics, metabolomics, transcriptomics, and epigenomics) could lead to discovery and validation of accurate biomarkers that may predict risk, response to therapy, mortality and more novel drugs for septic shock. We have utilized a novel inverted drug discovery strategy that incorporates (1) focus on the early infectious stage of sepsis, (2) multiple 'omics (multi-'omics) and (3) we start with human 'omics for drug candidate discovery (instead of animal models), confirm mechanisms in human cell and clinically-relevant sepsis models, and only then make go/no-go decisions for potential validated targets for clinical development-all this designed to increase the chances of discovering effective sepsis drugs.
In ER means that RCTs must recruit septic patients in the ER and not wait until patients are in the ICU so that biomarkers can be measured early and personalized care can be initiated early. Three RCTs of EGDT were coordinated across continents and countries [22] so that results could be compared and a patient level meta-analysis [14] could be done-all this was planned prospectively and emphasized ER recruitment of patients into these complex RCTs. These RCTs highlighted the successful partnerships of ER and ICU in RCTs that recruited patients with septic shock within 1-2 h of presentation [14,[23][24][25]. Another ER study showed that early antibiotics in ER are the most effective component of early ER interventions [19]. The Surviving Sepsis Guidelines [1] recommend norepinephrine (NE) as the first vasopressor agent followed by epinephrine (EPI) or vasopressin (AVP) in patients who do not respond adequately to norepinephrine. Dobutamine is added in patients who have evidence of depressed ventricular function due to underlying disease or septic cardiac dysfunction.
I review several new studies regarding these agents and several new agents that were not successful in pivotal RCTs (levosimendan), were recently approved (angiotensin-II (AT=II)) or are in development (selepressin).

Norepinephrine
The ARISE [2] early Goal-Directed Therapy (EGDT) RCT did a sub-study of vasopressor use; the median time from ED presentation to commencing a vasopressor was 4.4 (2.7, 7.1) h (38% prior to central venous access) after receiving 3.1 (2.3, 4.3) L. intravenous fluid prior [3]. Interestingly, earlier initiation of vasopressor(s) was associated with higher crude 90-day mortality.
In patients who are on both NE and AVP, tapering NE before AVP may be associated with a higher incidence of hypotension [4]. Clinically, NE is dosed by weight or not by weight. A retrospective study found that morbidly obese patients had lower in-hospital mortality, but had higher 1-year mortality compared to normal-and under-weight patients. Cumulative norepinephrine exposure was highest in morbidly obese patients [5]. Total norepinephrine exposure was an independent mortality predictor in septic shock.

Vasopressin
The Vasopressin and Septic Shock Trial (VASST) found no difference in overall mortality, but decreased mortality in the AVP group who had less severe septic shock [6]. It is not clear whether AVP versus NE changed mortality in practice in the VASST coordinating center hospital after VASST was published. We used propensity matching of AVP-to NE-treated patients in the VASST coordinating center before and after VASST was published [7]. Before VASST, AVP was associated with increased mortality compared to NE while after VASST, there was no difference in mortality between AVP-and NE-treated patients.
Acute Kidney Injury (AKI) sub-phenotypes could be used to identify responsiveness to AVP in septic shock. Latent class analysis methodology was applied independently in two critically ill populations (Discovery: n = 794 and Replication: n = 425) with AKI. In VASST, AVP compared to NE was associated with improved 90-day mortality in AKI-SP1 (27% vs. 46%), but no significant difference in AKI-SP2 (45% vs. 49%) [8]. This analysis identified two molecularly distinct AKI sub-phenotypes with different response to AVP.
Some patients with septic shock are on chronic renin-angiotensin-aldosterone system inhibitor but it is unknown whether the hemodynamic response to AVP differs between patients who are on vs. not on chronic renin-angiotensin-aldosterone system inhibitor(s) [9]. There was no significant difference in 6-hour mean arterial pressure in septic shock patients receiving vasopressin who were on versus those not on chronic renin-angiotensin-aldosterone system inhibitor therapy.
Renin-angiotensin-aldosterone system inhibitor patients had lower total vasopressor requirements at 24 h compared with non-renin-angiotensin-aldosterone system inhibitor patients.
The Septic Shock 3.0 definition could alter treatment comparisons in RCTs. We wondered whether the AVP versus NE comparison of 28-day mortality of patients who met the Septic Shock 3.0 definition (lactate > 2 mmol/L) differed from AVP versus NE in the (pre-Sepsis 3.0) septic shock definition used in VASST [10]. In VASST, the Septic Shock 3.0 definition decreased sample size by half and increased 28-day mortality rate by 10%. AVP lowered mortality versus NE if lactate was less than or equal to 2 mmol/L. Patients had higher plasma cytokines in lactate greater than 2 vs. less than or equal to 2 mmol/L, indicating a brisker cytokine response to infection. The Septic Shock 3.0 definition and our findings have important implications for RCT design in septic shock.

Levosimendan
Levosimendan was a possible alternative to dobutamine in septic shock with depressed ventricular function because levosimendan is a calcium-sensitising drug with inotropic and other properties that could be beneficial. In a large pivotal RCT there was no difference in SOFA score between the levosimendan group (6.7, SD 4.0) versus placebo or in 28-day mortality rates (34.5% and 30.9%) [11]. Levosimendan was associated with lower likelihood of successful extubation and increased risk of supraventricular tachyarrhythmias. This RT was critiqued because there was no need for a baseline echocardiogram to detect decreased ventricular function for inclusion into the RCT. However, levosimendan is not recommended for septic shock with depressed ventricular function.

Angiotensin-II
AT-II is a potent vasoconstrictor that could be beneficial in refractory septic shock and a proof-of-principle RCT showed some benefit on short term markers (marked reduction in NE dose requirements) [12]. A large RCT found that angiotensin II increased MAP more rapidly (within 3 h) than did NE [13] but there were no results regarding effects on non-cardiovascular organ dysfunction. Mortality was nominally but not significantly lower with angiotensin II than NE (46% vs. 54%, p = 0.12). Angiotensin II is available now clinically and may be effective for profound vasodilatory shock.

Selepressin
Selepressin is a highly selective V1a agonist that could have advantages over AVP because selepressin would mitigate effects on vasopressin-induced increased von Willbrand multimers (procoagulant proteins) and could mitigate increased permeability and fluid balance in septic shock [14]. Selepressin increased MAP and quite dramatically minimized the increase in fluid balance in an ovine model of septic shock [15]. In a proof-of-principle RCT showed that a greater fraction of patients on 2.5 g/kg/min selepressin maintained MAP >60 mmHg without norepinephrine (about 50% and 70% at 12 and 24 h, respectively) vs. 1.25 ng/kg/minute selepressin and placebo and NE was weaned more rapidly with selepressin 2.5 ng/kg/min vs. placebo (0.04 vs. 0.18 µg/kg/minute at 24 h) [16].
Furthermore, fluid balance was lower from day 5 on with selepressin 2.5 ng/kg/min vs. placebo possibly due to protection of endothelial permeability. The results of a large novel pivotal RCT that uses (1) response adaptive design and (2) organ dysfunction as the primary endpoint of selepressin in septic shock will be reported shortly [17].
Administration of intravenous fluids is a ubiquitous therapy in medicine. Selection of intravenous fluids is usually based on clinician's preferences with marked regional variations. "Normal" saline (0.9% sodium chloride), the most commonly prescribed crystalloid solution worldwide, contains supraphysiologic concentrations of chloride (154 mmol/L), with a strong ion difference (SID) of zero. Balanced or "buffered" solutions, on the other hand, have a lower sodium and chloride content and a positive SID, with an electrochemical composition that more closely approximates extracellular fluid.
In experimental studies, resuscitation with 0.9% saline, but not with low-chloride solutions, led to hyperchloremic metabolic acidosis, increased inflammatory response, coagulopathy, derangements in renal perfusion and acute kidney injury (AKI). In the clinical setting, randomized controlled trials have demonstrated that saline leads to hyperchloremic acidosis. However, evidence of low-versus high chloride solutions having effects on clinical outcomes has been limited until recently. Most prior evidence has been derived from before and after and observational studies. These studies suggest that low-chloride solutions reduce the risk of AKI, use of renal replacement therapy, coagulopathy, need of blood transfusion and mortality. Conversely, individual patient level RCTs have failed to show benefit of the low-chloride solutions. However, two fundamental limitations exist with these RCTs. First, outcomes such as mortality and use of RRT were uncommon (<10% and <5% respectively) and thus lack of power is an alternative explanation for the negative results observed. Second, in these trials expected effect sizes may be rather small due to low total volume of study solution infused. For example, in the largest trial to date, median volume of study fluid was only 2 L over the entire ICU stay. Such volumes are unlikely to produce changes in plasma sodium, chloride concentration or acid-base balance and are thus unlikely to have any effect on hard clinical outcomes.
In 2018 two very large (collectively enrolling nearly 30,000 patients) pragmatic trials were reported from a single US medical center. These trials utilized a cluster-randomized design and included patients going to the ICU (in one trial) or to the general medical/surgical floor (in the other). The trials were remarkable, in part, for the consistency of the results. Both studies showed a 1% absolute risk reduction in occurrence of a composite endpoint including death, dialysis and persistent renal dysfunction. Importantly, the trials detected harm even with an exposure as low as 1L of fluid and found the largest effect to be in patients with sepsis. Two large RCTs are currently underway in Brazil and in Australia and New Zealand. These trials will hopefully shed additional light on this issue. However, 0.9% saline has a very limited role in the management of critically ill patients, especially those with sepsis, as should no longer be seen as the default solution for fluid therapy. Instead, intravenous fluid therapy should be individualized and patients should receive the type of fluid, route of administration, volume and rate that their clinical status dictate.

Introduction
Prediction of fluid responsiveness plays a major role in guiding resuscitation of critically ill patients. Fluid responsiveness is defined as the ability of the heart to increase its stroke volume in response to an increase in preload induced by fluid administration. This implies that both ventricles work on the ascending part of the Frank-Starling relationship. Fluid responsiveness is present in about 50% of critically ill patients [1]. Since fluid loading might cause harm to critically ill patients in particular to fluid non-responders, it is important to assess fluid responsiveness before infusing fluid. A variety of dynamic tests, which challenge the Frank-Starling relationship without the need for any fluid infusion, have been developed to predict fluid responsiveness. Among them the respiratory variation of hemodynamic signals such as stroke volume variation (SVV) or pulse pressure variation (PPV) have gained a lot of popularity over the past years [2]. PPV was demonstrated to be reliable to predict fluid responsiveness in patients ventilated with a tidal volume of at least 8 mL/kg [3].

Limitations of pulse pressure variation and stroke volume variation
Several conditions-detailed below-limit the interpretation of PPV (and SVV) in critically ill patients [4].
Patients with spontaneous breathing activity (even during mechanical ventilation). In such cases, PPV cannot predict fluid responsiveness as the respiratory changes in intrathoracic pressure are irregular, in rate or in amplitude.
Patients with cardiac arrhythmias. The variability of stroke volume (and thus of pulse pressure) is obviously due to the irregular heart rhythm and not to ventilation. False positive PPV values are thus expected.
Patients with acute respiratory distress syndrome (ARDS). Under low tidal volume ventilation, respiratory changes in intrathoracic pressure might be not large enough to produce significant changes in preload. Accordingly, De Backer et al. showed that the prediction of fluid responsiveness using PPV was weaker when the tidal volume was <8 mL/kg than when the tidal volume was ≥8 mL/kg [5]. Nevertheless, during low tidal volume ventilation, a high value of PPV (e.g., >12%) still suggests fluid responsiveness whereas a low PPV cannot exclude the presence of fluid responsiveness [2]. To improve the interpretation of low PPV, it has been recommended to quantify the change in PPV in response to a transient (<60 s) increase in tidal volume from 6 to 8 mL/kg (tidal volume challenge) [6]. Myatra et al. showed that an increase in the absolute value of PPV ≥3.5% during a tidal volume, predicted fluid responsiveness with excellent accuracy [6]. Further confirmation is obviously necessary. Others proposed to overcome the limitation of using PPV in the case of low tidal volume ventilation by dividing PPV by the respiratory changes in esophageal pressure [7]. The disadvantage of this test is the need of an esophageal probe.
Low lung compliance by reducing the transmission of airway pressure to the intrathoracic structures, should result in false negative PPV values [2]. Accordingly, a clinical study showed that when the compliance of the respiratory system (Crs) was >30 mL/cmH 2 O, PPV predicted fluid responsiveness accurately, whereas when Crs was ≤30 mL/cmH 2 O [8].
In cases of high respiratory rate a low PPV can be observed even in cases of fluid responsiveness. A study showed that PPV cannot be accurately interpreted when the heart rate/respiratory rate ratio is <3.6 [9].
Patients with intra-abdominal hypertension (IAH). In such patients, the respiratory variations of stroke volume are not exclusively related to volemia and threshold values separating responders and non-responders should be higher than under normal intra-abdominal pressure [10]. For instance, a PPV value of 15% can be associated with fluid unresponsiveness.
Patients with right heart failure. High PPV values (>12%) despite fluid unresponsiveness were reported in cases of right ventricular dysfunction [11]. A predominant effect of mechanical insufflation on the right heart afterload would result in a decrease in right ventricular stroke volume during insufflation due to right ventricular afterload-dependence rather than to right ventricular preload-dependence [2]. However, in the above-mentioned study [11], the tidal volume was >8 mL/kg, and attenuation of the degree of RV afterload-dependence during low tidal volume ventilation cannot be excluded.

Limitations of other dynamic tests
Other tests can be used in the (frequent) presence of limitations of PPV (and SVV) [12]. In mechanically ventilated patients, tests such as end-expiratory occlusion alone or in combination with end-inspiratory occlusion can be used as alternatives. The passive leg raising (PLR) test is an excellent alternative to the heart-lung interaction tests and can be used in patients receiving mechanical ventilation as well as in non-intubated patients [12]. Its appropriate use requires to respect strict rules [13]. Among them are the necessity to use a real-time cardiac output measurement and the necessity to start from a semi-recumbent position [13]. The main contraindications of PLR are intracranial hypertension and surgical conditions in the operating room. Finally, it seems that IAH is a limitation of the appropriate interpretation of PLR as test to predict fluid responsiveness [14].

Conclusions
There are many limitations to use PPV and SVV in critically ill patients. Note that in the operating room setting, these limitations are far more frequent. However, there are recent advances aimed at overcoming some of the limitations such as the tidal volume challenge. In critically ill patients (except in those with IAH), PLR can be a good alternative to predict fluid responsiveness in situations where PPV cannot be used.
It is important to keep in mind that the presence of fluid responsiveness does not mean that fluid is necessary. The decision to infuse fluid should also be based on the presence of signs of shock and of low risks of fluid overload. 8 Acute kidney injury (AKI) and sepsis carry global consensus definitions. The simultaneous presence of both defines sepsis-associated AKI (S-AKI). S-AKI is the most common AKI syndrome in the ICU and accounts for approximately half of all AKI in critically ill patients. The pathophysiology of S-AKI remains poorly understood, but animal models and lack of major histological changes suggest that, at least initially, septic AKI may be a largely functional phenomenon with combined microvascular shunting and tubular cell stress. The diagnosis remains based on clinical assessment and measurement of urinary output and serum creatinine. However, multiple biomarkers and especially cell cycle arrest biomarkers are rapidly gaining acceptance. Prevention of septic AKI remains based on the treatment of sepsis and on early resuscitation. Such resuscitation relies on the judicious use of both fluids and vasoactive drugs. In particular, starch-containing and chloride-rich fluids are nephrotoxic to patients with high susceptibility (e.g., patients with sepsis). Vasoactive drugs have variable effects on renal function in septic AKI. At this time, norepinephrine is the dominant agent but vasopressin may also be used.
Patients with severe AKI receiving RRT appear fairly homogeneous (at the clinical and molecular level). However, this may belie important phenotypic differences that are already present early on. In sepsis, patients who recover from AKI appear to have good outcomes (at least over 1-3 yrs). Certain molecular signatures are more prominent in sepsis (e.g., early TIMP-2 activation, early ATP depletion, late feroptotic cell death, and high BMPR1a without concomitant increased BMP7) and these signatures may also be indicative of non-recovery. Both TIMP-2 and IGFBP7 are modulated by P53; but downstream effects of both molecules vary. Understanding transition from early dysfunction to late irreversible injury (vs. recovery) will be key to developing treatments. Biomarkers hold considerable promise in current translational medicine and obviously in intensive care medicine (ICU). We previously discussed the role of biomarkers for early diagnosis of acute kidney injury (AKI) in critically ill patients [1]. Meanwhile, other biomarkers have found the way from bench to bedside. Clinical experience has shown that some of these markers may become powerful tools in the management of critically ill patients. However, the complexity of acute disease conditions exhibits some pitfalls for biomarker use that need further investigation. Renal biomarkers easily outperform "traditional" variables such as creatinine and estimated glomerular filtration rate to predict occurrence and outcome of AKI [1][2][3]. However, many questions remain. Whether biomarkers can improve the approach to the sick or compromised kidney, in particular regarding timely initiation of continuous renal replacement therapy (CRRT), has insufficiently been clarified. Moreover, biomarkers lack specificity in case of sepsis-induced AKI or in AKI associated with the acute respiratory distress syndrome (ARDS). Also, a potential role of biomarkers to assist in decision processes for admission or discharge in emergency and ICUs has not been established. AKI is a major determinant of morbidity and mortality in critically ill patients. Crucial issues are early detection of the at-risk population and timely start of therapy. In this context, biomarkers may have a decisive preemptive role. AKI is a common condition in hospitalized patients and is associated with worse short-term and long-term outcome. The delay in the diagnosis of AKI has been proven to be associated with morbidity and mortality. One of the challenges clinicians face is the early detection of AKI. Another difficulty is the lack of a consensual definition of AKI. Expert consensus seems to be emerging [4] but recent literature review shows that the definitions used in the published articles remain multiple and heterogeneous [5]. While many urinary and serum proteins have been investigated as potential biomarkers for the early diagnosis or prediction of AKI or to identify patients at high risk, the ideal diagnostic tools of AKI remains an unmet medical need. Patients with predominant cardio-renal syndrome and high serum concentrations of Neutrophil gelatinase-associated lipocalin (NGAL) were found to be at risk of developing AKI, even in the absence of oliguria and increased creatinine levels. NGAL-positive patients also more frequently require renal replacement therapy and have higher mortality rates [2,6]. This new entity, divulged by using a biomarker as "subclinical AKI" [7] suggested that current concepts and definition of AKI might need reassessment [2]. The SAPPHIRE study [8,9] showed that two novel cell cycle arrest biomarkers, tissue inhibitor of metalloproteinase-2 (TIMP-2) and urinary insulin-like growth factor binding protein 7 (IGFBP7), largely outperformed existing markers. Whether these markers can predict the need for or duration of renal replacement therapy is currently under investigation. Studies in ICU patients will help to identify the potential of tissue inhibitor of metalloproteinase-2/insulin-like growth factor binding protein 7 for early detection of postoperative or antimicrobial-related toxic AKI. High serum cystatin-C (Cys-C) levels at admission were found to predict subclinical AKI in patients admitted in the ICU [10].These biomarkers have been subject of very recent investigations. Vitamin-D-binding protein (VDBP) is a low molecular weight protein that is filtered through the glomerulus as a 25-(OH) vitamin D 3 (VDBP) complex. In the normal kidney, VDBP is reabsorbed and catabolized by proximal tubular epithelial cells reducing urinary excretion to trace amounts. Acute tubular injury is expected to result in urinary VDBP loss [11]. VDBP and kidney injury molecule-1 were identified as early and very sensitive markers of contrast-induced nephropathy, [12] whereas interleukin-18 and kidney injury molecule-1 predicted early-stage AKI in burn patients [13].An upcoming marker could be hemoglobin subunit beta (HBβ), a component of hemoglobin [13]. Hemoglobin is the main carrier of oxygen and carbon dioxide in cells of the erythroid lineage and ensures tissue oxygen delivery throughout the body. Blood HBβ levels in septic patients are significantly higher than those in healthy volunteers, and values in septic shock exceed those found in severe sepsis. HBβ levels apparently correlate with the degree of sepsis-induced endothelial cell dysfunction and thus may closely reflect sepsis severity [13]. An area of particular clinical interest is the behavior of biomarkers dedicated to detect a specific pathology (such as NGAL for AKI) in sepsis. Indeed, increased NGAL levels are also observed in many septic patients. Moreover, as up to half of septic shock patients may develop AKI, NGAL measurement loses all specificity for detecting AKI, as it detect sepsis at the same time [14]. Further research is needed to obtain biomarkers with sufficient diagnostic and prognostic specificity to discriminate between an ongoing septic process and an associated AKI [15]. Research to interfere in this fascinating universe of interacting pathways is on the brink of a breakthrough [16]. Hall et al. performed a systematic review associated with meta-analysis to evaluate the potential of AKI in-vitro diagnostic tests to improve the management of patients with critical illness [17,18]. They evaluated sensitivity and specificity, cost-effectiveness evaluation, appraisal of analytical validity, care pathway analysis and value of information analysis. Three tests (TIMP-2 IGFBP7, NGAL and cystatin C were subject to detail review. Their main finding is the heterogeneity in the reporting of those studies and the poor analytical validity. The meta-analysis of the reported results showed a pooled sensitivity of the biomarkers ranging from 0.54 to 0.92 and a pooled specificity ranging from 0.49 to 0.95. In addition, the authors discussed the fact that cost-effectiveness have either no or poorly been explored. The authors call for a homogenisation of practices both in the development of studies and in the analysis and reporting of their results. It also seems necessary to insist on the need to develop impact studies. Indeed, only biomarkers associated with management strategies that have shown that they can improve patient outcomes are of real interest. Finally, it should also be borne in mind that the dosage of these new biomarkers is often costly; medical-economic studies should be conducted to address this issue [17,18]. The odyssey for the ideal biomarkers for predicting AKI will continue.

Introduction
Shock is caused by an inadequate supply, or inappropriate use of metabolic substrate (mainly oxygen) resulting in tissue damage and cellular death. Several factors contribute to organ dysfunction in septic patients. Hemodynamic factors such as volume depletion, low cardiac output or inappropriate vasodilation resulting in systemic hypotension may directly produce organ hypoperfusion through a reduction in organ perfusion pressure. Organ autoregulation, the tendency of organ blood flow to remain constant over a range of organ perfusion pressure values, defines the perfusion range over which resistance can compensate for the decrease in pressure. The autoregulatory threshold is the lowest pressure at which autoregulation is maintained and may be defined by the intercept of the autoregulated zone with the slope of the subautoregulatory zone. Below their autoregulatory thresholds, organ blood flows become linearly dependent on perfusion pressure. Therefore, it can be speculated that therapy should be aimed at providing an adequate organ perfusion pressure and that this is higher than commonly targeted or achieved in the treatment of septic patients. While this may require the use of vasopressor catecholamines, with their attendant risk of organ vasoconstriction and reduction in organ blood flow, clinical studies suggest that such reactions rarely occur and that organ function usually improve when tissue perfusion pressure is augmented during sepsis and septic shock.

Epinephrine
In patients unresponsive to volume expansion or other catecholamine infusions, epinephrine can increase mean arterial pressure, primarily by increasing cardiac index and stroke volume with more modest increases in systemic vascular resistance and heart rate. Epinephrine can increase oxygen delivery, but oxygen consumption may be increased as well. Epinephrine administration has been associated with increases in systemic and regional lactate concentrations. The monitoring periods were short, and so it is unclear if these increases are transient. Other adverse effects of epinephrine include increases in heart rate, but electrocardiographic changes indicating ischemia or arrhythmias have not been reported in septic patients. Epinephrine has had minimal effects on pulmonary artery pressures and pulmonary vascular resistance in sepsis.

Norepinephrine
Norepinephrine is a potent alpha-adrenergic agonist with less pronounced beta-adrenergic agonist effects. In open labeled trials, norepinephrine has been shown to increase mean arterial pressure in patients who remained hypotensive after fluid resuscitation and dopamine. Due to concerns about potential adverse vasoconstrictive effects on regional vascular beds such as the liver and the kidney, norepinephrine traditionally had either not been used or had been reserved as a last resort in a moribund patient, with predictably poor results. The experience with norepinephrine in septic shock strongly suggests that this drug can successfully increase blood pressure without causing deterioration in organ function. In most studies, septic patients were given fluid to correct hypovolemia before dopamine with or without dobutamine was titrated to achieve the target blood pressure. When dopamine failed, norepinephrine was added to the dopamine regimen. In most studies of septic patients, norepinephrine was used at a mean dose of 0.2 to 1.3 g/kg/min. The initial dose can be as low as 0.01 g/kg/min, and the highest reported norepinephrine dosage was 3.3 g/kg/min. Thus, large doses of the drug may be required in some patients with septic shock, possibly due to alpha-receptor down-regulation in sepsis. Norepinephrine therapy usually causes a clinically significant increase in mean arterial pressure attributable to its vasoconstrictive effects, with little change in heart rate or cardiac output (CO), leading to increased systemic vascular resistance. CO is increased by 10 to 20% and stroke volume index (SVI) by 10 to 15%. Clinical studies have reported either no change or modest increases, in pulmonary capillary wedge pressure. Mean pulmonary arterial pressure is either unchanged or increased slightly. In patients with hypotension and hypovolemia, e.g., during hemorrhagic or hypovolemic shock, the vasoconstrictive effects of norepinephrine can have severe detrimental effects on renal hemodynamics, with increased renal vascular resistance and renal ischemia. Indeed, norepinephrine has been demonstrated to cause ischemia-induced acute renal failure in rats. The situation is different in hyperdynamic septic shock, in which it is believed that urine flow decreases mainly as a results of lowered renal perfusion pressure. Since norepinephrine has a greater effect on efferent arteriolar resistance and increases the filtration fraction, normalization of renal vascular resistance could effectively reestablish urine flow.
In the high output-low resistance state of septic shock patients, norepinephrine can markedly improve mean arterial pressure and glomerular filtration. In several studies norepinephrine (0.5 to 1.5 g/kg/min) significantly increased urine output, creatinine clearance, and osmolar clearance. This supports the hypothesis that renal ischemia observed during hyperdynamic septic shock is not worsened by norepinephrine infusion and even suggests that this drug may optimize effectively renal blood flow and renal vascular resistance.

Use of Vasopressor Therapy
When fluid administration fails to restore an adequate arterial pressure and organ perfusion, therapy with vasopressor agents should be initiated. Potential agents that can be selected include dopamine, norepinephrine, epinephrine and phenylephrine. Vasopressor therapy may be required transiently even while cardiac filling pressures are not yet adequate, in order to maintain perfusion in the face of life-threatening hypotension. Although the use of these drugs has the potential to reduce organ blood flow through arterial vasoconstriction, their final effects will depend upon the sum of the direct effects and any increase in organ perfusion pressure. In settings where organ autoregulation is lost, organ flow becomes linearly dependent on pressure and organ perfusion pressure should be preserved if flow is to be optimized. Whether or not a potent vasopressor also has positive inotropic effects is of clinical importance in patients with low cardiac output. From a practical point of view, when a vasopressor infusion is started, doses should be carefully titrated to restore mean arterial pressure, without impairing stroke volume. The precise mean blood pressure level targeted depends upon the premorbid blood pressure but can be as high as 75 mmHg. However, individual responses should be kept at the minimum level required to reestablish urine flow and in some patients this can be achieved with a mean arterial pressure of 60 or 65 mmHg.

Conclusions
Optimization of organ perfusion pressure is an important goal to achieve during the treatment of sepsis and septic shock. A rational argument can be made for augmenting perfusion pressure based on the linear pressure-flow relationship secondary to organ insult during sepsis. In addition, there is a lot of clinical evidence showing the absence of harm with vasopressor infusion and the improvement in organ function when these agents are used to restore adequate blood pressure.

CRRT in Sepsis
Manuel E. Herrera Gutiérrez Sepsis is the main cause of AKI in the ICU. Following the interaction between the bacteria and the host, a response is triggered that encompasses the synthesis of pro (TNF, IL-1, IL-8, PAF) and antiinflammatory (IL-10, IL-4, etc . . . ) mediators. If an excessive amount of these molecules is produced the response of the host can be compromised (inmunoparalisys) [1]. Different strategies aiming to block the synthesis of some of these mediators have shown no results or even worst, a worsening of the situation of our patients, but if we could count on a therapy that normalizes the concentration of mediators keeping a normal balance between them, hypothetically we could control de inmunoparalisys improving the defensive capability of the immune system (immunomodulation) [2].
In the last decades an important effort has been done by different investigators and the industry aiming to improve the extracorporeal therapy devices at our disposal in order to gain efficacy in this theoretical inmunomodulatory capability by means of a higher clearance of immune mediators.
A first approach was to employ convection because a high convective dose (high volume hemofiltration) was shown to improve hemodynamics and was followed by a rapid decrement in vasopresor need. As a fact, a reduction in mediator levels has been demonstrated with convection but it is clear that to reach this effect, a high change of fluid must be employed [3] so that, in we decide to treat our septic patient with a convective therapy we must use HVHF. Anyway, even when a clinical improvement is to be expected, a decrement of mortality has not been proven for this treatment and its use can carry serious problems because the high volume of fluid interchanged and the need for an equipment capable of controlling exact balances.
Another approach, less workload demanding is the use of membranes with a wider pore size (high cut-off membranes HCO) [4] that let the mediators to be effectively cleared using a lower amount of fluid. Its main drawback is the high loss of albumin that accompanies this therapy, but this side effect cam be minimized if we employ diffusion as the depurative modality. Results with this device are close to those described for HVHF and once more there is no evidence about improvement in outcome.
If continue widening the pore size, the amount of molecules cleared increases and with it the possibility to eliminate more mediators and so the use of PMF membranes has been tested also for this indication. Once more the clinical effect is evident but not the improvement in outcome [5] and PMF presents supposes also a serious risk, because the non-selective elimination of important plasmatic proteins and the need for high FFP volumes.
Finally, adsorption has also been proposed as a mean for the elimination of inflammatory mediators and as a fact, seems the most promising among these techniques and the one where more advances are being presented, in three different directions: first we are witnessing an improvement in the adsorptive capability of old membranes, as AN-69 (Oxiris ® ) or PMMA; then some membranes have been developed aiming for a highly selective adsorption capability as is LPS affinity for LPS Adsorber ® or Toraymixin ® , and finally and looking for exactly the opposite, development of membranes with very high non-selective capability of adsorption as for Cytosorb ® . At the moment, experience for all these systems is scarce and almost observational small series, counting only with RCT for Toraymin ® . Its usefulness has not been definitely proven and the most recent RCT published does not show effect on survival also for this device [6]. A meta-analysis published in 2013 did show a positive effect on survival for extracorporeal devices (in the analysis were included all the modalities already mentioned) and suggesting a stronger effect for the adsorptive devices [7] but this analysis did not include the last one published mentioned earlier.
Some authors have also tried for a combination of different modalities but this strategy has not been proven to be of utility. CPFA, a hybrid modality combining plasmafiltration plus adsorption, has not shown the expected results.
So, at this point we can not rest in any conclusive evidence for or against the use of extracorporeal therapies for the management of sepsis in terms of survival, even when, as already mentioned, all the modalities tried have shown a positive effect on hemodynamics and vasopressor usage. This similarity in the results is not surprising because we must keep in mind that whatever technique we use, our aim is in fact the same, an elimination of inmunomediators, and that any differences in effect will really reflect a difference in clearance capability for these techniques (except for the highly selective adsorption provided by Toraymixyn ® ).
One last aspect that merits attention when approaching the septic patient with an extracorporeal therapy is the non-selective elimination that these treatments provide and the fact that among the many molecules that are cleared are included most o the antibiotics, the main treatment of sepsis. If we consider that our aim is to clear as much mediators as possible, it is easy to understand that the risk of underdosing the antibiotics is really high. If a careful dosage of these drugs is not prescribed we'll be in the end endangering the recovery of our patients and can even worsen their prognosis.
As a resume and following the recommendations of the SSC, at this point we cannot make a sound recommendation for or against the use of extracorporeal therapies for sepsis [8]. 6 Assessment of the inflammatory response can help the decision-making process when diagnosing community-acquired pneumonia (CAP) [1][2][3][4][5][6], but there is a lack of information about the influence of time since onset of symptoms.
We studied the impact of the number of days since onset of symptoms on inflammatory cytokines and biomarker concentrations at CAP diagnosis in hospitalized patients [1].
We performed a secondary analysis in two prospective cohorts including 541 patients in the derivation cohort and 422 in the validation cohort. The time since onset of symptoms was selfreported, and patients were classified as early presenters (<3 days) and nonearly presenters. Biomarkers (C-reactive protein [CRP] and procalcitonin [PCT] in both cohorts) and cytokines in the derivation cohort (IL-1,6,8,10, and tumor necrosis factor-α) were measured within 24 h of hospital admission.
In early presenters, CRP was significantly lower, whereas PCT, IL-6, and IL-8 were higher. Nonearly presenters showed significantly lower PCT, IL-6, and IL-8 levels. In the validation cohort, CRP and PCT exhibited identical patterns: CRP levels were 36.4% greater in patients with 3 or more days since onset of symptoms than in those with less than 3 days since symptom onset in the derivation cohort and 38.2% in the validation cohort. PCT levels were 40% lower in patients with 3 or more days since onset of symptoms in the derivation cohort and 56% in the validation cohort.
Time since symptom onset modifies the systemic inflammatory profile at CAP diagnosis. This information has relevant clinical implications for management, and it should be taken into account in the design of future clinical trials [1]. The last two to three years provided several big steps regarding understanding and management of sepsis. The increasing insight into pathomechanisms of post-infectious defense not only led to some new models of host response, but also opened a new of personalized sepsis therapy by using the individual pattern of information on different biological levels. These include genomics, proteomics, metabolomics, etc., often put together as the "omic" approach to personalized medicine, e.g., in sepsis [1]. Several projects concentrated on the fingerprint of patients by describing their genetic information on the DNA and/or RNA level, thus described as genotyping, often with the goal to assess their individual risk [2].

Phenotyping of Patients with Sepsis
In contrast, downstream information on the levels of proteins, other biomarkers, or metabolic changes, but also clinical symptoms or even demographic data such as age, weight, or height of patients, are often put together as phenotyping, the natural pendant to the genotype. To identify reproducible pattern of septic patients, modern IT technologies are tested more and more. In a recent landmark paper, a group of international investigators developed a computational model using reinforcement learning, which is able to dynamically suggest optimal treatments for adult patients with sepsis in the intensive care unit (ICU); reinforcement learning is a category of Artificial Intelligence (AI) tools in which a virtual agent learns from trial-and-error processes to create an optimized set of rules [3]. Forty-eight variables were extracted from the patients' files, including demographics, the premorbid status, vital signs, laboratory values, applied fluids and vasopressors. It was demonstrated that the value of the treatment selected by the AI tool was reliably higher than that of ICU physicians. Interestingly, mortality was lowest in patients for whom clinicians' actual treatment matched the AI decisions. Similar sophisticated methods were applied by another group of researchers using a large database, although they did this with data from patients immediately after cardiac arrest and hospital transfer, i.e., without any further information on the patients' history [4]. The number of 39,566 patients from 186 ICUs were analyzed. The investigators found that Machine Learning approaches (gradient boosting machine, support vector classifier, random forest, artificial neural network, and an ensemble) significantly enhance predictive discrimination for mortality following cardiac arrest compared to existing illness severity scores and classical logistic regression, without the use of pre-hospital data.
Patients with lower respiratory tract infections (LRTI) are a challenge for goal-directed antibiotic treatment: in the absence of a definitive microbiologic diagnosis, clinicians may presume symptoms are due to a noninfectious inflammatory condition and initiate empiric corticosteroids, which can exacerbate an occult infection. Furthermore, even with negative microbiologic testing, providers often continue empiric antibiotics due to concerns of falsely negative results, a practice that drives emergence of antibiotic resistance and increases risk of second-hit infections. Recent studies integrated data from the patients' host response marker with information of the individual microbiome to forecast the risk in critically ill patients for LRTI [5]. the research group performed metagenomic next-generation sequencing (mNGS) on tracheal aspirates from 92 adults with acute respiratory failure and simultaneously assessed pathogens, the airway microbiome, and the host transcriptome. This study suggests that a single streamlined protocol offering an integrated genomic portrait of pathogen, microbiome, and host transcriptome may hold promise as a tool for LRTI diagnosis. This kind of detailed patient phenotyping, performed in a population reflective of the true heterogeneity of ICU patients, including severely immunocompromised subjects and patients receiving broad-spectrum antibiotics, could be a view-into-the future. Studies in a larger cohort may further validate these findings, strengthen the utility of these models, and assess the impact on clinical outcomes.
Another exiting approach to better describe the pattern of infections in individual septic patients was recently introduced by investigators who presented a way to distinguish the source of various bloodstream infections (BSI), which may facilitate more accurate tracking and prevention of hospital-acquired infections [6]. The researchers developed and applied a streamlined bioinformatic tool to match bloodstream pathogens precisely to a candidate source. Then they leveraged this approach to interrogate the gut microbiota as a potential reservoir of bloodstream pathogens in a cohort of hematopoietic cell transplantation recipients. In conclusion, more precisely identifying the origins of BSIs may influence how hospitals and health care providers can most effectively work to prevent infections.
The previous two examples demonstrated the impact of modern techniques to better diagnose the patients' diseases. Of course, the question remains if these approaches may have any influence on the therapeutic consequences-a rather new field of medicine, which is described as theragnostics.
The following study is currently submitted by our own group and showed how phenotyping by a combination of specific biomarkers influences the effect of low dose hydrocortisone (HC) in septic shock patients [7]. Although several trials have consistently reported faster shock resolution, the utility of HC in patients with septic shock remains controversial; whereas two French studies reported outcome benefit, two international studies found no survival effect from HC. Corticosteroids are traditionally considered to induce immune suppression via the glucocorticoid receptor (GR) and its repressive effect on pro-inflammatory transcription factors. Thus, patients in an overall state of immunosuppression, which is presumed for many cases of septic shock, may be potentially compromised by administration of an immunosuppressive drug. These diverging effects of steroids support the need for biomarkers to guide their application. Therefore, we applied machine learning to physiological and laboratory data from patients enrolled into former randomized studies to determine a theragnostic marker for hydrocortisone treatment. It was found that the ratio of serum interferon-γ (IFNγ) to interleukin-10 (IL-10) was able to identify specific sub-cohorts with increased and decreased survival upon treatment. Applying the rule to two further, smaller datasets showed the same tendency. Strengths of our study were that we used well selected cohorts of patients with septic shock, that our marker showed very similar results across all studies, and that this approach showed the potential clinical relevance as a future option to better phenotype septic shock patients before applying hydrocortisone.

Adjunctive Therapy in Sepsis-Old Friends vs. New Trials James A. Russell
Professor of Medicine, University of British Columbia, Principal Investigator, Centre for Heart Lung Innovation, Vancouver, BC, Canada The main adjunctive therapy in common use in septic shock and that has Surviving Sepsis Guideline (SSG) [1] recommendations is corticosteroids. There are several exciting novel adjunctive agents that could accelerate clearance and neutralization of lipopolysaccharide (LPS) and lipotechoic acid (LTA). There are additional immune enhancing strategies for patients in later sepsis who have signs of depressed immunity and novel pleiotropic anticoagulant type products (recombinant human thrombomodulin) that could also prove efficacious.

Corticosteroids
The SSG suggest "against using IV hydrocortisone to treat septic shock patients if adequate fluid resuscitation and vasopressor therapy are able to restore hemodynamic stability. If this is not achievable, we suggest IV hydrocortisone at a dose of 200 mg per day (weak recommendation, low quality of evidence)" [1]. Note the low quality evidence and the weak recommendation.
Since these SSG were published there are now two large pivotal RCTs of corticosteroids in septic shock with quite contrasting results and conclusions [2,3]. Annane and colleagues [2] found that hydrocortisone plus fludrocortisone decreased mortality compared to placebo (39% vs. 45,3% respectively, p = 0.02), while Venkatesh and colleagues [3] reported that hydrocortisone did not decrease mortality compared to placebo (27.9% vs. 28.8% respectively). One can see obvious differences between the RCTs: (1) Annane and colleagues used a combination of hydrocortisone plus fludrocortisone because of the altered steroid physiology of septic shock and because of the positive results of their prior steroid RCT [4]; (2) the pooled mortality of Annane's RCT was higher than Venkatesh suggesting that Annane's patients were sicker. The Surviving Sepsis Guidelines [1] and recent corticosteroid guidelines [5] do not include these RCTs and will need to be updated. We suggest that the combination of hydrocortisone and fludrocortisone be considered for patients in septic shock who do not respond adequately to norepinephrine infusion.

Clearance and neutralization of lipopolysaccharide (LPS) and lipotechoic acid (LTA)
Decreased Proprotein Convertase Subtilisin/Kexin type-9 (PCSK9) activity increases LDL receptor density and clearance. Pathogen lipid (PL) clearance is related to endogenous lipid clearance; accordingly, PCSK9 regulates clearance of PLs such as LPS and LTA. Pharmacologic inhibition of PCSK9 improved survival and inflammation in murine polymicrobial peritonitis [6,8]. In several human septic shock cohorts, PCSK9 Loss-Of-Function (LOF) genetic variants were associated with decreased mortality compared to PCSK9 wild-type patients. PCSK9 LOF also decreased inflammatory cytokines in septic shock patients and after LPS administration to healthy volunteers. The PCSK9 effect was mitigated in LDL receptor knock-out mice and in humans homozygous for a LDL receptor variant resistant to PCSK9. Thus, decreased PCSK9 function is associated with increased PL clearance via the LDL receptor, a decreased inflammatory response, and decreased mortality [8,9]. Patients who have PCSK9 loss-of-function variants also have decreased readmission rate for sepsis after an episode of sepsis suggesting long term benefits of impaired PCSK9 function [10].
Plasma PCSK9 levels are greatly increased in sepsis [11]. Plasma PCSK9 greater than 250 ng/mL decreases hepatocyte LPS clearance and such levels are associated with cardiovascular and respiratory failure [11].
Plasma LDL levels decrease in sepsis and are associated with increased mortality [12] but is this merely association or do LDL levels causally contribute to increased mortality? We reasoned that an "instrumental variable" approach may help address this issue [13,14]. Recently Ference et al. [13] used a genetic instrumental variable strategy to show that LDL levels causally contribute to cardiovascular risk. They used PCSK9 genotype and HMGCR genotype as instrumental variables to prove causality from association data. Surprisingly we observed discordant associations between PCSK9 and HMGCR genotypes with mortality of sepsis. This logically leads to the novel conclusion that low LDL levels are not causal but rather LDL clearance rate contributes to sepsis mortality.
In interim summary, PCSK9 regulates PL clearance and so inhibition of PCSK9 activity is an attractive sepsis and septic shock target -PCSK9 inhibitor(s) could be safe and effective.

Modulation of HDL levels in sepsis
The excessive host inflammation induced by PLs can lead to septic shock. Of all plasma lipoproteins, high-density lipoprotein (HDL) has the greatest affinity for PLs. Decreased HDL cholesterol (HDL-C) levels are associated with an increased risk of organ dysfunction and mortality, prolonged hospital admission, and nosocomial infection [15]. We wondered whether genetic variation in genes of HDL metabolism alter HDL-C levels and outcomes of sepsis. Cholesteryl ester transfer protein (CETP) regulates HDL levels and our major findings were that variants of CETP that decrease HDL increase mortality: more specifically, the rs1800777 GOF variant in CETP is associated with greater decline of HDL-C, elevated CETP activity, and increased mortality of sepsis. CETP also affected the development of acute kidney injury in septic shock [16].
Therefore, CETP activity plays a critical role in affecting sepsis outcomes. Thus, CETP could be a therapeutic target in sepsis. One CETP inhibitor, torcetrapib significantly increased the risk of infection-and malignancy-associated mortality in patients with high cardiovascular risk. This may be an off-target effect of torcetrapib because other large RCTs of CETP inhibitors dalcetrapib, evacetrapib, anacetrapib did not validate increased risk of severe infection.
CETP regulation of HDL-C levels during acute infection critically modulates mortality and acute kidney injury. CETP genotype may represent a novel tool to risk stratify patients with sepsis. Modulation of HDL-C levels by CETP inhibition could be a novel sepsis therapeutic target.

Immune enhancing strategies for patients in later sepsis who have signs of depressed immunity
The later stages of septic shock are characterized by depressed immunity and increased risk of nosocomial infection, organ dysfunction and death [17]. Accordingly, immune enhancing interventions could improve outcomes of such patients. There are ongoing RCTs of IL-7 and of XX that are designed to test the hypothesis that later intervention to enhance immunity could decrease mortality of sepsis and septic shock.

Recombinant human thrombomodulin
Sepsis-associated coagulopathy (SAC) is common in sepsis and septic shock, affecting about 2/3 of patients. Anticoagulation with activated protein C was successful in a pivotal RCT but not in a later Phase 3 RCT in septic shock [18]. Recombinant human throbomodulin (rhTM) is a protein that binds to protein C and activates protein C, so rhTM is anticoagulant but also has anti-inflammatory and complement inhibitory actions. A Phase 2 RCT of rhTM showed that rhTM improved outcomes in patients who had increased INR and thrombocytopenia. Accordingly, such patients were included in a subsequent pivotal Phase 3 RCT that was recently completed.  In the first stages of ARDS proinflammatory mediators inhibit natural anticoagulant factors which alter the normal balance between coagulation and fibrinolysis leading to a procoagulant state [1]. This together with the breakdown of the alveolar-capillary barrier leads to proteinaceous edema, neutrophils infiltration into the alveolar compartment and the activation of macrophages towards a pro-inflammatory phenotype.

Marta Camprubí-Rimblas
Beneficial effects of anti-coagulants have been proved in pre-clinical and clinical models of acute lung injury (ALI) and ARDS, although systemic bleeding offset its positive effects. Anti-coagulants could be effective for their anti-inflammatory activity in addition to their anti-coagulant properties. Moreover, given the cross talk of these pathways and their influence on permeability, anti-coagulants could also restore the alveolar-capillary barrier.
Recent studies revealed that local administration of anticoagulants by nebulization could not only re-establish coagulant activity in the lung but also prevent systemic bleeding [2]. Beneficial effects of anticoagulants are shown in preclinical and clinical trials of ALI and ARDS, although results are controversial.
Local administration of tissue-type plasminogen activator or tissue factor pathway inhibitor by nebulization could maintain its properties while avoid systemic adverse effects; however further investigation in this form of delivery is needed [3,4].
Nebulized heparin and/or antithrombin reduced pulmonary inflammation and coagulation and avoided systemic bleeding in a model of ALI [5]. Treatment with nebulized heparin modulated alveolar macrophages diminishing TGF-β and NF-κB effectors and the coagulation pathway and reduced the recruitment of neutrophils into the alveolar space. Nebulization of antithrombin alone ameliorated coagulation, while combined antithrombin and heparin had a higher impact reducing permeability and decreasing the infiltration of macrophages into the alveolar compartment. In injured human cell lung populations isolated from lung biopsies, heparin reduced the expression of pro-inflammatory markers in alveolar macrophages and deactivated the NF-κB pathway in alveolar type II cells; decreasing the expression of its mediators and effectors [6]. Also, in injured alveolar type II cells the administration of antithrombin decreased the levels of pro-inflammatory mediators and increased the tight junctions. Those studies proved a translational action of both anti-coagulants into humans.
The local administration of heparin and antithrombin by nebulization might be a potential treatment for ARDS, as they act in different pathways and processes of the pathophysiology of this syndrome. Lung injury is attenuated by the administration of those local anti-coagulants decreasing inflammation, coagulation and proving ameliorations on permeability without causing systemic bleeding.
Nevertheless, controversial results have been found in clinical studies where heparin has been nebulized. In patients with ARDS nebulized heparin reduced the days of mechanical ventilation and did not affect systemic coagulation, although a trend to increase aPTT levels was observed with the highest dose [7]. In the multicenter trial HEPBURN, nebulized heparin in burn patients with inhalation trauma, the study had to be stopped prematurely due to safety reasons: increased blood sputum and high levels of aPTT [3].
To identify subtypes in ARDS heterogeneity might improve the response of those patients to a specific treatment. The nebulization of antithrombin and heparin combined or alone in a subtype of patients most likely to respond to the appropriate anticoagulant should also be studied. This together with the proper time to initiate a treatment is of major importance to fight ARDS. Moreover, we should have in mind that animal models mimic human ARDS only in part, and that this could affect the relevance of the data.
ARDS is a really complex disease regarding its pathophysiology, so the unique or combined therapy should face different pathways and processes to ameliorate patient's outcomes.

Study of the impact of stopping MRSA therapy in VAP patients with negative respiratory cultures.
Of the approximately one third who were deescalated, there was no increase in mortality, but there was a shorter length of hospital and ICU stay and a lower incidence of acute kidney injury (less vancomycin use). 6 In 117 patients in the ICU with CAP, 72% had a defined etiology. Isolated bacterial and viral infections occurred in slightly less than 30% each, and mixed bacterial -viral infection in 15%. Mixed infection was associated with a nearly 14-fold increase in mortality. Severe community-acquired pneumonia (CAP) has been defined as those cases that require intensive care unit (ICU admission. Direct admission to an ICU is required for patients with septic shock or acute respiratory failure requiring invasive mechanical ventilation (IMV), which are defined as major severity criteria by the 2007 IDSA/ATS guidelines [1]. These guidelines developed a set of 9 minor severity criteria on the basis of data on individual risks to identify patients with severe CAP: respiratory rate > 30 breaths/min, PaO 2 /FiO 2 < 250 mmHg, multilobar infiltrates, confusion and/or disorientation, uremia (BUN level > 20 mg/dL), leukopenia (WBC count < 4 × 10 9 cells/L), thrombocytopenia (platelet count < 100 × 10 9 platelets/L), hypothermia (core temperature < 36 ºC), and hypotension (SBP < 90 mm Hg requiring aggressive fluid resuscitation). Admission to an ICU was also recommended for patients with 3 or more of these minor severity criteria [1]. However, none of those minor severity criteria adequately distinguish patients for whom ICU admission is necessary. Invasive mechanical ventilation was the main determinant for ICU admission in one study, followed by septic shock [2]. In the absence of major criteria, ICU admission was not related to survival of patients with minor severity criteria in this study [2]. Minor criteria probably identify patients that may require close monitoring rather than active life-support treatment.

Miquel Ferrer
Severe CAP is present in about 18% of hospitalized patients with CAP [3]. Despite global efforts to improve outcomes, mortality remains high in severe CAP. Considering all cases who met criteria for severe CAP, admitted or not to the ICU, we recently reported that the 30-day mortality of these patients is 22% [3]. Between 37% and 60% patients with CAP in the ICU may require IMV [2,[4][5][6]. The mortality rates of ICU patients with CAP range between 13% and 28%, depending on the different series and whether ICU or hospital mortality was reported.
The mortality rate of patients with severe CAP that require IMV is high, 32% and 55% for ICU mortality [7,8], and between 33% and 56% for hospital mortality [3,9,10]. Even in patients with CAP treated with NIV, the hospital mortality of those intubated after NIV failure may be as high as 54% [11]. As expected, older age, co-morbidities, and higher severity indices of pneumonia and organ system dysfunction at admission were independently associated with mortality in these reports. In order to assess whether the use of IMV is simply a marker of more acute severe disease or a determinant of poor outcome, we recently reported a large, prospective and consecutive series of hospitalized patients with severe CAP with special focus in the association of IMV with mortality [3]. Compared to non-intubated patients, those who received IMV did not present higher severity scores at hospital admission. However, the use of IMV independently predicted 30-day mortality. The contribution of IMV to mortality was reinforced by the finding that the actual mortality of these patients was substantially higher than that predicted by the APACHE-II score. In contrast, the actual mortality of non-intubated patients was lower than that predicted by this score. Whatever the cause is, the use of IMV seems to give a surplus of mortality in this subgroup of severe CAP patients.
Septic shock was also an independent predictor of mortality in patients with severe CAP in our series [3]. This is not surprising considering that shock is an accepted major severity criterion of CAP and that it is associated with clinical failure [12]. However, the actual mortality of those patients with septic shock as the unique major severity criterion was only slightly higher than that predicted by the APACHE-II score [3]. Finally, the hospital mortality of patients with at least one major severity criteria, either septic shock, need for IMV or both, was higher than that of patients with minor criteria only (86, 29% vs. 59, 16%, p < 0.001) [3].
Streptococcus pneumoniae is the leading cause of CAP; it is the underlying aetiological agent in 22% of patients requiring ICU admission [13], and about 30% of these patients develop pulmonary complications during their clinical courses [14]. A French multicentre study of severe pneumococcal CAP patients admitted to ICUs reported an overall mortality rate of 29% [15]. The high mortality of severe CAP occurs despite the fact that the majority of patients receive an early and adequate antibiotic treatment [3]. This is probably due, in part, to an imbalanced and disproportionate local and systemic inflammatory response that contributes to impairment of gas exchange, sepsis and end-organ dysfunction. Although clinical trials are heterogeneous, the adjuvant use of corticosteroids appears to be beneficial for those patients with severe CAP, particularly in presence of high systemic inflammatory response [16,17].
Acute respiratory distress syndrome (ARDS) is a potential complication of severe CAP. Pneumonia is largely the most frequent cause of ARDS in a multicentre prospective epidemiological study [18]. There was limited information regarding the incidence of ARDS, associated pathogens, risk factors, and specific outcomes in hospitalized patients with severe CAP in the era of the current Berlin definition [19], according to which patients must be receiving positive-pressure ventilation. We therefore assessed prospectively the characteristics of mechanically-ventilated patients with severe CAP and ARDS [20]. Among patients with CAP, ARDS was present in 2% of hospitalized patients, in 13% of ICU patients, and in 29% of mechanically ventilated patients, either invasively or non-invasively. Higher organ system dysfunction and previous antibiotic use were independent risk factors for ARDS, while previous inhaled corticosteroids was independently associated with a lower risk. The 30-day mortality was similar between patients with and without ARDS (25% vs. 30%, p = 0.25), confirmed by propensity-adjusted multivariate analysis. These results indicate that the expected association of ARDS with mortality seems more related to the need for mechanical ventilation in these patients with CAP rather than ARDS itself, as we have recently reported that invasive mechanical ventilation in patients with severe CAP independently predicts mortality [3]. 17  The human microbiome consists of a similar amount of cells as the human body itself and has about 100 times the number of genes. The microbiome changes considerably during critical illness and the composition is much more chaotic than under healthy circumstances. This may explain the much higher rectal carriage of potentially pathogenic micro-organisms, which is associated with more bloodstream infections and pneumonia. A loss of diversity of the microbiome is associated with infection and a worse outcome and therefore the microbiome may be a target of interventions. Selective decontamination of the digestive tract has a long history in the ICU and is considered standard of practice in several hospitals throughout Europe. Through the application of topic antibiotics in the digestive tract, SDD aims to reduce the burden of potentially pathogenic micro-organisms and thereby reduce the number of bloodstream and respiratory infections. Several studies that randomized more than 10.000 patients showed that SDD is associated with less respiratory tract infections and bloodstream infections. It also resulted in a reduction in mortality with an adjusted odds ratio around 0.80. This makes SDD one of the most effective and well studied interventions in the intensive care unit. However, adoption of SDD as a standard practice has met a lot of resistance from clinicians. We will explore the five most frequently used arguments against SDD in the lecture.

Consultant, President of the World Alliance Against Antibiotic Resistance (WAAAR). Paris, France
Antibiotic resistance is becoming a serious threat for humanity. Antimicrobial resistance (AMR) is far higher in the ICUs than in classical wards in the hospital or in the community. Particularly in ICU patients, the gut can be considered as the epicenter or the factory of antibiotic resistance [1]. The presence of MRB in the gut is due to either their selection by the antibiotics or to the transmission of MRB from other patients or the environment via the hands. The ways to prevent the acquisition of those resistant strains in the gut of ICU patients are usually hygiene control measures, in particular hand-disinfection and antibiotic restriction/stewardship. However, some other techniques might be useful to prevent colonization of the gut with MRB, or to clean up the gut of the patients from MRB, in particular in case of outbreaks in the ICU from those strains. Selective Digestive Decontamination (SDD) including the use of non-absorbable antibiotics in the stomach is one of those techniques. This technique allows to reach extremely high levels of those non-absorbable antibiotics, far above the MIC of all strains of G-including the most resistant ones. The topic is very controversial, since SDD is suspected to increase AMR in the ICU, but in fact, some studies from The Netherlands showed that resistance was not increased in the digestive flora with SDD, but rather decreased [2]. However some increase in the MIC of aminoglycosides to gram negative has been described, some strains being fully resistant to those antibiotics. Resistance to colistin has been also described. Oral non-absorbable antibiotics, which can be used without the other components of SDD have also been able to stop outbreaks with MRB in the ICU, for example one due a highly resistant strain of Klebsiella pneumoniae [3]. Several recent studies confirm this initial one. However, some studies are negative, and further data are mandatory.
Other methods have also been tried with contrasting results, including the use of probiotics, of Blactamases able to destroy MRB, of encapsulated charcoal able to trap antibiotics at the end of the small intestine and then avoid their effect on the colonic flora [4], and more recently fecal transplantation with very promising results [5]. The "search and destroy "concept has been developed a long time ago in The Netherlands with excellent results, in particular for Staphylococcus aureus resistant to meticillin. On the same line, the concept "Search, destroy and restore" could be one of the best solution to treat gut colonization with MRB in the ICU. The issue of the prevention must of course be part of this strategy. The new diagnostic methods, based mainly on molecular technology, allow the detection of microorganisms in a much shorter period of time than those of traditional microbiology. In the field of respiratory infection in Intensive Care Units, new procedures have simplified the diagnosis of viruses, such as the case of Influenza or Respiratory Syncytial Virus [1][2][3][4][5]. Tests are available that make possible, not only the detection of bacteria, but also the quantification of the bacterial load and the rapid detection of various mechanisms of resistance. Multiplex-real time Polymerase Chain Reaction tests are now available commercially and ease to use. They identify the most common virus and Gram positive and Gram negative bacteria causing VAP in less than 4 h [6]. Rapid detection of the main resistance mechanisms, including Carbapenemase producing bacteria are now available in hours but most studies have been performed on isolated strains but not on direct clinical samples [7][8][9].
With regard to fungal diseases, the new technology permits rapid identification of fungi of the genus Aspergillus and the family Mucoraceae and has been greatly improved with the molecular detection techniques of Pneumocystis jirovecii. In addition to tests directly applied to respiratory samples, fungal blood biomarkers such as 1-3 B-D-glucan and Galactomannan are of unquestionable diagnostic help [10][11][12].
Speaking of time of response, as opposed to a traditional duration of 48-72 h to give answers of practical clinical interest, we have switched to clinically useful response times of less than 8 h. Despite of this, there are still not enough data to certify the clinical and economic impact of these tests, which in my opinion are, by far, cost-effective.
One of the problems lies, not in the technique itself, but in the mechanisms of explanation and communication to the clinician that are associated with it. With bad communication, the best techniques are incapable of making an adequate clinical impact [13]. However, the best markers to evaluate clinical impact of microbiological laboratory techniques have yet to be determined.
A good example of our previous points is Infection with methicillin-resistant S. aureus (MRSA). Today the presence of MRSA genome can be reported in approximately 2 h. In recent years it has become acceptable that MRSA pneumonia can be reasonable rule out if a nasal carrier status is discarded. This obviously permit safe de-escalation of specific antiMRSA drugs with considerable stewardship impact [14].
Regarding etiological detection in the septic patient, the changes are also spectacular, although the new molecular technology has not succeeded in displacing traditional blood cultures as a reference technique [15,16]. Microbiology departments have an essential role in alerting to sepsis in the whole population of a general hospital. The simple receipt of a blood culture request is a valuable alert of sepsis, before microbiological results, and a telephone intervention with alert just upon reception of blood cultures has proven clinically useful [17].
A group of techniques that is making its way into the detection of specific microorganisms in a time close to 2 h, are the techniques of T2 Magnetic Resonance, available now, not only for the detection of candidemia but for the presence of several of the bacteria most frequently causing sepsis [18][19][20][21][22].
The direct detection of bacterial or fungal genomes in blood continues to be problematic and for this reason, an interesting way of working is to use the newly arrived blood cultures to identify microorganisms in other samples coming from places other than the blood that are sent simultaneously to the laboratory. In a proportion of those cases the causal microorganism of sepsis can be found in samples such as urine, respiratory tract or skin and soft tissues in which, because of their higher microbial load, molecular techniques are more sensitive [23,24].
The problem of rapid diagnosis of Infectious Diseases is increasingly moving from the technique itself and its performance to administrative problems derived fundamentally from a simplistic and basic interpretation of its cost and availability problems derived from the times and working hours of the Microbiology Services. The pending revolution is that of having a 7 × 24 Microbiology, with the capacity to perform, interpret and transmit these results in real time [25].

Saad Nseir
Critical Care Center, CHU Lille, F-59000 Lille Lille University, Medicine Faculty, F-59000 Lille, France s-nseir@chru-lille.fr Ventilator-associated tracheobronchitis (VAT) is a common ICU-acquired infection. Its incidence ranges 1.4-19% of critically ill patients receiving invasive mechanical ventilation. This infection is considered as an intermediate process between colonization and ventilator-associated pneumonia (VAP). Histological studies showed a continuum between these two infections. Several definitions are available for VAT. However, all these definitions have some limitations. The most accepted and frequently used definition include all the following criteria: fever >38 • C with no other cause, purulent tracheal secretions, positive tracheal aspirate (≥105 cfu/mL), and absence of new infiltrate on chest X-ray [1]. VAT is frequently caused by Gram negative bacilli. Pseudomonas aeruginosa, Staphylococcus aureus, and Acinetobacter baumannii are the most common pathogens isolated in respiratory secretions of VAT patients.
Previous studies reported prolonged duration of mechanical ventilation, and ICU stay in VAT patients. This negative impact on outcome is related to increased inflammation of lower respiratory tract, and sputum production [2]. Extubation failure, and difficult weaning could result from increased sputum production. In addition, higher rates of VAP were reported in patients with VAT compared with those without VAT.
Differentiating VAT from colonization or from VAP could be a difficult task. The use of significant microbiological threshold (tracheal aspirate at 105 cfu/mL or bronchoalveolar lavage (BAL) at 104 cfu/mL), associated with local and systemic signs of infection could be helpful to distinguish VAT from tracheobronchial colonization. Further, in the event that a portable chest X-ray is not accurate enough in diagnosing a new infiltrate in critically ill patients, it would probably allow differentiating severe (VAP) from less severe (VAT) ventilator-associated lower respiratory tract infections. Therefore, one could argue that the presence of a new infiltrate on chest X-ray, associated with clinical and biological signs of infection, should be considered as a severity sign, that might trigger prompt empirical antibiotic treatment [3,4].
There are at least four reasons to suggest a continuum between VAT, and VAP. First, the higher rates of VAP in patients with VAT compared with those with no VAT. Second, histological findings of postmortem animal and human studies clearly showed the coexistence of these two infections, and described them as bronchopneumonia. Third, the higher SOFA, CPIS, PCT levels, and mortality in VAP compared with VAT patients strongly suggest that VAT might be a precursor of VAP. Fourth, the pathophysiology of VAP also supports this hypothesis, as microaspiration of contaminated oropharyngeal secretions is a permanent phenomenon, lesions with different severity might exist in the lower respiratory airway of mechanically ventilated patients. However, in some patients VAP might occur without previous VAT, suggesting two different pathogenic pathways.
There is probably an overlap between these two infections, but no available examination could differentiate them at bedside [5]. CT scan and lung ultrasound are more efficient in diagnosing lung infiltrate than chest Xray. However, to diagnose a new infiltrate, baseline examination is required. Additionally, fiberoptic bronchoscopy and BAL could probably not be used to differentiate VAT from VAP, as previous studies reported frequent high burden of bacteria on BAL in chronically ventilated patients without local or systemic signs of infection. A post-hoc analysis of the large TAVeM international database evaluated the interest of biomarkers in differentiating VAT from VAP [6]. Although PCT and CRP presented lower values in VAT as compared to VAP, there was a marked overlap of both biomarkers values in both VA-LRTI not allowing adequate discrimination. Another recent large post-hoc study evaluated the accuracy of the clinical pulmonary infection score (CPIS) in differentiating VAT from VAP [7]. All patients with a microbiologically confirmed VAT or VAP were included. CPIS exhibited moderate accuracy for the diagnostic of VAP among patients with microbiologically confirmed ventilator infections.
There are at least two reasons to distinguish VAT from VAP: (1) antibiotic treatment for VAT is currently not recommended because of lack of good quality data in favor of antibiotic treatment and also because antibiotic treatment is a well-known risk factor for multidrug resistance bacteria emergence. (2) if antibiotic treatment is beneficial in VAP patients, a short course of antibiotics (i.e., 3 days) might be sufficient to treat this infection. The recent large prospective multicenter multinational TAVeM study allowed validation of a highly specific definition of VAT, and clearly showed that VAT and VAP are not associated with the same impact on outcome. Mortality rate was significantly higher in VAP patients compared with those with VAT, and those with no VA-LRTI. In our opinion, this is a key finding supporting the fact that these two infections should be differentiated even if closely linked, and that VAT patients might benefit from a shorter duration of antibiotic treatment. The TAVeM2, randomized double-blind controlled study, has started recently in France, and is evaluating the impact of two durations of systemic antibiotic treatment (3, or 7 days) versus no antibiotic treatment in a large cohort of VAT patients.

Rosario Menéndez
University and Polytechnic Hospital La Fe, Valencia, Spain Community-acquired pneumonia may cause during the acute phase cardiovascular complications with a frequency around 15% and, interestingly, after discharge and during the next 10 years that risk remain higher [1]. The association between cardiovascular complications and pneumonia has been recently recognized and proved in several studies: 1. In animal pneumonia models Streptococcus pneumonia may directly cause necrosis of myocites and scar formation [2] 2. In case-control studies, after controlling for comorbid conditions and age, patients with pneumonia had significantly higher risk for developing cardiovascular events 3. In sepsis, a similar increase of cardiovascular events has been reported during the episode and in those who survive. In fact, some authors have stated that pneumonia should be considered a new cardiovascular risk factor. During bacteremia Streptococcus pneumoniae can translocate across the vascular endothelium into the myocardium and form microlesions. Cardiotoxic products generated by S. pneumoniae may damage myocardium and heart failure and other complications may develop. Several molecular mechanisms underlying pneumococcal cardiac invasion have been described: cardiomyocite death, scar formation and cardiac remodeling.
Biomarkers are important tools that allow quantifying biologic processes such as inflammation or damage with a potential usefulness for diagnosis and prognosis evaluation. Several criteria define a good biomarker: accuracy, reliability and impact for diagnosis. There are several cardiac biomarkers that have been evaluated in pneumonia with the purpose of measuring cardiovascular damage and/or mortality [3]. Thus, cardiac troponin is related to myocardial necrosis; endothelin 1 is secreted by endothelial cells and correlates with shear stress; natriuretic peptides of myocardial stress (proBNP and proANP) are considered of similar accuracy for the diagnosis of heart failure); proADM is a neurohormonal biomarker of pressure and volume overload in heart failure.
In general, most of the studies have measured cardiac biomarkers at CAP diagnosis and there are scarce publications with determination after discharge. Moreover, mortality is the most frequent outcome whereas cardiovascular events have been less investigated. Kruger et al. [4] in a CAP cohort found that cardiac biomarkers, and especially proADM, were associated with short and medium term mortality (at 90 days). Biteker et al. [5] in a CAP prospective study evaluated echocardiographic findings along with NT-proBNP levels. They found that patients with raised levels of NT-proBNP and reduction of TAPSE had increased probability of complications and death. On the contrary, those with low levels of NT-proBNP without reduction of TAPSE had no complications. In a posterior study, endothelin 1 provided additional information when studied together with proBNP.
Chang CL et al. [6] evaluated both NT-proBNP and troponin T for mortality prediction in CAP. They found that both, NT-proBNP and troponin T, predicted 30-day mortality in age-adjusted analysis. However, after adjustment for PSI, a raised NT-proBNP persisted as independent biomarker predictor. The authors pointed out that some degree of cardiac involvement may exist and even remain under-recognized during the pneumonia episode and its presence is related to poor outcome. Nowak et al. [7] have compared prognostic mortality accuracy (comparing areas under the curve) of natriuretic peptides in CAP, showing that it was similar for short and long-term mortality although not higher than Pneumonia severity index (PSI). In multivariable Cox-regression analysis, NT-proBNP remained an independent mortality predictor and they suggest that a combination of this biomarker with PSI would be helpful for stratifying mortality risk.
Soluble markers of platelet activation such as s-P selectin and sCD40 ligand have been found in patients with stable atherosclerosis or acute coronary syndromes and they predict cardiovascular events. In CAP, Cangemi et al. [8] evaluated platelet activation and the development of myocardial infarction showing that raised platelet activation biomarkers were independent predictors. Recently, Mendez R et al. [9] have evaluated the usefulness of cardiac biomarkers in CAP to predict early and late cardiovascular events (1 year follow-up). They have reported that raised initial levels and at 30 days were independent predictors of cardiovascular events.
In summary, cardiac biomarkers as surrogate markers of cardiovascular damage or stress may help identifying CAP patients more susceptible to develop cardiovascular events and/or die. The increased damage and its persistence are pointing out to damage of the cardiovascular system provoked by CAP.  The rapid emergence and dissemination of antibiotic-resistant microorganisms in intensive care units (ICUs) worldwide threatens adequate antibiotic coverage of infected patients in this environment and may justify using regimens combining several broad-spectrum antibiotics antimicrobials, even when the presumed infection probability is low. Numerous studies have indeed documented that a significant increase in mortality is observed when optimal antibiotic therapy is delayed in infected ICU patients and some quality improvement initiatives that encouraged earlier prescribing have also reported decreases in mortality [1]. Unfortunately, this "spiraling empirical" practice increasingly leads to undue antibiotic administration to many ICU patients without true infections or with infections caused by very susceptible bacteria not requiring broad-spectrum antimicrobial agents, paradoxically driving the emergence of more antibiotic-resistant microorganisms and causing infections that are, in turn, associated with heightened mortality and morbidity. Receipt of unnecessary and prolonged broad-spectrum antibiotics can also cause significant direct harm including antibiotic-associated adverse events, Clostridioides difficile infections, and changes to the digestive microbiome [2].
Although the Surviving Sepsis Campaign guidelines [3] recommend starting new antibiotics within one hour in ICU patients with sepsis, such an approach leads to unnecessary treatment in many non-infected patients and thus could be unwarranted in many cases [4]. In a quasi-experimental, before-and-after, observational cohort study on patients admitted to the University of Virginia surgical ICU, Hranjec and colleagues documented that delaying antibiotics antimicrobials for hemodynamically stable patients with suspected infections until they were objectively documented was associated with more initially appropriate therapy and lower all-cause mortality than using an aggressive strategy [5]. Thus, for clinically stable patients, this strategy might achieve better antibiotic use without impacting prognosis. Patients with mildly or moderately severe, early-onset infections and no specific risk factors (e.g., prolonged hospitalization, immunosuppression and/or recent prolonged antibiotics) can also receive narrow-spectrum drugs, like a non-pseudomonal third-generation cephalosporin, as recommended in the ERS/ESICM/ESCMID/ALAT guidelines for the management of hospital-acquired pneumonia [6].
Obtaining specimens for appropriate cultures before antibiotic administration is essential to confirm infection, identify responsible pathogen(s) and enable therapy de-escalation in response to susceptibility profile(s). Having current and frequently updated knowledge of local bacteriological epidemiology increases the likelihood of prescribing appropriate initial antibiotics. Whether surveillance cultures could further improve empirical treatment selection for ICU patients with suspected hospital-acquired infections is still debated but should certainly be weighed when difficult-to-treat microorganisms abound, making initial choices particularly risky.
For ICU patients admitted with community-acquired or healthcare-associated infections, more restraints for antimicrobial-therapy selection are certainly possible. Available data suggest that the incidence of pathogens resistant to the usual in-patient IDSA-ATS guideline-recommended antibiotic regimen for pneumonia (i.e., a non-pseudomonal cephalosporin and a macrolide) is usually not significantly increased unless two, three or more risk factors are present, with prior antibiotic use or hospitalization and poor functional status being more important predictors of resistant bacteria than nursing-home residence alone. Using such an algorithm could lead to fewer pneumonia patients unnecessarily receiving broad-spectrum antibiotics [7].
Within the past decade, the way clinical microbiology laboratories identify microorganisms was revolutionized, leaving behind slow, traditional methods based on phenotype characteristics (e.g., growth on defined media, colony morphology, Gram-staining, and biochemical reactions) incurring significant diagnosis delay, in exchange for new diagnostic techniques including real-time multiplex polymerase chain reaction and matrix-assisted laser-desorption ionization-time-of-flight mass spectrometry.
The latter, making possible rapid pathogen identification and their antimicrobial-resistance patterns (at least for certain organisms), could undoubtedly promote earlier therapy appropriateness and de-escalation [8].
Regardless of the diagnostic strategy used for suspected infections in the ICU, serial clinical and microbiological evaluations are highly relevant to re-assess therapy after 48-72 h and stop it if infection is unlikely [9]. For many ICU patients with infections, including late-onset infections, therapy can be de-escalated, once respiratory tract-, blood-and/or other specimen-culture results become available, if no resistant organism is recovered or because the isolated pathogen is sensitive to a narrower-spectrum antibiotic than that prescribed empirically. For example, if MRSA is not found, vancomycin and linezolid should be stopped, unless the patient is allergic to β-lactams or has developed an infection with Gram-positive bacteria susceptible only to them. Unfortunately, study results showed that, despite not being associated with any adverse outcomes, de-escalation was not consistently applied in many ICUs [10].
The two most commonly cited reasons to prescribe combined antibiotics for the entire treatment duration are to achieve synergy and prevent resistant-strain emergence. However, antibiotic synergy has only been shown to be valuable in vitro and in patients with neutropenia or a >25% probability of death. Randomized-controlled trial results on combined therapy showed its benefit to be inconsistent or null, even when they were pooled in meta-analyses or analysis was restricted to P. aeruginosa-infected patients. Importantly, such regimens did not prevent antimicrobial-resistance emergence under therapy, and were associated with significantly more nephrotoxicity. Based on those data, most patients' therapy could be safely switched to monotherapy after 3-5 days, provided that the initial therapy was appropriate, the clinical course evolved favorably, and that microbiological data did not indicate very difficult-to-treat microorganisms, as can be observed for some non-fermenting GNB and carbapenemase-producing Enterobacteriaceae.
Computerized decision-support programs linked to electronic patient records can facilitate the dissemination of information to physicians for immediate use in therapeutic decision-making and improving quality of care. Partially or non-automated protocols, often instigated by hospital-based quality-improvement teams, also had demonstrated efficacy, as well as having an infectious disease specialist interacting regularly with the medical ICU team [11].
In summary, although no one disputes the fact that ICU patients with true bacterial infection and septic shock should receive appropriate antibiotics immediately, it is certainly possible to titrate the timing of antibiotic administration and not to use broad-spectrum agents in all cases, according to the risk of infection of patients, risk factors for MDR pathogens and severity of the disease [12].
Broad-spectrum beta-lactam antibiotics are the cornerstone of antimicrobial therapy-both empirical and directed-for many patients in the intensive care unit (ICU), because of their large spectrum covering a wide range of pathogens, low toxicity and cost.
In recent years, there has been an increased interest in a PKDP optimized use of antibiotics and particularly for beta-lactam antibiotics research into optimized therapy has been intense. The primary reason for this has been the extensive body of literature demonstrating the variability of antibiotic concentrations in critically ill patients with standard dosing [1] (both dose and method of administration) and the clinical failure that is often observed despite presumed antibiotic efficacy. Furthermore, an increase in antibiotic resistance around the world has resulted in a search for innovative strategies to treat infections caused by pathogens with reduced susceptibility-and on occasions overt resistant pathogens, and for methods to reduce the development of resistance during routine antibiotic therapy. Prolonged infusion of beta-lactam antibiotics has been proposed to overcome all of this and improve outcome, treat multidrug resistant (MDR) infections and reduce resistance.
Prolonged infusion covers different strategies; both extended infusion (infusion of an antibiotic dose over 2 h or more), and continuous infusion. Although both are often lumped together and may serve the same goal, they should not be used interchangeably. Most of the data currently available are on continuous infusion, and the results should not be extrapolated to extended infusion.
The key feature of prolonged infusion is that it avoids the peaks and troughs observed in intermittent infusion and it prolongs the time during which a concentration is above a certain threshold [2]. Considering that antibiotic efficacy of beta-lactam antibiotics is determined by the time above the minimal inhibitory concentrations of the causative pathogen, and that up to 100% time above this concentration is associated with increased bacterial killing, it should be evident that prolonged infusion results in a more effective antibiotic therapy.
The data on prolonged infusion can be divided in studies that have investigated the pharmacokinetics and studies that have looked at clinical outcomes. The literature documenting the changed PK is quite consistent and shows that indeed PK is changed, and antibiotic exposure is higher with prolonged infusion [2]. One limitation is that for higher targets (to treat more resistant pathogens), time above the MIC may be (paradoxically reduced) with continuous infusion.
The clinical data on prolonged infusion are less clear. A number of studies have suggested a clinical advantage of continuous infusion, and when put together in a meta-analysis a reduced mortality has even been described [3], but the most solid data on the topic could not demonstrate superiority of continuous infusion, but it can be said that prolonged infusion seems to be safe, and not associated with worse outcomes. A large study (the Beta-Lactam Infusion Group-III study (BLING-III)), aiming to include 7000 patients in Australia, UK, Belgium, New Zealand, Sweden, France and Portugal, has recently started and should give a definitive answer to this important question.
A few reports have described the use of prolonged infusion to treat MDR infections, but these are purely descriptive, and the role of the infusion strategy is unclear.
Data on the effect on antimicrobial resistance are less clear and the available data so far could not demonstrate an impact on the development of antibiotic resistant in patients with continuous infusion, but this is limited by the low quality of the evidence [4].
A number of practical issues and potential pitfalls should be considered before embarking on continuous infusion. Antibiotic stability is key, and not all antibiotics are suitable for prolonged infusion [5]. Most commonly used broad spectrum beta-lactam antibiotics are stable for a number of hours but rarely beyond 12 h at room temperature; the involvement of a clinical pharmacist in this process is recommended. Continuous infusion requires a separate infusion line, and this may pose practical problems; for extended infusion this may be less a challenge. A loading dose is required before starting continuous infusion and should not be forgotten. For extended infusion, care should be taken to use infusion lines with small dead spaces to avoid antibiotic solution remaining behind in the infusion line. Finally, prolonged infusion should not be used for economic reasons-similar target attainment with lower doses.
Some areas of uncertainty remain. For some patients just changing the infusion strategy may be insufficient to reach the set target, and it is not clear if the PK target changes when switching to continuous infusion.
In conclusion, prolonged infusion of beta-lactam antibiotics (in fact continuous infusion) holds promise as it has been shown to lead to better PK target attainment. The effect on clinical outcomes is less clear, and although the intervention is cheap and relatively simple to implement, there is no compelling evidence to consider intermittent infusion of beta-lactam antibiotics obsolete.

Introduction
Antibiotics are the ONLY class of drug where giving to one patient may change the efficacy in another patient. Resistance among the Gram-negative bacilli is increasing at an alarming rate. Antimicrobial resistance (AMR) is mobile via genes on plasmids, leading to spread through bacterial populations which is augmented by increasing global air travel and migration. The direct consequences of infection with resistant microorganisms include more severe, longer illnesses, increased mortality, prolonged hospital stay and increased healthcare costs. Of particular concern is carbapenem resistance in the Gram-negative Enterobacteriacae (Carbapenemase Producing Enterobacteriacea, CPE). There are few new antimicrobial agents in the pipeline which give cover for these Gram-negative pathogens. New agents are being developed to cover the serine carbapenemases (KPC, OXA), however very few agents are available to cover metallo-carbapenemases (NDM, VIM, IMI). Therefore, novel strategies to treat these infections must be employed.

Case report
We report a case of severe pneumonia in a patient transferred directly by air (medevac), from Turkey to our Adult Intensive Care Unit. Respiratory samples (BAL) isolated multi-drug resistant Gram-negative bacilli. Both Acinetobacter spp. carrying an OXA-23 and OXA-51 and Klebsiella pneumoniae carrying an OXA-48 and NDM were detected. Both organisms were pan-resistant. In-house laboratory antimicrobial synergy testing was carried out to determine a treatment regime and a novel treatment combination was initiated (Figure 1). length of ICU stay and hospital stay. Furthermore it was associated with increase in the mortality among septic shock patients. The severity of hyperoxia and the duration of exposure to hyperoxia could as well influence the prognosis in septic shock patients.

Introduction
Antibiotics are the ONLY class of drug where giving to one patient may change the efficacy in another patient. Resistance among the Gram-negative bacilli is increasing at an alarming rate. Antimicrobial resistance (AMR) is mobile via genes on plasmids, leading to spread through bacterial populations which is augmented by increasing global air travel and migration. The direct consequences of infection with resistant microorganisms include more severe, longer illnesses, increased mortality, prolonged hospital stay and increased healthcare costs. Of particular concern is carbapenem resistance in the Gram-negative Enterobacteriacae (Carbapenemase Producing Enterobacteriacea, CPE). There are few new antimicrobial agents in the pipeline which give cover for these Gram-negative pathogens. New agents are being developed to cover the serine carbapenemases (KPC, OXA), however very few agents are available to cover metallo-carbapenemases (NDM, VIM, IMI). Therefore, novel strategies to treat these infections must be employed.

Case report
We report a case of severe pneumonia in a patient transferred directly by air (medevac), from Turkey to our Adult Intensive Care Unit. Respiratory samples (BAL) isolated multi-drug resistant Gram-negative bacilli. Both Acinetobacter spp. carrying an OXA-23 and OXA-51 and Klebsiella pneumoniae carrying an OXA-48 and NDM were detected. Both organisms were pan-resistant. Inhouse laboratory antimicrobial synergy testing was carried out to determine a treatment regime and a novel treatment combination was initiated (Figure 1). Figure 1. Positive Synergy Test. "Russian-doll" appearance implies synergy between Ceftazidimeavibactam and Aztreonam. When used in combination these agents can be used synergistically to cover both serine (Ceftazidine-avibactam) and metallo (Aztreonam) -carbapenemases.

Conclusions
The WHO states that antimicrobial resistance "threatens the core of modern medicine and sustainability of an effective, global public health response". AMR results from systematic misuse and overuse of antibiotics. In view of the ease and amount of global travel, every nation is at risk. Carbapenem resistance in the Gram-negative population is increasing at an alarming rate. Very few new or replacement products are in the pipeline. Therefore novel strategies to treat these infections are required. Combinations of antimicrobials, based around individualised resistance mechanisms and laboratory synergy testing can be employed. Trust, Southmoor Road, Wythenshawe, UK Introduction: Human Metapneumovirus (hMPV), from the paramyxovirus family, is frequently a cause of childhood respiratory tract infections but may rarely cause acute respiratory failure. hMPV can produce a spectrum of disease from mild coryza to bronchiolitis and pneumonia and is difficult to distinguish clinically from respiratory syncytial virus infection. Reinfection in adulthood is common and often asymptomatic but can occasionally be severe particularly in elderly or immunocompromised patients. We report a case of severe acute respiratory failure in a patient who had been newly diagnosed with pro-myeloid leukaemia, who subsequently required Extra Corporeal Membrane Oxygenation (ECMO), and where the only pathogen identified was hMPV.
Case Report: A previously fit and well 23-year-old woman presented to her local A+E with a two-week history of bruising and menorrhagia. She was diagnosed with pro-myeloid leukaemia for which she received 10 days of idarubicin and arsenic treatment. Post-therapy she became neutropenic and developed a chest infection. Multiple sets of blood cultures were taken along with a serum galactomannan all of which were negative. The only pathogen identified was hMPV, from PCR of a throat swab. She was started on a variety of anti-microbials including intravenous ribavirin directed against hMPV. Over the next four days the patient's oxygen requirements increased, and she was moved to a level 2 bed for CPAP and the next day to ICU. She was intubated one day after arriving in ICU but remained hypoxic (Pa02 8-10 kPa) despite 90-95% oxygen. She was therefore referred for consideration of veno-venous (VV) ECMO support. She was accepted as a candidate for ECMO based on a Murray score of 9 and the potentially reversible aetiology of her acute respiratory failure. The patient was transferred and commenced on VV ECMO via femoro-femoral cannulation. A CT head on arrival showed a small left-sided bleed and some swelling though her pupils remained 2 mm and reactive. However, at 4 am the next morning her pupils were found to be 7 mm and fixed. A repeat CT head showed catastrophic bleeding and brain stem herniation. The decision was taken to remove care and she died later that day.
Discussion: Testing for hMPV is part of the routine viral PCR panel but positive results may be disregarded in patients with acute respiratory failure particularly when another organism is identified concurrently. This is despite accumulating evidence that hMPV can cause severe respiratory disease and ARDS even in immunocompetent patients. There is a paucity of evidence to support the use of any drug therapies to treat hMPV although ribavirin and intravenous immunoglobulin have both shown an effect in vitro. As such, the mainstay of management is supportive care which can include the use of VV ECMO. VV ECMO provides gas exchange thereby increasing oxygenation while allowing the lungs to be rested from the insult caused by mechanical ventilation. Haemorrhage is the most common complication of ECMO due to associated clotting factor dilution, platelet dysfunction and the need for patients to be systemically anticoagulated. Here we report the case of patient who after cytotoxic chemotherapy developed a fulminant pneumonia and ARDS due to hMPV infection and required VV ECMO support. She then had a large intracranial haemorrhage secondary either to ECMO or sepsis-induced coagulopathy and likely complicated by her underlying haematological malignancy.

Conclusion:
In adult patients with acute respiratory failure hMPV infection should form part of the differential diagnosis. If hMPV is confirmed physicians should prepare for the possibility of rapid deterioration and the potential need for ECMO support which may itself cause significant complications. Introduction: Over the last years, there was an increase in the number and severity of Clostridium difficile infections (CDI) in all medical settings, including the intensive care unit (ICU). The current prevalence of CDI among ICU patients is estimated at 0.4-4% and has severe impact on morbidity and mortality.
Objective: The main objective was to know the incidence of C. Difficile in a third level medicalsurgical ICU, likewise the characteristics of the infected patients.
The secondary objective was to study the association between the given treatment and the maintained clinical cure until the hospital discharge.
Material and methods: We used a retrospective, observational and analytical design. Have been included all patients with ICD with microbiological (+) tests diagnosed in the ICU from January 2014 to December 2017. Positive microbiological tests have been excluded within the period of antibiotic treatment and those with negative toxin or that could not be confirmed by PCR.
Results: 19 patients were included. The descriptive analysis shows us that 63,2% of the patients were men with a medium age of 61.9 years. 52.6% of infected patients were over 65 years.
About comorbidities, diabetes mellitus was present in 31.6% of patients, acute kidney injury in 36.8% and 47.4 of them were immunodepressed.
About therapies, 94.7% was given antibiotic treatment before the infection by Clostridium difficile, 100% of the patients were receiving proton pump inhibitors.
It is striking that in the majority of the patients who were diagnosed with Clostridium difficile infection, the antibiotic treatment they were receiving previously could not be interrupted. In the analytical study it was observed that 100% of patients who stopped taking antibiotics at the time of diagnosis of Clostridium infection evolved more favorably than those who could not, as well as those patients who responded early (<5 days) to treatment with C. difficile had a greater maintained clinical cure.
Finally, treatment with metronidazole in monotherapy was associated, although not significantly, with a greater maintained clinical response than combined treatment with metronidazole + vancomycin, which could be explained by a severity bias.
Conclusion: Clostridium difficile infection is more serious when affects a critical ill patient. It is mandatory to evaluate periodically the need to continue with the prescribed treatments (antibiotic therapy, proton pump inhibitors . . .), as well as an early diagnosis to identify those patients with higher risk of recurrence to choose the most appropriate antibiotic therapy for them.  The prevalence of Clinical University Hospital Lozano Blesa area TBM was 2.36%. The study population was predominantly female. The mean age of the study population was 45.57 ± 12.7 years. The most common symptoms were headache and altered mental status, without any other neurological defects. The duration of present symptoms may vary from 1 day to 4 months. Critical care specializes was involved in 33% of patients.
All patients had a lumbar tap (LP) done for cerebrospinal fluid (CSF) analysis. Mycobacterium tuberculosis baciloscopy wasn t identify in any of the samples after Ziehl Nielson stain. CSF mycobacterial culture was positive in 55%. PCR of TB was positive in 33% of them. No abnormal radiological findings have been found.
Risk factors such as HIV co infected patients were reported in 33% cases and malignancy in 11%. Two-thirds of TBM patients have a Spanish background, and rest of them were from Africa. No other risk factors have been observed.
On followed up after 6 months of discharged patients 70% of them had a full recovery, however 2 patients lost follow up.

•
The incidence of TB in Aragón has dropped significantly between 2011 and 2017 and TBM was the first most common of extrapulmonary disease. • TBM is a very critical diseases in terms of fatal outcome and permanent sequelae and requires rapid diagnoses and treatment.
Introduction: Sepsis and septic shock are frequent diagnoses in intensive care units, either as an admission diagnosis or as a consequence of respiratory, abdominal, urinary and other health conditions. The incidence of these diagnoses is high and cause a remarkable morbidity and mortality. Therefore, it is relevant to assess the functional outcome of patients after hospitalization in the ICU of "Centro Hospitalar e Universitário do Algarve (CHUA)-Unidade de Portimão", with sepsis and/or septic shock, and to identify the variables that contribute to their functional recovery after discharge.
The background variables, which are relevant to characterize the patients and their clinical condition before hospitalization, as well as the ICU variables, were selected and used to establish