Current Issue : Year : 2026 – Volume: 16 Issue: 2

 

Current Issue

Year : 2026 – Volume: 16 Issue: 2

Current Issue Articles

Original Research Article

EFFECT ON THE NEWBORN OF USING MONOPOLAR CAUTERY DURING ABDOMINAL ENTRY IN CAESAREAN SECTIONS

http://dx.doi.org/10.70034/ijmedph.2026.2.1

Anshika Kashyap, Deepa Mathur, Alpa Bhosale, Ninad Gharat

View Abstract

Background: Caesarean section is one of the most frequently performed obstetric procedures, and electrosurgical devices such as monopolar cautery are increasingly used during abdominal entry to improve hemostasis and reduce operative time. Objective: To assess the effect on the newborn of using monopolar cautery during abdominal entry in elective cesarean sections compared with the conventional scalpel technique. Materials and Methods: This prospective randomized case–control study was conducted at a tertiary level medical college and hospital from Jan 25 to December 25, including 100 women undergoing elective lower segment cesarean section at term. Results: In this study of 100 patients (50 in each group), baseline characteristics were comparable between the cautery and no cautery groups. Mean maternal age was 29.9 ± 3.8 vs 30.2 ± 4.2 years, gestational age 38.27 ± 0.69 vs 38.37 ± 0.72 weeks, and birth weight 2.95 ±0.25 vs 2.93 ± 0.26 kg. Apgar scores improved similarly from 8.40 ± 0.25 vs 8.27 ± 0.26 at 1 minute to 9.33 ± 0.67 vs 9.40 ± 0.53 at 10 minutes. Cord pH (7.29 ± 0.07 vs 7.30 ± 0.05), lactate (2.6 ± 0.9 vs 2.4 ± 0.8 mmol/L), and glucose (76.8 ± 9.7 vs 78.3 ± 10.2 mg/dL) were comparable. Previous LSCS was the most common indication (64% vs 62%), and NICU admissions were low (6% vs 8%). Conclusion: The use of monopolar cautery during abdominal entry in elective cesarean sections does not adversely affect immediate neonatal outcomes. Monopolar cautery appears to be a safe and effective technique that can be used without compromising newborn well-being. Keywords: Cesarean section, monopolar cautery, electrocautery, neonatal outcomes, Apgar score, cord blood pH.

Page No: 1-6 | Full Text

 

Original Research Article

A STUDY OF OUTCOME OF LONG BONE FRACTURES IN PAEDIATRIC PATIENTS MANAGED WITH TITANIUM ELASTIC NAILING SYSTEM IN TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.2

Suyog Patole, Sujay Mahadik, Mrigank Goel, Vedant Kadu, Satish Mehta

View Abstract

Background: Titanium Elastic Nailing System (TENS) has become a preferred minimally invasive method for stabilizing paediatric diaphyseal long bone fractures, offering balanced flexible fixation that preserves periosteal blood supply and allows early mobilization. This study assessed short-term functional and radiological outcomes, pain trajectory, malalignment, limb length discrepancy (LLD), and complications following TENS fixation in a tertiary care setting. Materials and Methods: A prospective cohort study was conducted over 12 months in the department of orthopaedics of a tertiary care hospital. Institutional ethics committee approved the study and informed and written consent was obtained from guardians of the cases. Children with simple, minimally displaced diaphyseal long bone fractures treated with TENS were included in this study on the basis of a predefined inclusion and exclusion criteria. Convenience sampling yielded 25 patients (sample size based on pilot proportion). Baseline demographics, injury mechanism, fracture pattern and final diagnosis were documented. Follow-up evaluations were performed postoperatively, at 1 month, and at 3 months. Functional outcomes were graded using Flynn et al TENS scoring system. Radiographs were assessed for alignment and union. Complications, pain, malalignment grades, and LLD were documented. SSPS 29.0 was used for statistical analysis and p value less than 0.05 was considered as statistically significant. Results: Mean age was 8.96±2.91 years; 68% were 5–10 years and 60% were male. Self-fall was the commonest mechanism (80%). Transverse fractures predominated (80%); femoral shaft fractures formed the largest diagnostic group. Malalignment <5° was maintained in 76% postoperatively and 80% at 3 months, with no significant change over follow-up (p=0.725). Pain decreased significantly to 0% at 3 months (p<0.0001). Minor complications occurred in 20% at 1 and 3 months (p=0.0139). Radiologically, 96% achieved union by 3 months. Malunion was reported in 1 (4%) patient. Mean LLD remained small (0.61–0.65 cm across intervals). Excellent functional outcomes were seen in 80% at 3 months. Conclusion: In selected paediatric diaphyseal long bone fractures, TENS provided reliable alignment control, rapid pain resolution, high union rates, minimal LLD, and predominantly excellent short-term functional outcomes with low minor complication rates. Keywords: Pediatrics long bone Fractures; Femur Fractures; Intramedullary Nailing; Treatment Outcome.

Page No: 7-13 | Full Text

 

Original Research Article

APRI SCORE AND FIB-4 INDEX AS NON-INVASIVE PREDICTORS OF ESOPHAGEAL VARICES IN LIVER CIRRHOSIS

http://dx.doi.org/10.70034/ijmedph.2026.2.3

T. Sowjanya Lakshmi, Rahul Conjeevaram, Maganti Yamuna, Jyothi Conjeevaram, Padamtinti Anurag Rao

View Abstract

Background: Cirrhosis leads to portal hypertension and life-threatening complications like esophageal variceal bleeding. While endoscopy is the gold standard for diagnosis, its invasive nature and cost, necessitate reliable non-invasive predictors. Aim: To evaluate the Aspartate Aminotransferase to Platelet Ratio Index (APRI) score and Fibrosis-4 (FIB-4) as non-invasive predictors of esophageal varices in liver cirrhosis. Materials and Methods: This observational study at Narayana Medical College Hospital (January–July 2024) included 105 cirrhotic patients. APRI and FIB-4 scores were calculated and compared against endoscopic findings. Results: Esophageal varices were present in 65.7% of patients. An APRI cut-off of ≥ 0.9 demonstrated 74.5% sensitivity, 61.1% specificity, and a significant association with high-risk varices (p = 0.0005).Binary logistic regression analysis revealed that patients with an APRI score of ≥ 0.9 had nearly five times the odds of harbouring high-risk varices compared to patients with an APRI score < 0.9 (OR = 4.59; 95% CI: 2.00–10.58; p = 0.0003). A FIB-4 cut-off of ≥ 2.78 showed higher sensitivity (82.3%) but lower specificity (40.7%), with a significant association (p = 0.0174).Patients with a FIB-4 score ≥ 2.78 had more than five times the odds of having esophageal varices compared to patients with a score below the 2.78 threshold (Odds Ratio [OR] = 5.31; 95% Confidence Interval [CI], 2.15 – 13.10; p < 0.001). Conclusion: Both indices are valuable tools for risk stratification. APRI serves as a reliable positive predictor for identifying high-risk varices, while FIB-4’s high sensitivity makes it an effective primary screening tool to rule out severe disease in resource-limited settings Keywords: APRI, FIB-4, Esophageal varices, Non invasive predictor, Cirrhosis.

Page No: 14-20 | Full Text

 

Original Research Article

PREDICTORS OF WEIGHT CHANGE DURING METFORMIN THERAPY IN ADULTS WITH TYPE 2 DIABETES : AN OBSERVATIONAL COHORT STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.4

Ambili Remesh, Mohammed Jaseem Ibrahim

View Abstract

Background: Metformin is the most prescribed first-line medication for people with type 2 diabetes mellitus (T2DM). In addition to its blood sugar-lowering effect, metformin is known to be associated with weight stabilisation or even slight weight reduction. However, the amount and type of weight changes are highly variable with metformin. This study aims to determine the weight-reducing potential of the antihyperglycemic agent. Understanding other factors causing weight changes could help improve treatment plans and metabolic outcomes. Materials and Methods: This prospective observational cohort study was conducted on an outpatient basis. We used consecutive sampling during routine outpatient visits to enrol 300 adult patients with type 2 diabetes mellitus who had been on metformin for at least six months. We gathered demographic, clinical, and metabolic information of the participants at baseline. For three months, we conducted follow-ups and recorded their body weight. Data was analysed using paired t-tests and regression modelling to assess weight changes and identify variables that may contribute to weight loss. Results: The participants had an average age of 52.4 years, with a standard deviation of 9.3 years. The group was made up of 54% men and 46% women. At the beginning of the study, the average body mass index (BMI) was 28.3 kg/m², with a standard deviation of 4.2 kg/m². During the follow-up period, 42% of patients lost weight, 36% maintained their weight, and 22% gained weight. A higher baseline BMI and a shorter duration of diabetes were strongly correlated with weight loss with metformin medication. Age showed a substantial correlation with weight change, whereas gender did not. Conclusion: Metformin medication is linked to moderate weight loss in a significant number of individuals with type 2 diabetes. Baseline BMI and diabetes duration appear to be major predictors of weight change. This shows how vital it is to analyse each patient individually when managing diabetes. With appropriate lifestyle changes, we can postulate that the treatment results can be further optimised. Keywords: Metformin, Type 2 Diabetes Mellitus, Weight Change, Body Mass Index, Predictors, Glycemic Control.

Page No: 21-26 | Full Text

 

Original Research Article

POST-OPERATIVE EPIDURAL ANALGESIA AFTER TKR: A DOUBLE-BLIND RANDOMIZED COMPARISON OF ROPIVACAINE 0.2% AND LEVOBUPIVACAINE 0.125%

http://dx.doi.org/10.70034/ijmedph.2026.2.5

Harshwardhan Tikle, Megha Harshwardhan Tikle, Kadali Lakshmi Sudha, Parasmita Bhattacharjee

View Abstract

Background: Total knee replacement (TKR) is associated with significant postoperative pain, which can delay rehabilitation and prolong hospital stay. Epidural analgesia using long-acting local anesthetics like levobupivacaine and ropivacaine is widely used for effective pain control. The aim is to compare the analgesic efficacy of epidural levobupivacaine (0.125%) and ropivacaine (0.2%) in patients undergoing TKR. Materials and Methods: This prospective, double-blind randomized study included 110 patients undergoing TKR, divided into two groups: Group L (levobupivacaine 0.125%, n=55) and Group R (ropivacaine 0.2%, n=55). Postoperative epidural infusion was administered, and patients were evaluated using Visual Analogue Scale (VAS) scores at rest and on movement at different time intervals. Hemodynamic parameters, sensory regression, motor blockade (Modified Bromage Scale), and adverse effects were also recorded. Results: Demographic parameters were comparable between groups. VAS scores at rest and during movement were similar at all time intervals (p>0.05). Group R demonstrated significantly lower pulse rate and systolic blood pressure at multiple time points (p<0.05), indicating better hemodynamic stability. Sensory regression to L1 was significantly prolonged in Group R (28.53 ± 13.463 hrs) compared to Group L (22.78 ± 11.263 hrs; p=0.018). Motor blockade and incidence of side effects were comparable in both groups. Conclusion: Both levobupivacaine and ropivacaine provide effective postoperative epidural analgesia following TKR. However, ropivacaine offers better hemodynamic stability and longer sensory blockade, making it a preferable choice. Keywords: Total knee replacement, Epidural analgesia, Ropivacaine, Levobupivacaine, Visual Analogue Scale, Postoperative pain.

Page No: 27-32 | Full Text

 

Original Research Article

PREOPERATIVE NEUTROPHIL–LYMPHOCYTE RATIO AS A PREDICTOR OF OPERATIVE DIFFICULTY AND POSTOPERATIVE OUTCOMES IN OPEN CHOLECYSTECTOMY: A RETROSPECTIVE STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.6

Manu Singh, Bheeni Bharti, Rahul Dhariwal

View Abstract

Background: Gallstone disease is a common gastrointestinal condition affecting a large proportion of adults worldwide. Although laparoscopic cholecystectomy is widely accepted as the standard treatment, open cholecystectomy remains essential in complicated cases. Identifying predictors of surgical difficulty and postoperative complications is therefore clinically important. The objective is to evaluate the association between preoperative neutrophil–lymphocyte ratio (NLR) and surgical outcomes in patients undergoing open cholecystectomy. Materials and Methods: A retrospective observational study was conducted in the Department of Anesthesia Kalyan Singh Government Medical College Bulandshahr, Uttar Pradesh, India from August 2024 to August 2025. Eighty patients undergoing open cholecystectomy were included. Patients were divided into two groups based on preoperative NLR values: NLR <3 and NLR ≥3. Operative time, difficult dissection, postoperative complications, and duration of hospital stay were recorded. Results: Patients with elevated NLR (≥3) had longer operative time (80 vs 55 minutes), higher incidence of difficult dissection (35% vs 10%), higher surgical site infection rates (18% vs 4%), and prolonged hospital stay (6 vs 3 days). Conclusion: Preoperative NLR is a simple and inexpensive biomarker that may help predict operative difficulty and postoperative complications in open cholecystectomy. Keywords: Neutrophil-lymphocyte ratio, gallstone disease, open cholecystectomy, inflammation, surgical outcomes.

Page No: 33-35 | Full Text

 

Original Research Article

AGE-RELATED VARIATIONS IN LIP PRINT PATTERNS: A CHEILOSCOPIC STUDY IN A SUBURBAN POPULATION OF DAKSHINA KANNADA DISTRICT

http://dx.doi.org/10.70034/ijmedph.2026.2.7

Fouzia Hameed, Vina Vaswani, Viswakanth B, Shahnavaz Manipady, Zahida Niaz, Aysha Hadia

View Abstract

Background: Cheiloscopy, the study of lip print patterns, has emerged as a useful adjunct in forensic identification due to the unique and relatively stable nature of lip groove patterns. Although several studies have evaluated the distribution of lip print patterns, limited research has focused on age-related variations in these patterns. The present study aimed to evaluate the distribution of lip print patterns across different age groups in a suburban population of Dakshina Kannada district. Materials and Methods: A cross-sectional study was conducted on 300 individuals aged 12–60 years residing in a suburban region of Dakshina Kannada district. The study population was categorized into three age groups: 12–20 years, 21–40 years, and 41–60 years. Lip prints were recorded using the cellophane tape method after applying lipstick and were analyzed using the Suzuki and Tsuchihashi classification. Data were analyzed using SPSS software, and the Chi-square test was used to assess the association between age groups and lip print patterns. Results: Among the five lip print patterns identified, Type I pattern was the most predominant (32.0%), followed by Type V (20.7%), Type IV (16.3%), Type II (15.7%), and Type III (15.3%). Age-wise analysis showed that Type I pattern was more common in the younger age group (44.8%), while Type III and Type V patterns were relatively more frequent in the older age group. However, statistical analysis showed no significant association between age group and lip print pattern distribution (χ² = 12.84, p = 0.118). Conclusion: The findings of this study indicate that lip print patterns remain largely stable across different age groups, supporting their potential use as a supplementary tool for personal identification in forensic investigations. Keywords: Cheiloscopy; Lip prints; Age-related variation; Suzuki and Tsuchihashi classification; Dakshina Kannada.

Page No: 36-41 | Full Text

 

Original Research Article

FORENSIC UTILITY OF CHEILOSCOPY IN PERSONAL IDENTIFICATION: A POPULATION-BASED STUDY FROM DAKSHINA KANNADA DISTRICT

http://dx.doi.org/10.70034/ijmedph.2026.2.8

Fouzia Hameed, Vina Vaswani, Viswakanth B, Shahnavaz Manipady, Zahida Niaz, Aysha Hadia

View Abstract

Background: Cheiloscopy, the study of lip print patterns, has been increasingly explored as an adjunct method for personal identification in forensic investigations. The characteristic grooves and fissures present on the vermilion border of the lips form distinct morphological patterns that may assist in identifying individuals. The present study aimed to evaluate the distribution of lip print patterns and assess the forensic utility of cheiloscopy in a population from Dakshina Kannada district. Materials and Methods: A population-based cross-sectional study was conducted on 300 individuals aged 12–60 years residing in suburban areas of Dakshina Kannada district. Lip prints were recorded using the cellophane tape method after applying lipstick and were analyzed according to the Suzuki and Tsuchihashi classification. Descriptive statistics were used to determine the frequency distribution of lip print patterns, and the Chi-square test was applied to assess the association between gender and lip print pattern distribution. Results: Among the five lip print patterns identified, Type I pattern was the most predominant (32.0%), followed by Type V (20.7%), Type IV (16.3%), Type II (15.7%), and Type III (15.3%). Gender-wise comparison showed that Type I pattern was the most common pattern in both males and females. However, statistical analysis revealed no significant association between gender and lip print pattern distribution (χ² = 5.236, p = 0.264). A majority of lip prints (79.3%) exhibited identifiable patterns (Type I–IV), indicating their potential usefulness for forensic identification. Conclusion: The findings of this study demonstrate that a large proportion of lip prints possess identifiable morphological patterns, supporting the use of cheiloscopy as a supplementary method for personal identification in forensic investigations. Keywords: Cheiloscopy; Forensic identification; Personal identification; Suzuki and Tsuchihashi classification; Dakshina Kannada.

Page No: 42-47 | Full Text

 

Original Research Article

A PROSPECTIVE OBSERVATIONAL STUDY OF PEDIATRIC PROXIMAL PHALANX FRACTURES: EPIDEMIOLOGY, CLINICAL FEATURES, MANAGEMENT, AND OUTCOMES

http://dx.doi.org/10.70034/ijmedph.2026.2.9

Kondamudi Srinivasu MBBS, DNB, MCh, Palli Shirin MBBS, MS, MCh, Dr NB, Kommu Vijay Babu MBBS, MS, MCh

View Abstract

Background: Proximal phalanx fractures are the most frequent hand fractures in children. Although most injuries are treated nonoperatively, outcomes vary according to fracture pattern, anatomical location, degree of displacement, and the presence of associated soft tissue injury. The objective is to evaluate the clinical profile, management patterns, and functional and radiological outcomes of proximal phalanx fractures in children aged 12 years or younger. Materials and Methods: This prospective study was conducted from January 2024 to January 2025 and included 24 children with proximal phalanx fractures. Data were collected on age, sex, mechanism of injury, fracture location and type, associated injuries, treatment modality, and follow-up outcomes. Pain was assessed using the Visual Analog Scale [VAS], functional outcome was evaluated by range of motion, and residual deformity was assessed on follow-up radiographs. Results: The study included 24 children, comprising 16 males and 8 females, with a mean age of 9 years. Sports-related trauma was the most common mechanism of injury, accounting for 58.3% of cases. The base of the proximal phalanx was the most frequently affected site [50.0%]. Conservative management with a volar slab was successful in 14 patients, whereas open reduction was required in 1 patient. At a mean follow-up of 9 months, 91.7% of patients were pain-free. Residual deformity was noted in 2 patients. Stiffness and restricted range of motion were primarily observed in children with associated soft tissue injury or extensor tendon involvement. Conclusion: Proximal phalanx fractures in children generally have favorable outcomes with conservative treatment. However, displaced fractures and injuries associated with soft tissue damage may require surgical intervention. Adherence to physiotherapy appears to play an important role in optimizing functional recovery. Keywords: children; proximal phalanx fracture; pediatric hand fractures; conservative management; functional outcome; range of motion; soft tissue injury.

Page No: 48-53 | Full Text

 

Case Report

DERMATOFIBROSARCOMA PROTUBERANS OF THE ANTERIOR CHEST WALL- A CASE REPORT

http://dx.doi.org/10.70034/ijmedph.2026.16.2.10

Divya, Dinesh Mahalingam, Lakshmi Narayanan Sankar, Chitra Tulasiram

View Abstract

Dermatofibrosarcoma protuberans (DFSP) is a rare, low-to-intermediate grade fibroblastic neoplasm of dermal origin characterized by locally aggressive behavior and a high propensity for recurrence if inadequately excised. It accounts for less than 0.1% of all malignancies and approximately 1% of soft tissue sarcomas. Although metastasis is rare (<5%), the infiltrative growth pattern with tentacle-like projections into surrounding tissue necessitates meticulous surgical management. We report the case of a 36-year-old male who presented with a slowly progressive plaque over the left anterior chest wall for five years. The lesion was initially misdiagnosed as tinea incognito. Clinical examination revealed a well-defined infiltrative plaque measuring 10 × 8 cm with nodular components, confined to the dermis and subcutaneous tissue. Punch biopsy demonstrated spindle cell proliferation arranged in a storiform pattern. Immunohistochemistry showed strong CD34 positivity and S-100 negativity, confirming DFSP. Contrast-enhanced CT scan revealed tumor confinement to superficial planes without deep muscle or bony invasion. The patient underwent wide local excision (WLE) with adequate margins followed by split skin graft reconstruction. Histopathology confirmed tumor-free margins with a low Ki-67 index (6%). Postoperative recovery was uneventful, with excellent graft uptake and no recurrence to date. This case emphasizes the importance of early recognition, histopathological confirmation, immunohistochemistry, and adequate surgical margins in achieving optimal outcomes. Long-term surveillance remains essential due to the risk of late recurrence.

Page No: 54-57 | Full Text

 

Original Research Article

A PROSPECTIVE OBSERVATIONAL STUDY ON MATERNAL AND PERINATAL OUTCOMES IN SINGLETON PREGNANCIES WITH ABNORMAL PRESENTATIONS IN TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.11

A Viplava, Swathi Rallabhandi, Amritha Aurora Meduri, Paka Sushritha

View Abstract

Background: Aim: To study the maternal and perinatal outcomes in singleton pregnancies with abnormal presentations. Materials and Methods: The study was designed as a single-center prospective observational investigation at the Government Maternity Hospital, Sultan Bazar, focusing on maternal and perinatal outcomes in singleton pregnancies with abnormal presentations. Spanning 24 months, the research included 150 antenatal women from 37 weeks gestation onward, confirmed by clinical and ultrasound examination. Exclusions were made for cases involving intrauterine death, congenital anomalies, and preterm births, using consecutive sampling of eligible participants. Data were collected using a semi- structured questionnaire, with thorough recordation of demographic and obstetric details, including gravidity, parity, and complications from previous pregnancies. Results: In the present study, the mode of delivery was predominantly through Lower Segment Caesarean Section (LSCS), which was necessary in 60% of the cases, reflecting the challenging nature of these pregnancies. Only 30% of the deliveries were normal vaginal births, indicating a tendency towards surgical intervention in complicated cases. Notably, age significantly influences the mode of delivery, with those under 20 years having a higher proportion of vaginal deliveries (55.6%) and LSCS least common (22.2%), a pattern that inversely correlates with increasing age, evident as LSCS dominates (84.3%) in those aged 30 years and above (P < 0.001). The AFI results highlight varying preferences for delivery mode based on fluid levels, with lower AFI favoring more balanced delivery methods and higher AFI leading predominantly to LSCS (92.2%) (P < 0.001). However, the type of labor, whether spontaneous or induced, shows no significant difference in delivery mode selection (P = 0.657), suggesting that other factors might have more substantial impacts on the decision-making process for the mode of delivery. Conclusion: The outcomes of this study reveal critical areas for potential improvement in perinatal care, particularly in the management of pregnancies complicated by abnormal presentations. The relatively high rates of NICU admissions illustrate the gravity of these cases and the imperative for enhanced prenatal monitoring and intervention strategies. The findings advocate for the development of refined guidelines that can assist healthcare providers in making informed decisions about the most suitable delivery methods, aiming to minimize complications and improve survival rates. Overall, this research contributes valuable insights into the impacts of delivery modes on maternal and perinatal health, serving as a basis for future studies and policy- making in obstetric care. Keywords: LSCS, Perinatal care, NICU, AFI, Vaginal Births.

Page No: 58-67 | Full Text

 

Original Research Article

A STUDY ON SERUM ADENOSINE DEAMINASE AS AN INDICATOR OF GLYCAEMIC STATUS IN TYPE 2 DIABETES MELLITUS

http://dx.doi.org/10.70034/ijmedph.2026.2.12

Thammadagoni Alivelu, Deva Mona, Earjala Raju, Shankar Chilumula

View Abstract

Background: Type 2 diabetes mellitus is a common metabolic disorder characterized by high blood glucose levels. Adenosine Deaminase (ADA) is an enzyme which catalyses the irreversible deamination of adenosine to inosine. Adenosine is known to exert potent metabolic effects acting through its receptors. Increased levels of serum ADA has been shown in individuals with type 2 diabetes mellitus. The study aims to understand the relationship between serum ADA and blood glucose levels. The objective is to estimate the value of serum adenosine deaminase (ADA) in patients with uncomplicated cases of type 2 diabetes mellitus and non diabetics control group. To estimate the association between serum ADA values among cases and control based on blood sugars. Materials and Methods: The study was conducted at Gandhi Medical college, Secunderabad. It involved 50 cases of uncomplicated type 2 diabetes mellitus and 50 age and sex matched healthy controls. Blood and urine samples were collected after obtaining an informed written consent from cases and controls. Serum Adenosine Deaminase levels were estimated colourimetrically, based on the method described by Giusti and Galanti. Serum FBS, PPBS, HbA1c levels, blood urea, serum Creatinine, lipid profile and urinary sugar and proteins were also measured simultaneously using routine laboratory methods. Results: A clear-cut elevation of serum ADA was found in diabetic subjects as compared to controls. In addition a very large positive correlation was found between serum ADA and blood glucose levels. Conclusion: This study shows ADA level is significantly related to the glycemic status in type 2 diabetes mellitus. Altered serum ADA level is an indirect expression of the tissue adenosine levels. Adenosine through its multiple metabolic effects is involved in the pathophysiology of diabetes. The present study highlights the importance of this metabolite and the need for further studies. Keywords: Type 2 diabetes mellitus; Adenosine Deaminase; Adenosine.

Page No: 68-71 | Full Text

 

Systematic Review

AWARENESS AND PRACTICES RELATED TO SMOKING AND ALCOHOL CESSATION PRIOR TO SURGERY: A SYSTEMATIC REVIEW

http://dx.doi.org/10.70034/ijmedph.2026.2.13

Divya Ravikumar, Fikret Ahmadov

View Abstract

Background: Smoking and excessive alcohol consumption are well-established risk factors for adverse postoperative outcomes, including wound complications, pulmonary infections, prolonged hospital stay, and increased morbidity and mortality. Evidence suggests that preoperative cessation of tobacco and alcohol significantly reduces surgical complications and improves recovery. Despite established clinical guidelines recommending cessation at least 4–8 weeks prior to elective surgery, patient awareness, adherence, and implementation of cessation practices remain inconsistent across healthcare settings. However, current evidence on awareness and real-world practices is fragmented. A comprehensive systematic review is therefore warranted. This systematic review aims to assess the level of awareness and the practices related to smoking and alcohol cessation prior to surgery among patients undergoing surgical procedures. Materials and Methods: This systematic review followed PRISMA guidelines and included English-language observational studies, randomized control trials, systematic review, meta-analysis, guidelines studies, focusing awareness and practices related to smoking and alcohol cessation prior to surgery and postoperative outcomes. Studies were included according to predefined inclusion criteria. Databases including PubMed/MEDLINE, Scopus, and Web of Science were searched using relevant keywords. Study selection and data extraction were performed, the risk of bias was assessed using appropriate design-specific tools, and findings were synthesized narratively. Results: Preoperative smoking and alcohol cessation reduces postoperative complications. Smoking cessation for at least 4 weeks before surgery significantly lowers pulmonary, wound, and overall complications, with greater benefits seen with longer cessation. Intensive cessation programs combining behavioral support and pharmacotherapy improve abstinence rates and may reduce postoperative morbidity. Combined smoking and alcohol use is associated with the highest risk of postoperative complications, readmission, and reoperation. Conclusion: Preoperative smoking and alcohol cessation interventions, particularly those lasting 4–8 weeks and involving behavioral and pharmacological support, improve abstinence rates and reduce postoperative complications, with greater benefits seen with longer cessation. However, further large-scale studies with standardized methods and long-term follow-up are needed to determine optimal intervention timing and long-term outcomes. Keywords: Smoking cessation, Alcohol cessation, Awareness and practices, surgery, Preoperative care, post operative outcomes.

Page No: 72-77 | Full Text

 

Systematic Review

LASER AND LIGHT-BASED THERAPIES FOR MELASMA: A SYSTEMATIC REVIEW

http://dx.doi.org/10.70034/ijmedph.2026.2.14

Alicia Kivork, Grigor Haryutunyan

View Abstract

Background: Melasma is a chronic facial hyperpigmentation disorder commonly affecting women of reproductive age, driven by ultraviolet and visible light exposure, hormonal factors, and dermal changes, with frequent recurrence. Although hydroquinone-based triple combination therapy remains first-line treatment, lasers and intense pulsed light (IPL) are increasingly used. This systematic review evaluates the efficacy and safety of laser and light-based therapies for melasma. The objective is to determine the most effective and safest laser modalities, compare outcomes across technologies and protocols, and identify research gaps to inform future studies and guide evidence-based clinical decision-making in melasma management. Materials and Methods: This systematic review was conducted according to PRISMA guidelines. PubMed, Scopus, Web of Science, and Google Scholar were searched using predefined keywords related to melasma and laser/light-based therapies. Randomized controlled trials and observational studies, review articles, meta-analysis were included. Two reviewers independently screened studies and extracted data. Owing to heterogeneity, results were synthesized qualitatively. Results: Laser and light-based therapies demonstrated variable efficacy in melasma management. Q-switched Neodymium-doped Yttrium Aluminium Garnet (Nd:YAG) showed temporary improvement but was associated with recurrence and pigmentary adverse effects. Fractional lasers and IPL provided moderate benefit, with combination IPL therapies enhancing outcomes. Picosecond lasers, particularly with diffractive lens arrays, showed superior efficacy, better patient satisfaction, and favorable safety profiles. Combination approaches improved long-term outcomes but increased the risk of mild adverse events. Conclusion: Picosecond lasers and combination-based strategies appear to offer the most promising balance of efficacy and safety. However, recurrence and procedure-related pigmentary changes remain concerns, highlighting the need for standardized protocols and larger controlled trials to optimize long-term melasma management. Keywords: Melasma, Laser therapy, Picosecond laser, Q-switched Nd:YAG, Fractional laser, Ablative laser, Non-ablative laser, Intense pulsed light (IPL), Combination therapy.

Page No: 78-83 | Full Text

 

Original Research Article

RECENT CHANGES IN THE CLINICAL SYMPTOMATOLOGY OF COMMUNITY-ACQUIRED PNEUMONIA IN CHILDREN

http://dx.doi.org/10.70034/ijmedph.2026.2.15

Kabita Ghose, Papiya Mistry, Faiza Ali

View Abstract

Background: Community-acquired pneumonia (CAP) is still one of the most prevalent reasons for kids to go to the hospital. The clinical manifestation of juvenile community-acquired pneumonia seems to be altering due to extensive immunization, enhanced nutrition, and shifting respiratory pathogen patterns. Classical symptoms like high fever and chest indrawing may not be as common now. Instead, more cases are being recorded with wheezing and viral-like symptoms. The objectives is to assess the present clinical symptomatology, severity indicators, and outcomes of community-acquired pneumonia in children admitted during a recent one-year timeframe. Materials and Methods: This retrospective observational study encompassed 100 children aged 1 month to 12 years, admitted with CAP from February 2025 to January 2026 at a tertiary care hospital. We looked at demographic information, symptoms, clinical indicators, oxygen saturation, laboratory results, radiographic findings, treatment, and outcomes. Results: Fever (93%) and cough (92%) were the most common symptoms that people had. 76% of the cases had tachypnea, although only twenty-four percent had chest indrawing. Wheezing was observed in 27% of children, suggesting an increasing wheeze-associated pneumonia phenotype. In 29% of cases, hypoxia (SpO₂ <92%) happened. 12% of patients had complicated pneumonia, and nine percent needed intensive care. Viral or unusual causes were suspected in over half of the cases based on clinical characteristics and lab results. Conclusion: The clinical manifestation of pediatric CAP is evolving towards diminished classical bacterial characteristics and an increased prevalence of wheeze-dominant and viral-like symptoms. These findings underscore the necessity to modify diagnostic and therapeutic strategies to align with current disease trends, thereby preventing overtreatment while facilitating the prompt recognition of severe cases. Keywords: Community-acquired pneumonia; pediatric population; clinical manifestations; wheezing; hypoxia; pediatric respiratory diseases.

Page No: 84-87 | Full Text

 

Original Research Article

COMPARISON OF SHORT-TERM CLINICAL OUTCOMES OF FIXED LOOP VERSUS ADJUSTABLE LOOP AS FEMORAL CORTICAL SUSPENSION DEVICES IN ARTHROSCOPIC ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION

http://dx.doi.org/10.70034/ijmedph.2026.2.16

Amit Garud, Nishant Gaonkar, Vaibhav Koli

View Abstract

Background: Various methods of femoral graft fixation are available, among which cortical loop fixation devices are most commonly used. This study aims to compare fixed loop and adjustable loop devices in terms of clinical outcomes. Materials and Methods: In this prospective randomized study, patients were divided into two groups based on the type of femoral fixation device used. Clinical outcomes were assessed at 6 months and 1 year using the International Knee Documentation Committee (IKDC) and Lysholm scores. Results: Both groups were comparable in terms of demographic characteristics and preoperative scores. Postoperatively, no statistically significant difference was observed between the two groups at 6 months or at 1 year. Conclusion: Arthroscopic ACL reconstruction yields comparable short-term clinical outcomes when either fixed loop or adjustable loop devices are used for femoral cortical fixation. Keywords: Fixed loop device, Adjustable loop device, Arthroscopic anterior cruciate ligament reconstruction.

Page No: 88-91 | Full Text

 

Original Research Article

A RETROSPECTIVE STUDY OF CONSERVATIVE MANAGEMENT IN SOLID ORGAN INJURY IN BLUNT ABDOMINAL TRAUMA WITH HEMOPERITONEUM AT A TERTIARY CARE CENTRE IN SOUTH GUJARAT

http://dx.doi.org/10.70034/ijmedph.2026.2.17

Bhavna P. Kala, Pavankumar M. Khunt, Aishwarya S. Champaneria

View Abstract

Background: Blunt abdominal trauma frequently results in solid organ injury with hemoperitoneum. In hemodynamically stable patients, non-operative management has increasingly become the preferred approach. This study aimed to evaluate the outcomes of conservative management and assess the utility of the Clinical Abdominal Scoring System (CASS) in such cases. Materials and Methods: A retrospective observational study was conducted at a tertiary care centre in South Gujarat from 2024 to 2026. A total of 50 hemodynamically stable patients with radiologically confirmed solid organ injury and hemoperitoneum were included. Data regarding demographics, mode of injury, clinical presentation, vital parameters, organ involvement, and CASS were collected from medical records and analyzed using descriptive statistics. Results: Among the 50 patients, 86% were males. The most affected age group was 18–25 years (36%). Road traffic accidents were the leading cause (60%). Most patients (68%) presented within 2 hours of injury. Abdominal pain was present in all cases, while vomiting and loss of consciousness were noted in 18% and 12%, respectively. The majority had pulse rate <90/min (70%) and systolic blood pressure between 90–120 mmHg (80%). Liver was the most commonly injured organ (62%), followed by spleen (26%), kidney (8%), and combined injuries (4%). Most patients had CASS scores between 8–12 (56%), while 44% had scores <8. All patients were managed conservatively with no requirement for operative intervention. Conclusion: Conservative management is highly effective in hemodynamically stable patients with solid organ injury. CASS is a valuable tool for clinical assessment and management planning. Keywords: Blunt abdominal trauma, hemoperitoneum, conservative management, solid organ injury, clinical abdominal scoring system.

Page No: 92-96 | Full Text

 

Original Research Article

RESISTANCE AND VIRULENCE IN CA-MRSA: CLINICAL AND MICROBIOLOGICAL SPECTRUM OF COMMUNITY-ACQUIRED SSTIs

http://dx.doi.org/10.70034/ijmedph.2026.2.18

Khushbu Sakure, Umesh Hassani, Vilas Thombare, Sunil Deshmukh, Ravindra Kadse

View Abstract

Background: Skin and soft tissue infections (SSTIs) are a major clinical burden, with community-acquired methicillin-resistant Staphylococcus aureus (CA-MRSA) increasingly implicated. These strains often combine resistance determinants with virulence factors such as Panton-Valentine leukocidin (PVL), complicating management. The aim is to investigate the clinical profile, microbiological spectrum, genotypic characteristics, and antibiotic resistance patterns of CA-SSTIs, with emphasis on CA-MRSA. Materials and Methods: A prospective cross-sectional study was conducted on 350 clinically diagnosed CA-SSTI cases over two years. Samples were processed for culture, phenotypic identification, antibiotic susceptibility testing, and PCR-based genotypic analysis. Results: The mean patient age was 37.9 years, with a slight male predominance (56%). Abscesses (46%) and cellulitis (34%) were the most common presentations. Culture positivity was 61%, with S. aureus as the predominant isolate (43%). MRSA accounted for 16% of cases. Genotypic analysis revealed mecA positivity in 86% and PVL gene presence in 79% of MRSA isolates. Antibiogram showed high resistance to ciprofloxacin (59%), gentamicin (59%), and erythromycin (53%), moderate resistance to clindamycin (33%) and tetracycline (34%), while all isolates remained sensitive to linezolid and vancomycin. Conclusion: CA-MRSA strains in this cohort demonstrated both multidrug resistance and high PVL prevalence, underscoring their dual threat of resistance and virulence. Linezolid and vancomycin remain reliable therapeutic options, while clindamycin may retain partial utility. Continuous molecular surveillance and antibiotic stewardship are essential to guide rational therapy and curb the spread of resistant, virulent clones. Keywords: Community-acquired SSTIs, MRSA.

Page No: 97-101 | Full Text

 

Original Research Article

CHIKUNGUNYA VIRUS INFECTION IN GONDIA, A TRIBAL DISTRICT IN MAHARASHTRA, INDIA: A DEMOGRAPHICAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.19

Khushbu s Sakure, Vivek Patil, Ravindra Khadse, Sunil Deshmukh

View Abstract

Background: Chikungunya virus infection continues to pose a significant public health challenge in India, with recurrent outbreaks and marked seasonal variation. District-level laboratory surveillance is essential to understand local transmission dynamics and to guide targeted vector control measures. The objective is to describe the demographic, temporal and spatial distribution of laboratory-confirmed chikungunya cases in Gondia district, Maharashtra, during 2024–2025 and to compare the month-wise trend with dengue during the same period. Materials and Methods: This study was conducted at the Department of Microbiology, Government Medical College, Gondia, which serves as a sentinel centre for vector born disease surveillance. Serum samples received from clinically suspected chikungunya cases during January 2024 to December 2025 were tested for chikungunya IgM-capture ELISA. Data on age, sex, month of reporting and taluka of residence were analysed. Descriptive statistics were used to assess trends and distribution patterns. Results: A total of 6,507 samples were tested using IgM ELISA during the study period, of which 181 were laboratory confirmed for chikungunya virus infection, giving an overall positivity of 2.8%. The positivity rate was higher in 2024 as compared to 2025. Males constituted 52% of cases, while females accounted for 48%, indicating near equal sex distribution. The majority of infections were observed among adults, particularly in the 21-30-year age group followed by 31-40. Month-wise analysis demonstrated distinct seasonal peaks, with a major increase during the monsoon and post-monsoon period and a smaller rise during early months of the year. Spatial analysis revealed marked clustering of cases in Gondia taluka (98 cases, 54.1%), followed by Sadak Arjuni, Goregaon and Amgaon, while comparatively fewer cases were reported from peripheral talukas. Conclusion: The present laboratory-based surveillance highlights clear spatio-temporal clustering and predominance of chikungunya infection among the adult population in Gondia district, with transmission peaks corresponding to favourable vector breeding seasons. Concentration of cases in specific talukas underscores the need for focused vector control and strengthened surveillance at sub-district level. Continuous laboratory surveillance remains crucial for early detection of transmission trends and for guiding district-specific public health interventions. Keywords: Chikungunya, surveillance, seasonality, Maharashtra, dengue.

Page No: 102-106 | Full Text

 

Original Research Article

A STUDY ON INCIDENCE OF POST MASTECTOMY PAIN AND PHANTOM BREAST SYNDROME FOLLOWING MASTECTOMY

http://dx.doi.org/10.70034/ijmedph.2026.2.20

David Salivendra, Ponnuru Chandi Priya, Jyeshavath Venkateswara Naik, Blessy Rani Medikonda

View Abstract

Background: Mastectomy is a commonly performed surgical procedure for the treatment of breast cancer. Although it is effective in controlling the disease, many patients experience postoperative complications such as post-mastectomy pain syndrome and phantom breast syndrome. These conditions may lead to persistent pain, discomfort, and psychological distress, thereby affecting the overall quality of life of patients. The aim is to determine the incidence of post-mastectomy pain and phantom breast syndrome in patients undergoing mastectomy for carcinoma breast. Materials and Methods: This observational study was conducted in the Department of General Surgery for a period of 2 years. A total of 100 patients who underwent mastectomy were included in the study. Patients above 35 years of age who underwent simple mastectomy or modified radical mastectomy were enrolled after obtaining informed consent. Patients were evaluated during follow-up visits approximately 8 to 12 weeks after surgery. A detailed clinical examination and a standardized questionnaire were used to assess the presence of postoperative pain and phantom breast symptoms. Data were entered in Microsoft Excel and analyzed using Epi Info and SPSS software. Results: Among the 100 patients studied, 67% underwent modified radical mastectomy and 33% underwent simple mastectomy. Post-mastectomy chest wall pain was observed in 21% of patients, while 11% experienced arm pain. Phantom breast pain was reported in 4% of patients, and phantom breast sensations were observed in 6% of patients. Most patients (66%) did not report any significant psychological impact following surgery. Conclusion: Post-mastectomy pain and phantom breast syndrome remain important postoperative complications, particularly following modified radical mastectomy. Early recognition, appropriate patient counseling, and improved surgical techniques may help reduce these complications and improve postoperative quality of life. Keywords: Breast cancer, Mastectomy, Post-mastectomy pain syndrome, Phantom breast pain, Phantom breast sensation.

Page No: 107-111 | Full Text

 

Original Research Article

PREVALENCE AND DETERMINANTS OF VACCINE HESITANCY AMONG ADULTS IN URBAN POPULATION

http://dx.doi.org/10.70034/ijmedph.2026.2.21

Samrat Prodhan, Suranjana Sarkar, Suman Kumar Roy, Arnab Ghosal, Ayan Ghosh

View Abstract

Background: Vaccine hesitancy has emerged as a major public health concern, even in urban populations with better access to healthcare services. It is influenced by multiple sociodemographic and behavioral factors, affecting vaccine uptake and the success of immunization programs. This study aimed to assess the prevalence and determinants of vaccine hesitancy among adults in an urban population. Materials and Methods: A community-based cross-sectional study was conducted among 351 adults in the urban field practice area of a tertiary care center in Kalyani, West Bengal. Participants were selected using multistage sampling. Data were collected using a pretested semi-structured questionnaire. Vaccine hesitancy was assessed based on the WHO SAGE framework. Statistical analysis included descriptive statistics, Chi-square test, and logistic regression. Results: The prevalence of vaccine hesitancy was 28.8%. Higher hesitancy was observed among younger adults, males, individuals with lower education, and those from lower socioeconomic status. Fear of side effects (57.4%), social media misinformation (46.5%), and doubts about vaccine efficacy (40.6%) were the most common reasons. Multivariate analysis revealed that primary education (AOR = 2.48) and lower socioeconomic status (AOR = 2.31) were significant independent predictors of vaccine hesitancy. Conclusion: Vaccine hesitancy remains a significant challenge in urban populations, driven by educational, socioeconomic, and informational factors. Targeted health education, improved communication strategies, and efforts to counter misinformation are essential to enhance vaccine acceptance and strengthen immunization programs. Keywords: Vaccine hesitancy; Socioeconomic factors; Health education; Immunization.

Page No: 112-117 | Full Text

 

Original Research Article

ASSOCIATION OF ELEVATED FIRST-TRIMESTER SERUM URIC ACID LEVELS WITH THE DEVELOPMENT OF GESTATIONAL DIABETES MELLITUS: A PROSPECTIVE OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.22

Mandala Jyothi, Rekha M

View Abstract

Background: Gestational diabetes mellitus (GDM) is one of the most common metabolic complications of pregnancy, leading to significant maternal and fetal morbidity. Early detection and intervention can substantially reduce adverse outcomes. Uric acid, a final product of purine metabolism, has been associated with insulin resistance, endothelial dysfunction, and oxidative stress. The present study was undertaken to evaluate whether elevated first-trimester serum uric-acid levels could serve as an early biochemical marker for predicting GDM. Materials and Methods: This prospective observational study was conducted in the Department of Obstetrics and Gynecology, St. Peter’s Medical College, Hospital and Research Institute, Krishnagiri, over a period from March 2025 to February 2026. A total of 150 antenatal women with gestational age < 14 weeks were enrolled. Serum uric-acid levels and fasting blood sugar were estimated at recruitment. Women were later screened for GDM between 24–28 weeks using a 75-gram oral glucose-tolerance test (OGTT) according to IADPSG/DIPSI criteria. Statistical analysis was performed using SPSS (version __). ROC-curve analysis was used to determine the optimal uric-acid cut-off for predicting GDM. Results: The mean age of participants was 23.8 ± 2.4 years, and the mean BMI was 22.0 ± 2.2 kg/m². The incidence of GDM was 14 % (21 of 150). Women with serum uric acid > 3.6 mg/dL in the first trimester had a significantly higher risk of developing GDM (P < 0.01). ROC analysis showed an AUC = 0.90 (SE = 0.05) with 91 % sensitivity and 98 % specificity, indicating excellent predictive performance. Serum uric acid correlated positively with GDM (r = 0.42, P < 0.01) and was independent of BMI and parity. Conclusion: An elevated first-trimester serum uric-acid level (> 3.6 mg/dL) is a strong and independent predictor of subsequent GDM. Because uric-acid testing is inexpensive, simple, and widely available, it may be incorporated as an early screening tool in antenatal care to identify high-risk women well before routine glucose testing. Larger multicentric studies are recommended to validate its predictive accuracy and integrate it into early pregnancy screening protocols. Keywords: Gestational diabetes mellitus, uric acid, first trimester, insulin resistance, early prediction.

Page No: 118-123 | Full Text

 

Original Research Article

STUDY OF HISTOMORPHOLOGICAL PATTERNS OF DCIS ASSOCIATED WITH INVASIVE DUCTAL CARCINOMA

http://dx.doi.org/10.70034/ijmedph.2026.2.23

Muchalapuri Sravani, Nunna Kiranmai, Akuthota Chaitanya

View Abstract

Background: Ductal carcinoma in situ (DCIS) frequently coexists with infiltrating duct carcinoma (IDC) of the breast and represents the non-invasive precursor component within the spectrum of breast carcinogenesis. Evaluating the histomorphological patterns of DCIS occurring alongside IDC provides important insights into tumor biology, aggressiveness, and potential prognostic implications. The aim is to study the histomorphological patterns of DCIS associated with IDC. Materials and Methods: Cross-sectional, descriptive study done in the department of Pathology, SVS Medical college, Mahbubnagar for duration of one-year ie from September 2024- september 2025. Results: DCIS was identified in the majority of IDC cases, with solid being the most common architectural pattern, followed by comedo and cribriform patterns. Mixed patterns were frequent, accounting for a substantial proportion of cases where more than one subtype of DCIS coexisted. High-grade DCIS showed a strong association with comedo necrosis and was more commonly present in tumors of higher histological grade. Cases with comedo or mixed architectural patterns demonstrated increased frequency of necrosis and a higher likelihood of lymphovascular invasion in the invasive component. No significant difference was observed in age distribution or tumor size in relation to DCIS subtype. Conclusion: DCIS accompanying IDC displays diverse architectural patterns, with solid and comedo types being most prevalent. High-grade nuclear features and comedo necrosis are more frequently associated with aggressive invasive carcinoma characteristics. Understanding the histomorphological spectrum of DCIS in association with IDC is essential, as it may provide prognostic information and help refine therapeutic decision-making. Keywords: Ductal carcinoma in situ (DCIS), Infiltrating duct carcinoma (IDC).

Page No: 124-129 | Full Text

 

Original Research Article

CLINICAL CORRELATION BETWEEN UNILATERAL PTERYGIUM AND DRY EYE

http://dx.doi.org/10.70034/ijmedph.2026.2.24

Balusupati Aalekhya, Farha Jabeen, Udayasree G, Mohammed Ather

View Abstract

Background: The aim is to study dry eye in cases of unilateral pterygium. The objective is to study the incidence of dry eye among the patients of unilateral pterygium. To find the clinical correlation between dry eye and unilateral pterygium. Results: This study demonstrates a significant association between pterygium and dry eye disease, as evidenced by measurable impairments in tear film stability and production. Using objective clinical parameters—Tear Break-Up Time (TBUT), Tear Meniscus Height (TMH), and Schirmer’s Tests I and II—we found that patients with pterygium consistently exhibited reduced values across all indicators when compared to controls. Among these, TBUT showed the strongest correlation with both the presence and severity of pterygium, underscoring the progressive nature of tear film instability as pterygium advances in grade.While TMH and Schirmer’s test values were also significantly lower in pterygium patients, their association with pterygium severity was less pronounced, likely due to sample size limitations and variability in tear secretion patterns. Nevertheless, these findings align with existing literature suggesting that pterygium contributes to both aqueous tear deficiency and tear film instability—key factors in the pathogenesis of dry eye disease. Conclusion: The present study highlights the clinical importance of routine screening for dry eye symptoms in patients with pterygium, particularly in moderate to severe cases. Early identification and management of tear film dysfunctionmay alleviate symptoms, enhance visual quality, and potentially reduce postoperative recurrence following pterygium excision. Future studies with larger cohorts and standardized diagnostic criteria are recommended to further validate and refine the association between pterygium and dry eye disease. Keywords: Pterygium, Dry eye, TBUT, TMH, Schirmer’s test.

Page No: 130-136 | Full Text

 

Original Research Article

A CASE CONTROL STUDY TO FIND OUT CHILD FEEDING PRACTICES RESPONSIBLE FOR SEVERE ACUTE MALNUTRITION AMONG UNDER FIVE YEARS OF CHILDREN AT VARANASI

http://dx.doi.org/10.70034/ijmedph.2026.2.25

Rituparna Ray

View Abstract

Background: Severe Acute Malnutrition (SAM) remains a major public health problem among under-five children in developing countries. Inappropriate child feeding practices play a crucial role in the development of malnutrition during early childhood. The aim is to identify and evaluate child feeding practices associated with Severe Acute Malnutrition among children under five years of age. Materials and Methods: This was a hospital-based case-control study conducted at the Nutritional Rehabilitation Centre of Pandit Deendayal Upadhyay Hospital, Pandeypur, Varanasi. The study was carried out over a period of 8 Months May 2025 to December 2025, at Pundit Deendayal Upadhyay Hospital, Varanasi. Results: Analysis of breastfeeding practices among the study participants revealed significant differences between cases and controls. Among the 50 cases with Severe Acute Malnutrition (SAM), only 20 (40%) children received early initiation of breastfeeding within one hour of birth, compared to 40 (80%) of the 50 controls (p<0.001), Exclusive breastfeeding for the first six months was observed in 18 (36%) cases and 35 (70%) controls (p<0.001), showing a significant protective effect of exclusive breastfeeding. Partial breastfeeding was more common among cases, reported in 25 (50%) children versus 12 (24%) controls (p=0.001). Children who were not breastfed at all included 7 (14%) cases and 3 (6%) controls, but this difference was not statistically significant (p=0.18). Conclusion: Suboptimal child feeding practices are major determinants of Severe Acute Malnutrition. Strengthening awareness and education regarding appropriate infant and young child feeding practices is essential to reduce the burden of malnutrition among under-five children. Keywords: Severe Acute Malnutrition, Under-five Children, Child Feeding Practices, Case-Control Study, Breastfeeding, Complementary Feeding, Dietary Diversity, Public Health.

Page No: 137-141 | Full Text

 

Original Research Article

CLINICAL SPECTRUM AND OUTCOMES OF ULTRASOUND-GUIDED INTERVENTIONS IN A TERTIARY CARE CENTER

http://dx.doi.org/10.70034/ijmedph.2026.2.26

Devidas Dahiphale, Shivaji Pole, Asmita Suryawanshi, Yash Tank

View Abstract

Background: Ultrasound (USG)-guided interventions have revolutionized minimally invasive diagnostic and therapeutic procedures by enabling real-time visualization, improving accuracy, and reducing complication rates. USG guidance improves overall clinical results, diagnostic yield, and procedural safety as compared to blind procedures. The purpose of this study was to assess the complication profile, technical success, clinical range, and diagnostic sufficiency of USG-guided treatments carried out in a tertiary care hospital. Materials and Methods: This retrospective observational study was conducted in the Department of Radiology at MGM Medical College, Chhatrapati Sambhajinagar, Maharashtra, over a one-month period (01 July 2025 to 31 July 2025). Included were 122 patients who had different diagnostic and therapeutic treatments guided by USG. Hospital records were used to gather information on demographics, procedure type, technical success, diagnostic yield, and complications. Results were evaluated in terms of complication rates, diagnostic sample adequacy, and procedure success. Results: Among 122 patients, 55.7% were males and 44.3% were females, with an age range of 1–82 years. Pleural tapping was the most commonly performed procedure (32.8%), followed by thyroid FNAC (13.9%), breast biopsy (9.8%), and lymph node FNAC (9.0%). Technical success was achieved in 99.2% of cases, with a diagnostic yield of 95.1%. No complications were observed in 93.5% of patients. Minor complications occurred in 6.5% of cases, predominantly localized pain (62.5%) and minor bleeding (37.5%). No major complications or mortality were reported. Conclusion: High technical success, superior diagnostic reliability, and low complication rates are all demonstrated by USG-guided procedures. According to these results, they are safe, efficient, and considered standard practices in tertiary healthcare settings. Keywords: Ultrasound-guided interventions, Minimally invasive procedures, Diagnostic yield, Technical success rate, Complication profile.

Page No: 142-148 | Full Text

 

Original Research Article

SURGICAL OUTCOMES OF FLEXOR HALLUCIS LONGUS AND PERONEUS BREVIS TENDON TRANSFERS IN NEGLECTED ACHILLES TENDON RUPTURES

http://dx.doi.org/10.70034/ijmedph.2026.2.27

Veluri Atchuta Ramaiah, Harish Kodi, Kada Venika, Padigapati Venkata Abhilash Reddy

View Abstract

Background: Neglected Achilles tendon ruptures, defined as untreated for over 3 weeks, lead to tendon retraction, muscle atrophy, and functional deficits like weak plantar flexion and gait instability. Surgical reconstruction via tendon transfers—flexor hallucis longus (FHL) or peroneus brevis (PB)—is standard when primary repair fails, but comparative outcomes remain underexplored. Materials and Methods: This prospective comparative study at a tertiary orthopedic department included 25 patients (13 FHL in Group A, 12 PB in Group B; aged 30-60 years) with chronic closed ruptures. FHL followed modified Wapner technique; PB used standard posterior/lateral harvest. Postoperatively, patients received equinus casting for 6 weeks, then progressive weight-bearing and physiotherapy. Outcomes at 6 months used Quigley scale, Leppilahti score, and AOFAS hindfoot-ankle score; complications were recorded. Results: Groups were balanced in age, sex, and injury-to-surgery time (3-5 weeks: 12 total; >5 weeks: 13 total). Group A excelled: Quigley excellent (6 vs 2), Leppilahti 91-100 (8 vs 2), AOFAS 91-100 (6 vs 2); all had good/excellent results vs 11/12 in B. Complications favored A: superficial infections (2 vs 4), no neurological issues (vs 3 in B); no reruptures. Conclusion: FHL transfer provides superior functional scores and fewer complications than PB for neglected Achilles ruptures, positioning it as the preferred option. Keywords: Achilles tendon rupture, flexor hallucis longus transfer, peroneus brevis transfer, tendon reconstruction, Quigley scale, AOFAS score.

Page No: 149-155 | Full Text

 

Original Research Article

A STUDY ON THE PREVELANCE OF LOW T3 SYNDROME IN HEART FAILURE WITH REDUCED EJECTION FRACTION AT NORTHERN RAILWAY CENTRAL HOSPITAL, NEW DELHI

http://dx.doi.org/10.70034/ijmedph.2026.2.28

Gopika krishnan B G, Celestina Dungdung, Sachin Madaan, Sanjay Joshi, Atul Gupta, Madhu Kaushal

View Abstract

Background: Heart failure (HF) is associated with high morbidity and mortality and is often accompanied by endocrine abnormalities, including thyroid dysfunction. Low T3 syndrome (LT3S), defined by reduced free triiodothyronine (FT3) with normal thyroid stimulating hormone (TSH) and free thyroxine (FT4), has been linked to adverse outcomes in HF. This study aimed to determine the prevalence of low T3 syndrome in patients with heart failure with reduced ejection fraction (HFrEF) and its association with clinical outcomes. Materials and Methods: This prospective observational study was conducted at Northern Railway Central Hospital, New Delhi over 18 months from July 2023 to December 2024. A total of 200 patients with HFrEF (LVEF <40%) were enrolled. Baseline clinical evaluation, thyroid function tests, and echocardiography were performed. Patients were categorized into low T3 syndrome and normal T3 groups and followed for six months to assess mortality, hospitalization, and major adverse cardiovascular events (MACE). Results: The mean age was 64.95 ± 12.99 years and 57.5% were males. Low T3 syndrome was observed in 40% of patients. Mortality at six months was significantly higher in the low T3 group compared to the normal T3 group (52.5% vs 16.67%, p<0.05). Hospitalization rates were also higher in the low T3 group (71.25% vs 49.17%, p<0.05). However, no significant association was found between low T3 syndrome and individual cardiovascular events. Conclusion: Low T3 syndrome is common in patients with HFrEF and is associated with increased mortality and hospitalization. Assessment of thyroid function may help identify high-risk patients. Keywords: Heart failure, Low T3 syndrome, HFrEF, Thyroid dysfunction, Mortality.

Page No: 156-160 | Full Text

 

Original Research Article

A COMPARATIVE STUDY OF INCIDENCE OF INCISIONAL HERNIA FOLLOWING EMERGENCY AND ELECTIVE SURGERIES

http://dx.doi.org/10.70034/ijmedph.2026.2.29

Ashfa Neelofer Mohammed, Ashwini Todewale

View Abstract

Background: Aim: To compare the occurrence of incisional hernia in emergency and elective surgeries in Shadan Institute of Medical Sciences. Objective: 1) To compare the incidence of incisional hernia in emergency and elective surgeries. 2) To analyse the factors contributing for the development of Incisional hernia in the patients who have undergone elective and emergency surgeries in Shadan Institute of Medical sciences. Materials and Methods: It was a case-control study conducted at Department of General Surgery, Shadan institute of medical sciences, Peerancheru, Hyderabad. Patients admitted in surgical wards of Shadan institute of Medical Sciences with incisional hernia with primary surgery done within 5 years and those who have undergone emergency and elective surgeries 5 years back. Results: In the present study, incisional hernia is more common after emergency surgeries as compared to elective surgeries. The significance of age as a risk factor for Incisional Hernia cannot be determined in the present study. Further study with greater sample size of effected as well as controlled groups is required to establish a causal relation. Incisional hernia is more common in males as compared to females, but male sex is not a risk factor for development of incisional hernia. Post-operative wound infections, diabetes mellitus, smoking, BMI>25, COPD can be considered as risk factors in development of incisional hernia. Conclusion: The present study concluded that Incidence of incisional hernia can be decreased by following proper suturing techniques, using appropriate suture material, observing sterile methods preoperatively and by achieving optimum glycaemic control. Keywords: Incisional Hernia, BMI, COPD, Glycemic control, Smoking.

Page No: 161-168 | Full Text

 

Original Research Article

DIAGNOSIS OF PLACENTA ACCRETA SPECTRUM IN POST-CAESAREAN PREGNANCY: A COMPARATIVE STUDY OF ULTRASOUND AND MAGNETIC RESONANCE IMAGING

http://dx.doi.org/10.70034/ijmedph.2026.2.30

Thendral G, Laishram Trinity Meetei, Laishram Deepak Kumar, Keisham Miranda Devi, Sheral Raina Tauro, Tamphasana Maimom

View Abstract

Background: Placenta accreta spectrum (PAS) is a life-threatening obstetric condition associated with abnormal placental invasion and rising caesarean section rates. Early diagnosis is essential for optimal management. This study evaluated the diagnostic performance of ultrasonography (USG) and magnetic resonance imaging (MRI) in post-caesarean pregnancies. Materials and Methods: A cross-sectional study was conducted over two years at RIMS, Imphal, including 84 pregnant women with prior caesarean section. All underwent USG, and MRI was performed in suspected cases. Histopathology was the gold standard. Data were analyzed using SPSS with chi-square test, t-test, and ROC analysis. Results: Most participants were aged 18–27 years (54%) with a mean age of 33.23 ± 5.91 years. Higher gravidity and multiple LSCS were significantly associated with PAS (p < 0.05). MRI showed higher sensitivity (65.3%) and specificity (77.3%) compared to USG (55.2% and 63.6%). ROC analysis indicated moderate diagnostic accuracy (MRI AUC = 0.614; USG AUC = 0.682). Conclusion: USG and MRI are effective for PAS diagnosis, with MRI showing better diagnostic performance. Multiparity and prior LSCS significantly increase PAS risk, highlighting the need for early detection and multidisciplinary management. Keywords: Placenta accreta spectrum, ultrasonography, MRI, caesarean section, multiparity, diagnostic accuracy.

Page No: 169-174 | Full Text

 

Original Research Article

COMPARATIVE STUDY OF EFFICACY, MOTOR, SENSORY BLOCKADE AND DURATION OF BLOCKADE OF INTRATHECAL HYPERBARIC LEVOBUPIVACAINE WITH FENTANYL AND HYPERBARIC ROPIVACAINE WITH FENTANYL IN LOWER ABDOMINAL AND LOWER EXTREMITY SURGERY

http://dx.doi.org/10.70034/ijmedph.2026.2.31

K. Deepa, Mangesh Suresh Gore, Anil Parde, Kadali Lakshmi Sudha

View Abstract

Background: Spinal anesthesia is widely used for lower abdominal and lower extremity surgeries due to its rapid onset, reliable sensory and motor blockade, and favorable safety profile. Newer local anesthetics such as levobupivacaine and ropivacaine are increasingly preferred because of their lower cardiotoxicity and neurotoxicity compared to bupivacaine. The addition of opioids like fentanyl as an adjuvant enhances the quality and duration of spinal anesthesia. Aim: To determine and compare the characteristics of subarachnoid block induced by hyperbaric levobupivacaine 0.5% with fentanyl and hyperbaric ropivacaine 0.75% with fentanyl in patients undergoing lower abdominal and lower extremity surgeries. Materials and Methods: This comparative study included 80 patients who were randomly divided into two groups of 40 each. Group L received intrathecal hyperbaric levobupivacaine 0.5% with fentanyl, while Group R received hyperbaric ropivacaine 0.75% with fentanyl. The onset and level of sensory blockade, duration of sensory and motor blockade, motor block intensity using the Modified Bromage Scale, duration of analgesia, hemodynamic parameters, and adverse effects were recorded and analyzed. Results: The mean age of patients was comparable between the groups (p = 0.63). Group L demonstrated significantly longer duration of sensory blockade (187 ± 16 min vs 134 ± 13.1 min, p < 0.001), longer motor block duration (145 ± 15.6 min vs 109 ± 9.44 min, p < 0.001), and prolonged analgesia compared to Group R. The onset of sensory block was faster in Group R (8.55 ± 2.80 min) than in Group L (10.7 ± 3.57 min, p = 0.004). Hemodynamic parameters were largely comparable between groups. Conclusion: Both drug combinations provided effective spinal anesthesia; however, levobupivacaine with fentanyl produced longer sensory and motor blockade, while ropivacaine with fentanyl allowed earlier recovery, making it suitable for shorter procedures. Keywords: Spinal anesthesia, Levobupivacaine, Ropivacaine, Fentanyl, Sensory blockade, Motor blockade.

Page No: 175-180 | Full Text

 

Original Research Article

COMPARATIVE STUDY OF EFFICACY BETWEEN FOLEY’S INDUCTION WITH MISOPROSTOL AND MIFEPRISTONE WITH MISOPROSTOL IN SECOND TRIEMESTER ABORTION

http://dx.doi.org/10.70034/ijmedph.2026.2.32

Rekha M, Mandala Jyothi

View Abstract

Background: Termination of pregnancy in the second trimester is associated with increased maternal risks. The MTP act, 1971 (amended in 2020) defines the ideal place and person who can conduct MTP. The safest, lowcost method of inducing MTP is preferred. This study was conducted to evaluate the efficacy and outcome of 2 medical methods of MTP , i.e., use of Foley’s bulb and misoprostol versus misoprostol with mifepristone. Materials and Methods: This observational study was conducted in the Department of Obstetrics and Gynaecology, St. Peter’s Medical College, Hospital and Research Institute, Krishnagiri, over 12 months period. A total of 150 patients in their second trimester eligible for MTP were included in this study. Patients were divided into 2 groups of 75 members each. Group A was induced with Foley’s bulb and misoprostol and Group B was induced with mifepristone and misoprostol. Results: most of the patients belonged to 20-30 years of age group. 86% of the study population was parous. Majority of the study patients were married and belonged to lower socioeconomic class. Most of the patients were in 17-20 weeks of gestational age. Request for MTP was the most common indication. The number of requirement of misoprostol doses was significantly decreasing with increase in the parity. The induction to abortion time interval was lower for Group B. patients of Group B had 100% success rate with minimal complications. Conclusion: Although Foley’s bulb with misoprostol is of low cost, but when compared to efficacy, it is inferior to mifepristone with misoprostol. Keywords: Abortion, second trimester, Foley’s, misoprostol, mifepristone.

Page No: 181-185 | Full Text

 

Original Research Article

EVALUATING COMPLIANCE WITH THE HOUR-1 SEPSIS BUNDLE AND THE IMPACT OF A RAPID RESPONSE INTERVENTION IN SUSPECTED SEPSIS PATIENTS”- A PROSPECTIVE INTERVENTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.33

Chodarapu Vedasri, Vijaya Kumar Punnapu, Yasoda Devi Kakaraparthi

View Abstract

Background: Sepsis remains a leading cause of preventable mortality worldwide, particularly in resource-limited settings where delayed recognition and inconsistent adherence to evidence-based guidelines are common. The Surviving Sepsis Campaign recommends implementation of the Hour-1 Sepsis Bundle to improve outcomes. However, compliance remains suboptimal in many tertiary-care hospitals. Objective: To evaluate compliance with the Hour-1 Sepsis Bundle in suspected sepsis patients and to assess the impact of a simple Rapid Response Card intervention on improving bundle adherence. Materials and Methods: This prospective interventional study was conducted over two months (October–December 2025) in the Department of General Medicine and ICUs of a tertiary-care teaching hospital in Visakhapatnam. Ninety-two adult patients with newly diagnosed sepsis (46 pre-intervention and 46 post-intervention) were included. Baseline compliance with individual Hour-1 bundle components—blood cultures, intravenous (IV) antibiotics, IV fluids, and vasopressors (when indicated)—was recorded. Serum lactate measurement was excluded due to non-availability. During the intervention phase, a bedside Sepsis Rapid Response Card was introduced to prompt timely action. Compliance rates and patient outcomes were compared using the Chi-square test, with p < 0.05 considered statistically significant. Results: IV antibiotic compliance improved significantly from 34.8% pre-intervention to 84.8% post-intervention (χ²=21.88, p<0.001). Complete Hour-1 bundle compliance increased from 13% to 32.6% (χ²=3.95, p=0.047). Blood culture collection improved from 15.2% to 30.4%, and IV fluid administration increased from 63% to 73.9%, though these changes were not statistically significant. Mortality decreased from 21.7% in the pre-intervention group to 13% post-intervention, representing a clinically meaningful reduction. Discharge rates increased from 71.7% to 82.6%. Conclusion: Implementation of a low-cost bedside Rapid Response Card significantly improved compliance with key components of the Hour-1 Sepsis Bundle, particularly timely IV antibiotic administration and overall bundle completion. Although full compliance remains suboptimal, structured, simple quality-improvement interventions can enhance early sepsis management and potentially improve clinical outcomes in resource-constrained settings. Keywords: Sepsis, Hour-1 bundle, Rapid response intervention, Antibiotic compliance, Quality improvement, Tertiary care hospital.

Page No: 186-194 | Full Text

 

Original Research Article

CARDIOVASCULAR RISK FACTOR BURDEN IN PATIENTS UNDERGOING CORONARY ARTERY BYPASS GRAFTING: A CLINICAL ANALYSIS

http://dx.doi.org/10.70034/ijmedph.2026.2.34

Neha Sharma, Lokendra Sharma, Dhruva Sharma, Neelima Sharma, Preksha Sharma, Uma Advani

View Abstract

Background: Coronary artery disease (CAD) remains one of the leading causes of morbidity and mortality worldwide. In patients with severe or advanced disease, coronary artery bypass grafting (CABG) is an established surgical procedure that helps restore adequate blood flow to the heart. Understanding the demographic characteristics and cardiovascular risk factor profile of patients undergoing CABG is important for improving clinical management and long-term outcomes. The objective is to evaluate the demographic profile, clinical parameters, and major cardiovascular risk factors among patients undergoing coronary artery bypass grafting. Materials and Methods: This observational descriptive study included 200 patients who underwent CABG at a tertiary care center. Data on demographic variables, clinical parameters, and comorbid conditions such as hypertension, diabetes mellitus, dyslipidemia, chronic kidney disease, and heart failure were collected. Continuous variables were expressed as mean ± standard deviation, while categorical variables were presented as frequency and percentage. Chi-square test, independent sample t-test and Pearson correlation were applied where appropriate, with a p-value <0.05 considered statistically significant. Results: The mean age of patients was 61.8 ± 10.7 years, and males constituted the majority of the study population (78%). Hypertension was the most common risk factor (55%), followed by diabetes mellitus (33.5%), hypercholesterolemia (24%), hypertriglyceridemia (21%), heart failure (18%), and chronic kidney disease (13.5%). A significant positive correlation was observed between age and systolic blood pressure (r = 0.28, p = 0.002) as well as between age and blood glucose levels (r = 0.21, p = 0.01). Conclusion: Patients undergoing CABG were predominantly older males with a high burden of cardiovascular risk factors, particularly hypertension and diabetes. Early identification and effective management of these modifiable risk factors may help reduce disease progression and improve clinical outcomes. Keywords: Coronary artery disease, Coronary artery bypass grafting, Cardiovascular risk factors, Hypertension, Diabetes mellitus, Dyslipidemia.

Page No: 195-199 | Full Text

 

Original Research Article

COMPARATIVE STUDY OF CLINICAL AND BIOCHEMICAL PROFILES IN ALCOHOLIC VS. NON-ALCOHOLIC STEATOHEPATITIS- RELATED CIRRHOSIS

http://dx.doi.org/10.70034/ijmedph.2026.2.35

Aman Arora, Shweta Arya

View Abstract

Background: Differentiating Alcoholic Steatohepatitis (ASH) from Non-Alcoholic Steatohepatitis (NASH) is challenging. This study evaluates the diagnostic utility of MRI, focusing on perivascular branching heterogeneity, in distinguishing ASH from NASH. Materials and Methods: This retrospective cohort study included 90 MRI exams from 60 NASH and 30 ASH patients, with both MRI and liver biopsy performed within 13 months. MRI findings were independently scored by two radiologists, and interclass correlation coefficients (ICC) and receiver operating characteristic (ROC) analysis were used to assess diagnostic accuracy. Results: The mean age of NASH and ASH patients was 60.5 ± 9.38 and 54.1 ± 11.48 years, respectively (p=0.012). Perivascular branching heterogeneity was observed in 63% of ASH patients and 30% of NASH patients (Reader 1). The ICC was 0.69 (0.46–0.82), and ROC analysis showed an area under the curve (AUC) of 0.69 for Reader 1 and 0.72 for Reader 2. The positive predictive value (PPV) for perivascular branching was 65% (Reader 1) and 67% (Reader 2). Conclusion: MRI, particularly perivascular branching heterogeneity, can aid in differentiating ASH from NASH. However, the moderate diagnostic accuracy suggests it should be used alongside clinical and biochemical data for more reliable diagnosis. Further studies with larger cohorts and advanced imaging are needed. Keywords: Steatohepatitis, Alcoholic Liver Disease, Non-Alcoholic Steatohepatitis (NASH), Liver Cirrhosis, Biochemical Profile, Clinical Profile.

Page No: 200-204 | Full Text

 

Original Research Article

RADIOLOGICAL AND LABORATORY CORRELATION OF PLASMA LEAKAGE IN DENGUE INFECTION USING CHEST AND ABDOMINAL ULTRASONOGRAPHY

http://dx.doi.org/10.70034/ijmedph.2026.2.36

Shweta Arya, Aman Arora

View Abstract

Background: Dengue Fever is a rapidly spreading vector-borne illness in tropical and subtropical regions. Plasma leakage due to increased vascular permeability is a key feature of severe dengue and may lead to complications such as pleural effusion, ascites, and shock. Early identification of plasma leakage is essential for timely clinical management. Ultrasonography of the chest and abdomen has emerged as a useful non-invasive modality for detecting early radiological evidence of plasma leakage. The objective is to evaluate the radiological and laboratory correlation of plasma leakage in dengue infection using chest and abdominal ultrasonography. Materials and Methods: This hospital-based observational study included 100 patients with serologically confirmed dengue infection. Demographic details, clinical findings, and laboratory parameters such as platelet count and hematocrit levels were recorded. All patients underwent thoracoabdominal ultrasonography to detect radiological signs of plasma leakage, including pleural effusion, ascites, gallbladder wall thickening, and pericardial effusion. The ultrasonographic findings were correlated with laboratory parameters. Statistical analysis was performed using appropriate tests, and a p-value <0.05 was considered statistically significant. Results: The mean age of the patients was 34.6 ± 11.2 years, with a male predominance (58% males). Ultrasonographic findings suggestive of plasma leakage were observed in 54% of patients. The most common findings were gallbladder wall thickening (38%), followed by ascites (32%) and pleural effusion (28%). Patients with ultrasonographic evidence of plasma leakage had significantly lower platelet counts (58,300 ± 18,200 cells/mm³) and higher hematocrit levels (44.2 ± 4.9%) compared to those without plasma leakage (p < 0.001). Conclusion: Chest and abdominal ultrasonography is a valuable, non-invasive tool for the early detection of plasma leakage in dengue infection. When combined with laboratory parameters, it can help identify patients at risk of severe disease and support timely clinical management. Keywords: Dengue fever, Plasma leakage, Ultrasonography, Pleural effusion, Hematocrit.

Page No: 205-210 | Full Text

 

Original Research Article

PREVALENCE, RISK FACTORS AND CLINICAL PROFILE OF CLOSTRIDIOIDES DIFFICILE INFECTION IN ANTIBIOTIC-ASSOCIATED DIARRHOEA: A PROSPECTIVE OBSERVATIONAL STUDY FROM NORTHERN KERALA

http://dx.doi.org/10.70034/ijmedph.2026.2.37

Bonitta Rachel Abraham, Sajaad Manzoor, Tinu Abraham Kuruvilla

View Abstract

Background: Clostridioides difficile infection (CDI) is a major cause of antibiotic-associated diarrhoea (AAD), particularly in elderly hospitalized patients. Indian data on CDI from semi-urban regions remain limited. Objectives: To determine the prevalence of CDI among patients with antibiotic-associated diarrhoea and to identify associated risk factors and clinical characteristics. Materials and Methods: This prospective observational study was conducted in a tertiary care hospital in Northern Kerala during a period of 15 months. A total of 125 hospitalized adult patients with diarrhoea following antibiotic exposure were included. Stool samples were tested for glutamate dehydrogenase antigen and Clostridioides difficile toxins A and B using a rapid immunoassay. Statistical analysis was performed using SPSS version 20.0. Results: The prevalence of CDI was 17.6%. CDI was more common among males (68.2%), with a mean age of 73.9 ± 10.02 years. Significant associations were observed with advanced age, exposure to broad-spectrum antibiotics—particularly glycopeptides, fluoroquinolones, carbapenems and lincosamides—acid suppressive therapy, steroid use and Ryle’s tube feeding. Both metronidazole and vancomycin were associated with favourable clinical outcomes. Conclusion: CDI is a significant cause of antibiotic-associated diarrhoea among elderly hospitalized patients. Early identification and judicious antibiotic use are essential to reduce morbidity. Keywords: Clostridioides difficile, antibiotic-associated diarrhoea, CDI, risk factors, India.

Page No: 211-214 | Full Text

 

Original Research Article

A STUDY ON PERCEIVED BARRIERS TO HEALTHY EATING IN HIGH SCHOOL ADOLESCENTS IN URBAN SRIKAKULAM

http://dx.doi.org/10.70034/ijmedph.2026.2.38

K.Sree Pardhvi Sarma, Y.Neelima

View Abstract

Background: Healthy eating in childhood is essential for growth and prevention of long-term Non communicable diseases. Children face unique barriers shaped by taste preferences, peers, family habits, and food marketing. Understanding these barriers helps in child-centred nutrition interventions. Objectives: • To identify the barriers to healthy eating in urban high school adolescents in Srikakulam. • To give evidence-based recommendations regarding healthy eating Materials and Methods: Descriptive cross-sectional study was conducted over 3 months among 550 high school students in urban Srikakulam. Using multistage sampling, five schools were selected randomly and students were chosen using simple random sampling. Data were collected through a self-administered questionnaire on sociodemographic factors and perceived barriers. Data were analysed in Excel and Epi info using descriptive statistics. Results: Majority of the participants (73.6%) were in the age group of 12-14 years and 60.4% of them were males. Guidance from parents on healthy food choices (mean score =4.28), Guidance from teachers on nutrition and healthy food choices (mean score=3.93), Consumption of atleast 3 meals a day (mean score=3.88), Complexity in understanding information on food labels (mean score=3.73), easy availability of unhealthy foods (samosas, chips) (mean score=3.62) were found to be main perceived barriers. Conclusion: Innovative behaviour change strategies are needed to inculcate healthy eating habits with engagement of parents and teachers. School curriculum needs to be modified to impart nutritional knowledge as well as motivation for behaviour change. Keywords: Adolescents, Healthy eating, Perceived barriers.

Page No: 215-220 | Full Text

 

Case Series

EVALUATING THE SAFETY OF INTRACRANIAL CAROTID ARTERY STENTING: OUR EXPERIENCE

http://dx.doi.org/10.70034/ijmedph.2026.2.39

Nidhi Roy, Ajay Sharma, Nakul Rathore, Akshay Pol1, Abu Shahma, Pandurang Barve, D.K. Tyagi

View Abstract

Intracranial atherosclerotic disease is a major cause of ischemic stroke in Asian populations, and while aggressive medical therapy remains the standard of care, a subset of patients with severe intracranial stenosis continues to experience recurrent ischemic events. This retrospective case series evaluates the safety and short-term outcomes of intracranial carotid artery stenting in five medically refractory patients with cavernous ICA stenosis, including four patients with right-sided disease presenting with recurrent TIAs and one patient with left-sided stenosis presenting with persistent headache and dizziness. All patients underwent detailed clinical and angiographic evaluation, followed by endovascular treatment using balloon angioplasty and/or intracranial stenting under standardized institutional protocols. The degree of stenosis ranged from 68% to 90%, and technical success was achieved in all cases with satisfactory luminal expansion and restoration of antegrade flow. No peri-procedural complications, including stroke, haemorrhage, or death, were observed. All patients demonstrated complete resolution of presenting ischemic or haemodynamic symptoms and remained free of recurrent transient ischemic attacks or stroke during 3–6 months of follow-up. These findings suggest that intracranial carotid artery stenting can be performed safely in carefully selected, medically refractory patients when undertaken in experienced centres, although larger prospective studies with longer follow-up are required to define its long-term role. Keywords: Intracranial atherosclerotic disease, Cavernous internal carotid artery, Intracranial carotid artery stenting, Endovascular treatment, Ischemic stroke.

Page No: 221-228 | Full Text

 

Original Research Article

IMPACT OF DIETARY AND LIFESTYLE MODIFICATION ON NUTRITIONAL STATUS IN CHILDREN WITH HEMOPHILIA- A LONGITUDINAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.40

C Karthikeyan, V Vahini, Dhiya.P.Reji

View Abstract

Background: Hemophilia is a hereditary bleeding disorder characterized by recurrent bleeding episodes, particularly into joints and muscles, leading to chronic arthropathy and functional limitation. With improved survival due to factor replacement therapy and prophylaxis, long-term issues such as nutritional status, physical activity, and quality of life have gained increasing importance. Children with hemophilia are vulnerable to both undernutrition and overweight/obesity due to reduced mobility, recurrent pain, fear of physical activity, and socioeconomic factors. Addressing dietary habits and lifestyle practices may play a crucial role in improving overall health outcomes in this population. Aim: To evaluate the impact of dietary and lifestyle modification on nutritional status among children with hemophilia in a tertiary care center. Materials and Methods: This longitudinal study included 60 children aged 0–18 years with hemophilia A or B registered at the Integrated Centre for Hemophilia and Hemoglobinopathy, Coimbatore Medical College Hospital. The study was conducted over 6 months (October 2024 to March 2025). Baseline anthropometric measurements were recorded, and nutritional status was assessed using WHO and IAP growth charts as appropriate for age. Individualized dietary counseling and lifestyle modifications, including physiotherapy-based strengthening and range-of-motion exercises, were provided. Follow-up assessments were conducted at the end of 6 months to evaluate changes in nutritional status. Clinical profile, treatment pattern, joint health score (HJHS), and social impact were also analyzed. Results: At baseline, 13.3% of children were overweight, 13.3% were underweight, and 11.7% were obese. After intervention, overweight reduced by 62.5% (p=0.001) and underweight reduced by 75% (p=0.036), while obesity remained unchanged (p=0.699). Nutritional abnormalities were more common in older children. The majority had severe hemophilia (75%) and were on prophylaxis (63.3%). Joint health improved in 60.5% of children receiving prophylaxis. Functional limitations and school absenteeism were common. Conclusion: Dietary and lifestyle modification significantly improved undernutrition and overweight among children with hemophilia, though obesity persisted. Integrating structured nutritional and physiotherapy interventions into comprehensive hemophilia care may enhance functional outcomes and overall well-being. Keywords: Hemophilia, Nutritional status, Dietary modification, Lifestyle intervention, Children.

Page No: 229-235 | Full Text

 

Original Research Article

SERUM VITAMIN D3 AND IGF-1 AS COMPOSITE MARKERS OF DISEASE SEVERITY AND BONE METABOLIC DYSFUNCTION IN LIVER CIRRHOSIS

http://dx.doi.org/10.70034/ijmedph.2026.2.41

Kamlesh Kumar Sharma, Vandana Sharma, Sandeep Nijhawan, Sandhya Mishra

View Abstract

Background: Liver cirrhosis is associated with multiple endocrinal abnormalities. Vitamin D3 and Insulin-like Growth Factor-1 (IGF-1) are both metabolised in the liver and are known to play key roles in bone metabolism. Their individual deficiency has been documented in cirrhosis, but their combined utility as composite markers of disease severity and bone metabolic dysfunction remains underexplored. Materials and Methods: A case-control study was conducted at SMS Medical College and Hospitals, Jaipur. Fifty diagnosed cirrhotic patients were enrolled as cases and 50 age- and sex-matched healthy subjects as controls. Serum Vitamin D3 and IGF-1 levels were measured and correlated with Child-Turcotte-Pugh (CTP) class and Model for End-Stage Liver Disease (MELD) score. Results: Serum Vitamin D3 was deficient in 84% of cirrhotic patients with mean levels of 18.30±14.38 ng/ml vs 40.85±23.63 ng/ml in controls (p<0.001). Serum IGF-1 was low in 58% of patients (68.16±60.14 vs 182.62±129.41 ng/ml; p<0.001). Both parameters declined progressively across CTP classes A, B and C: Vitamin D3 (89.2 → 23.65 → 7.56 ng/ml) and IGF-1 (128.67 → 81.68 → 43.36 ng/ml). Vitamin D3 showed the strongest negative correlation with disease severity (r = -0.803 with CTP; r = -0.816 with MELD). The composite deficiency of both markers was significantly associated with advanced disease (CTP Class C and MELD ≥15). Conclusion: Serum Vitamin D3 and IGF-1 are significantly depleted in liver cirrhosis and their levels correlate strongly with disease severity. The combined assessment of these two markers offers a clinically practical, non-invasive index of hepatic-bone metabolic dysfunction that tracks CTP and MELD progression. Routine supplementation and monitoring of Vitamin D3 and IGF-1 should be considered an integral part of cirrhosis management. Keywords: Liver Cirrhosis, Vitamin D3, IGF-1, CTP Score, MELD Score, Bone Metabolic Dysfunction, Disease Severity.

Page No: 236-240 | Full Text

 

Original Research Article

BURDEN OF NON-COMMUNICABLE DISEASE RISK FACTORS AND THEIR ASSOCIATION WITH LIFESTYLE PRACTICES AMONG ADULT POPULATION IN THE FIELD PRACTICE AREAS OF SINDHUDURG DISTRICT

http://dx.doi.org/10.70034/ijmedph.2026.2.42

Sachin Sharma, Santosh Gadadnavar

View Abstract

Background: Non-communicable diseases (NCDs) are the leading cause of morbidity and mortality in India, largely driven by modifiable behavioral and lifestyle risk factors. Early identification of these risk factors at the community level is essential for effective prevention and control. The present study was conducted to assess the burden of major NCD risk factors and their association with lifestyle practices among adults in the field practice areas of Sindhudurg district, Maharashtra. Objectives: 1. To estimate the prevalence of selected NCD risk factors among the adult population. 2. To assess lifestyle practices related to diet, physical activity, tobacco, and alcohol use. 3. To determine the association between lifestyle practices and NCD risk factors. Materials and Methods: A community-based cross-sectional study was conducted from January to June 2026 in the field practice areas of SSPM Medical College, including RHTC Pandur (population 3,250), RHTC Kasal (population 4,399), and UHTC Kudal (population 23,600), covering service areas of RH Malvan, SDH Kankavli, and RH Kudal. Adults aged ≥18 years residing in the study area for at least one year were included using multistage sampling. Data were collected using a pretested structured questionnaire based on the WHO STEPwise approach. Information on socio-demographic profile and lifestyle practices was obtained. Anthropometric measurements and blood pressure were recorded using standard protocols. Data were analyzed using appropriate descriptive statistics and Chi-square test, with p<0.05 considered statistically significant. Results: A total of 600 adults participated in the study. The prevalence of tobacco use, alcohol consumption, physical inactivity, and inadequate fruit and vegetable intake was 32.5%, 21.3%, 46.8%, and 72.4%, respectively. Overweight/obesity (BMI ≥25 kg/m²) was observed in 28.7% of participants, while 24.5% had hypertension and 11.2% reported known diabetes. Significant associations were found between physical inactivity and overweight/obesity (p<0.001), tobacco use and hypertension (p=0.02), and unhealthy diet with both obesity and diabetes (p<0.05). Clustering of two or more risk factors was observed in 41.6% of the study population. Conclusion: A high burden of behavioral and metabolic NCD risk factors was observed among adults in the field practice areas of Sindhudurg district, with significant associations between unhealthy lifestyle practices and major risk conditions. Community-based lifestyle modification strategies, regular screening, and strengthened primary health care interventions are urgently needed to reduce the future burden of NCDs. Keywords: Non-communicable diseases, Risk factors, Lifestyle practices, WHO STEPS, Community-based study, Hypertension, Obesity.

Page No: 241-246 | Full Text

 

Original Research Article

PREDICTIVE FACTORS AND DEVELOPMENT OF A NOMOGRAM FOR POSTOPERATIVE PANCREATIC FISTULA FOLLOWING PANCREATIC RESECTION: A PROSPECTIVE STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.43

Haritha Gorantla, Sanjay Kumar Gurjar, Vimal Kumar Mittal, Sitaram Yadav, Niranjan Kumar Gandhi, Sindhura Bukka

View Abstract

Background: Postoperative pancreatic fistula (POPF) remains a major contributor to morbidity following pancreatic resections. Identifying reliable predictors is essential for risk stratification and improved perioperative management. This study aimed to evaluate risk factors associated with POPF and develop a predictive nomogram. Materials and Methods: A prospective study was conducted on 140 patients undergoing pancreatic resection between June 2023 and November 2024 at a tertiary care center. Preoperative, intraoperative, and postoperative variables were analyzed. Logistic regression analysis was used to identify predictors of POPF. A nomogram was constructed and validated using receiver operating characteristic (ROC) analysis. Results: The overall incidence of POPF was 15.7%. No significant association was observed with most clinicopathological variables. Significant predictors included higher body mass index (BMI), increased postoperative white blood cell count, soft pancreatic texture, smaller pancreatic duct size, greater postoperative albumin decrease, and elevated drain amylase on postoperative day 1. The nomogram demonstrated excellent discrimination with an area under the curve (AUC) of 0.97. Conclusions: The proposed nomogram provides an effective tool for individualized prediction of POPF risk and may assist in optimizing perioperative strategies in pancreatic surgery. Keywords: Pancreatic fistula; Pancreatic resection; Risk factors; Nomogram; Albumin; Drain amylase.

Page No: 247-253 | Full Text

 

Original Research Article

PREVALENCE, PATTERN AND DETERMINANTS OF DRUG-RESISTANT TUBERCULOSIS AMONG ADULT TB PATIENTS REGISTERED AT SELECTED TUBERCULOSIS UNITS IN JHANSI DISTRICT

http://dx.doi.org/10.70034/ijmedph.2026.2.44

Vimal Arya, Mahendra Chouksey, Bharti

View Abstract

Background: Drug-resistant tuberculosis (DR-TB) poses a major challenge to tuberculosis control in India. Evidence regarding the burden and determinants of DR-TB at the sub-district level is limited. Objectives: To estimate the prevalence and identify the socio-clinical determinants of drug-resistant tuberculosis among adult patients with TB registered at selected tuberculosis units in the Jhansi district. Materials and Methods: A retrospective record-based analytical cross-sectional study was conducted among 300 adult patients with TB, registered at the Mauranipur and Bangra Tuberculosis Units between May 2024 and May 2025. Data regarding sociodemographic characteristics, clinical history, and drug susceptibility testing were extracted from the Nikshay portal records. Associations between determinants and DR-TB were assessed using the chi-square test and odds ratio, with p < 0.05 considered statistically significant. Results: The overall prevalence of drug-resistant tuberculosis was 26% (78/300). DR-TB was significantly higher among males (30.5%) compared to females (15.6%) (p < 0.001). Patients aged >50 years had a higher prevalence (32.9%) than younger patients (23.9%) (p = 0.028). Previous TB treatment was strongly associated with DR-TB (55% vs. 6.7%; OR = 17.5, p < 0.001). HIV infection and a history of smoking were also significantly associated with drug resistance. No significant differences were observed between the Mauranipur and Bangra tuberculosis units regarding the major determinants. Conclusion: A considerable burden of drug-resistant tuberculosis was observed among patients with TB in the study area. Previous TB treatment, HIV infection, smoking, and older age were important determinants. Strengthening early detection, adherence to treatment, and targeted interventions for high-risk groups are essential to control DR-TB in the region Keywords: Drug-resistant TB, MDR-TB, Jhansi, Risk factors, NTEP.

Page No: 254-258 | Full Text

 

Original Research Article

BONE MARROW INVOLVEMENT IN NON HODGKIN LYMPHOMA AT TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.45

Akula Anitha, C. Aruna, Puppala Srilatha, Akarsh, Annapoorna Sirisha, Triveni Bhopal

View Abstract

Background: Non-Hodgkin’s lymphomas (NHLs) represent a heterogeneous group of disorders that arise from the lymphoid system. NHL occurs in individuals at virtually all ages, but it’s uncommon in children. Low-grade NHL (Follicular lymphoma, and Chronic lymphocytic leukemia) are more common in the USA, whereas Diffuse Large B cell Lymphoma (DLBCL) is most common in India. Bone marrow aspiration and biopsy are an important component of staging workup in NHL. Materials and Methods: This is a Prospective study, done for a period of 24 months. Total number of cases studied are 50 cases of NHL, with bone marrow correlation. CD3 and CD 20 IHC was done to differentiate between B and T cell Lymphoma. Results: In the present study out of 50 cases, 22 cases had both Bone marrow aspiration and Bone marrow biopsy. Of these 22 cases, BMB showed positive infiltrations in 16 (72.7%) of cases, while the remaining 6 (27.3%) cases were negative for infiltration and showed normal bone marrow study.Of the 50 cases of NHL, there were 49 cases of B cell – NHL constituting to 98%, and only 1 case of T- NHL. CD3 and CD 20 IHC was done to differentiate between B and T cell Lymphoma. Out of these 49 cases of B-cell NHL, the major subtype was Chronic lymphocytic lymphoma. Conclusion: Bone marrow aspiration is a simple and rapid procedure and it is alternative to biopsy, and it is supported by its easier and earlier diagnostic availability in the study of NHL. B cell lymphoma were more frequent than T cell lymphoma with predominance of Chronic Lymphocytic Lymphoma and Para trabecular infiltrations in the present study. Keywords: Non Hodgkin lymphoma, Bone marrow aspiration and Bone marrow biopsy.

Page No: 259-265 | Full Text

 

Original Research Article

A PROSPECTIVE OBSERVATIONAL STUDY IN CHILDREN BETWEEN AGE GROUP OF 1 MONTH TO 12 YEARS IN MEASUREMENT OF SERUM FERRITIN, LIVER ENZYMES -SERUM GLUTAMIC OXALOACETIC TRANSAMINASE (SGOT), SERUM GLUTAMIC PYRUVIC TRANSAMINASE (SGPT)AS A PROGNOSTIC INDICATOR OF SEVERE DENGUE IN CHILDREN ADMITTED IN TERTIARY CARE CENTRE

http://dx.doi.org/10.70034/ijmedph.2026.2.46

Sakthipriya A.R, V.Vahini, Karthikeyan C

View Abstract

Background: Dengue is a major pediatric public health problem in tropical countries, with clinical presentation ranging from dengue with warning signs to severe dengue with shock and organ dysfunction. Early identification of children at risk for severe dengue is essential for timely monitoring and management. Serum ferritin and liver enzymes such as serum glutamic oxaloacetic transaminase (SGOT/AST) and serum glutamic pyruvic transaminase (SGPT/ALT) may serve as useful prognostic biomarkers, but their serial evaluation in hospitalized children remains clinically relevant. The aim is to assess the prognostic value of serum ferritin, SGOT, and SGPT in predicting severe dengue among children aged 1 month to 12 years admitted to a tertiary care centre. Materials and Methods: This was a prospective observational study conducted over 1 year in a tertiary care centre, including 86 children aged 1 month to 12 years with dengue (as per WHO 2023 criteria) and/or NS1Ag/IgM positivity. Children with severe anemia, alternate causes of thrombocytopenia, other confirmed febrile serologies, and chronic transfusion-dependent hemolytic anemia were excluded. Clinical history, examination findings, and investigations including CBC, ferritin, liver function tests, serum albumin, IgM dengue, USG abdomen, and chest X-ray were recorded. Serum ferritin, SGOT, SGPT, and albumin were measured on Day 1 and Day 3 of admission. Data were analyzed to compare dengue with warning signs (DWS) and severe dengue, and ROC analysis was performed to determine diagnostic cut-offs. Results: Of 86 children, 66 (76.7%) had DWS and 20 (23.3%) had severe dengue; no child had dengue without warning signs. DSS constituted 18 (20.93%) and DHF 2 (2.33%) cases. Altered sensorium (p=0.001), tender hepatomegaly (p=0.027), and clinical fluid accumulation (p=0.022) were significantly associated with severe dengue. Mean ferritin levels were significantly higher in severe dengue on Day 1 (719.46 ± 245.13 vs 378.66 ± 163.53 ng/mL; p=0.001) and Day 3 (567.71 ± 148.95 vs 433.29 ± 135.72 ng/mL; p=0.006). SGOT and SGPT were also significantly elevated in severe dengue on both days (p<0.05). On ROC analysis, Day 1 ferritin showed the best performance (cut-off 536 ng/mL, sensitivity 95.0%, specificity 78.5%, Youden index 0.735), outperforming SGOT, SGPT, and albumin. Conclusion: Serum ferritin is a strong prognostic biomarker for severe dengue in children, with SGOT and SGPT providing additional supportive value. Serial measurement of ferritin and liver enzymes, along with clinical assessment, can help in early risk stratification and close monitoring of pediatric dengue cases in tertiary care settings. Keywords: Dengue; Severe dengue; Serum ferritin; SGOT; SGPT.

Page No: 266-273 | Full Text

 

Original Research Article

EFFICACY OF FRACTIONAL CO2 LASER VS COMBINATION OF FRACTIONAL CO2 LASER WITH 50% TCA IN THE TREATMENT OF ATROPHIC ACNE SCARS - A COMPARATIVE STUDY IN A TERTIARY CARE CENTER

http://dx.doi.org/10.70034/ijmedph.2026.2.47

Chaduvula Jahnavi, Kalagarla Sravani, Penugonda N S Sulakshmi Raajitha, Pasagadugula Venkata Krishna Rao, Pemmaraju Ramana Murty

View Abstract

Background: Acne, the most common skin disease, is a disorder of pilosebaceous units that affects adolescents mainly. Scarring is one of the main sequelae of acne which causes significant psychosocial burden on affected individuals. Acne scars are classified into atrophic and hypertrophic scars, atrophic scars being the most common among the two. Even though many cosmetic procedures are available for treatment of acne scars individually, the review of literature revealed a very meagre number of studies pertaining to combination procedures. Hence the present study of combination therapy was taken up to evaluate the advantage of combined therapy over monotherapy in the treatment of atrophic acne scars. Materials and Methods: Patients attending the out-patient department of DVL from January 2024 to June 2025 presenting with facial atrophic acne scars were considered for this study. Written consent was taken. Details of symptoms, duration, site involved, type of scars, skin types, evolving dermatosis was recorded. Photographs were taken before and after completion of treatment. Qualitative grading is done before and after treatment. Patients meeting the inclusion criteria were randomly categorized into two treatment groups. Group A: Included only patients on Fractional CO2 laser. Group B: Included Fractional CO2 laser plus 50% TCA CROSS. Results: Forty patients were recruited after following the inclusion and exclusion criteria. The results were assessed based on 3 parameters-reduction in Goodman and Baron grade, patient assessment, patient satisfaction. There was a statistically significant difference in Goodman and Baron grade reduction, patient satisfaction between both groups. Conclusion: Fractional Co2 laser gives best results for superficial boxcar and rolling scars. TCA CROSS gives best results for ice pick and deep boxcar scars. Fractional CO2 laser followed by 50% TCA CROSS sequentially as a combination therapy has given excellent results with immense patient satisfaction. Keywords: Atrophic acne scars, Fractional CO₂ laser, Trichloroacetic acid (TCA), Combination therapy, Acne scar management, Skin resurfacing, Dermatological procedures.

Page No: 274-279 | Full Text

 

Review Article

AN OVERVIEW OF BURDEN OF DEATH DUE TO CONGENITAL ANOMALIES IN INDIA

http://dx.doi.org/10.70034/ijmedph.2026.2.48

Prerna Chandra, Neha Yadav, Vivek Mishra, Pradip Kharya, Oshin Verma

View Abstract

Background: Congenital anomalies are a significant but often overlooked cause of morbidity and mortality in India. These birth defects ranging from structural to functional abnormalities contribute notably to neonatal and under-five deaths, especially in low and middle income countries. Objective of this study is to assess the burden, mortality patterns and regional variations of congenital anomalies in India and to highlight actionable strategies for prevention, early detection and management. Materials and Methods: This review synthesizes national and regional data from published studies, reports from the World Health Organization and Government health surveillance documents. Mortality statistics by anomaly type and age group were analyzed using data from 2015 to 2021. Regional prevalence patterns were compiled from hospital-based studies conducted across different Indian states. Results: Among congenital anomalies, highest mortality observed in children under five years of age. Congenital heart anomalies were the leading cause, followed by neural tube defects and chromosomal disorders such as Down syndrome. Survival rates vary widely depending on the condition, with Down syndrome showing high survival and congenital heart diseases showing lower survival within the first year of life. Conclusion: Congenital anomalies remain a major public health concern in India. Strengthening antenatal screening, neonatal diagnosis, birth defect surveillance, and health worker training are critical to reducing the associated mortality and long-term disability. Public awareness, integrated care systems, and national level registries are essential to address the current gaps in prevention and management. Keywords: Congenital anomalies India, Congenital heart defects, Neural tube defects, Birth defect surveillance, Antenatal screening.

Page No: 280-286 | Full Text

 

Original Research Article

ANGIOGRAPHIC SPECTRUM OF INTRACRANIAL VASCULAR PATHOLOGIES ON CEREBRAL DIGITAL SUBTRACTION ANGIOGRAPHY IN A TERTIARY CARE CENTRE

http://dx.doi.org/10.70034/ijmedph.2026.2.49

Ishaan Arora, Shivaji Pole, D. B. Dahiphale, Asmita Pravin Suryawanshi

View Abstract

Background: Cerebrovascular diseases remain a leading cause of morbidity and mortality worldwide. Accurate imaging of intracranial vasculature is essential for diagnosis and management of these conditions. Digital subtraction angiography (DSA) remains the gold standard for evaluating intracranial vascular pathology due to its superior spatial resolution and dynamic assessment of cerebral circulation. Objective: To evaluate the angiographic spectrum of intracranial vascular pathologies detected on cerebral digital subtraction angiography in patients presenting to a tertiary care centre. Materials and Methods: This retrospective observational study included 34 patients who underwent cerebral digital subtraction angiography between January 2025 and June 2025 at a tertiary care centre. Demographic characteristics, clinical indications, and angiographic findings were obtained from archived angiography reports. Angiographic findings were categorized into intracranial arterial stenosis or occlusion, intracranial aneurysm, vertebrobasilar disease, arteriovenous malformation (AVM), and normal angiographic findings. Results: A total of 34 patients were included, comprising 20 males (58.8%) and 14 females (41.2%), with a mean age of 58 years. The most common indication for cerebral DSA was evaluation of ischemic stroke (55.9%), followed by subarachnoid hemorrhage (20.6%). Intracranial arterial stenosis or occlusion was the most common angiographic finding (52.9%), predominantly involving the internal carotid artery and middle cerebral artery. Intracranial aneurysms were detected in 17.6% of cases, vertebrobasilar disease in 14.7%, and arteriovenous malformation in 2.9%. Normal angiographic findings were observed in 11.8% of cases. Conclusion: Cerebral digital subtraction angiography remains an essential imaging modality for evaluation of intracranial vascular diseases. Intracranial arterial stenosis and occlusion were the most frequent angiographic findings in this study, highlighting the importance of DSA in the diagnosis and management of cerebrovascular disorders. Keywords: Digital subtraction angiography, intracranial vascular disease, cerebral angiography, aneurysm, stroke.

Page No: 287-294 | Full Text

 

Original Research Article

COMPARISON OF PROPOFOL VERSUS SEVOFLURANE ON POST-OPERATIVE COGNITIVE DYSFUNCTION IN GERIATRIC PATIENTS UNDERGOING GENERAL ANAESTHESIA

http://dx.doi.org/10.70034/ijmedph.2026.2.50

Jyotirmayee Patel, Kashish Ahuja, Anjali Singh

View Abstract

Background: Postoperative cognitive dysfunction (POCD) is a common complication in geriatric patients undergoing general anesthesia, leading to impaired memory, attention, and executive function. The choice of anesthetic agent may influence the incidence and severity of POCD. The aim is to compare the effects of propofol and sevoflurane on postoperative cognitive dysfunction in geriatric patients undergoing general anesthesia. Materials and Methods: This prospective comparative study included geriatric patients undergoing elective surgeries under general anesthesia. Patients were divided into two groups based on anesthetic technique: propofol-based total intravenous anesthesia and sevoflurane-based inhalational anesthesia. Cognitive function was assessed preoperatively and on postoperative day 7 using MMSE and MoCA scores. Results: Both groups demonstrated a decline in postoperative cognitive scores; however, the reduction was more significant in the sevoflurane group. The incidence of POCD was higher among patients receiving sevoflurane compared to propofol. Propofol was associated with faster recovery, reduced neuroinflammatory impact, and better preservation of cognitive function. Duration of anesthesia and advanced age showed a positive correlation with POCD incidence. Conclusion: Propofol appears to be associated with a lower incidence of postoperative cognitive dysfunction compared to sevoflurane in geriatric patients. Its neuroprotective, anti- inflammatory, and antioxidant properties may contribute to better cognitive outcomes. Careful anesthetic selection may help reduce POCD risk in elderly surgical patients. Keywords: Postoperative cognitive dysfunction, Propofol, Sevoflurane, Geriatric anesthesia, Cognitive function, General anesthesia.

Page No: 295-301 | Full Text

 

Original Research Article

ROLE OF ULTRASOUND AS THE PRIMARY IMAGING MODALITY AND COMPUTED TOMOGRAPHY IN SUSPECTED ACUTE APPENDICITIS: A RETROSPECTIVE STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.51

Sahithi Reddy Almela, Parthasarathi A, Gautham Muthu

View Abstract

Background: Acute appendicitis is a common surgical emergency in which delayed diagnosis increases the risk of perforation and sepsis, whereas inaccurate diagnosis may lead to negative appendicectomy. Imaging plays an important role when clinical findings are atypical. This study assessed the diagnostic performance of ultrasonography (USG) as the primary imaging modality and the additional value of computed tomography (CT) in selected patients with suspected acute appendicitis. Materials and Methods: This retrospective record-based study included 60 patients who underwent appendicectomy for suspected acute appendicitis over a 2-year period. Data were obtained from hospital electronic records, radiology archives, operative notes, and histopathology reports. USG was performed in all patients as the initial imaging test and categorized as positive, equivocal/non-diagnostic, or negative. CT was performed only in patients in whom USG was equivocal or in whom there was a persistent clinical suspicion despite negative USG. Final diagnosis was confirmed by operative findings and/or histopathology. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy were calculated. Results: Of the 60 patients, 34 (56.7%) were males and most patients were aged 11-30 years (56.7%). Appendicitis was confirmed in 52 patients (86.7%), while 8 (13.3%) represented negative appendicectomy. USG was positive in 38 (63.3%), equivocal/non-diagnostic in 10 (16.7%), and negative in 12 (20.0%) patients. When equivocal studies were considered non-positive, USG showed sensitivity of 69.2%, specificity of 75.0%, PPV of 94.7%, NPV of 27.3%, and diagnostic accuracy of 70.0%. CT was performed in 22 patients and demonstrated sensitivity of 94.4%, specificity of 75.0%, PPV of 94.4%, NPV of 75.0%, and diagnostic accuracy of 90.9%. Conclusion: Ultrasonography is a useful first-line investigation in suspected acute appendicitis because of its availability and absence of radiation, but its diagnostic performance is limited by operator dependence, occasional equivocal examinations, and moderate sensitivity. CT provides substantially higher diagnostic accuracy and is an effective problem-solving modality in unresolved cases. A staged USG-first, selective-CT approach appears practical and clinically appropriate. Keywords: Appendicitis, Ultrasonography, Computed Tomography, Sensitivity and Specificity, Appendectomy.

Page No: 302-308 | Full Text

 

Original Research Article

AUDIT OF REPEAT IMAGING REQUESTS AND FACTORS CONTRIBUTING TO REPEAT CT/MRI EXAMINATIONS: AN OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.52

Ravi Elango, Kalpana.R, Ajit Kumar, Annitha Elavarasi J

View Abstract

Background: Repeat CT and MRI examinations are frequently performed in tertiary care hospitals for clinical reassessment, follow-up, and treatment monitoring. While many repeat studies are appropriate and clinically necessary, a considerable proportion may be avoidable due to poor documentation, non-availability of prior imaging, or communication gaps between departments and institutions. Such repetition can increase patient burden, resource utilization, and workload in radiology services. Auditing repeat imaging requests is therefore important to identify patterns of use and factors contributing to unnecessary duplication. Aim: To audit repeat imaging requests and identify factors contributing to repeat CT and MRI examinations in a tertiary care hospital. Materials and Methods: This observational study was conducted in the Department of Radiology of a tertiary care hospital and included 150 patients who underwent repeat CT or MRI examinations. Data were collected from radiology request forms, imaging registers, hospital information systems, and picture archiving and communication system records using a structured proforma. Variables recorded included age, sex, imaging modality, patient status, source of referral, referring department, anatomical region examined, interval between previous and repeat imaging, availability of prior imaging, adequacy of clinical details, mention of prior imaging, reason for repeat examination, and justification status. Data were analyzed using SPSS version 27.0. Categorical variables were expressed as frequencies and percentages, and associations were tested using chi-square or Fisher’s exact test. A p-value of less than 0.05 was considered statistically significant. Results: Among the 150 patients, the majority belonged to the 31–50 years age group (38.67%), and males predominated (61.33%). CT constituted 64.00% of repeat imaging, while MRI accounted for 36.00%. Most repeat examinations were performed in inpatients (68.00%). Brain imaging was the most commonly repeated examination (30.67%), followed by abdomen and pelvis (25.33%). The most frequent interval between scans was 8–30 days (31.33%). Previous imaging was unavailable in 45.33% of cases, and clinical details were inadequate in 38.67%. Disease progression (26.67%) and follow-up/post-treatment evaluation (21.33%) were the commonest reasons for repeat imaging. Overall, 58.67% of repeat examinations were clinically justified, whereas 28.00% were potentially avoidable and 13.33% were likely unnecessary. Avoidable or unnecessary repeat imaging was significantly associated with non-availability of previous imaging (p<0.001), inadequate clinical details (p<0.001), and referral from other departments or institutions (p=0.003). Conclusion: Repeat CT and MRI examinations were common in this tertiary care setting and were influenced by both genuine clinical need and preventable system-related factors. Improved documentation, better access to prior imaging, and stronger interdepartmental communication may reduce unnecessary repeat imaging and enhance the quality of radiology services. Keywords: Repeat imaging; Computed tomography; Magnetic resonance imaging; Radiol

Page No: 309-316 | Full Text

 

Original Research Article

IMAGING PATTERNS OF MUSCULOSKELETAL INFECTIONS ON MRI/USG AND THEIR CLINICAL CORRELATION: AN OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.53

Ajit Kumar, Annitha Elavarasi J, Ravi Elango, Kalpana.R

View Abstract

Background: Musculoskeletal infections encompass a wide spectrum of conditions involving bones, joints, muscles, and soft tissues, which can lead to significant morbidity if not diagnosed early. Clinical presentation is often variable and nonspecific, making imaging modalities such as magnetic resonance imaging (MRI) and ultrasonography (USG) crucial for early detection, characterization, and management. Understanding imaging patterns and their correlation with clinical and laboratory findings is essential for improving diagnostic accuracy and guiding treatment. Aim: To evaluate the imaging patterns of musculoskeletal infections on MRI and USG and to correlate these findings with clinical presentation and laboratory parameters in patients presenting to a tertiary care hospital. Materials and Methods: This observational study included 50 patients with clinically suspected musculoskeletal infection who underwent MRI and/or USG. Clinical data including symptoms, examination findings, and comorbidities were recorded. Laboratory parameters such as total leukocyte count, erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), and culture results were analyzed. Imaging findings on MRI and USG were assessed systematically for features such as soft tissue edema, abscess formation, joint effusion, marrow edema, cortical destruction, and deep fascial extension. Statistical analysis was performed using SPSS version 27.0, and correlation between imaging findings and clinical/laboratory parameters was evaluated. Results: The majority of patients were in the 21–40 years age group (38.00%) with male predominance (64.00%). Pain (92.00%), tenderness (82.00%), and swelling (78.00%) were the most common clinical features. Raised CRP (74.00%) and ESR (70.00%) were frequently observed. Lower limb involvement (48.00%) and osteomyelitis (28.00%) were the most common anatomical and diagnostic patterns, respectively. MRI demonstrated significant superiority in detecting marrow edema, cortical destruction, and deep fascial extension (p<0.05), whereas USG was comparable in identifying soft tissue edema, joint effusion, and abscesses. Abscess formation showed significant association with fever, elevated inflammatory markers, positive culture, and need for surgical intervention (p<0.05). Conclusion: MRI and USG are complementary modalities in the evaluation of musculoskeletal infections. MRI is superior for detecting deep and osseous involvement, while USG is valuable for superficial lesions and procedural guidance. Imaging findings, especially abscess formation, correlate well with clinical severity and laboratory parameters, aiding in timely diagnosis and management. Keywords: Musculoskeletal infections, Magnetic resonance imaging, Ultrasonography, Osteomyelitis, Clinical correlation.

Page No: 317-324 | Full Text

 

Original Research Article

COMPARISON OF ENDOBRONCHIAL CRYOBIOPSY AND CONVENTIONAL FORCEPS BIOPSY IN SUSPECTED LUNG CARCINOMA: A CROSS-SECTIONAL STUDY FROM A TERTIARY CARE CENTRE

http://dx.doi.org/10.70034/ijmedph.2026.2.54

Davinder Kumar Kundra, Vikas Jaiswal, Sachin Baliyan

View Abstract

Background: Accurate histopathological and molecular diagnosis is essential in the management of lung carcinoma, particularly in the era of targeted therapy and immunotherapy. Conventional endobronchial forceps biopsy, though widely used, often yields small and fragmented specimens with crush artifacts. Cryobiopsy has emerged as a promising alternative that may provide larger and better-preserved tissue samples. This study compared the diagnostic yield, specimen adequacy, and safety of cryobiopsy versus forceps biopsy in patients with suspected lung carcinoma presenting with visible endobronchial lesions. Materials and Methods: This hospital-based cross-sectional study included 76 adult patients undergoing flexible bronchoscopy for suspected lung carcinoma with visible endobronchial growth. All patients underwent both forceps biopsy and cryobiopsy during the same procedure. Primary outcome was diagnostic yield. Secondary outcomes included specimen size, adequacy for immunohistochemistry (IHC) and molecular testing, crush artifact, and procedure-related complications. Statistical analysis was performed using paired tests, with p < 0.05 considered significant. Results: Cryobiopsy demonstrated significantly higher diagnostic yield compared to forceps biopsy (88.2% vs 68.4%; p = 0.002). Mean specimen size was larger with cryobiopsy (6.8 ± 1.9 mm vs 3.2 ± 1.1 mm; p < 0.001), with significantly lower crush artifact (7.9% vs 42.1%; p < 0.001). Adequacy for IHC (85.5% vs 57.9%; p < 0.001) and molecular testing (81.6% vs 50.0%; p < 0.001) was superior with cryobiopsy. Bleeding was more frequent with cryobiopsy (39.5% vs 23.7%; p = 0.041), though predominantly mild and manageable. No major procedure-related morbidity or mortality was observed. Conclusion: Endobronchial cryobiopsy provides significantly higher diagnostic yield and superior molecular adequacy compared to conventional forceps biopsy, with an acceptable safety profile. Incorporation of cryobiopsy into routine bronchoscopic practice may enhance diagnostic precision and reduce the need for repeat procedures in lung carcinoma evaluation. Keywords: Cryobiopsy; Forceps biopsy; Lung carcinoma; Endobronchial lesion; Diagnostic yield.

Page No: 325-331 | Full Text

 

Original Research Article

ROTAVIRUS DIARRHOEA IN THE POST-VACCINATION PERIOD: CLINICAL PROFILE AND GENOTYPING STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.55

Naresh Jindal, Vibha Madan, Pooja, Shisher Agrawal

View Abstract

Background: Rotavirus remains a leading cause of acute gastroenteritis in children under five years of age, despite the introduction of rotavirus vaccination into national immunization programs. In the postvaccination period, understanding changes in clinical presentation, disease severity, and circulating rotavirus genotypes is essential for assessing vaccine impact and guiding public health strategies. Materials and Methods: This hospital-based observational study was conducted among children aged 6 weeks to 59 months presenting with acute diarrhoea at a tertiary care center during the postvaccination period. Demographic details, clinical features, vaccination status, and disease severity were recorded using a structured proforma. Stool samples were tested for Group A rotavirus antigen by enzyme-linked immunosorbent assay (ELISA). Rotavirus-positive samples underwent G and P genotyping using reverse transcription polymerase chain reaction (RT-PCR). Disease severity was assessed using the Vesikari Clinical Severity Score. Statistical analysis was performed to compare clinical and severity parameters between groups. Results: Out of 163 children with acute diarrhoea, 68 (41.7%) were rotavirus positive. Rotavirus positivity was significantly higher among partially vaccinated and unvaccinated children compared to fully vaccinated children (p = 0.031). Rotavirus-positive children had significantly higher rates of vomiting, frequent stools, dehydration, and hospital admission (p < 0.05). Severe disease (Vesikari score ≥11) was observed in 36.8% of rotavirus-positive cases compared to 18.9% of rotavirus-negative cases (p = 0.006). Among rotavirus-positive children, fully vaccinated children had significantly fewer severe cases and shorter hospital stays compared to partially vaccinated or unvaccinated children (p = 0.011 and p = 0.002, respectively). Genotype analysis revealed predominance of G9 (25.0%) and G12 (22.1%) genotypes, with P[8] as the most common P type (61.8%). G9P[8] and G12P[6] were the most frequent genotype combinations. Conclusion: Rotavirus continues to contribute substantially to acute diarrhoeal illness in young children in the postvaccination era. While vaccination does not completely prevent infection, it significantly reduces disease severity and hospitalization. The predominance of emerging genotypes such as G9 and G12 underscores the need for continued clinical and molecular surveillance to monitor vaccine impact and guide future immunization strategies. Keywords: Rotavirus; Acute gastroenteritis; Postvaccination period; Vesikari score; Childhood diarrhoea.

Page No: 332-338 | Full Text

 

Original Research Article

IMPLANT PROSTHODONTIC INTERFACE: A REVIEW OF THE IMPORTANCE OF ABUTMENT DESIGN AND MATERIALS

http://dx.doi.org/10.70034/ijmedph.2026.2.56

Gundabathina Sarala Devi, V Vamsi Krishna Reddy, T. Pavan Kumar, V. Dileep Nag

View Abstract

Dental implants have rapidly become recognized as a dependable and efficient solution for replacing lost teeth, enhancing functionality, aesthetics, and patient satisfaction. The implant-abutment interface serves as the crucial link between the implant fixture and the prosthetic restoration in implant prosthodontics. The design and material of the abutment are essential in determining the long-term efficacy of implant-supported restorations, as they influence mechanical stability, biological integration, and cosmetic outcomes. The abutment facilitates a solid peri-implant soft tissue seal, minimal micro-movement, and optimal load distribution. Suboptimal design and material choices can result in compromised aesthetics, marginal bone resorption, peri-implant irritation, and screw loosening. Consequently, to enhance implant functionality, it is crucial to comprehend the influence of abutment parameters. This review aims to critically evaluate the role of abutment materials and design in the implant prosthodontic interface. Titanium, zirconia, and hybrid composites are among the often-employed materials discussed, along with surface characteristics, emergence profile, and connection geometry. The biological and mechanical success of implants is significantly determined by their design and materials, as concluded in this review. The selection of materials influences biocompatibility, bacterial adherence, and aesthetic integration; nonetheless, conical connection designs and platform switching have shown improved stability and reduced crestal bone loss. Zirconia offers superior aesthetics in anterior regions; nonetheless, titanium's reliability and robustness establish it as the gold standard. New biomaterials and CAD/CAM technology enhance both clinical outcomes and personalization. To achieve optimal patient outcomes and ensure the longevity of implants, meticulous selection of abutment design and material is crucial. Keywords: Dental implants, Abutment design, Implant–abutment interface, Titanium, Zirconia, Biocompatibility.

Page No: 339-348 | Full Text

 

Case Series

FUNCTIONAL OUTCOME FOLLOWING INTERNAL FIXATION OF PROXIMAL HUMERUS SURGICAL NECK TWO- AND THREE-PART FRACTURES WITH POLARIS NAIL

http://dx.doi.org/10.70034/ijmedph.2026.2.57

Ramesh M, Manesh Chacko Philip, Ranju Raj

View Abstract

Background: Proximal humerus fractures are common injuries accounting for approximately 4–6% of all adult fractures. Displaced fractures involving the surgical neck may result in significant functional impairment if not managed appropriately. Intramedullary fixation has emerged as a minimally invasive alternative to plate fixation, offering stable fixation while preserving soft tissue and vascular supply. The Polaris intramedullary nail has been specifically designed for proximal humerus fractures and provides multiple proximal locking options to improve stability. The objective is to evaluate the clinical and radiological outcomes of Polaris intramedullary nailing in the management of displaced two-part and three-part surgical neck fractures of the proximal humerus. Materials and Methods: This prospective case series included 25 patients with displaced proximal humerus fractures treated with Polaris intramedullary nailing at a tertiary care orthopaedic centre over a period of two years. Patients with Neer two-part and three-part surgical neck fractures were included. Functional outcomes were assessed using the Constant–Murley shoulder score, and radiological union was evaluated using serial radiographs during follow-up. Statistical analysis was performed using IBM SPSS software version 26.0. Results: The mean patient age was 54.6 ± 12.8 years. The majority of patients were male (64%). Two-part fractures accounted for 60% of cases, while three-part fractures constituted 40%. Radiological union was achieved in 24 patients (96%), with most fractures healing within 10–12 weeks. The mean Constant–Murley score at final follow-up was 78.6 ± 9.4. Good to excellent functional outcomes were observed in 80% of patients. The mean forward flexion was 142.5° and mean abduction was 135.4°. Postoperative complications occurred in three patients (12%), including shoulder stiffness, screw irritation, and delayed union. No cases of deep infection, implant failure, or avascular necrosis were observed. Conclusion: Polaris intramedullary nailing provides reliable fixation for displaced two-part and three-part surgical neck fractures of the proximal humerus. The technique allows early mobilization and results in high union rates with satisfactory functional outcomes and a low complication rate. Intramedullary fixation may therefore be considered an effective surgical option for selected proximal humerus fractures. Keywords: Proximal humerus fracture; Intramedullary nailing; Polaris nail; Surgical neck fracture; Constant–Murley score; Shoulder fracture fixation.

Page No: 349-355 | Full Text

 

Original Research Article

KAP STUDY ON AI IN THE MEDICAL FIELD WITH SPECIAL FOCUS ON AI VS HUMAN DOCTOR

http://dx.doi.org/10.70034/ijmedph.2026.2.58

Shounak Gupta, Harshpreet Singh, Prachi Jain

View Abstract

Background: Artificial intelligence (AI) is swiftly changing healthcare; however, the effect of AI on healthcare is contingent on the knowledge, attitudes, and practices (KAP) of medical practitioners. We organized a cross-sectional KAP survey of medical students and doctors to understand their awareness and attitudes towards AI in the field of medicine, their perceptions about AI compared to human doctors. Previous research’s have indicated a large level of awareness and little action among doctors and students. Materials and Methods: A structured questionnaire was used to survey 178 medical students and 30 doctors in a tertiary care institute. Demographics, knowledge of AI (e.g. awareness and education), attitudes (e.g. confidence in AI applications, fears of replacement), practices (e.g. use of AI tools, training) were covered. Descriptive summarization of responses was done. Chi-square (significance at p<0.05) was used to test categorical comparisons (students vs doctors). Results: Nearly all participants had heard of AI: 99.4% of students and 100% of doctors reported awareness. However, formal AI education was low (18.2% of students vs 0% of doctors had AI in their curriculum). Around 86–87% in both groups knew of AI’s use in medicine. Use of AI tools differed (33.3% of doctors vs 60.0% of students reported occasional or frequent use for study purposes). Positive attitudes were common: e.g. 93.8% of students agreed that AI is beneficial in medical education and 90% of doctors felt AI would benefit patients. In contrast, fewer believed AI could replace human doctors (23.3% of doctors, 39.7% of students). Notably, more doctors than students expressed ethical or privacy concerns (86.7% vs 38.4%, p<0.001). Chi-square tests showed significant differences in key attitudes: doctors were significantly more likely to view AI as beneficial to patient care (90% vs 50.6%, p<0.001) and to express privacy/ethical concerns (86.7% vs 38.4%, p<0.001), while students were more optimistic about AI replacing traditional learning methods (39.7% vs 23.3%, p<0.001). Conclusion: Both medical students and physicians showed high awareness of AI and generally favorable attitudes toward its role in healthcare, but actual training and use remain limited. Students were enthusiastic about learning AI, whereas doctors emphasized potential ethical issues and the need for guided integration. Our findings underscore the need to incorporate AI education into medical training (as recommended in similar KAP studies) and to foster “human-AI” collaboration rather than see AI as a replacement for clinicians. Keywords: Artificial intelligence; medical education; knowledge, attitudes, and practices (KAP); physicians; medical students; AI vs doctor; healthcare technology.

Page No: 356-360 | Full Text

 

Case Report

ANGIOLEIOMYOMA OF UTERUS – A RARE BENIGN SMOOTH MUSCLE TUMOUR

http://dx.doi.org/10.70034/ijmedph.2026.2.59

Isha Pareek, Sudhamani S, Shreya Shah

View Abstract

Uterine angioleiomyoma (vascular leiomyoma) is an uncommon smooth muscle tumor derived from blood vessels and is almost exclusively reported in the subcutaneous soft tissues of extremities. When seen in the female genital tract, the lesion is very rare. The radiological and clinical characteristics that distinguish uterine angioleiomyoma from conventional leiomyoma are not well documented in literature. Herein, we describe a 32-year-old woman with complaints of menorrhagia and pelvic pain. Ultrasonography showed an intramural mass suggesting a leiomyoma. This mass was surgically excised through an open myomectomy procedure. The pathologic findings were characteristic of angioleiomyoma and the differential diagnosis with conventional leiomyoma is discussed. Angioleiomyoma should be considered a rare variant of uterine smooth muscle tumors due to its specific potential to cause heavy menstrual bleeding by virtue of its vascular nature and due to its increased risk of causing hemorrhage during surgical excision. Keywords: Angioleiomyoma; CD34; Leiomyoma; Menorrhagia; Myomectomy; Uterus; Vascular leiomyoma

Page No: 361-363 | Full Text

 

Original Research Article

EFFECT OF DURATION OF DISEASE AND GLYCEMIC CONTROL ON ATTENTION, EXECUTIVE FUNCTION AND VISUAL REACTION TIME IN TYPE 2 DIABETES MELLITUS

http://dx.doi.org/10.70034/ijmedph.2026.2.60

Santosh Mayannavar, Hardik Nagar, Nayan Mali

View Abstract

Background: The objective is to assess the effect of disease duration and glycemic control on attention, executive function, and visual reaction time in patients with T2DM. Materials and Methods: This cross-sectional study included 60 participants with T2DM, divided into two groups based on their Glycosylated Hemoglobin (HbA1c) levels: Group A (Good Glycemic Control, HbA1c < 7.0%, n=30) and Group B (Poor Glycemic Control, HbA1c ≥ 7.0%, n=30). Participants were further sub-grouped based on disease duration (< 5 years and ≥ 5 years). All participants were assessed using the Digit Letter Substitution Test (DLST) for attention, the Stroop Color and Word Test (SCWT) for executive function, and a computer-based Visual Reaction Time (VRT) test for processing speed. Statistical analysis was performed using Pearson's correlation coefficient and independent t-tests. Results: The group with poor glycemic control (Group B) demonstrated significantly poorer cognitive performance compared to Group A, evidenced by lower DLST scores (p < 0.01), higher Stroop interference scores (indicating poorer executive function, p < 0.01), and prolonged Visual Reaction Times (p < 0.001). Furthermore, a longer duration of disease (≥ 5 years) was significantly correlated with worse performance on all three cognitive parameters, independent of glycemic control. A significant positive correlation was found between HbA1c levels and VRT (r = 0.52, p < 0.001), while a negative correlation was observed between HbA1c and DLST scores (r = -0.45, p < 0.01). Conclusion: Both poor glycemic control and a longer duration of T2DM are independently and significantly associated with deficits in attention, executive function, and psychomotor speed. These findings underscore the importance of stringent, early glycemic management to potentially mitigate the risk of cognitive decline in the diabetic population. Keywords: Type 2 Diabetes Mellitus, Glycemic Control, HbA1c, Attention, Executive Function, Visual Reaction Time, Cognition.

Page No: 364-369 | Full Text

 

Original Research Article

HEALTH LITERACY AND PATIENT ACTIVATION AMONG ADULTS WITH MULTIMORBIDITY IN URBAN INDUSTRIAL INDIA: A CROSS-SECTIONAL STUDY FROM SURAT WITH DIRECT IMPLICATIONS FOR NURSE-LED CHRONIC DISEASE MANAGEMENT

http://dx.doi.org/10.70034/ijmedph.2026.2.61

Aarjukumari Ghoghari, Bhavesh Chauhan

View Abstract

Background: Non-communicable diseases impose an expanding burden on primary healthcare systems across urban India, particularly in industrial cities where large numbers of working-age migrants live with multiple co-existing chronic conditions. Surat, a major diamond-processing and textile manufacturing hub in South Gujarat, is home to hundreds of thousands of such workers. Despite the scale of this population and the complexity of its health needs, evidence on whether these individuals possess adequate knowledge, skill, or confidence to manage their own conditions — or whether the health information they receive is accessible to them — remains limited. This study was designed to measure health literacy and patient activation in adults with multimorbidity attending outpatient clinics across four Surat hospitals, to assess blood pressure control as a direct clinical outcome, and to identify the characteristics most strongly associated with poor self-management capacity. Materials and Methods: A cross-sectional observational study was conducted between September and December 2024 across four outpatient clinical settings in Surat: Sardar Multispeciality Hospital, Shraddha Multispeciality Hospital, SIMS Hospital, and Dhameliya Kidney Hospital. Adults aged 30 to 65 years with at least two confirmed ICD-10-defined chronic conditions were enrolled using consecutive sampling (n=287). Patient activation was measured with the PAM-13 (Cronbach’s alpha = 0.87); health literacy with the HLS-EU-Q16 (Cronbach’s alpha = 0.83); and medication adherence with the MMAS-8. Blood pressure was measured following triplicate protocol using a calibrated automated sphygmomanometer. Multivariable logistic regression with VIF diagnostics was used to identify independent predictors of poor self-management. A sensitivity analysis excluding obesity and anxiety/depression as sole qualifying conditions was performed. Reporting follows STROBE guidelines. Structured patient education and literacy-adapted communication strategies were applied during data collection to ensure patient comprehension of study instruments. Results: Mean PAM-13 score was 52.4 (SD 14.7; 95% CI 50.7–54.1). Over three in five participants (61.7%; 95% CI 56.0–67.1%) registered poor self-management capacity at PAM Level 1 or 2. Low or problematic health literacy affected 68.3% of the sample (95% CI 62.7–73.5%). Among 205 hypertensive participants, 120 (58.5%) had uncontrolled blood pressure. Only 23.6% of those on prescribed medication showed high MMAS-8 adherence. In the adjusted regression model, low health literacy carried the strongest association with poor self-management (aOR 3.84; 95% CI 2.41–6.12), followed by low education (aOR 2.67), unskilled occupation (aOR 2.19), and three or more concurrent conditions (aOR 2.02). All VIFs were below 2.3; Nagelkerke R²=0.38. Findings were robust in sensitivity analysis. No participant had ever accessed a nurse-led chronic disease service; only 18.1% had received any structured self-management education. Conclusion: Low health literacy, poor patient activation, and inadequate medication adherence are common in this population and are associated with objectively

Page No: 370-379 | Full Text

 

Original Research Article

PROSPECTIVE OBSERVATIONAL STUDY ON SOCIOECONOMIC STATUS AND DIETARY FACTORS RESPONSIBLE FOR NON-INITIATION OR DELAYED INITIATION OF LACTATION

http://dx.doi.org/10.70034/ijmedph.2026.2.62

Agnimita Giri Sarkar, Apurpa Ghosh, Surupa Basu

View Abstract

Background: Lactogenesis II is a hormonally regulated process dependent on coordinated prolactin, insulin, cortisol, and progesterone withdrawal. Increasing evidence suggests that metabolic dysfunction may impair secretory activation despite adequate nutritional intake. Delayed initiation of lactation remains common in India, yet the endocrine–metabolic determinants underlying this phenomenon are insufficiently explored. This study aimed to evaluate the association of socio-economic status (SES) and dietary factors with delayed or non-initiation of lactation, with emphasis on potential endocrine–metabolic mechanisms. Materials and Methods: This analytical cross-sectional study included 400 postpartum mothers (within six months of delivery), comprising 200 cases (delayed/non-initiation of lactation) and 200 controls (normal initiation). SES was classified using the Modified Kuppuswamy Scale 2023. Dietary pattern was categorized as vegetarian or non-vegetarian, and daily caloric intake was estimated using structured dietary recall. Independent variables included maternal age, parity, education, mode of delivery, and antenatal care utilization. Continuous variables were analyzed using independent t-tests and categorical variables using Chi-square tests. A p-value ≤ 0.05 was considered statistically significant. Results: Mothers with delayed lactation were significantly older than controls (28.63 ± 4.67 vs. 24.83 ± 4.08 years; p < 0.0001). Mean caloric intake was significantly higher in the delayed group (2723.77 ± 456.68 kcal) compared to the normal group (1852.99 ± 320.40 kcal; p < 0.0001). A strong association was observed between SES and lactation status (p < 0.001), with delayed lactation more prevalent among upper-middle and lower-middle socio-economic strata. Conclusion: Delayed initiation of lactation is significantly associated with advancing maternal age, higher caloric intake, and middle SES. The findings support the hypothesis that endocrine–metabolic dysregulation—particularly impaired insulin sensitivity and altered prolactin responsiveness—may underlie delayed lactogenesis. In socio-economically transitioning populations, metabolic risk factors may exert greater influence on breastfeeding initiation than caloric insufficiency. Integrating metabolic screening with structured lactation support in accordance with World Health Organization recommendations may improve early breastfeeding outcomes and neonatal health indicators. Keywords: Delayed lactation, Lactogenesis II, Socioeconomic status, Caloric intake, Insulin resistance, Endocrine dysfunction, Maternal age, Metabolic imbalance, Breastfeeding initiation.

Page No: 380-384 | Full Text

 

Original Research Article

VOICES FROM THE FRONTLINE: EXPLORING HEALTH WORKERS’ EXPERIENCES IN NEPAL’S LYMPHATIC FILARIASIS MASS DRUG ADMINISTRATION PROGRAM

http://dx.doi.org/10.70034/ijmedph.2026.2.63

Achut Babu Ojha, Nafisul Hasan, Damaru Prasad Paneru, Sagar Parajuli, Sudip Raj Khatiwada, Md Makabul Rain, Sanjeev Kumar, Bibek Paudel

View Abstract

Background: Lymphatic filariasis (LF) remains a major public health challenge in Nepal. While Mass Drug Administration (MDA) is the cornerstone of elimination, the perspectives and lived experiences of health workers—those who implement the program—are often overlooked. Materials and Methods: This study employed a qualitative dominant mixed methods design, with emphasis on focus group discussions (FGDs) among health workers in three endemic districts. Thematic analysis was conducted using Braun and Clarke’s six step framework. Quantitative survey data (n=257) provided contextual support but the primary lens was qualitative. Results: Five overarching themes emerged: (1) health workers as strategic implementers, (2) contextually competent resources, (3) community mediators, (4) compliance facilitators, and (5) program optimizers. Health workers described challenges such as fear of side effects, rumors, inadequate supervision, and logistical constraints, but also highlighted their adaptive strategies, community trust building, and commitment to elimination goals. Conclusion: Health workers are not passive conduits of policy but active agents shaping program outcomes. Their narratives reveal both systemic barriers and opportunities for strengthening LF elimination. Policies must prioritize training, supportive supervision, incentives, and community engagement strategies that leverage health workers’ unique position at the frontline. Keywords: Lymphatic Filariasis, Mass Drug Administration, Qualitative Research, Health Workers, Nepal, Community Engagement.

Page No: 385-387 | Full Text

 

Original Research Article

HPLC-BASED EVALUATION OF HEMOGLOBINOPATHIES IN PEDIATRIC POPULATION: A RETROSPECTIVE CROSS-SECTIONAL STUDY FROM A TERTIARY CARE CENTER IN SOUTH INDIA

http://dx.doi.org/10.70034/ijmedph.2026.2.64

Komati Poornima, Arshiya Anjum, Shirish Kumar Patewar

View Abstract

Background: Hemoglobinopathies are among the most common inherited disorders worldwide and contribute significantly to pediatric morbidity, particularly in developing countries like India. Early detection using reliable diagnostic modalities is essential for timely intervention and genetic counseling. The objective is to evaluate the prevalence and pattern of hemoglobinopathies in pediatric patients using High-Performance Liquid Chromatography (HPLC) and to analyze their hematological characteristics. Materials and Methods: This retrospective cross-sectional study included 150 pediatric patients (0–14 years) over a period of 12 months (January–December 2025). Blood samples collected in EDTA were analyzed using an automated hematology analyzer for red cell indices and BIORAD D-10 HPLC system for hemoglobin variant detection. Cases were categorized based on hemoglobin fractions and retention times. Results: Out of 150 cases, 109 (72.7%) showed normal hemoglobin pattern, while 36 (24.0%) had hemoglobinopathies and 5 (3.3%) were indeterminate. Among abnormal cases: Sickle cell disease (HbSS): 11 cases (7.3%), Beta thalassemia trait: 7 cases (4.7%), Sickle cell trait: 6 cases (4%) & Compound heterozygous states: 6 cases (4%). A slight male predominance (52.7%) was observed. The most affected age group was 6–10 years (38.9%). Approximately 80% of cases showed microcytic hypochromic indices. Conclusion: Hemoglobinopathies, particularly sickle cell disease and beta thalassemia trait, are prevalent in the pediatric population. HPLC is a reliable and effective tool for screening and diagnosis. Early detection combined with genetic counseling and preventive strategies is essential to reduce disease burden. Keywords: Hemoglobinopathies, Sickle Cell Disease, Beta Thalassemia Trait, High-Performance Liquid Chromatography (HPLC), Pediatric Anemia, Hemoglobin Variants, Red Cell Indices, Screening.

Page No: 388-394 | Full Text

 

Original Research Article

COMPARISON BETWEEN 5% EMLA CREAM AND LIDOCAINE JELLY ON POST OPERATIVE SORE THROAT FOLLOWING GENERAL ANAESTHESIA WITH ENDOTRACHEAL INTUBATION - A PROSPECTIVE RANDOMIZED DOUBLE BLIND CONTROLLED STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.65

Yerraboina Madhavi, Sujani Gooty, Sunkesula Sonia Begum, Saya Raghavendra prasad, Chiruvella Sunil

View Abstract

Background: Postoperative sore throat (POST) is the most common postoperative outcome after endotracheal intubation. In this study we compared the efficacy of 5% EMLA cream and 2% Lidocaine jelly on incidence, severity of post operative sore throat, hoarseness of voice (HOV), post-extubation cough (PEC) following general anaesthesia with endotracheal intubation. Materials and Methods: This study includes 90 patients of 18 to 65 years old, with American Society of Anesthesiologists(ASA) physical status I and II and of either sex were scheduled to receive 5% EMLA cream or 2% Lidocaine jelly applied over the ET Tube cuff. POST was graded as none (0), mild (1), moderate (2), severe (3). A score of ≥2 was considered as significant POST. The incidence and severity of POST at the sixth post-operative hour was the primary outcome. Secondary outcomes included the incidence of POST, HOV and PEC at zero, first, and 24 hours. Results: The incidence of postoperative sore throat of any grade at 1st and 6th hour of postoperative period was significantly less in EMLA group when compared to Lidocaine group (35.6 % vs 60%, p - 0.0202; 15.6% vs 35.6%, p - 0.0296 respectively). The incidence of HOV was statistically significant at sixth postoperative hour (6.6% and 22.2%, p - 0.0358). The incidence of PEC was similar in both the groups at all times. Conclusion: 5% EMLA cream on the ETT cuff effectively reduced the incidence and severity of POST, HOV, PEC in the initial 6 hours of surgery compared to the 2% Lidocaine jelly group. Keywords: Postoperative sore throat (POST), 5 % EMLA, 2% Lidocaine jelly, hoarseness of voice, post extubation cough.

Page No: 395-400 | Full Text

 

Original Research Article

FACTORS INFLUENCING SURVIVAL AND SHORT-TERM OUTCOMES OF VERY LOW BIRTH WEIGHT INFANTS IN A TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.66

Paramjeet

View Abstract

Background: Very low birth weight (VLBW) infants remain at high risk of mortality and morbidity, particularly in developing countries. Despite advances in neonatal care, outcomes vary widely depending on maternal, perinatal, and neonatal factors. Identifying these factors is essential for improving survival and short-term outcomes. The objective is to evaluate the factors influencing survival and short-term outcomes among very low birth weight infants admitted to a tertiary care hospital. Materials and Methods: This retrospective observational study was conducted in the Neonatal Intensive Care Unit (NICU) of a tertiary care hospital. All neonates with a birth weight between 500 and ≤1500 grams, born between 1 January 2022 and 31 December 2023, and admitted within 24 hours of birth were included. Data on maternal, perinatal, and neonatal variables were collected from medical records. The primary outcome was survival to discharge, while secondary outcomes included major neonatal morbidities. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: A total of 1,018 VLBW infants were included, of whom 664 (65.2%) survived and 354 (34.8%) died. Survival was significantly higher among infants with adequate antenatal care, exposure to antenatal corticosteroids, and cesarean delivery. Higher birth weight, gestational age, head circumference, and Apgar scores were significantly associated with improved survival (p < 0.001)002E Conclusion: Survival of VLBW infants is influenced by a complex interplay of antenatal, perinatal, and neonatal factors. Strengthening antenatal care, ensuring timely interventions, and improving neonatal intensive care practices are crucial to enhancing survival and reducing morbidity in this vulnerable population. Keywords: Very low birth weight, neonatal mortality, NICU, antenatal care, prematurity, neonatal outcomes

Page No: 401-406 | Full Text

 

Original Research Article

SEVERE MINERAL BONE DISEASE IN A CHILD WITH A SOLITARY KIDNEY AND NEGLECTED END-STAGE RENAL DISEASE ON MAINTENANCE HAEMODIALYSIS

http://dx.doi.org/10.70034/ijmedph.2026.2.67

Sivaani.U, Ramya.N, Meyyan.A, Punitha.J , Karuppiah.K, Ganavi. Ramagopal

View Abstract

Chronic kidney disease–mineral and bone disorder (CKD-MBD) represents a systemic complication of advanced renal dysfunction characterized by abnormalities in calcium, phosphate, vitamin D metabolism, and secondary hyperparathyroidism leading to skeletal and cardiovascular complications. We report a neglected pediatric case of end-stage renal disease (ESRD) with a solitary kidney that developed severe mineral bone disease due to prolonged absence of follow-up and lack of supplementation. A male child in late childhood with a history of vesicoureteral reflux–related renal damage and prior nephrectomy presented with progressive severe bone pain, immobility, and reduced urine output. The child had been advised dialysis several years earlier, but was lost to follow-up and had not received mineral or vitamin D supplementation. Clinical evaluation suggested advanced CKD with severe skeletal involvement consistent with CKD-MBD. The patient was initiated on maintenance Hemodialysis with supportive metabolic management. This case highlights the devastating skeletal consequences of untreated CKD-MBD in children and emphasizes the importance of early detection of reflux nephropathy, long-term nephrology follow-up, and strict metabolic monitoring in pediatric CKD patients. Keywords: CKD–mineral bone disorder, End stage renal disease, Solitary kidney, Hemodialysis Pediatric renal disease.

Page No: 407-410 | Full Text

 

Original Research Article

EVALUATING PROTEINURIA'S PROGNOSTIC VALUE IN CHILDHOOD DENGUE INFECTIONS IN A TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.68

Khizerulla Sharief, Venkatesh K S, Veeralokanadha Reddy M

View Abstract

Background: Dengue is a major public health problem in tropical countries, with children being particularly vulnerable to severe forms of the disease. Early identification of prognostic markers is essential in tertiary care settings to reduce morbidity and mortality. Proteinuria, resulting from endothelial dysfunction and plasma leakage, has emerged as a potential indicator of disease severity in dengue infection. Objectives: To correlate urine protein creatinine ratio (UPCR) with severity of illness in children diagnosed with dengue fever. To assess whether proteinuria can serve as a predictor of progression to severe dengue. Materials and Methods: A hospital-based prospective observational study was conducted in the Department of Pediatrics over a period of one year. A total of 140 children aged 1 month to 12 years with laboratory-confirmed dengue infection were included. Daily early morning urine samples were collected from day 3 to day 9 of fever or until discharge. Proteinuria was assessed using urine dipstick and quantified by urine protein creatinine ratio (UPCR). Patients were categorized as probable dengue, dengue with warning signs, and severe dengue as per WHO criteria. Data were analyzed using SPSS version 19.0. Pearson Chi-square test and ANOVA were applied; p < 0.05 was considered statistically significant. Results: Of 140 children, 55% had probable dengue, 24.2% had dengue with warning signs, and 20.7% had severe dengue. Severe dengue accounted for 38% mortality within its group. Both dipstick proteinuria and UPCR values were significantly higher in severe dengue compared to other groups (p < 0.05). Proteinuria peaked between days 5–6 of illness and declined during recovery. Expired patients demonstrated markedly elevated UPCR and dipstick values compared to survivors. No significant association was observed between platelet count and UPCR. Conclusion: Proteinuria, particularly quantified by UPCR, shows significant correlation with dengue severity in children. Serial monitoring of UPCR may serve as a useful prognostic marker for early identification of severe disease and guide clinical management in tertiary care settings. Keywords: Dengue; Proteinuria; Urine Protein Creatinine Ratio; Pediatric Dengue; Severe Dengue; Prognostic Marker; Renal Involvement.

Page No: 411-416 | Full Text

 

Systematic Review

ROLE OF HISTOPATHOLOGY IN FORENSIC DIAGNOSIS OF SUDDEN CARDIAC DEATH: A SYSTEMATIC REVIEW AND META-ANALYSIS

http://dx.doi.org/10.70034/ijmedph.2026.2.69

Pradeep Kumar Nayak, Sudhanshu Sekhar Sethi, Ajith Antony, Priyatosh Dash, Debdas Samantaray, Mahendra Singh, Pratyush Mishra

View Abstract

Background: Sudden cardiac death (SCD) is one of the leading causes of natural deaths as seen in forensic practice, and it is very often a big puzzle for the pathologist to solve because the heart may not show any significant changes at autopsy. Histopathological analysis has always been one of the most important parts of the investigation of a sudden cardiac death after the person has died, but at the same time, the increasing number of biochemical, imaging, and molecular methods makes it necessary to thoroughly assess the contribution of histology in such a multimodal forensic setting. This systematic review and meta-analysis aimed to assess the role of histopathology in the forensic diagnosis of sudden cardiac death and to compare its diagnostic performance with emerging adjunctive methods. Materials and Methods: A comprehensive literature search was carried out in main electronic databases to find studies related to histopathological examination and additional diagnostics in SCD. The study selection criteria allowed systematic review, meta-analysis, and original forensic investigation articles revealing the microscopic changes in the heart or the comparison of diagnostic results. The authors collected information on the type of studies, the histopathological findings, and the diagnostic performances. Where possible, the authors quantitatively summarized the data to determine the pooled effect of diagnostics and the comparative accuracy of the different modalities. Results: Eight publications consisting of nearly 1, 485 cases were examined. Histopathology played a major role in directly determining the cause of death in 72% of the cases and it was used as one of the proofs in another 18% of the cases. The top microscopic changes were myocardial ischemia/necrosis (41%), interstitial fibrosis (18%), cardiomyopathic remodeling (13%), myocarditis (11%), contraction band necrosis (9%), and vascular abnormalities (8%). Histopathology revealed a very high pooled sensitivity (88%) and specificity (92%) that even exceeded the performance of single adjunctive methods like cardiac biomarkers, postmortem imaging, and molecular analyses. A multimodal diagnostic method combining histopathology with ancillary techniques yielded the highest total diagnostic accuracy. Conclusion: Histopathological examination is still essential in the forensic diagnosis of sudden cardiac death. It is the main confirmatory method that discloses structural and cellular changes of the myocardium. Besides, newer biochemical, imaging, and molecular methods may increase the certainty of the diagnosis. However, they are most valuable when used together with traditional microscopy. Also, standardization of sampling protocols along with the application of advanced techniques might increase the diagnostic accuracy and decrease the number of unexplained sudden cardiac death cases in forensic medicine. Keywords: Sudden cardiac death, Forensic histopathology, Postmortem diagnosis, Myocardial injury, Molecular autopsy, Multimodal forensic investigation.

Page No: 417-423 | Full Text

 

Original Research Article

COMPARATIVE STUDY ON WATER, SANITATION AND HYGIENE (WASH) PRACTICES AMONG URBAN AND RURAL HOUSEHOLDS IN THE FIELD PRACTICE AREA OF TERTIARY CARE INSTITUTE

http://dx.doi.org/10.70034/ijmedph.2026.2.70

Kethana Poola, H. Hemachandra, M.K. Sasikala, Sunita. Sreegiri

View Abstract

Background: Safe WASH practices are important for the health and well-being of humans, contributes to healthy nutrition and helps to live in healthy environments. As per WHO, the burden of illness associated with WASH accounts for 4.0% of all fatalities and 5.7% of total disease burden across worldwide and 7.5% of all mortality in India. Appropriate WASH practices can prevent at least 9.1% of the global disease burden and 6.3% of all deaths by reducing the burden of water borne diseases, under-nutrition and contribute to economic development of the country. Development of infrastructure and proper regulations will improve the WASH practices to reduce the disease burden. This is study to assess and compare the water, sanitation and hygiene (WASH) practices among urban and rural households. Materials and Methods: Community based analytical cross-sectional study was conducted in the Urban and rural field practice area of Tertiary Care Institute, Tirupati during the period of July – September 2025. Using multistage random sampling, 170 households were selected. the data was collected regarding socio demographic details and WASH practices using “WHO Core questions on water, sanitation and hygiene for household surveys” by Interview method. Data was entered in MS Excel spread sheet and analysed using R software version 4.5.2. Results: Among the 170 households (85 - Urban and 85 – Rural), main source of drinking water in urban areas was piped water into dwelling (37.6%) whereas it was a water kiosk (70.6%) among rural households. Majority (46.8%) in urban areas had flush toilets to piped sewer system whereas in rural areas, majority (91.1%) had flush toilets to septic tank. Safe drinking water practices have been found to be significantly associated with locality (p = 0.002) using a Pearson chi-square test. Similarly, rural households (90.5%) had improved sanitation facilities compared to urban households (88.2%). Conclusion: A significantly higher proportion of rural households had access to improved drinking water sources and sanitation facilities compared to urban households. Overall, safe drinking water practices were strongly associated with locality, emphasizing the need for context-specific interventions to further enhance WASH standards in both settings. Keywords: Drinking water, Sanitation, Hygiene, WASH, Urban and rural households.

Page No: 424-428 | Full Text

 

Original Research Article

CYTOMORPHOLOGICAL PATTERN ANALYSIS OF VARIOUS LYMPHADENOPATHIES IN A TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.71

Hassan Sona Rai, Bhargavi Mohan, Manjunath HK, Mythri B M, Vinitra K, Gudrun Koul, Suma HV, Lakshmi Devi M

View Abstract

Background: The human body consists of approximately 600 lymph nodes. Tonsils, adenoids, spleen and Peyer’s patches are various parts of the lymphoid tissue and their role is to provide immunity against various pathogens. Peripheral lymph nodes are located deep in the subcutaneous tissue and can be palpated if any thing causes them to enlarge. Lymph node enlargements are one of the commonest clinical presentation of patients and it encompasses a wide spectrum ranging from inflammation to a malignant lymphoma and finally metastatic malignancy. FNAC needle aspiration cytology (FNAC) is a rapid, simple, reliable, minimally invasive and cost-effective procedure which can be used as a outpatient setting however, histopathological examination remains as the gold standard in the evaluation of lymphadenopathy. The aim of the study was to evaluate the cytomorphological patterns of various diseases causing lymphadenopathy. Materials and Methods: A three years study of 100 cases of lymphadenopathy presenting to the Department of pathology from 1stJanuary 2023 to 31stDecember 2025 taken up for our study. FNAC performed using a 23/24 gauge needle and 5ml syringe. Smears were fixed in 95% ethyl alcohol and stained with Hematoxylin and Eosin, Papanicolaou stain, Leishman stain and Giemsa stain. Special stains like periodic acid–Schiff for mucin (PAS) and Ziehl–Neelsen stain (ZN) stain for acid-fast bacilli (AFB) performed wherever required. Results: Most common lesion observed in our study was reactive lymphadenitis (25%), followed by granulomatous lymphadenitis (28.15%), tubercular lymphadenitis (17.39%) and metastatic lesions (14.23%). Cervical lymphadenopathy found to be the most common site in our study. During the course of 3 years a total of 100 cases were received and were studied. Age of patients ranged from 1 year to 78 years. Females were 32 and males were 68 in number. Most common lesion found in our study was Reactive hyperplasia (25%), followed by Granulomatous (22%), Tuberculosis (20%), Suppurative (10%), Necrotic (5%), Parasitic (1%), Metastasis (12%), Lymphoma (5%). Conclusion: Our study highlighted the various cytomorphological patterns of lymphadenopathy. This study demonstrates that fine needle aspiration is a safe accurate and valuable tool in the evaluation of various lymphadenopathies. FNAC analysis shows that accuracy is 95%, since histopathological diagnosis is invasive procedure. Keywords: Lymph node, Lymphadenopathy, Fine needle aspiration cytology, Granulomatous, Malignant.

Page No: 429-433 | Full Text

 

Original Research Article

STUDY OF SERUM TSH AND URIC ACID LEVELS IN PREECLAMPSIA OF PREGNANCY

http://dx.doi.org/10.70034/ijmedph.2026.16.2.72

Vishwajit Shivaji Shinde, Abhay Nagdeote, Sudhir Sase

View Abstract

Background: Preeclampsia - a major hypertensive disorder due to pregnancy is significantly associated with fetal and maternal morbidity. Alterations in thyroid function and purine metabolism have been implicated in its pathophysiology, reflected by changes in hormone serum thyroid-stimulating hormone and levels of serum uric acid. Objective: To estimate and compare serum TSH and levels of serum uric acid in women with pre-eclampsia and age-matched normal pregnant womens as controls, and to determine the correlation between these parameters. Materials and Methods: The study was conducted at a tertiary level hospital center including 61 pre eclamptic women and 61 normotensive pregnant controls. The serum TSH and serum uric acid levels were analyzed from venous blood samples collected of study participants. Urine samples were assessed for proteinuria. Statistical analysis included Chi-square test, unpaired t-test, correlating using Pearson’s correlation. Results: The cases and controls were comparable with respect to age (p > 0.05). Levels of serum uric acid were highly significant in cases of pre-eclamptic women than in controls, both in categorical distribution and mean values (5.48 ± 1.40 mg/dL vs.4.01±1.00 mg/dl; p<0.05). Although the categorical distribution of TSH levels did not significantly different between groups (p>0.05), Mean of serum TSH level was significantly higher in pre-eclamptic group (3.11± 0.98 µIU/mLvs.2.50 ± 1.06µIU/mL; p<0.05). Statistically significant but mild positive correlation was seen between serum TSH levels and levels of serum uric acid in cases group that is pre-eclamptic women (r = 0.30, p<0.05), while in the control group no significant correlation was found (p>0.05). Conclusion: Preeclampsia condition showed significantly elevated serum uric acid and mean TSH levels, along with a positive correlation between these parameters. Estimation of serum uric acid and TSH may serve as useful biochemical markers in the evaluation and monitoring of preeclampsia. Keywords: Pregnancy, Preeclampsia, TSH, Uric Acid, Complication in pregnancy, Eclampsia.

Page No: 434-438 | Full Text

 

Original Research Article

OUTCOMES OF LIMBERG FLAP RECONSTRUCTION IN PILONIDAL SINUS DISEASE: A PROSPECTIVE STUDY

http://dx.doi.org/10.70034/ijmedph.2026.16.2.73

Govardhan M. Gaikwad, Obaid Syed, Ameenuddin Ali Syed, Syed Naureen Nazar

View Abstract

Background: Pilonidal sinus disease (PSD) is a chronic inflammatory condition of the sacrococcygeal region associated with significant morbidity and recurrence. Off-midline closure techniques have demonstrated improved outcomes compared to conventional midline closure. Objective: To evaluate clinical outcomes, postoperative complications, and recurrence following Limberg flap reconstruction in patients with PSD. Methods: This prospective observational study was conducted at a tertiary care center in India from January 2021 to January 2026. A total of 56 adult patients with chronic or recurrent PSD underwent rhomboid excision with Limberg flap reconstruction. Postoperative outcomes, complications, and recurrence were assessed during follow-up. Results: Among 56 patients, 80.4% were male. Surgical site infection occurred in 5.4% of patients, seroma in 3.6%, and wound dehiscence in 1.8%. Transient paraesthesia was observed in 7.1% of patients. No flap necrosis was noted. No recurrence was observed among patients with available follow-up. Conclusion: Limberg flap reconstruction is a safe and effective technique for PSD, associated with low complication rates and favorable short to mid-term outcomes Keywords: Pilonidal sinus disease; Limberg flap; Rhomboid flap; Off-midline closure; Sacrococcygeal region.

Page No: 439-443 | Full Text

 

Original Research Article

HAND HYGIENE AWARENESS, PRACTICES, AND PERCEIVED BARRIERS AMONG OUTPATIENTS AND THEIR CAREGIVERS: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.16.2.74

Patil Sujata Rajendra, Salve Shobha Bansi, Andrea Almeida

View Abstract

Background: Hand hygiene remains a critical yet underutilized public health intervention for preventing infectious diseases and curbing antimicrobial resistance, especially in low-resource settings. Despite increased awareness during the COVID-19 pandemic, sustained behavioural change in community settings remains limited. This study assesses the attitudes and practices related to hand hygiene among patients in outpatient departments, and to identify common barriers across sociodemographic groups. Materials and Methods: A cross-sectional study was conducted among 1109 participants attending multiple centre OPDs of a tertiary care hospital’s field practice area. Data was collected using a structured interviewer administered pre-validated questionnaire through face-to-face interviews. Convenience sampling was used; descriptive and inferential statistics were applied to assess associations. Results: Most participants were from rural areas (72.1%). Awareness was highest for handwashing after toilet use (91.4%), but lower for after coughing/sneezing (33.6%) and caregiving (38.5%). Barriers included soap unavailability (49%), self-reported lack of awareness (38.4%), time constraints (32.6%), and lack of clean water (30.7%). Urban residents reported significantly higher consistent soap use than rural counterparts (p = 0.023). Positive attitudes were significantly associated with better hand hygiene practices (p < 0.00001). Conclusion: Despite good awareness of key hand hygiene moments, actual practices remain inconsistent, with significant rural-urban disparities and structural barriers. Interventions must go beyond information dissemination to address access, affordability, and behaviour change, especially in rural settings. Keywords: Hand hygiene, knowledge-attitude-practice (KAP), outpatient care, caregivers, barriers, rural health, infection prevention, India.

Page No: 444-449 | Full Text

 

Original Research Article

HPV DNA TESTING AS A PRIMARY TOOL FOR CERVICAL SCREENING IN THE COMMUNITY AND ITS CORRELATION WITH CERVICAL CYTOLOGY

http://dx.doi.org/10.70034/ijmedph.2026.2.75

Ganapuram Sindhuja, Archana Dinesh B, Banda Divya

View Abstract

Background: Cervical cancer remains one of the leading causes of cancer-related morbidity and mortality among women worldwide. Early detection of cervical epithelial abnormalities through screening methods such as the Papanicolaou (Pap) smear and Human Papillomavirus (HPV) DNA testing plays a crucial role in identifying precancerous lesions and preventing disease progression. Objectives: To evaluate the prevalence of cervical cytological abnormalities, determine HPV DNA positivity, and analyze the correlation between HPV DNA detection and Pap smear findings among women undergoing cervical cancer screening. Materials and Methods: This cross-sectional study included 286 women who underwent cervical screening. Data were analyzed with respect to demographic characteristics, parity, menstrual history, presenting complaints, per speculum examination findings, Pap smear results, and HPV DNA detection. Cytological findings were classified according to standard Pap smear reporting, and HPV DNA testing was performed to determine viral positivity. Statistical analysis was conducted to assess the association between cytological abnormalities and HPV DNA positivity. Results: Most participants were 41–45 years old, and multiparity (≥3 deliveries) was common. The most frequent symptom was white discharge per vagina, and 22% had an unhealthy cervix on examination. Pap smear showed normal cytology in 52.4%, while 47.6% had abnormalities, mainly inflammatory changes with few cases of ASCUS, LSIL, and HSIL. HPV DNA positivity strongly correlated with abnormal cytology, with all ASCUS, LSIL, and HSIL cases testing HPV positive. The association between abnormal Pap smear findings and HPV DNA positivity was highly statistically significant (χ² = 221.45, p < 0.0001). Conclusion: The study demonstrates a strong correlation between cervical cytological abnormalities and HPV DNA positivity. Combined screening with Pap smear and HPV DNA testing enhances the detection of premalignant cervical lesions and can improve early diagnosis and prevention strategies for cervical cancer. Keywords: Cervical cancer screening, Pap smear, Human Papillomavirus Deoxyribonucleic Acid(HPV DNA), Atypical Squamous Cells of Undetermined Significance(ASCUS), Low-Grade Squamous Intraepithelial Lesion(LSIL), High-Grade Squamous Intraepithelial Lesion(HSIL).

Page No: 450-454 | Full Text

 

Original Research Article

EVALUATION OF MODIFIED TENSION BAND TECHNIQUE IN THE MANAGEMENT OF PATELLAR FRACTURES

http://dx.doi.org/10.70034/ijmedph.2026.2.76

S Khaderulla Basha, Uma Maheshwar Reddy, T Naveen Babu, Madamanchi Harsha

View Abstract

Background: Patella fractures are common and it constitutes about 1% of all skeletal injuries resulting from either direct or indirect trauma. The subcutaneous location of the patella makes it vulnerable to direct trauma as in dashboard injuries or a fall on the flexed knee, whereas violent contraction of the quadriceps results in indirect fractures of patella. These fractures are usually transverse and are associated with tears of medial or lateral retinacular expansions. In this study a series of 30 cases of fracture patella were studied after treating with Modified Tension Band Wiring technique. Material and Methods: This prospective study was done in Department of Orthopaedics at Sri Balaji Medical College Hospital & Research Institute, Tirupati, who were enrolled between July 2024 - June 2025.This study consists of 30 cases of fracture patella treated by modified tension band wiring. Cases were selected based on inclusion and exclusion criteria. Inclusion Criteria: 1. All closed and type I and type II open displaced transverse patellar fractures. 2. Transverse fracture with displacement of more than 2 to 3 mm and articular step of more than 2mm. 3. Comminuted fractures where reconstruction and fixation by modified tension band wiring is possible. Exclusion Criteria: 1. Type III compound fractures. 2. Grossly comminuted, vertical or marginal fractures. 3. Old fractures (more than 2-3 weeks). 4. Pathological fractures. Conclusion: Our study shows that modified tension band wiring is a definitive procedure in management of displaced transverse patellar fracture with least complications and also helps for early mobilization post-operatively. In our study we observed excellent result in 86.6% and good in about 10% and poor in 3.3% of cases. 4 Out of 30 cases had complications. Early post-operative Physiotherapy is a very essential tool of success in the management of these fractures, which helps in reducing complication like stiffness of knee and in providing good function. Long-term follow up is necessary to assess late complications like osteoarthritis and late functional outcome. Keywords: Patellar fractures, modified tension band technique, West’s criteria

Page No: 455-458 | Full Text

 

Original Research Article

A STUDY TO COMPARE ONDANSETRON AND DEXAMETHASONE FOR PREVENTION OF INTRAOPERATIVE NAUSEA AND VOMITING DURING CESAREAN DELIVERY UNDER SUBARACHNOID BLOCK

http://dx.doi.org/10.70034/ijmedph.2026.2.77

Sherin Soni, Vignesh Rajan, Urmila Keshari, Anuj Keshari, K Gopiraju

View Abstract

Background: Intraoperative nausea and vomiting (IONV) are frequent and distressing complications during cesarean delivery performed under subarachnoid block. These symptoms can compromise maternal comfort, surgical conditions, and overall patient satisfaction. Effective prophylaxis is therefore essential. Ondansetron and dexamethasone are commonly used antiemetics, but comparative evidence during cesarean delivery under spinal anesthesia remains limited. Materials and Methods: This prospective, randomized, double-blind, placebo-controlled study was conducted on 100 term parturients undergoing elective cesarean delivery under spinal anesthesia. Participants were randomly allocated into three groups: placebo (normal saline), dexamethasone 8 mg IV, or ondansetron 4 mg IV administered 20 minutes prior to spinal anesthesia. The primary outcomes were the incidence and severity of nausea, retching, and vomiting. Secondary outcomes included hemodynamic changes, ephedrine requirement, sedation scores, need for rescue antiemetics, and neonatal APGAR scores. Results: Ondansetron significantly reduced the incidence of nausea (18.2%), retching (9.1%), and vomiting (3.0%) compared with dexamethasone and placebo (p < 0.05). Dexamethasone demonstrated moderate efficacy compared with placebo but was less effective than ondansetron. The severity of nausea and requirement for rescue antiemetics were lowest in the ondansetron group. Hemodynamic parameters, sedation scores, and neonatal APGAR scores at 1 and 5 minutes were comparable across all groups. Conclusion: Ondansetron 4 mg IV is superior to dexamethasone 8 mg IV and placebo for preventing intraoperative nausea and vomiting during cesarean delivery under spinal anesthesia. Dexamethasone offers moderate benefit and may serve as an alternative or adjunct. Both agents are safe for maternal and neonatal outcomes. Keywords: Intraoperative Nausea and Vomiting, Ondansetron, Dexamethasone, Cesarean Section, Spinal Anesthesia.

Page No: 459-462 | Full Text

 

Original Research Article

HEARING LOSS IN CHRONIC KIDNEY DISEASE PATIENTS ON DIALYSIS: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.78

Jubina Puthen Purayil, Kalpana RajivKumar, Swapnil Gosavi, Ketki Pimpalkhute

View Abstract

Background: Chronic kidney disease (CKD) is a progressive systemic disorder that affects multiple organs, including the auditory system. Structural and physiological similarities between the cochlea and the kidney may predispose patients with renal dysfunction to hearing impairment. Objectives: The present study aimed to determine the prevalence and severity of hearing loss among CKD patients undergoing dialysis and to evaluate its association with serum creatinine levels and duration of dialysis. Materials and Methods: A cross-sectional observational study was conducted among 50 patients diagnosed with CKD and receiving maintenance dialysis. Audiological assessment was performed and hearing loss was categorized as none, mild, moderate, or severe. Serum creatinine levels and duration of dialysis were recorded for each patient. Statistical analysis included descriptive statistics, Chi-square test for association, and Pearson correlation to determine the relationship between creatinine levels and severity of hearing loss. A p-value of <0.05 was considered statistically significant. Results: Out of 50 patients, 29 (58%) demonstrated hearing loss. Moderate to severe hearing impairment was observed in 21 patients (42%). A statistically significant association was identified between duration of dialysis and severity of hearing loss (p = 0.03). Serum creatinine levels showed a moderate positive correlation with hearing loss severity (r = 0.62), indicating that higher creatinine levels were associated with greater auditory impairment. Conclusion: Hearing loss is a common complication among CKD patients undergoing dialysis. Both elevated serum creatinine levels and longer duration of dialysis are associated with increased severity of hearing impairment. Incorporating routine audiological screening into CKD management may facilitate early detection and timely intervention. Keywords: Chronic kidney disease, Sensorineural hearing loss, Dialysis, Creatinine, Audiometry.

Page No: 463-466 | Full Text

 

Original Research Article

EMERGING TRENDS IN URINARY CANDIDIASIS: SPECIES SPECTRUM AND AZOLE RESISTANCE IN A CROSS-SECTIONAL STUDY AT TERTIARY CARE HOSPITAL, TAMILNADU

http://dx.doi.org/10.70034/ijmedph.2026.2.79

Ravikumar S, Ravichandran B, Sathiya M, Aarthy S

View Abstract

Background: Candidiasis is the most common cause of fungal infections, leading to a range of muco-cutaneous infections to life threatening invasive diseases. Opportunistic infections by Candida sp are becoming quite common in hospitals today with antifungal resistance. Candida species account for almost 10-15% nosocomial UTIs. Candida albicans has been the commonest species causing infection for many years but indiscriminate use of azole group of drugs has led to increase in Non Albicans Candida infection and resistance to antifungal drugs in Candida species. Aim: To determine the isolation pattern, species distribution and antifungal susceptibility pattern of Candida species in urine samples particularly, their azole resistance pattern in the department of Microbiology, Chengalpattu Medical College and Hospital. Materials and Methods: From August 2022 – July 2023, from a total of 644 urine samples,71 Candida species were isolated, speciated by various phenotypic methods and antifungal susceptibility testing done under standard mycological procedures. Results: In this study, predominanace of Non Albicans Candida species is evident, with C.tropicalis (34%) as the predominant species followed by C.albicans (23%) .Fluconazole resistant found to be 37% in this study which is higher due to prevalance of Non Albicans Candida species. Conclusion: This study highlights the predominance of Non-albicans Candida (NAC) species, being an important cause of urinary tract infections. Non -albicans Candida (NAC) species are more resistant to antifungal drugs than Candida albicans owing to higher prevalance of azole resistance. Therefore, the species identification of Candida isolates along with their antifungal susceptibility pattern can help the clinician in better treatment of patients with candiduria. Due to appropriate management with antifungal agents, there will be significant decrease in the length of stay of hospitalized patients, who may be already immunocompromised, thereby reducing the chance of acquiring nosocomial infections and healthcare costs. Keywords: Susceptibility, candidiasis, antifungal resistance, immunocompromised.

Page No: 467-472 | Full Text

 

Original Research Article

MORPHOMETRIC ANALYSIS AND CONGENITAL ANOMALIES OF KIDNEY AND URETER IN ADULT CADAVERS: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.80

Vanajakshi Bothsa, K Vijayalakshmi

View Abstract

Background: Measurement of morphometric parameters of the kidney and ureter plays a crucial role in clinical practice, as the variations existing can influence clinical diagnosis and surgical procedure Materials and Methods: A descriptive cross- sectional study was conducted in the Department of Anatomy at a tertiary medical institute,over the period of five years from February 2020 till March 2025. Total of 50 pairs of adult cadaveric kidneys and ureters were examined. Morphometric parameters including renal length, width, thickness, weight, and ureteric length were measured using vernier callipers and a digital weighing machine. Statistical analysis was performed using IBM SPSS version 27. Results: The left kidney consistently demonstrated greater length, width, thickness, and weight compared to the right kidney (p < 0.0001). No statistically significant difference was observed in ureteric length between sides (p = 0.949). Pearson’s correlation analysis showed a weak positive correlation between renal length and ureteric length on both sides, which was not statistically significant. Congenital anomalies were observed in 14% of cadavers. Conclusion: The study provides baseline morphometric data of kidneys and ureters in adult cadavers and highlights the presence of congenital anomalies that could have clinical significance in surgical and radiological practice. Keywords: Kidney morphometry, Cadaveric study, Congenital renal anomalies, ureter morphometry.

Page No: 473-478 | Full Text

 

Case Report

ACUTE BASAL GANGLIA HEMORRHAGE AS THE INITIAL PRESENTATION OF EXTRA-ADRENAL PARAGANGLIOMA IN AN ADOLESCENT: A CASE REPORT

http://dx.doi.org/10.70034/ijmedph.2026.2.81

Vivek Kumar Tripathi, Mahim Mittal

View Abstract

Background: Intracerebral hemorrhage (ICH) in adolescents is uncommon and often indicates an underlying secondary etiology. Among these, catecholamine-secreting tumors such as paragangliomas are rare but potentially life-threatening causes of severe hypertension and vascular complications. Case Presentation: We report the case of a 15-year-old male who presented with sudden onset left-sided hemiparesis and slurred speech. On admission, he was found to have severe hypertension (200/110 mmHg). Neuroimaging revealed a right basal ganglia hemorrhage. Further evaluation uncovered a history of episodic headache, palpitations, sweating, and abdominal pain suggestive of catecholamine excess. Abdominal imaging demonstrated a well-defined enhancing mass in the aortocaval region adjacent to the right adrenal gland. Biochemical analysis showed significantly elevated urinary metanephrine levels, confirming the diagnosis of an extra-adrenal paraganglioma. Management and Outcome: The patient was managed conservatively for intracerebral hemorrhage with blood pressure control using intravenous labetalol and osmotherapy. Rehabilitation therapy was initiated, and the patient was subsequently referred for definitive surgical management of the tumor. Conclusion: This case highlights the importance of considering paraganglioma as a rare but critical cause of hypertensive intracerebral hemorrhage in adolescents. Early recognition through clinical suspicion, biochemical testing, and imaging is essential to prevent morbidity and mortality. Keywords: Intracerebral hemorrhage; basal ganglia; paraganglioma; adolescent hypertension; catecholamine excess; stroke in young.

Page No: 479-483 | Full Text

 

Original Research Article

A STUDY OF ASSOCIATION OF CLINICAL SEVERITY OF ACUTE ISCHEMIC STROKE WITH SERUM CALCIUM LEVELS AT THE TIME OF ADMISSION: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.82

Nadgatti Vinayak Umesh, Rahul Arya, Mukesh Kumar

View Abstract

Background: Acute ischemic stroke is a leading cause of mortality and disability worldwide. Dysregulation of calcium homeostasis is a key mechanism in the ischemic injury. This study aimed to investigate whether serum calcium levels at the time of hospital admission correlate with the clinical severity of acute ischemic stroke as assessed by the National Institutes of Health Stroke Scale (NIHSS). Materials and Methods: A cross-sectional observational study was conducted at T.S. Misra Medical College & Hospital, Lucknow, enrolling 50 consecutive patients diagnosed with acute ischemic stroke by CT imaging, presenting within 24 hours of symptom onset. Serum total calcium, albumin-corrected calcium, and ionized calcium were measured at admission. Stroke severity was assessed using NIHSS at admission, day 4, and discharge. Patients were classified as mild (NIHSS 1–4), moderate (NIHSS 5–15), or severe (NIHSS 16–42). Infarct size was categorized on CT imaging. Statistical correlation and group comparisons were performed across severity strata and discharge status. Results: The mean age was 61.66 ± 13.88 years, with a male predominance of 70%. Mean serum total calcium was 9.09 ± 0.90 mg/dl; corrected calcium was 9.28 ± 0.98 mg/dl; and ionized calcium was 1.15 ± 0.12 mmol/L. At admission, 68% had moderate strokes. A statistically significant inverse relationship was found between total serum calcium and stroke severity (p = 0.035), with mild stroke patients having the highest calcium (9.45 ± 0.61 mg/dl) and severely affected patients the lowest (8.80 ± 0.85 mg/dl). Ionized calcium also correlated significantly with stroke severity (p = 0.044). Patients who deteriorated clinically had significantly lower total calcium (7.84 ± 0.58 mg/dl) compared to those who improved (9.44 ± 0.64 mg/dl; p < 0.001). The NIHSS improved from 7.16 at admission to 5.82 at discharge. No significant association was found between ionized calcium and infarct size (p = 0.565). Conclusion: Lower admission serum calcium was significantly associated with greater stroke severity and poorer short-term outcomes. Serum total calcium may serve as a simple, cost-effective, and widely available prognostic biomarker in acute ischemic stroke. Larger prospective studies are needed. Keywords: Acute ischemic stroke, serum calcium, corrected calcium, NIHSS score, stroke severity, infarct size, prognostic biomarker.

Page No: 484-489 | Full Text

 

Original Research Article

CLINICAL CHARACTERISTICS AND OUTCOMES OF ROAD TRAFFIC INJURY PATIENTS AT A TERTIARY CARE HOSPITAL

http://dx.doi.org/10.70034/ijmedph.2026.2.83

Chirag Dineshbhai Amin, Dhruvesh Laljibhai Katara, Meet Pravinbhai Prajapati, Ashish Chaudhary, Bhargav Valangar

View Abstract

Background: Road traffic injuries are a major cause of morbidity and mortality worldwide, particularly affecting young adults. Objective: To assess the characteristics and outcomes of non-polytraumatized road traffic injury patients presenting to the emergency department. Materials and Methods: A prospective observational study was conducted on 150 patients with road traffic injuries. Demographic details, injury characteristics, and outcomes were recorded and analyzed. Results: Males constituted 68.0% of cases, with peak incidence in the 31–40 years age group (22.7%). Commercial vehicles were the most common mode of transportation involved. Head injuries (30.7%) and fractures (20.0%) were the most frequent injury types. Higher admission rates were observed in truck-related injuries (80.0%), while mortality was higher in motorcycle-related cases (12.0%). Conclusion: Road traffic injuries predominantly affect young adult males and vary in outcome depending on vehicle type and injury pattern. Early intervention and preventive strategies are essential to improve outcomes. Keywords: Road traffic injury, Emergency department, Trauma, Outcome analysis.

Page No: 490-493 | Full Text

 

Original Research Article

ASSESSMENT OF CLINICORADIOLOGICAL PATTERN AND ETIOLOGICAL FACTORS AMONG PATIENTS WITH BRONCHIECTASIS

http://dx.doi.org/10.70034/ijmedph.2026.16.2.84

Jatin Arya, Aashutosh Asati, P. K. Baghel, Pramod Kushwaha, Sudeept Kumar Dwivedi

View Abstract

Background: Bronchiectasis is a chronic suppurative lung disease characterized by irreversible bronchial dilatation, recurrent infections, and progressive respiratory impairment. High-resolution computed tomography (HRCT) has emerged as the gold standard for diagnosis, enabling detailed assessment of disease extent and patterns. Identifying clinicoradiological correlations and etiological factors is crucial for targeted management and prognostication. Objectives: To assess the clinical presentation, radiological patterns on HRCT chest, and etiological factors among patients with bronchiectasis and to evaluate their association with disease severity. Materials and Methods: This observational cross-sectional study was conducted over a period of one year at Shyam Shah Medical College and Sanjay Gandhi Memorial Hospital, Rewa, Madhya Pradesh, included 100 diagnosed cases of bronchiectasis. Detailed clinical evaluation, sputum microbiology, pulmonary function testing, and HRCT chest were performed. Radiological patterns (cylindrical, varicose, cystic, mixed), lobar distribution, and associated findings were analyzed. Etiology was categorized as post-infective, tuberculosis-related, idiopathic, congenital. Results: The mean age of patients was 60.05 ± 11.64 years, with a female predominance (55%). The most common symptoms were chronic cough (92%), expectoration (52%), and recurrent hemoptysis (27%). HRCT revealed varicose bronchiectasis as the predominant pattern (34%), followed by mixed (26%), cylindrical and cystic (20%) types. Varicose bronchiectasis was the most common HRCT pattern in both smokers and non-smokers. FACED score a total of 45% of patients had a score between 0–2, 28% had scores of 3–4, 27% had scores ranging from 5–7, representing severe disease with poorer outcomes. Post-tubercular bronchiectasis (30%), followed by COPD Associated (28%), idiopathic (25%) and ABPA was (17%). Patients with cystic and mixed patterns showed significantly lower FEV1 values and higher frequency of Pseudomonas aeruginosa isolation (p < 0.01). There was a significant negative correlation between oxygen saturation (SpO₂) and FACED score across all severity categories. Conclusion: Bronchiectasis predominantly affects middle-aged males and commonly presents with chronic productive cough. Cylindrical bronchiectasis with lower lobe predominance is the most frequent HRCT pattern. Post-infective and post-tubercular etiologies remain leading causes. HRCT patterns correlate significantly with clinical severity and microbiological profile, emphasizing its role in comprehensive disease assessment and management. Key Words: Bronchiectasis; High-Resolution Computed Tomography; Clinicoradiological Correlation; Etiological Factors; Post-infective Lung Disease.

Page No: 494-500 | Full Text

 

Original Research Article

A STUDY ON CAN (CLINICAL ASSESSMENT OF NUTRITION) SCORE AND ITS CORRELATION WITH GESTATIONAL AGE

http://dx.doi.org/10.70034/ijmedph.2026.2.85

Suma Priya M, Sandupatla Kiran Kumar, Pogula Arun Reddy, B. Venkatachalam, Mettu Pradeep Reddy

View Abstract

Background: The primary objective of the study was to assess the nutritional status of newborn babies using clinical assessment of nutritional score (CAN SCORE) within 48 hours after birth and categorise them into nourished and malnourished. The secondary objective was to find the association of CAN SCORE with (1) neonatal clinical parameters. Materials and Methods: A comparative, hospital-based cross-sectional observational study was done on 200 newborn babies delivered in Mallareddy Medical College for Women and Hospital. The study was done for 1 year, from December 2020 to December 2021. The newborn parameters used to assess the nutritional status were CAN SCORE and Anthropometric measurements. The analysis used the Special Package for Social Sciences (SPSS), and a p-value of < 0.05 was considered significant. Results: The nutritional status of the newborns was assessed based on CAN SCORE, which showed that 127 (63%) babies had fetal malnutrition, with CANSCORE <25 and 73 (37%) being well nourished with CANSCORE >25.The association of CAN SCORE with different parameters was evaluated to reach a result. 70% of late pre-term and 65% of term babies were malnourished, proving a statistically positive association.63.5% of the newborns were malnourished whereas weight for gestation age identified 38% as SGA and 62% as AGA with a p-value of 0.017. The association between neonatal BMI and CAN SCORE was statistically significant, with a p-value of 0.016 (<0.05). The association between Ponderal Index and CAN SCORE was statistically significant, with a p-value of 0.008 (<0.05). Conclusion: The CANSCORE has a statistically significant association with gestational age, birth weight according to gestational age, neonatal BMI, ponderal index and maternal anaemia status. Keywords: Nutritional Score, Malnourished, New Born babies.

Page No: 501-506 | Full Text

 

Original Research Article

IMMUNOHISTOCHEMICAL EVALUATION OF P63 EXPRESSION IN THE GRADING OF UROTHELIAL NEOPLASMS

http://dx.doi.org/10.70034/ijmedph.2026.2.86

Suganthi P, Dhivya M, J. Maheswari

View Abstract

Background: Urothelial carcinoma (UC) is the most common malignancy of the urinary bladder and demonstrates wide morphological and biological variability. Interobserver variability is still a problem, despite the fact that tumor grading is essential for prognosis and treatment choices. It has been suggested that p63, a nuclear transcription factor involved in epithelial development, might be used as an additional marker to grade urothelial neoplasms. The aim is to evaluate the immunohistochemical expression of p63 in varying grades of urothelial neoplasms and to assess its correlation with histopathological grade and clinicopathological parameters. Materials and Methods: This laboratory-based prospective and retrospective observational study included 50 cases of primary urothelial carcinoma diagnosed between July 2015 and June 2019 at the Department of Pathology, Dhanalakshmi Srinivasan Medical College and Hospital. Formalin-fixed paraffin-embedded tissue sections were subjected to immunohistochemical staining using mouse monoclonal antibody against p63 (clone 4A4). Nuclear staining in more than 10% of tumor cells was considered increased expression, while less than 10% was considered decreased expression. Statistical analysis was performed using SPSS version 21.0, and Chi-square test was applied with p < 0.05 considered significant. Results: High-grade urothelial carcinoma constituted 62% of cases, while 38% were low grade. Increased p63 expression was predominantly observed in low-grade tumors (64.3%), whereas decreased expression was mainly associated with high-grade tumors (95.5%). A statistically significant inverse correlation was found between p63 expression and tumor grade (χ² = 18.662, p = 0.000). No significant association was observed between p63 expression and age or tumor stage. Conclusion: p63 expression shows a significant inverse correlation with histological grade of urothelial carcinoma. Decreased expression is associated with high-grade tumors, suggesting its utility as an adjunct immunohistochemical marker in grading urothelial neoplasms. Larger studies with long-term follow-up are recommended to further establish its prognostic significance. Keywords: Urothelial carcinoma; p63; Immunohistochemistry; Histological grading; Bladder cancer; Tumor differentiation; WHO/ISUP classification; Muscle invasion; Prognostic marker.

Page No: 507-513 | Full Text

 

Original Research Article

COMPARISON OF INSERTION CHARACTERISTICS OF LMA PROSEAL AND AMBU AURAGAIN IN ADULT PATIENTS UNDER CONTROLLED VENTILATION: A RANDOMISED STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.87

Jairam Panicker, Mathews. P. Oommen

View Abstract

Background: Supraglottic airway devices are widely used for airway management in patients undergoing surgery under general anaesthesia. The LMA ProSeal and Ambu AuraGain are second-generation supraglottic airway devices designed to provide improved airway seal and facilitate gastric access. Although both devices are commonly used in clinical practice, their insertion characteristics and performance under controlled ventilation require comparative evaluation. The objective is to compare the insertion characteristics of LMA ProSeal and Ambu AuraGain in adult patients undergoing surgery under general anaesthesia with controlled ventilation, with respect to ease of insertion, number of attempts, time taken for successful insertion, oropharyngeal leak pressure, and perioperative complications. Materials and Methods: This prospective, randomized study was conducted over a period of 12 months in a tertiary care hospital. A total of 100 adult patients, aged 18–60 years and belonging to ASA physical status I and II, scheduled for elective surgeries under general anaesthesia with controlled ventilation were enrolled. Patients were randomly allocated into two groups of 50 each: Group P (LMA ProSeal) and Group A (Ambu AuraGain). The primary outcomes assessed were number of insertion attempts, time taken for successful device insertion, and ease of insertion. Secondary outcomes included oropharyngeal leak pressure, adequacy of ventilation, and incidence of perioperative complications such as sore throat, blood staining of the device, and airway trauma. Results: Both devices were successfully inserted in the majority of patients. The Ambu AuraGain demonstrated a higher first-attempt success rate and shorter insertion time compared to the LMA ProSeal. Oropharyngeal leak pressure was comparable between the two groups. The incidence of minor complications was low and did not differ significantly between the groups. Conclusion: Both LMA ProSeal and Ambu AuraGain are effective and safe supraglottic airway devices for use in adult patients under controlled ventilation. However, Ambu AuraGain appears to offer advantages in terms of ease and speed of insertion, with comparable airway seal and complication profile. Keywords: LMA ProSeal, Ambu AuraGain, supraglottic airway device, controlled ventilation, randomized study, airway management.

Page No: 514-519 | Full Text

 

Original Research Article

COMPARISON OF MEDIAN AND PARAMEDIAN TECHNIQUE OF THORACIC EPIDURAL ANAESTHESIA IN PATIENTS UNDERGOING LAPAROTOMY UNDER COMBINED GENERAL AND EPIDURAL ANAESTHESIA: A PROSPECTIVE OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.88

Mathews. P. Oommen, Jairam Panicker

View Abstract

Background: Thoracic epidural anaesthesia, when combined with general anaesthesia, is widely used for perioperative analgesia in patients undergoing laparotomy. The two commonly employed approaches for epidural space identification are the median and paramedian techniques. Although both techniques are routinely practiced, they differ in anatomical approach, technical ease, and potential complication profile. Comparative data on their performance in the thoracic region, particularly in the context of combined general and epidural anaesthesia, remain limited. The objective is to compare the median and paramedian techniques of thoracic epidural catheter placement in patients undergoing laparotomy with respect to technical success, number of attempts, ease of catheter placement, procedure-related complications, and quality of perioperative analgesia. Materials and Methods: This prospective observational study included 120 adult patients scheduled for elective laparotomy under combined general and thoracic epidural anaesthesia. Patients underwent thoracic epidural catheter placement using either the median or paramedian approach based on the attending anaesthesiologist’s routine practice. Data collected included demographic variables, number of attempts required for successful epidural placement, time taken for the procedure, incidence of complications such as vascular puncture, dural puncture, and paraesthesia, and intraoperative as well as postoperative analgesic efficacy. Postoperative pain scores and requirement of rescue analgesia were recorded to assess the quality of analgesia. The two groups were compared using appropriate statistical tests. Results: Both techniques were found to be effective for thoracic epidural placement. The paramedian approach was associated with a higher first-attempt success rate and fewer needle redirections, whereas the median approach required fewer anatomical landmarks to be negotiated. Procedure-related complications were infrequent in both groups. Postoperative analgesia, as assessed by pain scores and rescue analgesic requirements, was comparable between the two techniques. Conclusion: Both median and paramedian approaches for thoracic epidural anaesthesia are safe and effective in patients undergoing laparotomy under combined general and epidural anaesthesia. The paramedian approach may offer technical advantages in terms of ease of placement and first-attempt success, while analgesic outcomes remain comparable between the two techniques. Keywords: Thoracic epidural anaesthesia, Median approach, Paramedian approach, Laparotomy, Combined general and epidural anaesthesia, Postoperative analgesia.

Page No: 520-526 | Full Text

 

Original Research Article

COMPARISON OF DEXAMETHASONE AND KETOROLAC AS AN ADJUNCT TO 0.5% (H) LEVOBUPIVACAINE IN ULTRASOUND AND PNS GUIDED AXILLARY BRACHIAL PLEXUS BLOCK IN PATIENTS UNDERGOING HAND AND FOREARM SURGERY

http://dx.doi.org/10.70034/ijmedph.2026.2.89

Shraddha More, Priya Ragini Prasad, Charu Neema

View Abstract

Background: Regional anesthesia has become an essential component of modern anesthetic practice, particularly in upper limb surgeries where it provides excellent analgesia, muscle relaxation, reduced opioid consumption, and early mobilization with fewer adverse effects compared to general anesthesia. Among the various regional anesthesia techniques, the axillary brachial plexus block is widely used for hand and forearm surgeries due to its safety, effectiveness, and ease of administration. The objective is to compare dexamethasone and ketorolac as an adjunct to 0.5% hyperbaric levobupivacaine in ultrasound and peripheral nerve stimulator guided axillary brachial plexus block in patients undergoing hand and forearm surgery. Materials and Methods: This prospective randomized comparative study was conducted in patients undergoing hand and forearm surgeries under axillary brachial plexus block. Patients were divided into two groups receiving levobupivacaine with dexamethasone or levobupivacaine with ketorolac. Sensory block, motor block, duration of analgesia, hemodynamic parameters, and rescue analgesia were recorded and analyzed statistically. Results: The onset of sensory and motor block was faster in the dexamethasone group compared to the ketorolac group. The duration of sensory and motor block was significantly prolonged in the dexamethasone group. Duration of postoperative analgesia was longer in patients who received dexamethasone as an adjuvant compared to ketorolac. Hemodynamic parameters remained stable in both groups throughout the study period. The number of rescue analgesic doses required was lower in the dexamethasone group compared to the ketorolac group. No significant adverse effects were observed in either group. Overall, dexamethasone provided better block characteristics and longer postoperative analgesia compared to ketorolac. Conclusion: Dexamethasone and ketorolac are effective adjuvants when added to levobupivacaine for axillary brachial plexus block. However, dexamethasone provides faster onset of block, longer duration of sensory and motor block, and prolonged postoperative analgesia compared to ketorolac. The addition of dexamethasone to levobupivacaine improves the quality of block and reduces postoperative analgesic requirement. Keywords: Axillary Brachial Plexus Block, Levobupivacaine, Dexamethasone, Ketorolac

Page No: 527-534 | Full Text

 

Original Research Article

ASSESSMENT OF COMPLIANCE WITH THE CIGARETTES AND OTHER TOBACCO PRODUCTS ACT (COTPA), 2003 IN PUBLIC PLACES OF SOUTH ANDAMAN DISTRICT, INDIA

http://dx.doi.org/10.70034/ijmedph.2026.2.90

Amrita Burma, N Haney Nickyson, Deepak Kumar, Anagha J

View Abstract

Background: The Cigarettes and Other Tobacco Products Act (COTPA), 2003 is a key legislative tool in India to curb tobacco use. However, its enforcement remains inconsistent. This study assessed the compliance of COTPA sections 4, 5, 6, 7, and 8 in public places and evaluated awareness among tobacco vendors in South Andaman district. Materials and Methods: A cross-sectional observational study was conducted over two months (September–October). Using two-stage cluster sampling, 106 public places (accommodation facilities, educational institutions, health facilities, public transport, shops) across three regions (North, Central, South) of South Andaman were observed. Additionally, 30 tobacco vendors were interviewed using a structured questionnaire. Data were analyzed using descriptive statistics and chi-square tests. Results: Only 23.3% of vendors were aware of COTPA. Overall compliance was poor: smoking in public places (Section 4) was observed in 41.5% of sites; 93.4% lacked mandatory signboards; 43.4% sold tobacco to minors (Section 6a); 27.4% sold within 100 yards of educational institutions (Section 6b). While 89.6% of tobacco packets displayed pictorial health warnings (Section 7), 93.4% sold loose cigarettes, violating the ban. Compliance with specified font and color for warnings (Section 8) was 83%. Regional variations existed, with poorer compliance in rural southern and northern areas for sales to minors, and urban central areas for advertising and packaging violations. Conclusion: COTPA compliance in South Andaman is inadequate, reflecting poor awareness and weak enforcement. Strengthened inter-sectoral coordination, regular monitoring, stricter penalties, and targeted public awareness campaigns are urgently needed. Keywords: COTPA, tobacco control, compliance, public places, Andaman and Nicobar Islands, tobacco vendors, enforcement.

Page No: 535-542 | Full Text

 

Original Research Article

A STUDY OF ROLE OF PLATELET TO LYMPHOCYTE RATIO [PLR] AND ITS CORRELATION WITH NATIONAL INSTITUTE OF HEALTH STROKE SCALE [NIHSS] FOR PREDICTION OF SEVERITY IN PATIENTS OF ACUTE ISCHEMIC STROKE

http://dx.doi.org/10.70034/ijmedph.2026.2.91

Srinivasa J, Umesh G Rajoor, Mohd Naveed Khan

View Abstract

Background: Despite of advances in clinical management, there is no robustic prognostic marker in acute stroke. The Platelet-to-Lymphocyte Ratio (PLR) is considered to be important marker for predicting stroke severity and its outcomes. This study aims to find out correlation between PLR and the National Institute of Health Stroke Scale (NIHSS) for predicting stroke severity and prognosis. Materials and Methods: This prospective observational study was conducted at Koppal Institute of Medical Sciences, Koppal, over one year (May 2022 to April 2023). Fifty patients with acute ischemic stroke, presenting within seven days of symptom onset, were enrolled. Detailed clinical and laboratory data, including platelet and lymphocyte counts, were collected at admission, at day3 /at time of discharge. Stroke severity was assessed using the NIHSS at both time points. Statistical analysis was performed using SPSS version 21, with Pearson's correlation coefficient used to analyze the relationship between PLR and NIHSS. Results: The study found a significant positive correlation between PLR and NIHSS at both admission (r = 0.874, p < 0.001) and day3 /discharge (r = 0.907, p < 0.001). Changes in PLR from admission to discharge were also strongly correlated with changes in NIHSS (r = 0.938, p < 0.001). These findings suggest that higher PLR values are associated with greater stroke severity and that changes in PLR reflect changes in stroke severity. Conclusion: PLR is a simple inflammatory marker for predicting stroke severity and helps in prognosis of patients with acute ischemic stroke. The significant correlations of PLR with NIHSS at both admission and day3 /discharge underscore the potential of PLR to enhance clinical assessment and guide therapeutic decisions. Further research is required to validate these findings in larger, multi-centre cohorts and to find out the mechanisms of the relationship between PLR and stroke outcomes. Keywords: Acute ischemic stroke, Platelet to Lymphocyte Ratio, National Institute of Health Stroke Scale, Stroke severity.

Page No: 543-547 | Full Text

 

Original Research Article

PATTERN OF LYMPH NODE LESIONS CLASSIFIED BY THE SYDNEY SYSTEM: A TERTIARY CARE CENTER EXPERIENCE IN SOUTHERN TELANGANA

http://dx.doi.org/10.70034/ijmedph.2026.2.92

Pavuluri Divya, Siva Chaithanya Bangi, Anusha Priyadarshani, Rama Devi

View Abstract

Background: Fine needle aspiration cytology (FNAC) is a rapid, minimally invasive and cost effective diagnostic modality for the evaluation of lymphadenopathy. Historically, the absence of a standardized reporting system resulted in variability in interpretation and communication. The Sydney Reporting System was introduced at the 20th International Congress of Cytology held in Sydney in 2019 to provide a uniform framework for lymph node FNAC reporting. Materials and Methods: This Prospective study was conducted in the Department of Pathology, Government Medical College, Wanaparthy, from 2023 to 2025. All lymph node FNAC cases received during the study period were reviewed and categorized according to the Sydney Reporting System. Demographic details, lymph node site, and cytological diagnosis were analyzed. Results: A total number of 90 lymph node FNAC cases were evaluated. Benign lesions constituted the majority, with reactive lymphadenitis being the most common diagnosis, followed by granulomatous and suppurative lymphadenitis. Malignant lesions formed a smaller proportion of cases. Cervical lymph nodes were the most frequently involved site. Conclusion: The Sydney Reporting System is a practical and reproducible framework for standardized lymph node FNAC reporting. Its application improves diagnostic clarity and facilitates effective clinician–pathologist communication, even in FNAC only settings. Keywords: FNAC, Lymph node, Sydney system, Cytology.

Page No: 548-550 | Full Text

 

Systematic Review

CLINICAL PROFILE AND MANAGEMENT OUTCOMES OF EUGLYCEMIC DIABETIC KETOACIDOSIS: A SYSTEMATIC REVIEW

http://dx.doi.org/10.70034/ijmedph.2026.2.93

Habiba Mohammed Usman Dudhmal

View Abstract

Background: Euglycemic diabetic ketoacidosis (EDKA) is a variant of diabetic ketoacidosis marked by metabolic acidosis and ketosis despite normal or only mildly elevated blood glucose levels. In contrast to classical diabetic ketoacidosis (DKA), the lack of significant hyperglycemia can obscure recognition and delay timely management. The growing use of sodium–glucose cotransporter-2 (SGLT2) inhibitors, together with triggers such as infection, prolonged fasting, pregnancy, and missed insulin doses, has led to an increasing number of reported cases. Therefore, a systematic review is needed to integrate current evidence and strengthen clinical recognition and understanding of this emerging condition. The objective is to comprehensively evaluate the demographic and clinical characteristics, precipitating factors, management strategies, and outcomes—including recovery, complications, intensive care requirement, and mortality—of patients diagnosed with euglycemic diabetic ketoacidosis. Materials and Methods: This study was conducted as a systematic review in accordance with PRISMA guidelines. A comprehensive literature search was performed in PubMed/MEDLINE, Scopus, Web of Science, Embase, and the Cochrane Library using relevant keywords related to euglycemic diabetic ketoacidosis. Observational studies, case series, randomized controlled trials, and case reports, review articles, meta-analysis were included. Titles, abstracts, and full-text articles were independently screened according to predefined eligibility criteria. Results: Euglycemic diabetic ketoacidosis affects all age groups and is increasingly linked to SGLT2 inhibitor use. It presents with milder acidosis and lower glucose levels, which may delay diagnosis. Triggers include reduced intake, infection, pregnancy, insulin omission, surgical stress, low-carbohydrate diets, and Sodium–Glucose Cotransporter-2 inhibitors (SGLT2 inhibitors). Outcomes are comparable to classical DKA with appropriate treatment, though hypoglycemia during therapy is more frequent. Early recognition and prompt management are essential to reduce complications. Management included fluid resuscitation, electrolyte correction, insulin infusion, and dextrose supplementation. Euglycemic DKA required shorter insulin duration but had higher hypoglycemia risk. Mortality was under 5% with standardized treatment. Conclusion: Euglycemic DKA is a serious diabetic emergency that may be overlooked due to near-normal glucose levels and is increasingly associated with SGLT2 inhibitor use, particularly in perioperative settings. Early recognition, patient education, routine ketone monitoring, and preventive strategies such as the STICH and STOP DKA protocols are essential to reduce risk. Prompt management and structured follow-up remain key to preventing recurrence and improving outcomes. Keywords: Euglycemic diabetic ketoacidosis, SGLT2 inhibitors, Diabetic ketoacidosis, Type 1 diabetes mellitus, Insulin therapy, Clinical profile, Management outcomes.

Page No: 551-556 | Full Text

 

Original Research Article

A POSTMORTEM CHRONICLE OF HANGING DEATHS IN A DECADAL SOUTHERN INDIAN STUDY (2011-2020)

http://dx.doi.org/10.70034/ijmedph.2026.2.94

Shodhan Rao Pejavar, Prashantha Bhagavath, Tanush Shetty, Rashmi R Aithal, Rishab Shetty, Narasimha Pai D, Arun Pinchu Xavier, Francis Nanda Prakash Monteiro

View Abstract

Hanging is one of the common methods employed for suicide worldwide, accounting for around 55.6% of all suicides globally, with an estimated 700,000 deaths annually. The incidence is particularly high in Asia, where hanging is responsible for nearly 60% of suicides, influenced by sociocultural, economic, and legal factors. This retrospective descriptive study aims to analyse the demographic, circumstantial and forensic characteristics of hanging deaths autopsied at a tertiary medical institution in southern India over ten years (2011–2020). A total of 169 confirmed cases of hanging deaths were examined using archived post-mortem reports, police inquest documents, and victim profiles. Data on demographics, scene findings, external and internal autopsy findings, and inferred reasons for suicide were collected and analysed descriptively using SPSS. Victims ranged in age from 11 to 63 years, predominantly affecting those aged 20–49 years (74.56%). Males accounted for 63.91% of cases. Most hangings occurred at home (86.98%) and involved complete suspension (75.74%). Rope was the most common ligature material (66.86%). External autopsy findings showed typical ligature mark characteristics, including atypical knot positioning (85.21%), running knots (73.96%), and oblique ligature marks (95.86%). Internal examination revealed sternocleidomastoid muscle haemorrhages in 78.70% of cases and hyoid bone fractures in 17.16%. The leading inferred reasons for suicide were relationship or marital conflicts (43.20%), mental illness (13.02%), and academic or professional pressure (12.43%). This study provides comprehensive forensic and demographic insights into hanging deaths in southern India, highlighting the predominance of young adults and males, the home setting, and relationship conflicts as major contributing factors. The findings underscore the need for culturally sensitive suicide prevention strategies and improved mental health services in the region. Keywords: Hanging, Ligature Mark, Mental Illness, Suicide.

Page No: 557-564 | Full Text

 

Original Research Article

IMPACT OF DISEASE DURATION, EXTENT, AND INFLAMMATORY SEVERITY ON COLORECTAL CANCER RISK IN INFLAMMATORY BOWEL DISEASE: A SYSTEMATIC REVIEW

http://dx.doi.org/10.70034/ijmedph.2026.2.95

Rani Vijayan

View Abstract

Background: Inflammatory Bowel Disease (IBD), including Ulcerative Colitis and Crohn’s Disease, is associated with an increased risk of Colorectal Cancer. The risk is believed to be influenced by several disease-related factors, particularly duration of illness, anatomical extent of colonic involvement, and severity of chronic inflammation. While multiple observational studies have explored these associations, the magnitude and consistency of this risk factors remain variably reported. A comprehensive systematic review is therefore warranted to synthesize current evidence and clarify the relative contribution of these factors to colorectal cancer risk in IBD patients. Objectives: The objective of this systematic review is to evaluate the impact of disease duration, extent of colonic involvement, and inflammatory severity on the risk of colorectal cancer in patients with inflammatory bowel disease (IBD), and to identify high-risk patient subgroups while synthesizing available evidence to inform surveillance strategies and risk stratification. Materials and Methods: This systematic review was conducted in accordance with the PRISMA guidelines. A comprehensive literature search was performed in electronic databases including PubMed, Scopus, and Web of Science. Observational studies (cohort and case–control studies), randomized control trials, review article, meta-analysis evaluating the association between disease duration, disease extent, inflammatory severity, and the risk of colorectal cancer in patients with inflammatory bowel disease were included. Titles, abstracts, and full texts were screened according to predefined eligibility criteria. Data extraction included the impact of disease duration, disease extent, and measures of inflammatory burden. The methodological quality of the included studies was assessed using appropriate risk-of-bias tools, and a qualitative synthesis was performed. Results: Longer disease duration, extensive colitis, and persistent or severe inflammation were associated with an increased risk of colorectal cancer (CRC) in patients with inflammatory bowel disease. The risk increases approximately 8–10 years after diagnosis and rises with longer disease duration, particularly in pancolitis. Additional risk factors include primary sclerosing cholangitis, early disease onset, and family history of colorectal cancer, while surveillance colonoscopy improves dysplasia detection. Conclusion: The risk of colorectal cancer (CRC) in ulcerative colitis increases with longer disease duration, extensive colitis, severe inflammation, dysplasia, and primary sclerosing cholangitis. Surveillance colonoscopy, including advanced techniques such as chromoendoscopy, improves early detection and may reduce CRC risk. Further research is needed to better understand disease mechanisms and develop improved biomarkers and preventive strategies. Keywords: Inflammatory Bowel Disease (IBD), Ulcerative Colitis, Crohn’s Disease, Colorectal Neoplasia, Chronic Inflammation, Disease Duration, Pancolitis, Surveillance Colonoscopy.

Page No: 565-571 | Full Text

 

Original Research Article

PREDICTORS OF OUTCOME OF NON INVASIVE VENTILATION IN ACUTE EXACERBATION OF CHRONIC OBSTRUCTIVE PULMONARY DISEASE: A PROSPECTIVE COHORT STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.96

Nini Navakumar, Jayaprakash B, Rajan D

View Abstract

Background: Non-invasive ventilation (NIV) is a proven modality for managing acute exacerbations of chronic obstructive pulmonary disease (COPD) with respiratory failure. It reduces morbidity, hospital stay, and mortality. However, failure of NIV and delayed intubation can worsen outcomes. The objective is to identify clinical and biochemical predictors of NIV outcome in patients with acute exacerbation of COPD (AECOPD) admitted with respiratory failure. Materials and Methods: This prospective cohort study was conducted in the Intensive Respiratory Care Unit, Department of Respiratory Medicine, Government Medical College, Trivandrum, from January 2017 to July 2018. Eighty-five AECOPD patients with pH 7.2-7.35 and/or PaCO₂ ≥45 mmHg treated with NIV were included. ABG parameters were analyzed at 0, 2, 4, 24, and 48 hours. Patients showing clinical and ABG improvement at 48 hours were categorized as successful; others requiring intubation or death were classified as failures. Results: NIV success was seen in 65 patients (76.4%). Poor Glasgow Coma Scale (GCS <8), serum bilirubin >1.2 mg/dL at admission, pH ≤7.3, and PaCO₂ ≥75 mmHg at 4 hours were independent predictors of NIV failure. Multivariate logistic regression identified pH ≤7.3 at 4 hours (OR 12.67; 95% CI 2.6-60.6; p=0.001), serum bilirubin >1.2 mg/dL (OR 8.3; 95% CI 1.2-54.7; p=0.02), GCS <8 (OR 4.46; 95% CI 1.0-19.5; p=0.04), and PaCO₂ ≥75 mmHg at 4 hours (OR 4.14; 95% CI 1.06-16.21; p=0.04) as significant predictors. Conclusion: Early reassessment of ABG after 4 hours of NIV initiation is essential. Persistently low pH, high PaCO₂, poor GCS, and elevated bilirubin levels predict NIV failure and can guide early intubation decisions. Keywords: ABG parameters, Bilirubin, ph, Hypercapnia, GCS, NIV failure predictors, Respiratory failure, Acute exacerbation, COPD.

Page No: 572-578 | Full Text

 

Original Research Article

EFFECT OF INTRAOPERATIVE LOCAL INFILTRATION ANESTHESIA ON POSTOPERATIVE PAIN FOLLOWING TOTAL KNEE REPLACEMENT: A COMPARATIVE STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.97

Y.Hari Prasad Reddy, P.Devanatha Reddy

View Abstract

Background: Total knee replacement is a commonly performed procedure for end-stage knee osteoarthritis, providing significant pain relief and functional improvement. However, postoperative pain remains a major concern, often affecting early mobilization, rehabilitation, and overall patient satisfaction. Effective pain control strategies are therefore essential. Intraoperative local infiltration anesthesia has emerged as a simple and effective technique aimed at reducing postoperative pain and opioid requirements. The objective is to evaluate the effect of intraoperative local infiltration anesthesia on postoperative pain and functional recovery in patients undergoing total knee replacement. Materials and Methods: This comparative study included patients undergoing total knee replacement, divided into two groups based on the use of intraoperative local infiltration anesthesia. One group received local infiltration anesthesia during surgery, while the control group underwent standard surgical procedure without infiltration. Postoperative pain was assessed using a standardized pain scoring system at predefined intervals. Secondary outcomes included time to mobilization, duration of hospital stay, and requirement of rescue analgesics. Patients were followed up during the immediate postoperative period for assessment of outcomes. Results: Patients who received intraoperative local infiltration anesthesia demonstrated significantly lower postoperative pain scores in the early postoperative period compared to the control group. The requirement for rescue analgesics was reduced in the infiltration group. Early mobilization was achieved more effectively, and patients in this group showed improved comfort during physiotherapy sessions. Duration of hospital stay was also shorter in patients receiving local infiltration anesthesia. No significant increase in complications was observed. Conclusion: Intraoperative local infiltration anesthesia is an effective and safe technique for reducing postoperative pain following total knee replacement. It facilitates early mobilization, decreases analgesic requirements, and improves overall patient recovery. Incorporation of this technique into standard perioperative pain management protocols can enhance postoperative outcomes. Keywords: Total knee replacement, Local infiltration anesthesia, Postoperative pain, Analgesia, Early mobilization, Orthopedic surgery

Page No: 579-584 | Full Text

 

Original Research Article

COMPARATIVE STUDY BETWEEN INTERLOCKING NAILING AND DYNAMIC COMPRESSION PLATING IN HUMERAL DIAPHYSEAL FRACTURES WITH RESPECT TO FUNCTIONAL AND SURGICAL OUTCOMES

http://dx.doi.org/10.70034/ijmedph.2026.2.98

P.Devanatha Reddy, Y.Hari prasad Reddy

View Abstract

Background: Fractures of the humeral diaphysis are commonly encountered in orthopedic practice and can be managed by both operative and non-operative methods. Among surgical options, dynamic compression plating and interlocking intramedullary nailing are widely used techniques. While plating provides stable fixation with direct fracture visualization, interlocking nailing offers a minimally invasive approach with preservation of soft tissue biology. However, there remains ongoing debate regarding the superiority of either technique in terms of functional recovery, union rates, and complication profile. The objective is to compare the functional and surgical outcomes of interlocking nailing and dynamic compression plating in the management of humeral diaphyseal fractures. Materials and Methods: This comparative study included patients with humeral shaft fractures managed surgically using either interlocking intramedullary nailing or dynamic compression plating. Patients were divided into two groups based on the surgical technique employed. Functional outcomes were assessed using standard scoring systems, while surgical outcomes were evaluated based on duration of surgery, intraoperative blood loss, time to union, and complications. Patients were followed up at regular intervals for clinical and radiological assessment. Results: Both treatment modalities achieved satisfactory fracture union. Interlocking nailing demonstrated advantages in terms of shorter operative time and reduced intraoperative blood loss. However, dynamic compression plating showed superior functional outcomes, particularly with respect to shoulder function and range of motion. Time to union was comparable between the two groups, although a slightly earlier union trend was observed in the plating group. Complication rates varied between the groups, with shoulder stiffness being more common in the nailing group and infection-related complications observed more frequently in the plating group. Conclusion: Both interlocking nailing and dynamic compression plating are effective methods for the management of humeral diaphyseal fractures. While interlocking nailing offers advantages in surgical parameters, dynamic compression plating provides better functional outcomes, especially in terms of shoulder mobility. The choice of procedure should be individualized based on fracture characteristics, patient factors, and surgeon expertise. Keywords: Humerus shaft fracture, Interlocking nailing, Dynamic compression plating, Functional outcome, Surgical outcome, Fracture union, Orthopedic trauma.

Page No: 585-590 | Full Text

 

Original Research Article

MRI PROTOCOL FOR EPILEPSY IMAGING IN THE DEVELOPING WORLD: A PRACTICAL PROPOSAL

http://dx.doi.org/10.70034/ijmedph.2026.2.99

Nilesh Kumar Sinha, Gaurav Raj, Rudra Prasad Ghosh, Shubhlaxmi Srivastava

View Abstract

Background: Epilepsy disproportionately affects low- and middle-income countries (LMICs), where infectious etiologies such as neurocysticercosis and tuberculomas are common. Standard MRI protocols are primarily optimized for non-infectious causes and may reduce diagnostic yield in these settings. Purpose: To evaluate a modified MRI epilepsy protocol incorporating post-contrast T1-weighted imaging and susceptibility-weighted imaging (SWI) for improved lesion detection in LMICs. Materials and Methods: In this retrospective study, 102 patients with epilepsy underwent 3T MRI using a protocol including 3D T1, T2, FLAIR, DWI, SWI, and post-contrast T1 sequences. Lesion types and sequence-wise detection were analyzed, and stepwise rank analysis assessed cumulative diagnostic yield. Results: Inflammatory granulomas were the most common lesions (39.2%). Post-contrast T1-weighted imaging showed the highest standalone detection (45.1%), particularly for infectious and neoplastic lesions. FLAIR and T2 sequences were essential for gliosis, demyelination, and mesial temporal sclerosis, while SWI was critical for detecting calcified granulomas and cavernomas. Combined use of T1+C and FLAIR detected 73.5% of lesions, increasing to 100% with additional sequences. Conclusion: Inclusion of post-contrast T1 and SWI significantly improves detection of epileptogenic lesions in LMICs. A region-specific MRI protocol enhances diagnostic yield and may improve epilepsy management in resource-limited settings. Keywords: MRI, epilepsy, granuloma, neurocysticercosis, tuberculoma, epilepsy surgery, epilepsy protocol.

Page No: 591-594 | Full Text

 

Original Research Article

CLINICOPATHOLOGICAL STUDY OF SALIVARY GLAND LESIONS WITH REFERENCE TO HISTOLOGICAL TYPES, AGE, SEX AND SITE DISTRIBUTION

http://dx.doi.org/10.70034/ijmedph.2026.2.100

Urmi Chakravarty Vartak, Shital Mahure, Shailesh Vartak, Shailaja Puri

View Abstract

Background: Salivary gland lesions comprise a diverse group of pathological conditions ranging from non-neoplastic inflammatory processes to benign and malignant neoplasms. Due to their varied histological patterns and clinical presentations, accurate diagnosis remains a challenge. The aim is to study the clinicopathological spectrum of salivary gland lesions with reference to histological types, age, sex, and site distribution. Materials and Methods: This hospital-based observational study included 79 cases of salivary gland lesions diagnosed over a defined study period at a tertiary care center. Clinical details such as age, sex, and site were recorded. Specimens were processed using standard histopathological techniques, and lesions were classified according to WHO guidelines. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: The mean age of patients was 38.9 ± 16.4 years, with the highest incidence in the 41–60 years age group (43.0%). There was no significant gender predilection, with males (50.6%) and females (49.4%) being almost equally affected. The parotid gland (62.0%) was the most commonly involved site. Benign lesions constituted the majority (49.4%), followed by non-neoplastic (39.2%) and malignant lesions (11.4%). Pleomorphic adenoma (44.3%) was the most common benign tumor, while mucoepidermoid carcinoma (10.1%) was the most frequent malignant lesion. Non-neoplastic lesions such as sialadenitis and abscess were also commonly observed. Conclusion: Salivary gland lesions show a wide clinicopathological spectrum with a predominance of benign tumors, particularly pleomorphic adenoma. The parotid gland is the most commonly affected site, and lesions are most frequent in middle-aged individuals. Histopathological examination remains essential for accurate diagnosis and appropriate management. Keywords: Salivary gland lesions. Pleomorphic adenoma. Mucoepidermoid carcinoma.

Page No: 595-601 | Full Text

 

Original Research Article

ANTHROPOMETRY AND HAEMATOLOGICAL PROFILE OF BABIES BORN TO MOTHERS WITH PRE-ECLAMPSIA VERSUS NORMAL MOTHERS

http://dx.doi.org/10.70034/ijmedph.2026.2.101

Aswathy Sunil, Adheena Mini Augustine, Bini Mariam Chandy

View Abstract

Background: Neonatal survival depends on a wide range of factors of which the birth weight, head circumference, hematological parameters at birth are key determinants. To tackle the high rates of neonatal deaths related to low birth weight, prematurity etc it is important to understand the factors leading to the same, of which maternal hypertension is one of the common cause. The objective is to compare the hematological profile and anthropometrical values of neonates born to mothers with preeclampsia and neonates born to healthy mothers. Materials and Methods: This was a cross sectional study design comprising 76 newborns who met the inclusion criteria. Neonates were grouped into neonates born to preeclamptic mothers and healthy mothers equally, 38 in each group. After birth immediately 2 ml of blood was collected from the umbilical cord into the vacutainer anti coagulated with EDTA and following parameters were studied- Hemoglobin, Total count, Differential Count, Platelet Count and the red cell indices – MCV, MCH, MCHC. The proforma was filled and all the values were collected and entered systematically. All categorical data were analysed with Chi square test. Results: On comparing the newborns born to mothers with preeclampsia versus normal mothers, the former group had higher hemoglobin levels, hematocrit values, lower leukocyte count and platelet count than the later group. MCV, MCH, MCHC were more or less similar in both the groups. Conclusion: Leukopenia, thrombocytopenia were found at a higher rate in newborns of hypertensive mothers. Newborns of hypertensive mothers carry a risk for complications including primarily infection and bleeding. Keywords: Preeclampsia, Neonatal Hematological Profile, Anthropometric Measurements in Newborns, Birth Weight and Head Circumference, Maternal Hypertension, Neonatal Leukopenia.

Page No: 602-605 | Full Text

 

Original Research Article

A STUDY OF TISSUE DOPPLER IMAGING FOR ASSESSMENT OF LEFT VENTRICULAR DYSFUNCTION IN CORONARY ARTERY DISEASE

http://dx.doi.org/10.70034/ijmedph.2026.2.102

Bilal Khan, Prashant Udgire

View Abstract

Background: Coronary artery disease (CAD) remains the leading cause of morbidity and mortality worldwide. Early detection of left ventricular (LV) systolic and diastolic dysfunction is crucial for prognostic stratification and therapeutic decision-making. Conventional Doppler echocardiography assesses LV function using parameters such as transmittal E/A ratio and left ventricular ejection fraction (LVEF). However, Tissue Doppler Imaging (TDI) provides direct quantitative assessment of myocardial velocities and may detect subclinical dysfunction earlier. The objective is to determine left ventricular mitral annular systolic (Sa) and early diastolic (Ea) velocities using TDI and compare them with conventional Doppler echocardiographic parameters in patients with CAD. Materials and Methods: This hospital-based observational study included 60 patients (39 males, 21 females) with newly or previously diagnosed CAD admitted to Khaja Banda Nawaz Teaching and General Hospital, Kalaburagi. All subjects underwent standard two-dimensional echocardiography, conventional pulsed-wave Doppler, and pulsed-wave TDI using a GE LOGIQ F8 system. Mitral annular velocities were recorded from the lateral annulus in the apical four-chamber view. Peak systolic (Sa), early diastolic (Ea), and late diastolic (Aa) velocities were measured and compared with conventional parameters including LVEF and transmitral E/A ratio. Statistical significance was defined as p < 0.05. Results: TDI-derived Ea detected diastolic dysfunction in 75% of CAD patients compared to 60% detected by conventional E/A ratio (p < 0.001). For systolic dysfunction, conventional LVEF identified 83.3% of cases, whereas Sa velocity detected 61.7%. Ea showed superior sensitivity in identifying diastolic dysfunction, while LVEF remained a stronger predictor of systolic dysfunction. Conclusion: TDI-derived Ea is a more sensitive parameter for detecting LV diastolic dysfunction in CAD patients compared to conventional Doppler E/A ratio. However, LVEF remains superior to Sa velocity in identifying systolic dysfunction. TDI parameters (Ea and Sa) provide valuable complementary diagnostic and prognostic information in CAD patients. Keywords: Coronary artery disease; Tissue Doppler Imaging; Left ventricular dysfunction; Mitral annular velocity; Ejection fraction.

Page No: 606-611 | Full Text

 

Original Research Article

CLINICAL PROFILE, EPIDEMIOLOGY, AND OUTCOME OF PATIENTS WITH CHLORPYRIFOS COMPOUND POISONING ADMITTED IN KIMS TEACHING HOSPITAL,KOPPAL: A PROSPECTIVE OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.103

Bhagath Kumar V, Umesh G Rajoor, Tanveer Ahmed T M, Gavisiddesh V Ronad

View Abstract

Background: Chlorpyrifos is a chlorinated organophosphate compound commonly implicated in pesticide poisoning in India. This study aimed to evaluate the clinical profile, epidemiology, and outcomes of patients with chlorpyrifos poisoning admitted to KIMS teaching hospital koppal. Materials and Methods: This prospective observational study was conducted at Koppal Institute of Medical Sciences from August 2024 to July 2025. Fifty-four patients with confirmed chlorpyrifos poisoning were enrolled. Demographic data, clinical features, laboratory parameters, treatment details, and outcomes were analyzed using descriptive statistics, chi-square test, and independent t-test. Results: The mean age was 32.10±9.95 years with male predominance (79.6%). Majority were suicidal cases (92.6%) among farmers (63.0%). Common clinical features included altered sensorium (48.1%), sweating (40.7%), and excess salivation (27.8%). Mortality rate was 33.3% among treated patients. Significant predictors of mortality included low GCS (p<0.001), high POP score (p<0.001), low SpO2 (p<0.001), mechanical ventilation requirement (p=0.002), respiratory distress (p=0.004), altered sensorium (p=0.002), and intermediate syndrome (p=0.008). Conclusion: Chlorpyrifos poisoning carries significant mortality. Early identification of high-risk patients using GCS and POP scoring with prompt intensive care may improve outcomes. Keywords: Chlorpyrifos; Organophosphate poisoning; Intermediate syndrome; Pseudocholinesterase; Mortality.

Page No: 612-616 | Full Text

 

Original Research Article

PROFILE OF SUPPURATIVE CORNEAL ULCER

http://dx.doi.org/10.70034/ijmedph.2026.2.104

Anil Kumar Verma, Anushree Gupta

View Abstract

Background: Purpose: To analyze the prevalence of bacterial and fungal corneal ulcer among diagnosed cases of suppurative corneal ulcer and to compare the role of direct microscopy and culture results in etiological diagnosis of suppurative corneal ulcer. Materials and Methods: A total of 31 patients with clinically diagnosed cases of suppurative corneal ulcer of all ages and either sex, who presented during a period of one year were studied. A detailed history and ocular examination was performed on each patient. All these corneal ulcers were scraped for direct microscopy and culture. Results: In this study 19 cases (61.3 %) were culture positive. In 15 cases (48.4%) pure bacterial growth was observed, in 3 cases (9.7%) pure fungal growth was observed. In one case both bacterial & fungal growth was observed. The most common bacterial isolate detected was Staphylococcus aureus. Most common fungal isolate detected was Acremonium species. Analysis of KOH wet-mount and Gram-stained smear was done using culture as gold-standard, sensitivity and specificity of KOH wet mount was 25% and 92.6% respectively. Sensitivity and specificity of Gram stain smear was 33.3 % and 95.5 % respectively. Conclusion: We conclude that suppurative corneal ulcer is a major cause of preventable monocular blindness and educational strategies can reduce avoidable risk such as trauma, but treatment protocols are required to manage established disease Keywords: Corneal ulcer, hypopyon, bacterial, fungal, stromal, conjunctival, microscopy, culture.

Page No: 617-621 | Full Text

 

Case Report

HYPONATREMIC HYPERTENSIVE SYNDROME WITH RENAL ISCHEMIA SECONDARY TO TAKAYASU VASCULITIS: A RARE PEDIATRIC CASE

http://dx.doi.org/10.70034/ijmedph.2026.2.105

Himanshi Aggarwal, Prasun Bhattacharje, Aditya Bhattacharjee

View Abstract

Takayasu arteritis is a chronic inflammatory large-vessel vasculitis predominantly affecting the aorta and its major branches, commonly seen in young females and frequently involving the renal arteries in pediatric patients from India and the Far East. Hyponatremic hypertensive syndrome is a rare but important clinical entity characterized by severe hypertension, hyponatremia, and polyuria, typically driven by unilateral renal ischemia with excessive renin secretion and pressure natriuresis from the contralateral kidney; early recognition is essential to prevent complications. This report describes a 4-year-old girl admitted to the pediatric intensive care unit with multiple afebrile convulsions and a history of polyuria and polydipsia, found to have absent bilateral brachial and radial pulses, higher lower-limb than upper-limb blood pressure, and grade 1 hypertensive retinopathy. Laboratory evaluation demonstrated persistent hyponatremia with polyuria and normal serum creatinine. Brain MRI showed multiple hypodensities, while CT angiography revealed multiple stenotic lesions of the right and left subclavian arteries and left vertebral artery, along with significant right renal artery stenosis and a small right kidney; echocardiography demonstrated left ventricular hypertrophy consistent with chronic hypertension. These findings supported a diagnosis of Takayasu arteritis complicated by renal artery stenosis and hyponatremic hypertensive syndrome, likely mediated by RAAS activation, contralateral pressure diuresis, and ADH-related volume effects exacerbating hyponatremia. Treatment with corticosteroids and methotrexate led to clinical improvement with normalization of blood pressure and serum sodium; however, repeat CT angiography showed progression of renal artery stenosis, prompting planning for renal angioplasty. The case underscores that refractory stage 2 hypertension with persistent hyponatremia in children should prompt evaluation for renal artery stenosis and vasculitis, with careful pulse examination and early imaging-based diagnosis to enable timely immunosuppression and, when needed, vascular intervention to reduce life-threatening complications and long-term morbidity. Keywords: Hyponatremic hypertensive syndrome (HHS), Takayasu arteritis (TA), renal artery stenosis, pediatric hypertension, EULAR.

Page No: 622-624 | Full Text

 

Original Research Article

CLINICAL PROFILE, ETIOLOGY, COMORBIDITIES, AND TREATMENT PATTERNS AMONG PATIENTS WITH CHRONIC KIDNEY DISEASE ATTENDING A TERTIARY CARE HOSPITAL IN ODISHA: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.106

Ramakanta Panda, Lopamudra Das, Shakti Prakash Mishra

View Abstract

Background: Chronic kidney disease (CKD) represents an increasing public health and clinical challenge in India, driven largely by the rising burden of non-communicable diseases such as diabetes and hypertension. Delayed diagnosis and limited access to renal replacement therapy contribute to significant morbidity, mortality, and financial hardship. The objective is to assess the clinical profile, etiology, comorbidities, treatment status, and direct treatment-related costs among CKD patients attending a tertiary care hospital in Odisha. Materials and Methods: A hospital-based cross-sectional study was conducted among 180 adult CKD patients attending the nephrology outpatient department and dialysis unit of PGIMER & Capital Hospital, Bhubaneswar, Odisha. Data on sociodemographic characteristics, CKD etiology, comorbidities, and treatment modalities were collected using a semi-structured schedule and verified through medical records. Descriptive statistics and Chi-square tests were used for analysis. Institutional ethical approval and informed consent were obtained. Results: The mean age of patients was 54.1 ± 13.8 years, with males comprising 72.2%. Diabetic nephropathy (40.0%) and hypertensive nephropathy (27.2%) were the leading etiologies. Hypertension (61.7%) and type 2 diabetes mellitus (48.9%) were the most common comorbidities. Advanced CKD (Stages 4–5) constituted 61.1% of the study population. Hemodialysis was the predominant treatment modality (35.0%), while peritoneal dialysis (3.3%) and renal transplantation (3.3%) were less frequently utilized. A majority of dialysis services were accessed from private hospitals. Median monthly expenditure for dialysis was ₹10,500, in addition to transportation and medication costs. Utilization of financial protection schemes was partial. The study highlights a substantial burden of CKD among individuals in their productive years, with diabetes and hypertension as key etiological and comorbid factors. Limited public sector dialysis capacity and reliance on private services contribute to financial stress among patients and families. Conclusion: Strengthening early CKD detection in primary care, improving NCD management, expanding renal replacement capacity, and enhancing financial protection mechanisms are essential to reduce the clinical and economic burden of CKD in resource-constrained settings. Keywords: Chronic kidney disease, Diabetes mellitus, Hypertension, Comorbidity, Hemodialysis, Renal replacement therapy, Renal transplantation, Financial burden.

Page No: 625-630 | Full Text

 

Original Research Article

TO COMPARE THE EFFECTIVENESS OF DAILY VERSUS ALTERNATE DAY ORAL IRON THERAPY FOR OBSTETRIC PATIENTS WITH IRON DEFICIENCY ANAEMIA: RANDOMIZED CONTROLLED TRAIL

http://dx.doi.org/10.70034/ijmedph.2026.2.107

Kanika Bansal, Chandana C, Sushma R

View Abstract

Background: Iron deficiency anaemia remains a major public health concern among obstetric patients, with significant maternal and fetal consequences. Conventional daily oral iron therapy is often limited by poor compliance due to gastrointestinal side effects. Alternate-day dosing has been proposed to improve iron absorption and tolerability. This study aimed to compare the effectiveness of daily versus alternate-day oral iron therapy in obstetric patients with iron deficiency anaemia. Materials and Methods: This randomized controlled trial was conducted on 46 obstetric patients with haemoglobin <11 g/dL, allocated into two groups: daily (n=23) and alternate-day (n=23) oral iron therapy (100 mg elemental iron). Patients were followed for 8 weeks. Haemoglobin, haematological indices, serum ferritin, gastrointestinal side effects, and compliance were assessed. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: Both groups showed significant improvement in haemoglobin levels from baseline to 8 weeks (p < 0.001). At 8 weeks, haemoglobin was higher in the alternate-day group (11.16 ± 1.15 g/dL) compared to the daily group (10.73 ± 1.07 g/dL), though not statistically significant (p > 0.05). Serum ferritin improved in both groups without significant intergroup difference. Gastrointestinal side effects, particularly epigastric pain and constipation, were significantly higher in the daily group (p < 0.05). Compliance was significantly better in the alternate-day group (85.15% vs 72.49%, p = 0.0006). Conclusion: Alternate-day oral iron therapy is as effective as daily therapy in improving haemoglobin and iron stores, with better tolerability and compliance, making it a suitable alternative in obstetric patients. Keywords: Iron deficiency anaemia, pregnancy, oral iron therapy, alternate-day dosing, haemoglobin, compliance, randomized controlled trial

Page No: 631-636 | Full Text

 

Original Research Article

MICROBIAL FLORA OF HEALTH CARE WORKERS ACCESSORIES - A POTENTIAL SOURCE OF HEALTHCARE-ASSOCIATED INFECTIONS

http://dx.doi.org/10.70034/ijmedph.2026.2.108

K. Snehaa, Sukriti Singh, Vineeta, Shwetha V R, Ruby Thomas

View Abstract

Background: Healthcare-associated infections (HCAIs) pose a significant risk to hospitalized patients, with healthcare workers (HCWs) and their accessories often acting as potential sources of transmission. This study aimed to assess the microbial contamination of common HCWs accessories, such as mobile phones, stethoscopes, pens, and finger rings, to better understand their role in spreading infections and antimicrobial resistance. Materials and Methods: This cross-sectional study was conducted from September to October 2023 at Andaman and Nicobar Islands Institute of Medical Sciences, Sri Vijayapuram. A total of 63 HCWs (52 doctors and 11 nurses) participated. Samples were collected from mobile phones, stethoscopes, finger rings, and pens using sterile cotton-tipped swabs. The samples were processed for microbial growth as per the standard bacteriological protocols, followed by antimicrobial susceptibility testing using the Kirby-Bauer disc diffusion method. Results: Among the 193 samples collected, 76.1% were contaminated with pathogens, with mobile phones (84.1%) showing the highest contamination rate, followed by finger rings (79.2%), stethoscopes (74.4%), and pens (68.3%). The predominant microorganisms isolated were Coagulase Negative Staphylococcus (CONS) (55.1%) and Staphylococcus aureus (42.1%). A significant proportion of these pathogens were resistant to common antibiotics, with 48.38% of S. aureus being methicillin-resistant (MRSA) and 44.44% of CONS being methicillin-resistant (CONS-MR). Conclusion: This study highlights the high prevalence of bacterial contamination of HCWs accessories, particularly mobile phones. It underscores the need for routine disinfection of these items and improved hand hygiene practices to prevent the transmission of harmful pathogens in hospital settings. The findings also suggest the potential role of these accessories in the spread of antimicrobial-resistant organisms, warranting stricter infection control protocols in healthcare settings. Keywords: Healthcare-associated infections, healthcare workers, microbial contamination, mobile phones, stethoscopes, finger rings, pens, antimicrobial resistance

Page No: 637-643 | Full Text

 

Original Research Article

EXPLORING THE LINK BETWEEN GLYCEMIC STATUS AND PARATHYROID HORMONE IN TYPE 2 DIABETES

http://dx.doi.org/10.70034/ijmedph.2026.2.109

Zeba Tanveer, Katta Vani, B Aradhana, Geetha Meenakshi

View Abstract

Background: Parathyroid hormone (PTH) is composed of 84 amino acids secreted by parathyroid cells. It plays a vital role in mineral and bone metabolism by promoting bone resorption, inhibiting urinary calcium (Ca) loss and accelerating vitamin D activation.[1] High blood glucose non-enzymatically glycates haemoglobin at numerous sites. The process of Glycation, takes 120 days. So, this property of Hb is used to track the average amount of glucose in the blood and measures a patient's glycaemic state for the past 3 months.[2] Objectives: 1. To measure HbA1c levels in both cases and controls. 2. To measure PTH levels in cases and controls. 3. To correlate the association of HbA1c and PTH among cases and controls. Materials and Methods: A case control study on 100 patients, out of which 50 cases of type 2 diabetes mellitus and 50 normal healthy individuals. HbA1c levels were measured using HPLC BIORAD D10 and PTH levels were measured using CLIA SIEMENS ADVIA CENTAUR XP. Spearman’s rho was used to assess the correlation between HbA1c and PTH in cases and pearsons correlation analysis was done to assess the correlation between Hba1c and PTH in controls. Results: Hundred participants (50 type 2 diabetes patients and 50 age- and sex- matched controls) were studied. HbA1c and PTH differed significantly between groups, being higher in cases. Correlations between HbA1c and PTH were weak in both groups, with large (HbA1c) and moderate (PTH) effect sizes. Higher PTH levels in higher quartiles. Conclusion: Patients with type 2 diabetes showed significantly higher HbA1c and PTH levels than controls. However, weak correlation existed between these variables, suggesting multifactorial regulation of PTH beyond glycemia. Elevated PTH may contribute to skeletal complications in diabetes, warranting further large- scale longitudinal studies. Keywords: HbA1c, PTH, Duration of diabetes mellitus.

Page No: 644-648 | Full Text

 

Original Research Article

EVALUATING THE IMPACT OF AN EDUCATIONAL INTERVENTION ON KNOWLEDGE OF THE ‘PROTECTION OF CHILDREN FROM SEXUAL OFFENCES’ (POCSO) ACT AMONG HIGH SCHOOL STUDENTS

http://dx.doi.org/10.70034/ijmedph.2026.2.110

Keerthiga S, N. Banerji, S. Sunitha

View Abstract

Background: Child sexual abuse (CSA) remains a major public health concern globally and in India. According to UNICEF (2024), 1 in 5 girls and 1 in 7 boys experience sexual violence during childhood. The National Crime Records Bureau (2022) reported that only 39.7% of crimes against children were registered under the POCSO Act. Limited awareness of laws, social inequalities, and cultural norms contribute significantly to vulnerability. School-based educational interventions can empower children with knowledge on safety, reporting mechanisms, and legal protection. Materials and Methods: This study was conducted among 92 students studying in classes 8th–10th in a government school under the UHTC field practice area of Tirupati. Students willing to participate with assent and parental consent were included. The questionnaire validated by legal experts was used to assess pre- and post-intervention knowledge. The educational intervention included two awareness sessions on the POCSO Act. Statistical analysis included paired t-tests and chi-square tests. Results: Participants were predominantly male (70.7%). Pre-test results showed only 11 students (12%) had adequate knowledge. Post-intervention, a significant increase in knowledge for 56 students (60.9%). In terms of mean, there has been significant improvement in knowledge score, improving from 7.15 to 10.79 (p < 0.000), reflecting the impact of education intervention. Conclusion: The educational intervention significantly improved awareness and understanding of the POCSO Act among high school students. Incorporating legal-awareness programs into school health initiatives is crucial to empower adolescents, enhance reporting, and prevent child sexual abuse. Keywords: POCSO, education intervention, knowledge assessment, child sexual abuse, awareness, high school students, Tirupati.

Page No: 649-654 | Full Text

 

Case Report

MULTIDISCIPLINARY APPROACH TO SALVAGING A PRECIOUS DIABETIC LIMB WITH CONTRALATERAL LIMB CONSIDERATIONS: A CASE REPORT

http://dx.doi.org/10.70034/ijmedph.2026.2.111

Premkumar E, Aravind P

View Abstract

Background: Diabetic foot ulcers are a leading cause of non-traumatic lower limb amputations, particularly in patients with poor glycemic control and peripheral vascular disease. Limb preservation becomes critically important when the contralateral limb is already compromised, as further amputation can result in significant functional disability. Case Presentation: We present the case of a 56-year-old male with long-standing uncontrolled diabetes mellitus who developed an infected ulcer over the right foot. The patient had a prior history of contralateral forefoot amputation, rendering the affected limb functionally critical Clinical evaluation revealed a deep ulcer with surrounding cellulitis, peripheral neuropathy, and reduced distal perfusion. Laboratory investigations showed poor glycemic control and elevated inflammatory markers. Imaging confirmed soft tissue infection without advanced osteomyelitis. A multidisciplinary approach involving diabetology, surgery, vascular assessment, and wound care was implemented. The patient underwent prompt surgical debridement, culture-directed antibiotic therapy, strict glycaemic control, and structured wound care with appropriate offloading strategies. Conclusion: This case highlights the importance of early multidisciplinary intervention in salvaging a high-risk diabetic limb, especially in the presence of contralateral limb compromise. Coordinated care can effectively prevent major amputation and preserve functional independence. Keywords: Diabetic foot ulcer; Limb salvage; Multidisciplinary care; Peripheral arterial disease; Wound management.

Page No: 655-659 | Full Text

 

Original Research Article

PATTERNS OF VACCINATION COVERAGE AT THE MODEL IMMUNIZATION CENTER: INSIGHTS FROM SIMS/SERVICES HOSPITAL LAHORE

http://dx.doi.org/10.70034/ijmedph.2026.2.112

Zohra Khanum, Fatima Tahira

View Abstract

Background: Vaccination is a cornerstone of public health, effectively reducing morbidity and mortality from infectious diseases. In Pakistan, the Expanded Program on Immunization (EPI) targets multiple vaccine-preventable diseases, yet significant disparities persist due to socioeconomic, geographic, and cultural barriers. Urban centers like the Model Immunization Center at SIMS/Services Hospital Lahore provide an opportunity to assess immunization patterns and inform strategies for improving vaccine uptake. Methodology: A retrospective, observational study was conducted at the Model Immunization Center over a four-month period (September 26, 2024, to January 18, 2025). Vaccination records were reviewed and categorized by type (routine, traveler’s, emergency, and COVID-19), shift (morning, evening, night), and gender. Descriptive statistics were used to analyze trends in coverage. Results: Out of 12,200 total vaccinations, traveler’s vaccines accounted for 9,500 (77.8%), followed by routine (1,200; 9.8%), emergency (800; 6.6%), and COVID-19 vaccines (700; 5.7%). The majority were administered during the morning shift (60–80%), indicating time-of-day preference. Males received a higher proportion of all vaccine types (55–70%), particularly in traveler’s and COVID-19 categories. Routine and emergency vaccinations showed moderate uptake, while COVID-19 vaccination coverage remained relatively low despite ongoing public health efforts. Conclusion: The study reveals distinct vaccination patterns at an urban immunization center, highlighting the high demand for traveler’s vaccines and moderate uptake of routine and emergency immunizations. Gender disparities and limited uptake during evening and night shifts suggest the need for more inclusive and flexible vaccination strategies. Key words: Vaccination, EPI, immunization.

Page No: 660-664 | Full Text

 

Original Research Article

A COMPARATIVE STUDY OF NASOGASTRIC VERSUS NASOJEJUNAL FEEDING IN CRITICALLY ILL PATIENTS ADMITTED TO A TRAUMA CARE UNIT

http://dx.doi.org/10.70034/ijmedph.2026.2.113

Sagar Gawali, Aditya Patil, Renuka Purohit

View Abstract

Background: Enteral nutrition is a cornerstone in the management of critically ill patients. While nasogastric (NG) feeding is commonly used, it is associated with complications such as feed intolerance and aspiration. Nasojejunal (NJ) feeding may offer advantages by bypassing the stomach and improving nutrient delivery. The aim is to compare the efficacy and safety of nasogastric and nasojejunal feeding in critically ill patients admitted to a trauma care unit. Materials and Methods: This prospective observational study was conducted in a trauma care unit over a period of two years. A total of 120 critically ill patients requiring enteral nutrition were included and divided equally into NG (n=60) and NJ (n=60) feeding groups. Parameters assessed included time to achieve target nutrition, gastric residual volume, serum albumin levels, achievement of feeding goals, complications, ICU stay, and mortality. Results: The NJ group achieved target nutrition significantly faster than the NG group (23.6 ± 6.8 vs 35.6 ± 3.4 hours, p<0.05). Gastric residual volume was significantly lower in the NJ group (517.8 ± 107.9 vs 917.4 ± 155.3 mL, p<0.05). Post-feeding serum albumin levels were significantly higher in the NJ group (3.76 ± 1.05 vs 3.12 ± 0.78 g/dL, p<0.05), and a higher proportion achieved feeding goals (88.3% vs 65.0%, p<0.05). No significant differences were observed in ICU stay, mortality, or complication rates between the groups. Conclusion: Nasojejunal feeding improves nutritional delivery and feeding efficiency compared to nasogastric feeding but does not significantly affect clinical outcomes such as mortality or ICU stay. It may be preferred in patients at high risk of feed intolerance. Keywords: Enteral nutrition; Nasogastric feeding; Nasojejunal feeding; Critical care; Trauma ICU.

Page No: 665-669 | Full Text

 

Original Research Article

COMPARATIVE STUDY OF MACHINE LEARNING–BASED PREDICTION MODELS VERSUS CONVENTIONAL CLINICAL RISK ASSESSMENT FOR POSTPARTUM HEMORRHAGE

http://dx.doi.org/10.70034/ijmedph.2026.2.114

Arushi Sharma, Arihant Sharma, Advaita Sharma

View Abstract

Background: Postpartum hemorrhage (PPH) remains one of the leading causes of maternal morbidity and mortality worldwide and continues to pose a major challenge in obstetric practice, particularly in tertiary care settings where high-risk pregnancies are frequently managed. Conventional clinical risk assessment is routinely used to identify women at risk; however, its predictive ability may be limited because of the multifactorial nature of PPH. With the growing use of digital health records and advanced analytics, machine learning–based prediction models have emerged as a promising approach for improving early identification of high-risk cases. Aim: To compare the predictive performance of machine learning–based prediction models with conventional clinical risk assessment for postpartum hemorrhage among women delivering at a tertiary care hospital. Materials and Methods: This hospital-based comparative observational study was conducted in the Department of Obstetrics and Gynecology at a tertiary care hospital. A total of 70 pregnant women admitted for vaginal delivery or cesarean section were included. Data were collected using a structured proforma from patient history, clinical examination, case records, labor room notes, operative records, and laboratory investigations. Demographic, obstetric, clinical, and delivery-related variables were recorded, including age, parity, body mass index, anemia, hypertensive disorders, previous cesarean section, placental abnormalities, induction of labor, prolonged labor, mode of delivery, macrosomia, and uterine atony. All patients were assessed by both conventional clinical risk assessment and machine learning–based prediction methods. Data were entered in Microsoft Excel and analyzed using SPSS version 27.0. Descriptive and comparative statistical analyses were performed, and model performance was compared using sensitivity, specificity, positive predictive value, negative predictive value, accuracy, and area under the receiver operating characteristic curve (AUROC). Results: Out of 70 patients, 14 (20.00%) developed postpartum hemorrhage. Significant factors associated with PPH included obesity (50.00% vs 19.64%, p=0.024), previous cesarean section (50.00% vs 21.43%, p=0.036), anemia (57.14% vs 21.43%, p=0.011), hypertensive disorders of pregnancy (35.71% vs 14.29%, p=0.049), placenta previa/abruption (28.57% vs 8.93%, p=0.041), prolonged labor (42.86% vs 16.07%, p=0.029), and uterine atony (64.29% vs 10.71%, p<0.001). The machine learning model outperformed conventional clinical risk assessment with higher sensitivity (85.71% vs 64.29%), specificity (89.29% vs 73.21%), positive predictive value (66.67% vs 37.50%), accuracy (88.57% vs 71.43%), and AUROC (0.91 vs 0.74). Conclusion: Machine learning–based prediction models demonstrated superior performance over conventional clinical risk assessment in predicting postpartum hemorrhage. Their use may enhance early risk stratification, clinical preparedness, and timely intervention in tertiary obstetric care. Keywords: Postpartum hemorrhage; machine learning; clinical risk assessment; prediction model; tertiary care hospital.

Page No: 670-677 | Full Text

 

Original Research Article

EVALUATION OF PREOPERATIVE DEXMEDETOMIDINE NEBULISATION ON THE PRESSOR RESPONSE TO LARYNGOSCOPY AND INTUBATION

http://dx.doi.org/10.70034/ijmedph.2026.2.115

Mohit Gedekar, Rahul Saxena, Geeta Bhandari, Jahanara, Deep Chilana

View Abstract

Background: Direct laryngoscopy and endotracheal intubation trigger a sympathetic surge, leading to increases in heart rate and blood pressure, which can be risky in patients with cardiovascular or neurological comorbidities. Dexmedetomidine, a selective α2-adrenergic agonist, has been shown to provide sedation, analgesia, and hemodynamic stability. Nebulized administration offers a non-invasive, patient-friendly alternative. Materials and Methods: This prospective, double-blind, randomized study included 88 adult patients (ASA I–II) undergoing elective surgery under general anesthesia. Patients were randomly allocated to receive nebulized dexmedetomidine (1 µg/kg, n=44) or saline (5 mL, n=44) 30 minutes before induction. Heart rate, blood pressure, SpO₂, propofol requirement, sedation scores (Ramsay), and side effects were recorded from pre-nebulization to 10 minutes post-intubation. Results: Baseline demographics were comparable. Dexmedetomidine significantly attenuated the rise in heart rate, systolic, diastolic, and mean arterial pressures during induction and intubation (p < 0.05). Propofol requirement was lower (84.45 ± 8.51 mg vs 101.75 ± 16.7 mg, p < 0.001), and Ramsay Sedation Scores were higher (3.18 ± 0.62 vs 1.93 ± 0.54, p < 0.001) in the dexmedetomidine group. Minor side effects, including bradycardia (6.9%) and hypotension (4.6%), were observed, with no significant differences compared to saline. Oxygen saturation remained stable. Conclusion: Preoperative nebulized dexmedetomidine is a safe, effective, and non-invasive method to attenuate the pressor response to laryngoscopy and intubation, reduce anesthetic requirement, and enhance sedation without compromising safety. Keywords: Dexmedetomidine, Nebulization, Laryngoscopy, Hemodynamic, General Anesthesia.

Page No: 678-682 | Full Text

 

Original Research Article

COMPARATIVE ANALYSIS OF SERUM ELECTROLYTE PROFILE AND SERUM URIC ACID LEVELS IN DIAGNOSED HYPERTENSIVE AND NORMOTENSIVE PATIENTS”

http://dx.doi.org/10.70034/ijmedph.2026.2.116

Devaki R.N, Prajna K, Vinay Kumar K, Naresh, Mahesh V

View Abstract

Background: Hypertension is a major global health concern and a leading risk factor for cardiovascular morbidity and mortality. Alterations in serum electrolyte levels and uric acid have been implicated in the pathophysiology of hypertension, influencing vascular tone, fluid balance, and renal function. Understanding these biochemical changes may aid in improving disease management and prevention strategies. The objective is to compare serum electrolyte profile (sodium, potassium, and chloride) and serum uric acid levels between hypertensive and normotensive individuals, and to assess the prevalence of associated biochemical abnormalities. Materials and Methods: This case-control study included 126 participants, comprising 63 hypertensive and 63 normotensive individuals. Baseline demographic and clinical data were recorded. Serum sodium, potassium, chloride, and uric acid levels were measured using standard laboratory methods. Statistical analysis was performed using the unpaired Student’s t-test, with a p-value < 0.05 considered statistically significant. Results: Hypertensive subjects demonstrated significantly higher mean serum sodium (140.75 ± 3.70 mmol/L), chloride (104.05 ± 2.70 mmol/L), and uric acid levels (4.47 ± 1.40 mg/dL) compared to normotensive individuals (135.08 ± 6.11 mmol/L, 99.62 ± 5.09 mmol/L, and 3.71 ± 1.14 mg/dL, respectively; p < 0.05). Serum potassium levels were significantly lower in hypertensive subjects (3.83 ± 0.56 mmol/L) than in normotensive controls (4.06 ± 0.48 mmol/L; p = 0.016). Additionally, the prevalence of hypernatremia, hypokalaemia, hyperchloremia, and hyperuricemia was higher among hypertensive individuals. Conclusion: Hypertension is associated with significant alterations in serum electrolyte levels and elevated uric acid. These findings highlight the importance of routine biochemical monitoring and suggest that correcting electrolyte imbalance and hyperuricemia may improve blood pressure control and reduce complications. Keywords: Hypertension; Electrolytes; Sodium; Potassium; Chloride; Uric acid; Hyperuricemia.

Page No: 683-686 | Full Text

 

Original Research Article

A STUDY OF VARIATIONS IN SUPRAMEATAL SPINE AND OTHER LANDMARKS ON THE LATERAL SURFACE OF TEMPORAL BONE AMONG CSOM PATIENTS

http://dx.doi.org/10.70034/ijmedph.2026.2.117

B.V.N.Muralidhar Reddy, George Pallapati, B. Zaiba kousar

View Abstract

Background: Chronic Suppurative Otitis Media (CSOM) is a common middle ear disease requiring surgical management. Identification of reliable anatomical landmarks, particularly the suprameatal spine (Henle’s spine), is essential for safe mastoidectomy. The objective is to evaluate the variations in the suprameatal spine and its relationship with lateral temporal bone landmarks in patients with CSOM. Materials and Methods: A hospital-based cross-sectional study was conducted on 120 patients with CSOM undergoing mastoidectomy at a tertiary care center. Intraoperative assessment included morphology of the suprameatal spine, mastoid pneumatization, ossicular status, and morphometric measurements of key anatomical landmarks. Data were analyzed using SPSS version 19. Results: The crest type of suprameatal spine was most common (68.3%), followed by triangular type (15%) and absence (16.7%). Absence of the spine was significantly associated with non-pneumatized mastoids (p < 0.05). The mean distance from Henle’s spine to the lateral semicircular canal and sinodural angle was 15.2 mm and 16.8 mm, respectively. Ossicular erosion was observed in 30% of cases, with the incus most commonly affected. Most patients had mild to moderate conductive hearing loss. Conclusion: Although the suprameatal spine is a valuable surgical landmark, its variability—especially in sclerotic mastoids—necessitates the use of multiple anatomical references during mastoidectomy. Understanding these variations can improve surgical safety and outcomes. Keywords: Chronic Suppurative Otitis Media; Suprameatal Spine; Henle’s Spine; Mastoidectomy; Hearing Loss.

Page No: 687-691 | Full Text

 

Original Research Article

ASSOCIATION BETWEEN SCREEN TIME AND BEHAVIOURAL HEALTH PROBLEMS AMONG TEENAGERS ATTENDING A TERTIARY CARE HOSPITAL IN LUCKNOW: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.118

Mehak Singh, Saksham Srivastava, Kamran Ahmad, Dhananjay Kumar

View Abstract

Background: The increasing penetration of digital media has significantly altered the lifestyle of teenagers, raising concerns regarding its potential impact on behavioural health. This study was conducted to estimate screen time and assess its association with behavioural health problems among teenagers aged 11–16 years. Materials and Methods: A hospital-based cross-sectional study was carried out from January to December 2024 in the pediatric outpatient department of a tertiary care hospital in Lucknow. A total of 159 teenagers were enrolled using convenience sampling. Screen time was assessed using a structured questionnaire and behavioural health was evaluated using the parent-reported Strengths and Difficulties Questionnaire (SDQ). Statistical analysis was performed using SPSS version 29, applying chi-square and independent t-tests, with a p-value <0.05 considered statistically significant. Results: The mean age of participants was 13.3 ± 1.7 years, with males comprising 52.2%. Overall, 68.6% of teenagers reported screen time exceeding 2 hours per day. Higher screen time was significantly associated with increasing age and personal smartphone ownership (p<0.05). Teenagers with screen time greater than 2 hours per day had significantly higher SDQ total difficulty scores (16.5 ± 5.0 vs 11.2 ± 4.1; p<0.001). Emotional problems, conduct problems, hyperactivity/inattention, and peer problems were significantly more prevalent in the higher screen time group. Conclusion: Excessive screen time is significantly associated with behavioural health problems among teenagers, highlighting the need for parental regulation and early intervention strategies. Keywords: Screen time, teenagers, behavioural health, SDQ, smartphone use.

Page No: 692-695 | Full Text

 

Original Research Article

A CLINICAL STUDY OF ACUTE PANCREATITIS, AND EVALUATION OF RANSONS SCORE IN DIAGNOSIS, COMPLICATIONS, AND MANAGEMENT

http://dx.doi.org/10.70034/ijmedph.2026.2.119

Sourabh Prajapat, Dipanshi Shah, Shashank Jain, Sandesh Deolekar, Madhav Lakhotia, Akshata Thakur

View Abstract

Background: Acute pancreatitis is a common gastrointestinal emergency with a wide spectrum of severity, ranging from mild self-limiting disease to severe life-threatening illness. Early prediction of disease severity is crucial for optimizing management and improving outcomes. Ranson’s scoring system is one of the earliest and widely used tools for prognostication; however, its applicability in different populations remains debatable. To study the clinical profile, etiology, and natural history of acute pancreatitis and to evaluate the utility of Ranson’s score in predicting disease severity, complications, and management outcomes. Materials and Methods: This prospective observational study was conducted in the Department of General Surgery at Dr. D. Y. Patil Hospital, Navi Mumbai, over a period of two years (March 2022–March 2024). A total of 50 patients diagnosed with acute pancreatitis were included. Clinical evaluation, laboratory investigations, and imaging studies were performed. Ranson’s score was calculated at admission and after 48 hours. Patients were managed according to standard protocols and followed for outcomes including complications, need for intervention, and mortality. Statistical analysis was performed using SPSS version 25.0, and associations were assessed using appropriate tests. Results: The majority of patients were above 50 years (64%) with a male predominance (58%). Alcohol (46%) and gallstones (44%) were the leading etiological factors. Abdominal pain (96%) was the most common presenting symptom. Based on Ranson’s score, 62% of patients were classified as having severe acute pancreatitis. Most patients (88%) were managed conservatively, and 82% did not develop complications. Among complications, pancreatic pseudocyst (8%) was the most common. No statistically significant association was observed between Ranson’s score and etiology (p = 0.66), management (p = 0.51), or complications (p = 0.14). Conclusion: Ranson’s score, although useful in assessing severity, demonstrated limited predictive value for complications and management outcomes in this study. Conservative management was effective in the majority of cases, including those classified as severe. These findings suggest that Ranson’s score should be used in conjunction with other clinical parameters and scoring systems for better prognostication in acute pancreatitis. Keywords: Acute pancreatitis, Ranson’s score, severity assessment, complications, conservative management.

Page No: 696-702 | Full Text

 

Original Research Article

UTILITY OF MID-UPPER ARM CIRCUMFERENCE AS A SCREENING TOOL FOR PREDICTING LOW BIRTH WEIGHT IN NEONATES

http://dx.doi.org/10.70034/ijmedph.2026.2.120

Nahid Akhtar, Saksham Srivastava, Singh Bhawana Komal, Kamran Ahmad

View Abstract

Background: Low birth weight (LBW) is associated with high risk of infections, difficulty in breathing, hypothermia and feeding problems. Therefore, LBW should be detected early to allow newborns to receive appropriate care soon after delivery. However, recording of accurate birth weight may not be always possible due to inadequate equipment in resource poor settings especially in rural areas. This study was conducted to determine the efficacy of mid upper arm circumference (MUAC) as a screening tool to identify low birth weight babies where no weighing scales are available. The aim is to study the correlation between mid upper arm circumference and birth weight in neonates in a tertiary care hospital in Lucknow. Materials and Methods: A cross-sectional observational study was conducted at tertiary care center. A total of 100 newborns were included in this study. MUAC and birth weight were measured. Descriptive statistics including mean, standard deviation and range were calculated for MUAC and birth weight. The correlation between MUAC and birth weight was assessed. Results: A total of 100 neonates were assessed within 24 hours of life. Correlation analysis demonstrated that birth weight showed a significant positive association with mid–upper arm circumference (p<0.001). This indicate that lower birth weight is closely linked with lower mid–upper arm circumference. Conclusion: Neonates with low MUAC values consistently fall into lower birth weight categories, whereas high MUAC values were strongly associated with higher birth weights. MUAC showed a very strong positive correlation with birth weight. Keywords: Mid upper arm circumference (MUAC), Low birth weight (LBW), Newborns, Screening tool.

Page No: 703-706 | Full Text

 

Original Research Article

ASSESSMENT OF STABILITY AND HEALING IN MANDIBULAR CONDYLAR FRACTURES MANAGED WITH LOCKING PLATE SYSTEMS VERSUS CONVENTIONAL MINIPLATES WITH EVALUATION OF BONE HEALING PATTERNS AND PATHOPHYSIOLOGICAL CHANGES

http://dx.doi.org/10.70034/ijmedph.2026.2.121

Noor ul Wahab, Irfan Qureshi, Muhtada Ahmad, Riaz Gul, Aamir Sharif, Fizza Tariq

View Abstract

Background: Mandibular condylar fractures are among the most common fractures of the maxillofacial region and pose significant challenges in achieving optimal functional and anatomical outcomes. Objective: To assess and compare the stability and healing outcomes of mandibular condylar fractures managed with locking plate systems versus conventional miniplates, focusing on bone healing patterns and associated biological responses. Materials and Methods: This comparative observational study included patients with unilateral or bilateral mandibular condylar fractures treated surgically using either locking plate systems or conventional miniplates. Postoperative evaluation was conducted during a defined follow-up period to assess fracture stability, radiographic evidence of bone healing, occlusal alignment, and functional recovery. Healing patterns were analyzed using radiographic imaging, while clinical assessment included pain, mouth opening, and complications such as infection or malocclusion. Results: Patients treated with locking plate systems demonstrated improved initial stability and more uniform bone healing patterns compared to those treated with conventional miniplates. Radiographic evidence indicated faster callus organization and reduced micromovement at the fracture site in the locking plate group. Clinically, better functional outcomes, including improved mouth opening and reduced postoperative discomfort, were observed. Complication rates were comparatively lower in the locking plate group. Conclusion: Locking plate systems provide superior biomechanical stability and promote more favorable bone healing patterns in mandibular condylar fractures compared to conventional miniplates. Their use is associated with improved functional outcomes and reduced postoperative complications, making them a preferable option in appropriately selected cases. Keywords: Mandibular condylar fracture, locking plate system, miniplates, bone healing, fracture stability, maxillofacial trauma.

Page No: 707-711 | Full Text

 

Original Research Article

CLINICAL AND WOUND RELATED FACTORS AFFECTING SPLIT THICKNESS SKIN GRAFT UPTAKE IN SOFT TISSUE DEFECTS

http://dx.doi.org/10.70034/ijmedph.2026.2.122

Rupal Nanda, Divyashree Singh, Ravi Bilunia

View Abstract

Background: Split thickness skin grafting is commonly used for coverage of ulcers and raw areas where primary closure is not possible. Graft uptake depends on local wound condition, infection status and patient related factors. Identification of predictors of poor graft uptake may help in better preoperative preparation. The aim is to assess the clinical, laboratory and wound related factors affecting split thickness skin graft uptake in patients with soft tissue defects. Materials and Methods: This prospective observational study included 166 patients undergoing split thickness skin grafting for ulcers and raw areas. Demographic profile, wound etiology, wound size, anatomical site, comorbidities, laboratory parameters and preoperative wound culture status were recorded. Graft uptake was assessed clinically on postoperative day 7. Good graft uptake was defined as >=80% uptake and poor graft uptake as <80% uptake. Data were analysed using chi-square test and multivariable logistic regression. Results: Good graft uptake was seen in 125 patients (75.3%), while poor graft uptake was seen in 41 patients (24.7%). Good uptake was seen in 90 patients (81.8%) with wounds <=10 x 10 cm compared with 12 patients (50.0%) with wounds >20 x 10 cm. Diabetic foot ulcers had lower good uptake, seen in 14 patients (46.7%). Culture-positive wounds showed poor uptake in 29 patients (47.5%), while sterile wounds showed poor uptake in only 12 patients (11.4%). On multivariable analysis, culture-positive wound, wound size >20 x 10 cm, serum albumin <3.5 g/dL and diabetes mellitus were independent predictors of poor graft uptake. Conclusion: Poor split thickness skin graft uptake was mainly associated with culture-positive wound, large wound size, hypoalbuminemia and diabetes mellitus. Preoperative correction of infection, nutritional status and glycaemic control may improve graft outcome. Keywords: Split thickness skin graft, graft uptake, wound culture, diabetic foot ulcer, serum albumin, wound size, soft tissue defect

Page No: 712-717 | Full Text

 

Case Series

PERIPHERAL OSSIFYING FIBROMA: A CASE SERIES

http://dx.doi.org/10.70034/ijmedph.2026.2.123

Shweta Goyal

View Abstract

Background: The objective is to evaluate the clinical presentation and histopathological characteristics of Peripheral Ossifying Fibroma (POF) through a case series, emphasizing its clinical variability and diagnostic features. Materials and Methods: This Descriptive case series was conducted in Department of Pathology at Shri Jagannath Pahadia Medical college Bharatpur Rajasthan from 2023 to 2025. A total of six patients clinically diagnosed with gingival overgrowth and confirmed histopathologically as peripheral ossifying fibroma were included. Clinical records were reviewed for demographic details, lesion characteristics, dimensions and associated symptoms. All lesions were surgically excised and sent for histopathological confirmation. Results: Out of six cases, five (83.3%) were females and one (16.7%) was male, showing a strong female predominance. The age ranged from 11 to 79 years. Lesions were predominantly located on the gingiva, especially in anterior and interdental regions. The duration varied from 2–3 months to 5–6 years. Most lesions were firm, with one case showing bony-hard consistency. Both pedunculated and sessile forms were observed. Conclusion: Peripheral ossifying fibroma is a reactive gingival lesion with diverse clinical presentations but consistent demographic and histopathological features. Early diagnosis and complete surgical excision with elimination of local irritants are essential to prevent recurrence. Keywords: Peripheral ossifying fibroma, gingival lesion, reactive lesion, oral pathology, case series.

Page No: 718-722 | Full Text

 

Original Research Article

DIAGNOSTIC UTILITY OF FROZEN SECTION IN OPERATIVE SURGICAL SPECIMENS

http://dx.doi.org/10.70034/ijmedph.2026.2.124

Deepa Janghel, Shende Pranali Kisandas

View Abstract

Background: Intraoperative consultation by frozen section technique is an invaluable tool for immediate diagnosis. Its accuracy and limitations vary with different anatomical sites. The correlation of intraoperative frozen section diagnosis with final diagnosis on routine histopathological sections is an integral part of quality assurance in surgical pathology. Materials and Methods: 106 tissue specimens from 64 cases were studied over a period of one year. Diagnostic accuracy of frozen section and its morphological quality and reliability in comparison to histopathology was evaluated for overall morphology. The turnaround time and limitations in section preparation and problems encountered were assessed. Results: Diagnostic accuracy of frozen section was 98.43%. The intraoperative consultation using Frozen section added an addendum to 29.68% cases. Conclusion: Intra-operative frozen section diagnosis is very useful, highly accurate and provides rapid, reliable, cost effective information necessary for optimum patient care. Keywords: Frozen section, Oncopathology, Surgical pathology.

Page No: 723-726 | Full Text

 

Original Research Article

STUDY TO COMPARE THE EFFICACY OF BREASTFEEDING AND ORAL SUCROSE ON THE PAIN EXPERIENCE OF NEONATES AND INFANTS DURING VENIPUNCTURE USING NEONATAL INFANT PAIN SCALE

http://dx.doi.org/10.70034/ijmedph.2026.2.125

Sandupatla Kiran, M. Suma Priya, Tumukunta Vaishnavi

View Abstract

Background: Aim of the Study: To compare the efficacy of breast feeding and oral sucrose on pain experience of neonates and infants during venipuncture using neonatal infant pain scale. Objectives: 1. To determine and compare the degree of pain in subjects receiving breastfeed and the subjects receiving oral sucrose using neonatal infant pain scale. 2. To determine the association between demographic variables and pain perception. Materials and Methods: It was a Randomized interventional stud carried out at Department of PAEDIATRICS, Malla Reddy Narayana Multi-Specialty Hospital, Suraram, Telangana. During the period 18 months (September 2022 to February 2024). Results: This randomized interventional study was done to compare the efficacy of breastfeeding and oral sucrose on the pain experience of neonates and infants during venipuncture using neonatal infant pain scale. 85 subjects each were allocated to two groups randomly. Group 1 received breastfeeding 10 minutes prior to the procedure and group 2 received 2ml of 24% oral sucrose two minutes prior to the procedure. Comparison of the two groups was based on the efficacy of breastfeeding and oral sucrose for pain experience during venipuncture using neonatal infant pain scale. According age of gestation, in the BF group 35 were pre- terms and 50 were term and in the sucrose group 42 were pre- term and 43 were term babies. The birth weight among the groups were in the breast-feeding group 33 were LBW and 52 had normal birth weight with the mean being 2.79± 0.6. In the sucrose group 31 were LBW and 54 were normal with the mean being 2.74±0.6 kgs. Mean NIPS in breastfeeding subjects was 1.12± 0.8. In the sucrose group mean NIPS being 1.12± 0.8. The overall mean NIPS was 1.33± 0.6. A significant association was seen between the groups suggesting sucrose fed babies had higher pain. In the breast- fed group it is observed the males had a mean NIPS score of 1± 0.8 and in females the mean was 1.1± 0.8 which on comparison was statistically insignificant (p> 0.05). Similarly, sucrose fed group males had a mean NIPS score of 1.5± 0.9 and females the mean was 1.5± 0.8 which on comparison was statistically insignificant. When both the groups were compared a statistically significant (p< 0.05) difference noted between suggesting sucrose- fed group had higher NIPS with respect to gestation. When both the groups were compared a statistically significant (p= 0.04) difference noted between suggesting sucrose- fed group had higher NIPS with respect to normal and low birth weight. Conclusion: According to the findings of the present study, the lowest pain score and crying time was in breastfed neonates. Considering the fact that breast feeding is a natural, useful and free intervention and does not need any special facility, this method is suggested in pain management and control during painful procedures for infants. Keywords: Pain score, NIPS, Oral sucrose, Breast feeding, Venipuncture.

Page No: 727-733 | Full Text

 

Original Research Article

ASSESSMENT OF ADVERSE DRUG REACTIONS OF CISPLATIN, CARBOPLATIN AND OXALIPLATIN: A PROSPECTIVE OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.126

G. Arathi, Jayashree Ganesan

View Abstract

Background: Platinum-based chemotherapy agents—cisplatin, carboplatin, and oxaliplatin—are widely used in the treatment of solid malignancies but are associated with varying toxicity profiles that may impact treatment outcomes. Materials and Methods: This prospective observational study included 165 adult patients with histologically confirmed cancers treated at a tertiary care hospital in Chennai over one year. Patients received standard platinum-based regimens and were monitored each cycle. Toxicities were assessed using CTCAE criteria, neurotoxicity with EORTC QLQ-CIPN-20, and performance status by WHO scale. Statistical analysis was performed using ANOVA and chi-square tests. Results: Carboplatin showed higher hematological toxicity, including severe anemia, leucopenia, and thrombocytopenia. Cisplatin was associated with increased nephrotoxicity and severe nausea and vomiting. Oxaliplatin demonstrated lower hematological and renal toxicity but higher incidence of peripheral neuropathy and diarrhoea. Differences were statistically significant (p < 0.05). Conclusion: Distinct toxicity profiles exist among platinum agents. Individualized selection can help minimize adverse effects and improve patient outcomes. Keywords: Toxicity profile, Chemotherapy, Nephrotoxicity, Neurotoxicity, Hematological toxicity.

Page No: 734-739 | Full Text

 

Original Research Article

SCORE BASED SEVERITY ASSESSMENT AND RISK STRATIFICATION OF ACUTE PANCREATITIS IN A TERTIARY CARE CENTRE: AN OBSERVATIONAL, COHORT STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.127

Ritvik D. Jaykar, Santoshkumar N. Deshmukh, Aakanksha A. Giri

View Abstract

Background: Acute pancreatitis outcomes are scored on the basis of bedside index of severity in acute pancreatitis (BISAP), Ranson criteria, acute physiology and chronic health evaluation (APACHE-II) and modified CT severity index (CTSI) score. The utility of these parameters is limited by their result related variability. We studied these scores along with the clinical characteristics of patients admitted to our tertiary care centre with acute pancreatitis. Materials and Methods: Patients with acute pancreatitis were included based on clinical features, pancreatic enzyme levels, and radiology. Data about clinical features, investigations and outcomes (mortality and complications) was obtained. Ranson criteria, BISAP, APACHE-II and modified CTSI scores too were assessed where feasible. Results: Of the 100 patients included, 52% were male with a median age of 51 years. Nearly one-fifths (21%) of the patients had an in-hospital mortality of which 71.43% were females. Common complications were shock (44%), pleural effusion (43%) and acute respiratory distress syndrome (ARDS) (37%). Regression analysis of Ranson criteria at admission showed higher odds of complications in tertile 1 and 2 (crude OR: 1.62, 3.6) even after adjustment for covariates (adjusted OR: 1.63, 4.81). Ranson criteria at 48 hours could predict mortality risk in tertile 2 and 3 (unadjusted OR: 4.17, 3.2; adjusted OR: 5.98, 1.25). Conclusion: The modified Ranson criteria has proven to be useful in predicting the complications based on the admission score and in-hospital mortality based on the cumulative 48-hour score. Other scores like modified CTSI, BISAP and APACHE-II also showed consistent association with poor outcomes. Keywords: Pancreatitis, complications, mortality, organ dysfunction scores, outcome assessment.

Page No: 740-744 | Full Text

 

Original Research Article

HISTOPATHOLOGICAL VERSUS CYTOLOGICAL EVALUATION OF BONE MARROW: A STUDY ON THE CONCORDANCE BETWEEN ASPIRATION AND TREPHINE BIOPSY

http://dx.doi.org/10.70034/ijmedph.2026.2.128

Trupti Gorte, Keshao Hiwale, Sunita Vagha, Amrita Singh Chauhan

View Abstract

Background: Bone marrow evaluation (BME) is crucial in testing haematological and certain non-haematological conditions. The bone marrow evaluation not only confirm clinically suspected but also previously unsuspected diagnoses. The cytological preparation of bone marrow, which is obtained by aspiration, reveals cellular morphology and allows ancillary testing (flow cytometry, molecular genetics) and the histological sample, usually obtained with a Jamshidi needle, allows optimal evaluation of cellularity, fibrosis or infiltrative disease. Bone marrow aspirate and trephine biopsy specimens nowadays are considered complementary and when both are obtained, they provide a thorough examination of bone marrow. Materials and Methods: During the study period of 1 year 111 BMEs, both bone marrow aspirations and BMB were performed for various indications. Results: The mean age was 40.6 years of these, the Male: Female ratio was 1.3:1. In this study comprising 111 patients, the majority were in the 21–40 years age group (42 cases, 37.8%), followed by the 41–60 years group (28 cases, 25.2%), and the 2–20 years group (24 cases, 21.6%). The least representation was from the 61–80 years age group (17 cases, 15.3%). A slight male predominance was observed overall, with 62 males (55.9%) and 49 females (44.1%). Assessment of bone marrow cellularity revealed that normocellular marrow was most common (42%), followed by hypercellular (34%), hypocellular (19%), and diluted samples (16%). Conclusion: This study reaffirms that while aspiration remains a valuable diagnostic tool, trephine biopsy is indispensable for a comprehensive assessment especially in complex, inconclusive, or architecturally dependent marrow pathologies. For optimal diagnostic accuracy and patient care, the combined use of both BMA and BMB should remain the standard practice. Keywords: Bone Marrow Aspiration, Bone Marrow Biopsy, Leukaemia.

Page No: 745-749 | Full Text

 

Original Research Article

CLINICAL PROFILE OF ACUTE PANCREATITIS AT TERTIARY CARE HOSPITAL IN NORTH KASHMIR

http://dx.doi.org/10.70034/ijmedph.2026.2.129

Arvind Kumar Gautam, Shair Ahmad Dar, Zaffar Iqbal Kawoosa, Mir Mushtaq, Ifrah Reshi, Sajad, Nowsheen Nazir Parray, Aisha ahmad Dar

View Abstract

Background: Acute pancreatitis is a common and potentially life-threatening inflammatory condition of the pancreas with a wide spectrum of clinical presentations ranging from mild, self-limiting disease to severe systemic illness. The etiology and clinical profile vary across geographic regions. Aims & Objectives: This study aimed to evaluate the clinical characteristics, etiological factors, and outcomes of acute pancreatitis in patients presenting to a tertiary care hospital in North Kashmir. Materials and Methods: This was a hospital-based observational study conducted at a tertiary care center in North Kashmir. Patients diagnosed with acute pancreatitis based on standard diagnostic criteria (clinical features, elevated pancreatic enzymes, and radiological findings) were included. Demographic details, etiological factors, clinical presentation, laboratory parameters, imaging findings, severity assessment, complications, and outcomes were recorded and analyzed using appropriate statistical methods. Results: A total of 200 patients were included in the study. The majority were females (68 %) with a mean age of 3 years. The 31-40 years (35%).The most common etiological factor was biliary cause–(58.5 %), followed by idiopathic(24.5%),Hypertriglyceridemia(4%),drugs(7%),,hyperparathyroidism(2%),Post ERCP (2%),anatomical causes(1%),trauma(1%) & viral causes(1%). The predominant presenting symptom was abdominal pain (100%), often associated with vomiting and abdominal tenderness. Based on severity classification, (47.5%) had mild, (32.5%) had moderately severe, and (20%) had severe pancreatitis. The most prevalent complication was hypocalcemia (27%), followed by ascites and kidney injury (9% each), pleural effusion (8%), and organ failure (4%). The overall mortality rate was (6%), with higher mortality associated with severe disease and organ dysfunction. Conclusion: Acute pancreatitis in this region shows distinct etiological and clinical patterns, with gallstones/alcohol being the leading causes. Early diagnosis, severity stratification, and timely management are crucial in reducing morbidity and mortality. Regional data such as this are essential for improving clinical outcomes and guiding preventive strategies. Keywords: Acute pancreatitis; Clinical profile; Etiology; Severity; Complications; North Kashmir; Tertiary care; Outcome.

Page No: 750-756 | Full Text

 

Original Research Article

PREDICTORS OF OUTCOME IN ACUTE SUBDURAL HEMATOMA IN A RESOURCE-LIMITED SETTING: A RETROSPECTIVE OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.130

Arvind Kumar Bodda, KVV Satyanarayana, K Chandrashekhar Reddy

View Abstract

Background: Acute Subdural Hematoma is one of the critical brain inuries, which results in a very high mortality rate. Its outcome depends on variety of factors like age, comorbidities, medication patient is on, mechanism of injury, time of presentation, time of surgical evacuation or start of conservative measures. With Current timely interventions though the mortality has come down, its still a long way to go. The primary objective is to study factors responsible for possible neurological deterioration in a case of Acute Subdural Hematoma. Secondary objective is to Study the correctness of this prediction in the course of treatment in the hospital. Materials and Methods: A prospective study conducted at a tertiary care hospital, enrolling all the patients with Acute Subdural Hematoma. Detailed evaluations, including history, clinical and radiological examinations, performed to determine a possible SDH Risk score, to categorize patients into Definitely requiring surgery, borderline and low risk. Borderline patients were divided into surgical and non-surgical group, Neurological deterioration in Non surgical group was looked for. Results: 80 patients with Acute Subdural Hematoma meeting the inclusion and exclusion criteria were enrolled. Patients with high-risk score were surgically evacuated. Patients with Borderline score were given the options of continuing the conservative line of management or to go for surgical evacuation after fully explaining the risk of conservative management and the risks of surgery. Outcome in both Surgical and conservative group of Borderline cases was closely followed for a period of 14 days. 5 patients were falling onto borderline category. 38 patients underwent surgery, 42 patients treated conservatively. 5 patients who were in conservative management under borderline category showed neurological deterioration and eventually underwent surgery. Conclusion: Acute subdural hematoma is leading cause of mortality among traumatic brain injury. SDH Risk score stands as a good tool in suggesting best line of management. With due timely intervention Borderline cases can also be saved with Surgical Evacuation. However, larger sample size is needed to expand the findings to all traumatic acute subdural hematoma patients. Keywords: Brain injury, trauma, Acute subdural hematoma

Page No: 757-759 | Full Text

 

Original Research Article

A CROSS SECTIONAL STUDY ON BREASTFEEDING PRACTICES AMONG POSTNATAL WOMEN IN FIELD PRACTICE AREA OF TERTIARY HEALTH CARE CENTRE, HYDERABAD, TELANGANA

http://dx.doi.org/10.70034/ijmedph.2026.2.131

Bhavana Laxmi Surity, Singirikonda Srikanth, Aruna Ch, Arundhathi Baki

View Abstract

Background: Breast feeding is a nature’s way of nurturing the child, creating a strong bond between the mother and the child. Breast milk is the first best ideal natural food, easily digested and absorbed by the infant as compared to formula milk. Under normal conditions, Indian mothers secrete 450-600 ml of milk daily with 1.1 gram protein and 70 kcal of energy per 100ml of human milk1. Breast milk is used as the 'gold' standard for good infant nutrition at birth. The aim is to study breastfeeding practices among postnatal mothers in urban and rural field practice area of tertiary health care centre, Hyderabad, Telangana. The objective is to study the prevalence of exclusive breastfeeding practice among the study subjects. To study the socio demographic factors influencing breastfeeding practices among study subjects. To compare the breastfeeding practices of study subjects in urban and rural field practice areas. Materials and Methods: A community based cross sectional study was conducted among N=520 postnatal mothers in urban and rural filed practice area of Department of Community Medicine, Osmania Medical College. Postnatal mothers of infants, were taken into the study after attaining the consent. Simple random sample technique was followed in selecting the study population. A pretested and semi structured questionnaire was used for collecting data on breastfeeding practices by interviewing mothers of infants. Inclusion criteria is Postnatal mothers of infants aged 0-12 months of both sexes who had given consent for the study. Exclusion criteria – Postnatal mother of infants with congenital birth defects. Infants in whom breastfeeding is contraindicated like galactosemia, psychosis of mother etc. Postnatal mothers who did not give informed consent. Results: In the present study majority 45.5% (237) of the infants belonged to 6-9 months age group among the total study population. In the present study the prevalence of early initiation of breastfeeding within 1 hour after child birth is 45%(117) in urban area and 43.5%(113) in rural areas and 44.2%(230) among total study population. Prevalence of exclusive breast feeding among infants 0 – 12 months age is 71.9% (187 out of 260) in urban areas and 74.23% (193 out of 260) in rural areas. Conclusion: Antenatal care counselling (ANC) during pregnancy in encouraging proper breastfeeding practices among mothers should be strengthened and should be provided as a continuum of care, by appropriately trained health-care professionals and community-based lay and peer breastfeeding counsellors. Counselling should anticipate and address important challenges and contexts for breastfeeding, in addition to establishing skills, competencies and confidence among mothers. Keywords: Exclusive Breast Feeding, Antenatal Mother, Infant.

Page No: 760-766 | Full Text

 

Original Research Article

KNOWLEDGE, ATTITUDE AND PERCEPTION REGARDING SCHIZOPHRENIA AMONG UNDERGRADUATE STUDENTS: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.16.2.132

Abhishek Dhawan, Yashaswi Ninawe, Adchitre SA, Akshita Sharma, Vishal Pahune, Andrea Almeida

View Abstract

Background: Schizophrenia is a chronic psychiatric disorder associated with significant stigma, social exclusion, and poor health-seeking behavior. Understanding awareness and attitudes among young adults is essential for early intervention and stigma reduction. Aim: 1. To assess knowledge, attitude, and perception regarding schizophrenia among undergraduates. Objectives: 1. To assess the level of knowledge regarding schizophrenia among undergraduate students, including its nature, causes, clinical features, and course of illness. 2. To evaluate attitudes of undergraduate students toward individuals with schizophrenia, particularly in social, familial, and occupational contexts. 3.To explore perceptions and beliefs related to stigma, chronicity, treatability, and functional impact of schizophrenia among undergraduate students. 4. To determine the association between socio-demographic factors (such as gender and age) and knowledge, attitude, and perception regarding schizophrenia. Materials and Methods: A cross-sectional study was conducted among 112 undergraduate students using a pre-validated, self-administered questionnaire. Data were analysed using proportions and percentages. Chi-square test was applied to assess gender-wise differences. A p-value <0.05 was considered statistically significant. Results: Majority (96.4%) identified schizophrenia as a psychiatric illness. Females demonstrated a more positive attitude towards employment and social inclusion. Misconceptions regarding dangerousness and heredity persisted. Gender difference was statistically significant for employment-related attitude (χ² = 4.12, p = 0.04). Conclusion: Despite adequate baseline knowledge, stigma and uncertainty persist. Structured mental health education at undergraduate level is strongly recommended. Keywords: Schizophrenia, Knowledge, Attitude, Undergraduate students, Mental health literacy.

Page No: 767-775 | Full Text

 

Original Research Article

EFFECT OF HEALTH EDUCATION INTERVENTION ON AVERAGE DAILY STEP COUNT AMONG NURSING OFFICERS IN A GOVERNMENT MEDICAL COLLEGE & HOSPITAL OF HARYANA

http://dx.doi.org/10.70034/ijmedph.2026.2.133

Bharti, Sumit Chawla, Bharti, Manju Rani, Muskan Saini, Leena Yadav

View Abstract

Background: Physical inactivity is a major modifiable risk factor for non-communicable diseases such as cardiovascular disease, diabetes mellitus, obesity, and certain cancers. Health education interventions have been recognized as an effective strategy to promote physical activity by increasing knowledge, motivation, and self-efficacy. Interventions focusing on simple, achievable goals such as increasing daily step count are particularly suitable for busy healthcare workers. Materials and Methods: This is a Quasi-experimental, before–after interventional study conducted over a period of four months from August 2024 to November 2024. A total of 112 nurses were included in the study. Appropriate statistical tests were used for calculation of various parameters. Results & Conclusion: In conclusion, the telephonic health education intervention resulted in a statistically significant increase in average daily step count among nursing officers. Given the sedentary baseline activity levels observed, structured behavioral reinforcement through periodic telephonic contact may serve as a feasible and cost-effective workplace strategy to promote physical activity among healthcare professionals with 3030.00 ± 1424.51 steps before intervention vs 3360 ± 1393.40 after intervention. Keywords: Health education, Physical inactivity, Steps.

Page No: 776-781 | Full Text

 

Original Research Article

MORBIDITY PATTERN, HEALTH SEEKING BEHAVIOUR AND ITS RELATED FACTORS AMONG THE ELDERLY URBAN DWELLERS OF NAGAON TOWN, ASSAM, INDIA

http://dx.doi.org/10.70034/ijmedph.2026.2.134

Hiyeswar Borah, Nibir Nath Sarma, Kalyan Borah, Jahnabi Das, Simmy Gavel

View Abstract

Background: Context/Background: With India projected to have nearly 20% of its population in the geriatric age group, the changing demographic structure unders cores the crucial role of older adults in nation-building through community engagement, knowledge dissemination, and shared experiences. Ensuring graceful ageing requires health-promotional activities like recreational activities, physical activity, and social contact, etc. Comprehensive geriatric assessment and coordinated screening programs are necessary for an independent life in later stages. This study aims to assess the prevalence of morbidity patterns and their related factors and health-seeking behaviour of elderly people, while also fostering awareness about health-promotional activities. Materials and Methods: A community-based cross-sectional study was conducted from May to October among elderly (≥60) people in urban areas of Nagaon town, Assam. A sample size of 384 was taken from 9 wards, selected randomly from 28 registered wards. Data was collected using a pre-designed, pre-tested and semi-modified proforma by interviewing the elderly after taking informed consent. Ethical clearance was obtained from the Institutional Ethical Committee. Elderly not willing to participate and giving incomplete information were excluded. Results: Total 54.7% males and 45.3% females were included in the study. Majority (64.6%) were from 60–74 years age group. Overweight was found 42.9% and 16.6% were obese. Hypertension was commonest (63.8%) among the elderly, followed by DM (37.5%).About 16.4% elderly had at least two co-morbidities. Hypertension among the people above 85 years was more prevalent (80%) than the other age groups. Conclusion: Hypertension and diabetes were the commonest morbidity among the participants. Overweight and obesity were also common. In our study, most of the elderly lacked physical activity. This study implies that there is a need to create awareness among the general public highlighting the importance of physical activity for healthy ageing. Keywords: Elderly, Physical activity, Morbidity, Health seeking behaviour, Urban dwellers.

Page No: 782-786 | Full Text

 

Original Research Article

COMPARISON OF CONTINUOUS EPIDURAL ANALGESIA VERSUS CONTINUOUS FASCIA ILIACA BLOCK FOR POST OPERATIVE ANALGESIA IN PATIENTS UNDERGOING HIP SURGERIES: A RANDOMISED CONTROLLED STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.135

V S V Bhavya Krishna Prasad, K. Mohan, V. Prem Kumar

View Abstract

Background: Effective postoperative pain management is essential following major hip surgeries to facilitate early mobilisation, reduce complications, and improve functional recovery. While continuous epidural analgesia has long been considered the gold standard, it is associated with haemodynamic instability. The fascia iliaca compartment block (FICB) has emerged as a promising alternative, offering effective analgesia with potentially fewer systemic effects. This study compares the efficacy and safety of continuous epidural analgesia and continuous fascia iliaca block in patients undergoing hip surgeries. Materials and Methods: This prospective randomized controlled study was conducted over 18 months and included 90 patients undergoing elective hip surgeries. Patients were randomly allocated into two groups: Group E (continuous epidural analgesia) and Group F (continuous suprainguinal FICB), with 45 patients in each group. All patients received standardized spinal anaesthesia intraoperatively. Postoperatively, continuous analgesia was maintained via epidural catheter in Group E and ultrasound-guided fascia iliaca catheter in Group F. Pain was assessed using the Visual Analogue Scale (VAS) at predefined intervals. Haemodynamic parameters, rescue analgesic requirement, and adverse effects were recorded. Statistical analysis was performed using independent t-test and Chi-square test, with p <0.05 considered significant. Results: Baseline characteristics were comparable between the two groups (p >0.05). VAS scores were significantly lower in the epidural group from 1 hour to 24 hours postoperatively (p <0.001), indicating superior analgesic efficacy. Rescue analgesic requirement was significantly reduced in Group E (820 ± 215 mg) compared to Group F (1320 ± 260 mg) (p <0.001). Mean arterial pressure was significantly lower in the epidural group throughout the postoperative period (p <0.001), with a higher incidence of hypotension (20.0% vs 4.4%, p = 0.03). Heart rate, oxygen saturation, and respiratory rate were comparable between groups. No cases of respiratory depression were observed. Conclusion: Continuous epidural analgesia provides superior and sustained postoperative analgesia with reduced analgesic requirements but is associated with increased risk of hypotension. Continuous fascia iliaca block offers better haemodynamic stability with adequate pain control, making it a valuable alternative, particularly in patients where haemodynamic stability is a concern. Keywords: Epidural analgesia; Fascia iliaca compartment block; Hip surgery; Postoperative pain; Visual Analogue Scale.

Page No: 787-790 | Full Text

 

Original Research Article

PARENTAL AWARENESS, KNOWLEDGE AND PREVALENCE OF REFRACTIVE ERRORS AMONG CHILDREN OF HOSPITAL PERSONNEL IN A TERTIARY CARE CENTRE

http://dx.doi.org/10.70034/ijmedph.2026.2.136

Suman M G, Sahana A, Zoya Mohsin, Arjuman Sadaf A, Govindaraj M

View Abstract

Background: Refractive errors are the leading cause of preventable visual impairment in children. Early detection and timely correction are essential to prevent long-term visual disability. This study aimed to determine the awareness about refractive errors among children of hospital personnel along with, associated risk factors, at a tertiary care centre. Materials and Methods: A hospital-based cross-sectional study was conducted from May to July 2025 among 100 children aged 6 months to 18 years. Comprehensive ocular examination including cycloplegic refraction was performed. Parental awareness on the need for screening, age of first evaluation, parental refractive status, screen time, were recorded. Data was analysed using SPSS version 22, and associations were tested using Chi-square or Fisher’s exact test. Results: Parental awareness was significantly higher among doctors compared to paramedical staff (p < 0.001). Among children diagnosed with amblyopia, 75% of parents were unaware of the condition.The overall prevalence of refractive error was 47%, with myopia (34%) and astigmatism (33%) being the most common types. No significant association was observed with age or gender. A significant association was found between parental refractive error and refractive error in children (p = 0.003). Screen time was significantly associated with overall refractive error (p = 0.006), though not specifically with myopia. Conclusion: Despite healthcare proximity, gaps in parental awareness remain, particularly among non-medical staff. Regular screening and targeted educational interventions are essential for early detection and prevention of avoidable visual impairment. Parental refractive status and near-work exposure are important risk factors for childhood refractive errors. Keywords: early detection, amblyopia, screening, astigmatism, screen time.

Page No: 791-797 | Full Text

 

Original Research Article

IMPACT OF MATERNAL OBESITY ON FETOMATERNAL OUTCOMES IN PREGNANCY

http://dx.doi.org/10.70034/ijmedph.2026.2.137

Momy Gul, Madhu Bala, Asma Kashif, Musrat Mustafa Shah, Farzana Gul, Irum Naz

View Abstract

Background: Objective: To determine the frequency of maternal and fetal complications in pregnancies complicated by maternal obesity, to evaluate its association with adverse maternal outcomes such as gestational diabetes mellitus, hypertensive disorders, and cesarean delivery, and to assess fetal outcomes including macrosomia, preterm birth, and neonatal complications, while also analyzing the relationship between different body mass index categories and pregnancy outcomes, and its impact on the mode of delivery and overall perinatal outcomes. Duration and Place of Study: This study was conducted at Bolan Medical College Quetta from February 2025 to February 2026. Materials and Methods: A cross-sectional study was conducted on 200 pregnant women with BMI >25 kg/m², gestational age ≥24 weeks, aged 20–35 years, and singleton pregnancies. Women with pre-existing chronic diseases, hypertension, pre-gestational diabetes, infections, or multiple gestations were excluded. Data on demographics, obstetric history, BMI, and clinical findings were collected using a structured proforma. Maternal outcomes included gestational hypertension, gestational diabetes, mode of delivery, and intrapartum complications, while fetal outcomes included macrosomia, stillbirth, and neonatal complications. Data were analyzed using SPSS v26, with continuous variables presented as mean ± SD, categorical variables as percentages, and chi-square tests applied for significance (p<0.05). Results: The study found several fetomaternal complications among pregnant women with obesity. Gestational hypertension was observed in 94 (47%) participants, and gestational diabetes in 80 (40%). Assisted births occurred in 80 (40%) cases, while cesarean sections were performed in 162 (81%). Fetal macrosomia was noted in 54 (27%) cases, and stillbirth in 16 (8%). Multiparous women had higher rates of assisted births compared to primiparous women. Gestational hypertension, cesarean delivery, and stillbirth were more frequent in women with a BMI of above 30 kg/m². compared to those with BMI of 25.6–30 kg/m² Conclusion: Maternal obesity is strongly associated with adverse pregnancy and birth outcomes. Higher BMI increases the risk of maternal and fetal complications, obstetric interventions, and fetal macrosomia. Careful clinical assessment and antenatal counseling are recommended, and overweight and obese women should be considered high-risk. Keywords: Maternal obesity, Fetomaternal complications, Maternal complications, BMI, Obesity, Hypertension.

Page No: 798-802 | Full Text

 

Original Research Article

GLANDIN E2 GEL VS GLANDIN E2 TABLET FOR INDUCTION OF LABOR AND SUCCESSFUL DELIVERY

http://dx.doi.org/10.70034/ijmedph.2026.2.138

Sanobar Baloch, Kanta Bai Ahuja, Azra, Shahneela Moosa, Saba Ijaz, Ghazala Arshad

View Abstract

Background: Induction of labour is one of the most frequently performed obstetric interventions worldwide, accounting for nearly 20% of deliveries. Vaginal dinoprostone (prostaglandin E2) is widely used for cervical ripening and labour induction. Different formulations, including gel and pessary (tablet), are available; however, comparative effectiveness remains a subject of clinical interest. Objective: To compare the efficacy and safety of Glandin E2 gel versus Glandin E2 tablet (pessary) for induction of labour and achievement of successful vaginal delivery. Study design: A randomised controlled study. Duration and place of study: This study was conducted at Indus Medical College Tando Muhammad Khan from January 2025 to January 2026 Materials and Methods: This randomised controlled trial was conducted in the Department of Obstetrics and Gynaecology. A total of 120 pregnant women with singleton, live, cephalic presentation pregnancies at ≥37 weeks of gestation with intact membranes were enrolled. Participants were randomly allocated into two equal groups (n = 60 each). Group A received Glandin E2 tablet (dinoprostone pessary), while Group B received Glandin E2 gel, both administered intravaginally in the posterior fornix. Data were analysed using SPSS version 20. Maternal age and gravidity were recorded, along with induction outcomes Results: There were a total of 120 females included in this study. All the participants of this study were divided into 2 groups having an equal number of patients in each group (n=60). The mean age calculated for group A was 27.8 years while for group B, it was 26.4 years. The majority of the females were primigravida in both the groups. Conclusion: Glandin E2 gel appears to be more effective than Glandin E2 tablet in achieving successful induction of labour and vaginal delivery Keywords: Labour induction; Dinoprostone; Prostaglandin E2; Cervical ripening; Glandin E2 gel; Glandin E2 pessary; Randomised controlled trial.

Page No: 803-806 | Full Text

 

Original Research Article

THE INTERPREGNANCY INTERVAL AS A DETERMINANT OF ADVERSE OBSTETRIC OUTCOMES

http://dx.doi.org/10.70034/ijmedph.2026.2.139

Soniya Mehtab, Kanwal Atif, Madhu Bala, Samina Bugti, Banadi, Nosheen Mushtaq

View Abstract

Background: The objective is to determine the relationship between interpregnancy interval (IPI) and obstetric outcomes in women with multiple pregnancies and whether spacing between pregnancies has a role in negative maternal and perinatal outcomes. This study was conducted at Liaquat University of Medical and Health Sciences Jamshoro from January 2025 to January 2026. Materials and Methods: This prospective observational study was carried out among 200 of the pregnant women who were booked with parity of 1-4. The participants were divided into two groups depending on the interpregnancy interval, which included less than 18 months and 18-24 months. Women who had chronic medical conditions, had multiple pregnancies or grand multiparity were excluded. Each of the participants was observed during pregnancy until delivery, and the maternal and the newborn outcomes were measured. The standard tests of significance were conducted to conduct statistical analysis and p less than 0.05 was regarded as significant. Results: Majority (112; 56%-women) of the women were aged between 20-30 years and 118 (59.0%-women) were in the lower socioeconomic levels. Caesarean birth was the commonest means of birth with 148(74%) women. Out of the neonatal outcomes, 52(26%) of the infants were found to be born with low birth weight whereas 30(15%) were preterm. The maternal complications were 38(19%) and 17 (8.5) and 14 (7) represented anemia, gestational diabetes, and hypertensive disorders respectively. It was found that there is a significant correlation between short IPI (<18 months) and the adverse outcomes of anemia (p = 0.001), preterm birth (p = 0.000), and low birth weight (p = 0.000). None of the sociodemographic characteristics were found to be statistically significant to IPI. Conclusion: Very short periods between pregnancies especially less than 18 months are closely linked to the risk of maternal anemia, preterm birth and low babies. Postpartum counseling and easy access to family-planning services can be very useful in encouraging optimal birth spacing of 18- 24 months, which in turn can lead to significant improvement in the health outcomes of both mother and child. Keywords: Interpregnancy interval, Obstetric outcome, Maternal morbidity, Neonatal outcome, Birth spacing.

Page No: 807-812 | Full Text

 

Original Research Article

PULMONARY DYSFUNCTION IN TYPE 2 DIABETES MELLITUS: INTEGRATED ANALYSIS OF GLYCEMIC BURDEN, ADIPOSITY, AND SPIROMETRIC IMPAIRMENT

http://dx.doi.org/10.70034/ijmedph.2026.2.140

Mahboob Fatima Mohd Sirajuddin Ahmed Siddiqi, Aamir Naushad, Anand N. Badwe

View Abstract

Background: Type 2 Diabetes Mellitus (T2DM) is a chronic metabolic disorder associated with multiple systemic complications. Emerging evidence suggests that the lung may represent an additional target organ; however, pulmonary involvement remains under-recognized. The objective is to evaluate pulmonary function in patients with Type 2 Diabetes Mellitus and to assess its association with glycemic parameters and anthropometric indices. Materials and Methods: This analytical case–control study included 200 participants comprising 100 patients with T2DM and 100 age- and sex-matched healthy controls. Anthropometric parameters including body mass index (BMI), waist circumference, hip circumference, and waist–hip ratio (WHR) were recorded. Glycemic status was assessed using fasting blood sugar (FBS) and postprandial blood sugar (PPBS). Pulmonary function tests were performed using spirometry, measuring FVC, FEV₁, FEV₁/FVC ratio, FEF 25–75%, and PEF. Statistical analysis was conducted using Z-test and Pearson’s correlation. Results: Patients with T2DM exhibited significantly higher BMI, waist circumference, hip circumference, and WHR compared to controls (p < 0.0001). Glycemic parameters (FBS and PPBS) were markedly elevated in the diabetic group (p < 0.0001). Pulmonary function tests revealed significant reductions in FVC and FEV₁ in diabetic patients (p < 0.0001), with a predominantly restrictive pattern of lung impairment observed. Significant negative correlations were found between glycemic parameters and pulmonary function indices, indicating progressive decline in lung function with increasing glycemic burden. Conclusion: Type 2 Diabetes Mellitus is associated with significant impairment in pulmonary function, predominantly of a restrictive nature. Chronic hyperglycemia and increased adiposity appear to play a central role in this dysfunction. These findings support the inclusion of pulmonary function testing in the routine assessment of patients with T2DM. Keywords: Type 2 Diabetes Mellitus; Pulmonary Function; Spirometry; Glycemic Burden; Adiposity; Restrictive Lung Disease; Forced Vital Capacity; Insulin Resistance.

Page No: 813-820 | Full Text

 

Original Research Article

PREVALENCE & PREDICTORS OF MDR TB AMONG PULMONARY TB PATIENTS IN AN ASPIRATIONAL TRIBAL DISTRICT: A CROSS-SECTIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.141

Mittal C. Balat, Nehal L. Damor

View Abstract

Background: Multidrug-resistant tuberculosis (MDR-TB) poses a significant challenge to tuberculosis control, particularly in vulnerable and underserved populations. Tribal communities residing in aspirational districts often experience poor healthcare access, malnutrition, and treatment non-adherence, increasing the risk of drug resistance. The objective is to determine the prevalence and predictors of MDR-TB among pulmonary TB patients in an aspirational tribal district. Materials and Methods: Cross-sectional observational study was conducted among 100 pulmonary TB patients registered under the National TB Elimination Programme in an aspirational tribal district. Socio-demographic, clinical, and treatment-related variables were collected using a structured proforma. Drug susceptibility testing was performed using CBNAAT and Line Probe Assay. Statistical analysis was conducted using SPSS. Associations were assessed using Chi-square test and independent t-test, and odds ratios were calculated with 95% confidence intervals. Results: The prevalence of MDR-TB was 18% (95% CI: 11.5-26.7%). Tribal status (p = 0.019), HIV positivity (p = 0.012), and low BMI (p = 0.015) were significantly associated with MDR-TB. Treatment-related factors such as previous TB treatment (p = 0.003), treatment default (p = 0.002), irregular drug intake (p = 0.001), and substance abuse (p = 0.014) were strong predictors of drug resistance. Age and diabetes mellitus were not significantly associated with MDR-TB. Conclusion: The study revealed a substantial burden of MDR-TB in an aspirational tribal district, with treatment-related non-adherence emerging as the strongest predictor. Strengthened adherence monitoring, early molecular diagnosis, nutritional support, and targeted interventions in tribal communities are essential to reduce the burden of MDR-TB and achieve TB elimination goals. Keywords: Multidrug-resistant tuberculosis; Tribal population; Drug resistance predictors.

Page No: 821-825 | Full Text

 

Original Research Article

FUTURE OF MONOCLONAL GAMMOPATHY DIAGNOSIS: MASS SPECTROMETRY, A TRANSFORMATION IN PERSPECTIVE BEYOND ELECTROPHORESIS

http://dx.doi.org/10.70034/ijmedph.2026.2.142

E. Maruthi Prasad, Rajesh Bendre

View Abstract

Immunofixation electrophoresis (IFE) and serum protein electrophoresis (SPEP) have been the major diagnostic tools for monoclonal gammopathies (MGs) in diagnostic laboratory settings for over five decades. These techniques are necessary but show limitations such as resolution, subjectivity, and sensitivity. Recently, clinical laboratory settings are increasingly applying high-resolution technologies like mass spectrometry (MS) and next-generation sequencing (NGS). This article discusses the historic role and operational limitations of SPEP and IFE with analytical challenges in diagnostic settings for the first time. We also discussed ideas about backup plans, such as MS-based immunoglobulin profiling and a standard method for the detection and quantification of MS-based monoclonal proteins. Additionally, analyzing the B-cell repertoire through NGS provides an opportunity to examine the genome for clonal detection. The integration of both biochemical and genomic data with the biomarkers, such as abnormal glycosylation patterns and ratios of heavy/light chains, to evaluate the risk of myelomas. A framework for early intervention, precise surveillance, and individualized treatment approaches in myeloma management is urgently needed. This transformation affects labs in terms of operations and plans for the future and brings a new era for early disease diagnosis, tracing minimal residual disease, and creating personalized predictions about how maladies like Monoclonal Gammopathy of Undetermined Significance (MGUS) will evolve in their disease progression. Keywords: Monoclonal gammopathy, mass spectrometry, serum protein electrophoresis, immunofixation electrophoresis, Next-generation sequencing, multiple myeloma, MGUS, proteomics, clonality.

Page No: 826-834 | Full Text

 

Review Article

PRESCRIPTION WRITING SKILLS IN FRESH MEDICAL GRADUATES

http://dx.doi.org/10.70034/ijmedph.2026.2.143

Ramesh Lolla, Shilpa Karkera, Yoga Rajamani, Kapili Dada

View Abstract

Prescription writing is one of the most important clinical responsibilities entrusted to fresh medical graduates and serves as a key indicator of their readiness for independent practice. It is not merely a technical act of writing the name of a medicine, but a complex professional skill that integrates pharmacological knowledge, clinical reasoning, ethical responsibility, legal awareness, communication, and patient safety. This review examined the concept, scope, and educational significance of prescription writing skills in fresh medical graduates, with particular attention to the transition from undergraduate learning to real-world prescribing. The review highlights that new graduates frequently encounter difficulties in converting theoretical knowledge into safe and rational prescribing decisions in actual clinical settings. Common concerns include incomplete prescriptions, unclear instructions, inadequate dose individualization, weak patient counselling, and insufficient safety checks. These problems are influenced not only by individual inexperience, but also by limitations in undergraduate training, lack of repeated practical exposure, and the pressures of busy clinical environments. The review further emphasizes that good prescription writing is grounded in regulatory compliance, ethical prescribing, completeness of documentation, rational drug selection, and sound clinical reasoning before the order is written. It also underlines the importance of communication and patient-centredness, as a prescription must be understandable and usable for patients, pharmacists, nurses, and other healthcare professionals. Keywords: Prescription writing, fresh medical graduates, rational prescribing, patient safety, prescribing education.

Page No: 835-844 | Full Text

 

Original Research Article

INTEGRATED TEACHING IN MEDICAL COLLEGES: ADVANTAGES AND LIMITATIONS

http://dx.doi.org/10.70034/ijmedph.2026.2.144

Ramesh Lolla, Shilpa Karkera, Hannah Chung, David Dunn

View Abstract

Integrated teaching has become an important reform in undergraduate medical education because it addresses the long-standing problem of fragmented learning in traditional discipline-based curricula. In conventional teaching, basic sciences and clinical subjects are often delivered separately, making it difficult for students to connect foundational knowledge with patient care. Integrated teaching attempts to overcome this gap by linking related disciplines, clinical relevance, professional competencies, and emerging healthcare themes within a more coherent curricular structure. This review examines the concept of integrated teaching in medical colleges, with particular focus on its major advantages and limitations. The review highlights that integrated teaching supports meaningful learning by promoting conceptual linkage between subjects, strengthening horizontal and vertical integration, and improving the relevance of preclinical knowledge through early clinical exposure. It also broadens the educational scope of the curriculum by incorporating patient safety, systems thinking, professionalism, digital health, telehealth, simulation, leadership, nutrition, research literacy, humanities, and future-oriented healthcare needs. Such an approach encourages the development of clinical reasoning, reflective capacity, communication, teamwork, and professional identity in a more connected and learner-centred manner. At the same time, integrated teaching presents several practical and academic challenges. These include the need for clear curricular models, faculty development, strong interdepartmental coordination, resource support, assessment alignment, and protection against superficial or tokenistic implementation. Inadequate planning may lead to confusion, loss of disciplinary depth, curricular overload, and weak educational outcomes. Keywords: Integrated teaching; medical education; curriculum integration; undergraduate medical curriculum; competency-based education.

Page No: 845-854 | Full Text

 

Original Research Article

A COMPARATIVE STUDY OF DESFLURANE VERSUS SEVOFLURANE IN OBESE PATIENTS UNDERGOING LAPAROSCOPY SURGERY: EFFECT ON RECOVERY PROFILE

http://dx.doi.org/10.70034/ijmedph.2026.2.145

Akanksha Jain, Anchal Tiwari, Gaurita Shrivastava

View Abstract

Background: Obesity presents significant anesthetic challenges due to altered respiratory mechanics, pharmacokinetics, and increased perioperative risks. Volatile anesthetics such as desflurane and sevoflurane are commonly used due to their low blood–gas solubility, but their comparative effects on recovery profiles in obese patients remain an area of clinical interest. Aim: To compare desflurane and sevoflurane with respect to recovery characteristics, cognitive function, hemodynamic stability, and postoperative adverse effects in obese patients undergoing laparoscopic surgery. Materials and Methods: This prospective, randomized, double-blind study included 80 obese patients (BMI ≥30 kg/m²), aged 18–60 years, undergoing elective laparoscopic surgery. Patients were allocated into two groups: desflurane (Group D) and sevoflurane (Group S). Standardized anesthesia protocols were followed. Primary outcomes included recovery parameters (eye opening, extubation, orientation), Modified Aldrete Score, and Mini-Mental State Examination (MMSE). Secondary outcomes included intraoperative hemodynamics, adverse events, and post-anesthesia care unit (PACU) stay duration. Results: Baseline characteristics were comparable between groups. Desflurane demonstrated significantly faster recovery, with shorter times to eye opening (4.6 ± 1.4 vs 7.5 ± 1.5 min), verbal response (5.2 ± 1.3 vs 8.1 ± 1.4 min), and extubation (6.3 ± 1.1 vs 9.2 ± 1.6 min) (p < 0.001). Cognitive recovery was also faster, with earlier return to baseline MMSE (39.6 ± 12.8 vs 51.1 ± 13.5 min; p = 0.001). Modified Aldrete Scores were significantly higher in the desflurane group at all intervals, indicating quicker recovery. Hemodynamic parameters and respiratory variables remained stable and comparable in both groups. The incidence of adverse effects was low and statistically insignificant. PACU discharge readiness was achieved earlier with desflurane (45.8 ± 7.5 vs 58.2 ± 8.4 min; p < 0.001). Conclusion: Desflurane provides significantly faster emergence, improved early cognitive recovery, and reduced PACU stay compared to sevoflurane in obese patients undergoing laparoscopic surgery, without compromising hemodynamic stability or safety. Keywords: Desflurane, Sevoflurane, Obesity, Laparoscopic surgery, Recovery profile, MMSE, PACU.

Page No: 855-863 | Full Text

 

Original Research Article

PRESSURE-CONTROLLED VERSUS VOLUME-CONTROLLED VENTILATION IN ROBOT-ASSISTED PELVIC SURGERIES: A COMPARATIVE ANALYSIS OF RESPIRATORY MECHANICS AND HEMODYNAMIC EFFECTS

http://dx.doi.org/10.70034/ijmedph.2026.16.2.146

Praveen, Henjarappa K S, Mamatha H S, Arathi B H, V. B. Gowda, Srihari S S

View Abstract

Background: Robot-assisted pelvic surgeries require pneumoperitoneum and steep Trendelenburg positioning, which significantly affect respiratory mechanics and cardiovascular physiology. Selection of an optimal ventilation strategy is crucial to minimize pulmonary complications and maintain hemodynamic stability. Pressure-controlled ventilation and volume-controlled ventilation are commonly used modes; however, their comparative intraoperative effects remain inadequately defined in robotic surgical settings. Objectives: To compare the effects of pressure-controlled ventilation and volume-controlled ventilation on respiratory mechanics and hemodynamic parameters in patients undergoing robot-assisted pelvic surgeries. Materials and Methods: This prospective comparative study included 88 adult patients undergoing elective robot-assisted pelvic surgeries, who were randomly allocated into two groups: PCV group (n = 44) and VCV group (n = 44). Standardized anesthetic protocols were followed. Respiratory parameters including peak airway pressure, dynamic compliance, tidal volume, minute ventilation, end-tidal carbon dioxide, and oxygen saturation were recorded during pneumoperitoneum and Trendelenburg positioning. Hemodynamic parameters such as heart rate, systolic blood pressure, diastolic blood pressure, and mean arterial pressure were also evaluated. Safety outcomes including ventilator alarms, intraoperative hypertension episodes, and postoperative pulmonary complications were analyzed. Statistical analysis was performed using appropriate parametric and non-parametric tests. Results: The PCV group demonstrated significantly lower peak airway pressures and higher dynamic lung compliance compared to the VCV group (p < 0.001). Oxygenation and ventilation efficiency remained comparable between the two groups. Hemodynamic parameters were largely similar; however, diastolic blood pressure stability was significantly better in the PCV group (p < 0.001). PCV was associated with fewer airway pressure alarms, reduced need for ventilator adjustments, and a lower incidence of intraoperative hypertension episodes (p < 0.05). Postoperative pulmonary complications were numerically lower in the PCV group, though not statistically significant. Conclusion: Pressure-controlled ventilation provides superior respiratory mechanics and improved intraoperative safety compared to volume-controlled ventilation in robot-assisted pelvic surgeries, while maintaining stable hemodynamics and effective gas exchange. PCV may be considered the preferred ventilation strategy in robotic surgical procedures requiring pneumoperitoneum and steep Trendelenburg positioning. Keywords: Pressure-controlled ventilation. Volume-controlled ventilation. Robot-assisted pelvic surgery.

Page No: 864-869 | Full Text

 

Original Research Article

CLINICAL PROFILE AND SPECTRUM OF CARDIAC DISEASE IN PREGNANCY IN TERTIARY CARE CENTRE IN NORTH INDIA

http://dx.doi.org/10.70034/ijmedph.2026.2.147

Amit varshney1, Shubhangi Gupta2, Preeti Singh

View Abstract

Background: Cardiovascular disease in pregnancy is an important cause of indirect maternal morbidity and mortality, with a shifting pattern in developing countries due to coexistence of rheumatic and congenital heart diseases. Objective: To evaluate the clinical profile and spectrum of heart disease in pregnancy at a tertiary care center. Materials and Methods: This prospective observational study was conducted at a tertiary care hospital from February 2024 to February 2026. A total of 300 pregnant women with diagnosed or suspected cardiac disease were included. Detailed clinical evaluation, electrocardiography, and echocardiography were performed. Data were analyzed using descriptive statistics. Results: Among 300 pregnancies, 13 cases (4.33%) had cardiac disease. Valvular heart disease was the most common (2.00%), with mitral stenosis being predominant (1.33%). Cardiomyopathy was observed in 1.00%, congenital heart disease in 0.67%, while arrhythmia and post-surgical cases each accounted for 0.33%. Most women had vaginal delivery (70%), while 30% underwent cesarean section. Conclusion: Cardiac disease in pregnancy was relatively uncommon but clinically significant. Valvular heart disease remains the predominant lesion. Early diagnosis, multidisciplinary care, and close antenatal monitoring are essential to improve maternal and fetal outcomes. Keywords: Pregnancy; Cardiac disease; Valvular heart disease; Rheumatic heart disease; Cardiomyopathy; Maternal outcome; Echocardiography.

Page No: 870-872 | Full Text

 

Original Research Article

GENDER VARIATIONS IN CARDIOVASCULAR AUTONOMIC FUNCTION: A CROSS-SECTIONAL STUDY IN MEDICAL STUDENTS

http://dx.doi.org/10.70034/ijmedph.2026.2.148

Bhumi M. Hirani, Hitesh Kumar Solanki

View Abstract

Background: Gender-related differences in cardiovascular autonomic regulation have been widely studied, yet findings remain inconsistent, particularly in young healthy populations. This study aimed to evaluate variations in cardiovascular autonomic function between male and female medical students. Materials and Methods: A cross-sectional study was conducted among 162 undergraduate medical students. Resting heart rate and blood pressure were recorded, along with autonomic function tests assessing parasympathetic activity (deep breathing difference, E:I ratio, 30:15 ratio) and sympathetic activity (blood pressure response to standing, sustained handgrip test, and cold pressor test) using standardized protocols. Data was analysed using independent -t test and Mann – Whitney U test. A p-value ≤ 0.05 was considered statistically significant. Results: Females exhibited a significantly higher resting heart rate compared to males (80.47 ± 14.45 vs 76.17 ± 12.65 bpm; p = 0.036). In contrast, males had significantly higher systolic (120.46 ± 7.14 vs 114.35 ± 6.19 mmHg; p < 0.001) and diastolic blood pressure (77.19 ± 4.20 vs 74.71 ± 5.97 mmHg; p = 0.022). Parasympathetic function parameters, including deep breathing difference, E:I ratio, and 30:15 ratio, were slightly higher in males but did not differ significantly. Similarly, sympathetic function tests, including fall in systolic blood pressure on standing, rise in diastolic blood pressure during handgrip, and rise in systolic blood pressure during cold pressor test, showed no statistically significant gender differences. Conclusion: While significant gender differences were observed in resting cardiovascular parameters, autonomic function indices were comparable between males and females, suggesting similar autonomic regulation in young healthy individuals. Keywords: Cardiovascular autonomic function, gender differences, heart rate variability, sympathetic activity, parasympathetic activity, medical students.

Page No: 873-877 | Full Text

 

Original Research Article

STUDY OF MICROALBUMINURIA IN OBESE INDIVIDUALS AS AN EARLY INDICATOR OF NEPHROPATHY

http://dx.doi.org/10.70034/ijmedph.2026.2.149

Sanketh Janardhan, Mohit A Kalyankar, Aminta Albert, Relvin Roy, Rabeca Johnson, Sanabil S P, Abdul Haque Usman Pulath Puthanath

View Abstract

Background: Obesity is a major global health concern associated with metabolic, cardiovascular, and renal complications. Increasing evidence suggests that obesity alone, even in the absence of hypertension and diabetes, can lead to early renal damage through mechanisms such as endothelial dysfunction, glomerular hyperfiltration, and chronic inflammation. Microalbuminuria, defined as urinary albumin excretion of 30–300 mg/g creatinine, is a sensitive and early marker of both renal impairment and generalized endothelial dysfunction. The aim is to evaluate endothelial dysfunction due to obesity using microalbuminuria as a marker and to study obesity-induced nephropathy in normotensive, non-diabetic obese individuals. The primary objective is to evaluate microalbuminuria as an early, non-invasive indicator of nephropathy in normotensive, non-diabetic obese individuals. Secondary objectives are to determine the prevalence of microalbuminuria in this population. To assess the correlation between microalbuminuria and Body Mass Index (BMI). To explore the need for routine screening for early detection of subclinical renal impairment. Materials and Methods: This descriptive cross-sectional observational study was conducted in the Department of General Medicine at Sri Chamundeshwari Medical College Hospital & Research Institute over a period of 6 months from November 2025 to April 2026. A total of 87 obese individuals (BMI >23 kg/m² as per Asia-Pacific classification), aged 18–70 years, who were normotensive and non-diabetic were included. Detailed clinical evaluation, anthropometric measurements, and laboratory investigations including fasting blood sugar and HbA1c were performed. Microalbuminuria was assessed using spot urine albumin-to-creatinine ratio (ACR) by Roche immunoturbidimetric method. Microalbuminuria was defined as ACR 30–300 mg/g creatinine. Statistical analysis was carried out using SPSS version 20, with p < 0.05 considered statistically significant. Results: Out of 87 participants, 31 (35.6%) were found to have microalbuminuria. The prevalence increased with rising BMI, from 16.7% in overweight individuals to 56.0% in those with BMI ≥30 kg/m². A statistically significant association was observed between BMI categories and microalbuminuria (p = 0.007). Additionally, a moderate positive correlation was found between BMI and urinary ACR (r = 0.48, p = 0.001), indicating increasing renal involvement with higher adiposity. Conclusion: Microalbuminuria is significantly prevalent among normotensive, non-diabetic obese individuals and correlates positively with BMI. It serves as an early, non-invasive marker of obesity-related endothelial dysfunction and nephropathy. Routine screening for microalbuminuria in obese individuals may facilitate early detection and timely intervention, thereby preventing progression to overt renal disease. Keywords: Obesity, Microalbuminuria, Nephropathy, Endothelial Dysfunction, Body Mass Index (BMI), Albumin-Creatinine Ratio (ACR)

Page No: 878-882 | Full Text

 

Original Research Article

EFFECTIVENESS OF CALCIUM DOBESILATE COMPARED TO PLACEBO IN CHRONIC VENOUS INSUFFICIENCY: A COMPARATIVE STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.150

Abhishek Gaur, Nidhi Gaur

View Abstract

Background: Chronic venous insufficiency (CVI) refers to a condition in which the venous system fails to effectively return blood to the heart in a unidirectional manner, thereby compromising tissue drainage, temperature regulation, and overall hemodynamic balance, irrespective of body position or physical activity. Several randomized controlled trials have demonstrated the therapeutic benefits of calcium dobesilate in managing CVI. The present study aimed to evaluate and compare the efficacy of calcium dobesilate with placebo in patients diagnosed with chronic venous insufficiency. Materials and Methods: This study was conducted in the Department of Surgery at Muzaffarnagar Medical College, Muzaffarnagar, Uttar Pradesh, India, over a period of two years from March 2020 to March 2022. Results: A total of 150 participants were included and equally divided into two groups. One group received calcium dobesilate (500 mg) twice daily for 8 weeks, while the other group was administered a placebo. Significant improvement (P < 0.05) was observed in symptoms such as pain, leg heaviness, fatigue, tingling, itching, and muscle cramps in the calcium dobesilate group compared to the placebo group. Both patient-reported outcomes and investigator assessments showed better overall improvement in the treatment group. Conclusion: Calcium dobesilate significantly alleviates symptoms in patients with clinically diagnosed CVI, irrespective of concurrent use of compression stockings. Keywords: Calcium dobesilate, Chronic venous insufficiency, Venous disorders.

Page No: 883-886 | Full Text

 

Original Research Article

A COMPARATIVE EVALUATION OF FISTULECTOMY WITH SPHINCTEROPLASTY VERSUS FISTULECTOMY WITH SPHINCTEROPLASTY COMBINED WITH MARTIUS FLAP REPAIR IN THE MANAGEMENT OF RECTOVAGINAL FISTULA

http://dx.doi.org/10.70034/ijmedph.2026.2.151

Abhishek Gaur, Nidhi Gaur

View Abstract

Background: Surgical intervention remains the primary and definitive treatment for anal fistula, a condition recognized for more than two millennia. Despite advances in operative techniques, management continues to be challenging due to variable recurrence rates and the potential risk of postoperative incontinence. The aim is to compare the efficacy, clinical effectiveness, and postoperative outcomes of fistulectomy with sphincteroplasty versus fistulectomy with sphincteroplasty combined with Martius flap repair. Materials and Methods: This study included a total of 24 patients diagnosed with anal fistula. Among them, 13 patients underwent fistulectomy with sphincteroplasty, while 11 patients were treated with fistulectomy, sphincteroplasty, and Martius flap repair. Patients were evaluated for surgical success, recurrence, and postoperative complications. Results: The success rate in the fistulectomy with sphincteroplasty group was 38.4% (5 patients), with recurrence observed in 8 patients. In contrast, the group treated with the addition of Martius flap repair demonstrated a higher success rate of 72.7% (8 patients). Recurrence occurred in 3 patients in this group; however, these cases were managed conservatively for two weeks, resulting in complete healing. Conclusion: The findings of this study suggest that the incorporation of Martius flap repair with fistulectomy and sphincteroplasty provides superior postoperative outcomes, including lower recurrence rates, reduced risk of incontinence, decreased wound infection, and improved patient satisfaction. Keywords: Anal fistula, Fistulectomy, Sphincteroplasty, Martius flap repair.

Page No: 887-890 | Full Text

 

Original Research Article

RETROGRADE ANALYSIS OF PREDICTABILITY OF CORONARY CALCIUM SCORE IN ESTABLISHED CORONARY ARTERY DISEASE PATIENTS UNDERGOING CORONARYARTERY BYPASS GRAFTING

http://dx.doi.org/10.70034/ijmedph.2026.2.152

K S Harikrishna, Tella RamaKrishna Dev, Sai Surabhi P, RV Kumar

View Abstract

Background: Coronary artery disease (CAD) remains a leading cause of morbidity and mortality worldwide. Early identification of significant coronary artery involvement is crucial for timely intervention. Coronary artery calcium (CAC) scoring using computed tomography has emerged as a non-invasive tool for assessing atherosclerotic burden and predicting CAD severity. Objectives: The primary objective of this study was to determine the predictability of significant coronary artery disease using coronary calcium scoring (Agatston score). The secondary objectives were to correlate calcium scores with the degree of coronary vessel occlusion and with associated cardiovascular risk factors. Materials and Methods: This observational study included 72 patients diagnosed with triple vessel coronary artery disease who underwent coronary artery bypass grafting (CABG) over a period of six months at our center. All patients were evaluated using conventional coronary angiography and CT coronary angiography for calcium scoring. Demographic data, risk factors, and angiographic findings were recorded and analyzed. Results: Among the study population, 52 (72.2%) were males and 20 (27.8%) were females, with a mean age of 57 years. Diabetes mellitus was present in 61% of patients, hypertension in 68%, and a positive family history of CAD in 64%. A history of smoking and alcohol consumption was noted in 66% and 61% of patients, respectively. Significant left main coronary artery disease (>50% stenosis) was observed in 7% of patients. Approximately 10% of patients had complete occlusion of the left anterior descending artery, while 80–90% stenosis was seen in nearly 65% of patients. Only 7% of patients had low Agatston scores, whereas the majority demonstrated moderate to high scores, indicating a strong association between elevated coronary calcium scores and the severity of coronary artery disease. Conclusion: Coronary calcium scoring is a valuable non-invasive tool for predicting the presence and severity of significant coronary artery disease. Higher Agatston scores correlate well with the degree of vessel occlusion and the presence of conventional cardiovascular risk factors, making it a useful adjunct in risk stratification and clinical decision-making in patients with suspected or established CAD. Keywords: Coronary artery disease, Coronary calcium score, Agatston score, CT coronary angiography, Triple vessel disease, Coronary angiography, Coronary artery bypass grafting (CABG) and Atherosclerosis.

Page No: 891-895 | Full Text

 

Original Research Article

THE ROLE OF ALBUMIN AND LIPID PROFILE AS BIOMARKERS FOR PREDICTING MORTALITY IN SEPSIS PATIENTS IN INTENSIVE CARE UNIT

http://dx.doi.org/10.70034/ijmedph.2026.2.153

G. Nivetha, K. Sumitra Vellaiammal, K. Sujatha, A. P. Thiyagarajan

View Abstract

Background: Worldwide, sepsis continues to be one of the main reasons for intensive care unit (ICU) admissions and deaths, especially in low- and middle-income countries. There is an urgent need to quickly discover new, reliable and accessible biomarkers to improve patient care by enabling risk stratification. Researchers are now identifying serum albumin and lipid profile parameters as potential prognostic indicators in sepsis. The aim and objective is to assess the ability of serum albumin and lipid profile parameters in predicting death in sepsis patients admitted to an ICU, compared with two well-known severity scores (APACHE II & SOFA). Materials and Methods: This study was performed in the course of twelve months in a tertiary hospital with patients who are were admitted to the intensive care level and have sepsis. The total population of patients included 150 and data collection included: demographic information, comorbid conditions, and number of clinical measurements performed on each patient. The APACHE II and the SOFA score were utilized as a way to measure the severity of the patient's illness. Serum albumin and lipid profile (total cholesterol, HDL-C, LDL-C and triglycerides) were obtained at the time of admission and through successive measurements every 24 hours for three days following admission. All patients were categorized as either survivors or non-survivors and subsequent analyses were performed on these two groups. Results: Older adults (>70 years), people with diabetes mellitus, and those with chronic liver disease displayed considerably higher rates of mortality than younger individuals without diabetes or chronic liver disease. The authors found that non-survivors had considerably higher APACHE II (26.02 ± 5.62 vs 19.20 ± 5.59) and SOFA scores (11.93 ± 3.69 vs 7.34 ± 2.47; p < 0.001) than survivors. Serum albumin concentrations were consistently lower in non-survivors than survivors during their ICU stays. In addition total cholesterol, HDL-C, and LDL-C were significantly lower in non-survivors than in survivors at all study time points, while triglyceride levels had no appreciable relation to mortality. Conclusion: The presence of low serum albumin levels and decreased levels of total cholesterol, high-density lipoprotein cholesterol (HDL-C), and low-density lipoprotein cholesterol (LDL-C) levels in individuals with sepsis have been shown to substantially improve prognostic accuracy through the use of previously established severity scores. These markers have also been suggested to be of significant utility in clinical decision making by giving the physician an accurate assessment of the patient’s prognosis in a cost-efficient manner. Keywords: Sepsis, ICU, Mortality, Albumin, Lipid Profile, HDL, LDL, Biomarkers, APACHE II, SOFA.

Page No: 896-901 | Full Text

 

Original Research Article

HEMATOLOGICAL PARAMETERS AS PREDICTORS OF DISEASE SEVERITY AND OUTCOMES IN COVID-19 PATIENTS: A PROSPECTIVE OBSERVATIONAL STUDY

http://dx.doi.org/10.70034/ijmedph.2026.2.154

G. Kayalvizhi, S. Sridevi, D. Dhivya

View Abstract

Background: Coronavirus disease 2019 (COVID-19) has a wide spectrum of clinical manifestations ranging from mild illness to severe disease and death. Early identification of patients at risk of severe disease is crucial for timely intervention. Hematological parameters and inflammatory markers have emerged as potential predictors of disease severity and outcomes. The aim is to evaluate hematological parameters as predictors of disease severity and clinical outcomes in COVID-19 patients. The objective is to assess hematological parameters in patients with COVID-19 infection. To correlate hematological parameters with disease severity. To evaluate the association of hematological parameters with clinical outcomes such as ICU admission, ventilator requirement, and mortality. Materials and Methods: This prospective observational study was conducted on 150 RT-PCR confirmed COVID-19 patients admitted to a tertiary care hospital. Patients were categorized into severe and non-severe groups based on clinical criteria. Hematological parameters including hemoglobin, total leukocyte count, differential counts, platelet count, neutrophil-to-lymphocyte ratio (NLR), and platelet-to-lymphocyte ratio (PLR), along with inflammatory markers such as C-reactive protein (CRP) and serum ferritin, were recorded. Statistical analysis was performed using appropriate tests, and p < 0.05 was considered statistically significant. Results: Severe COVID-19 patients were significantly older and had a higher prevalence of comorbidities. Hematological findings revealed significant leukocytosis, neutrophilia, lymphopenia, and thrombocytopenia. NLR and PLR were significantly elevated in severe cases. Inflammatory markers such as CRP and ferritin were markedly higher in severe patients and non-survivors. These parameters showed strong associations with ICU admission, ventilator requirement, and mortality. Conclusion: Hematological parameters, particularly lymphocyte count, NLR, PLR, CRP, and ferritin, are valuable predictors of disease severity and outcomes in COVID-19 patients. These readily available markers can aid in early risk stratification and improve clinical management. Keywords: COVID-19; Neutrophil-to-Lymphocyte Ratio; Hematological Parameters.

Page No: 902-909 | Full Text