Tumgik
#Juniper Publishers Indexing Sites
Text
Let’s Talk and Grow Together: A Bidirectional Communication between Granulosa- and Oocyte Derived Factors in the Ovary
Tumblr media
Abstract
Reproduction, one of the most active and appealing area of research for endocrinologists and reproductive biologists since many a years, has several faces that remains to be unmasked in terms of its regulatory aspects. Available information on the regulation of oocyte development and maturational competence are gaping and needs elucidation to achieve utmost quality of eggs, a major area of concern. The notion of the somatic follicular cells providing an appropriate microenvironment for the development of oocyte throughout its journey has been replaced with the current perception of a complex yet regulated cross-talk between the granulosa-and oocyte-derived factors to orchestrate follicle development. Interestingly, actions of FSH and LH are mediated or modulated by these locally produced non-steroidal peptide factors from the follicular layer and the oocyte itself (insulin-like growth factors (IGFs), epidermal growth factor (EGF) family members, TGFβ super family members etc.), forming an intimate regulatory network within the ovarian follicles. Present article will provide a deeper insight into the need and underlying mechanisms of action of these growth factors in the intraovarian network to sustain a healthy oocyte.
Read More About This Article: https://juniperpublishers.com/gjorm/GJORM.MS.ID.555569.php
Read More Juniper Publishers Google Scholar Articles: https://scholar.google.com/citations?view_op=view_citation&hl=en&user=xBu_HGEAAAAJ&authuser=1&citation_for_view=xBu_HGEAAAAJ:Zph67rFs4hoC
0 notes
Seasonal Prediction of Marine Ecosystems: How Close Are We?- Juniper Publishers
Abstract
The primary purpose of this article is to supply definitive statements regarding current skills in seasonal prediction with emphasis on marine ecosystem indicators and how forecasts could be possibly used for societal benefit. To enhance skills and setting priorities toward the further development and application of the existing dynamical models for seasonal prediction one has to recognize that this kind of process necessarily requires robust interactions amongst the biophysical science and applications communities and a delicate balance between scientific feasibility and application requirements. Also, recommendations for improving seasonal prediction skills and enhancing the use of seasonal prediction information for applications are outlined.
Introduction
A marine ecosystem is a complex nonlinear dynamical system, with significant spatial variability, strongly linked to the circulation of the atmosphere and oceans, and temporal variability ranging from hours to decades. Marine ecosystem forecasters interact with society mainly because of the latter's demands for accurate and detailed ocean environmental forecasts. The complexity concerning the oceanic ecosystem demonstrates that quantitative predictions can only be made using comprehensive numerical models, which encode the relevant laws of dynamics, thermodynamics and biochemistry for a complex dynamical system. Typically, such models include some millions of scalar equations, describing the interaction of physics and biochemistry on scales ranging from tens of kilometres to tens of hundreds of kilometres. These equations can only be solved on large computers.
Predictability � Theory and Models
The maximum predictability of a marine ecosystem is yet to be achieved in operational seasonal forecasting. This positioning is based upon the recognition that the model error continues to limit forecast quality and that the interactions among the nonlinear terms of the ecosystem set the limits of predictability. The fact that model error continues to be problematic is evident from the need for successful calibration efforts and the efficient use of empirical techniques to improve dynamical model forecasts. Essentially there is untapped predictability since that we currently may not take into consideration important interactions among the physical and biological components of the ecosystem. The maximum achievable predictability is unknown and assessing this limit requires much additional research.
Chaos theory developed in an attempt to demonstrate the limited predictability of atmospheric variations [1]. In the past, the topic of predictability has been a somewhat theoretical and idealized one and to some extent not used in the practicalities of prediction. The predictability problem could well be formulated e.g. through a Liouville equation [2]. However, in practice, estimates of predictability are created from multiple (ensemble) forecasts of comprehensive atmospheric and ocean prediction models [3-6]. The uncertainty mainly in model error and initialization is quantified by contracting ensemble members, where the individual members of the ensemble differ by small perturbations [7]. For instance, the predictability of weather is largely determined by uncertainty in a forecast's starting conditions [8,9], whilst the predictability of ecosystem variations is also influenced by uncertainty in representing computationally the equations that govern the biogeochemical model (for example, to what extent the phytoplankton species should be represented, and subsequently what will be the model error). Furthermore, chaos theory shows that these kinds of environmental forecasts have to be expressed probabilistically [7]; the laws of physics dictate that relatively long-term accurate weather and ocean forecasts cannot be expected. These probability forecasts quantify uncertainty in weather and ecosystem prediction. The forecaster has to strive to estimate reliable probabilities, not to disseminate predictions with a precision that cannot be justified scientifically. Examples have shown that, in practice, the economic value of a reliable probability forecast (produced from an ensemble prediction system) exceeds the value associated with a single deterministic forecast with uncertain accuracy [10,11].
Scientific indications suggest unequivocally that predictions should be provided only as probabilities, utilizing either ensembles with dynamical models or appropriate alternatives along with empirical models. Metrics involved with probability estimations and their interpretations, are more complex compared to the deterministic ones. The necessity of a procedure that in fact sets a framework including all aspects of seasonal ecosystem forecast is incontestable considering the steadily expanding demand and use of model estimations. The purpose of hind-casts will be the standard method for improving the sample measurements. However, hind-casts may also have many challenging concerns that should be tackled, like insufficient initialization data, non-stationary nature of observing systems and non-stationary nature of the marine ecosystem.
Producing reliable probability forecasts from ensembles of atmospheric/biophysical model integrations put enormous demands on computer resources. Computer power is essential so that one can resolve the details of such a system [12]. It has been argued that, as a result of the nonlinear nature of a given system, systematic mistreatment of small-scale phenomena may lead to the systematic mistreatment of large-scale phenomena. However, one can find reasons for studying small-scale phenomena in their right. From an ecosystem prediction perspective, it is essential to have the ability to simulate such details, if models are able to forecast significant events like spring-blooms or fish stocks. However, this poses a significant dilemma given the current computing resources. To simulate extreme events, models with considerable complexity and resolution are required. On the other hand, estimating changes reliably to the probability distributions of extreme and relatively rare occasions, a large number of ensembles have to be used. One fact is inevitable; the more the necessity to provide reliable forecasts of uncertainty in the predictions of weather and ecosystem, the more the demand for computer power exceeds availability. Indeed the call for quantitative predictions of uncertainty is typically a pertinent consideration related to the design of future generations of supercomputers; ensemble prediction generally is a perfect application for parallel computing.
Model errors, particularly in areas exhibiting strong interannual variability of the ocean circulation and vertical mixing, such as the winter-spring period, continue to hamper seasonal prediction skill. The benefit of reducing the model error has not been overstated. There is a range of strategies for improving models including better representation of the interactions among the elements of the ecosystem (tuning and customization to reduce any model bias), better description of biogeochemical cycles, and substantially increased spatial resolution. All of these strategies ought to be vigorously pursued. Except the model errors, the production of seasonal forecast quality remains hindered using a wide range of factors, including:
I. Scarcity: Unlike daily forecasts, seasonal predictions can be obtained only a few times per year, creating difficulties in accumulating the information required to provide stable estimates of great quality (together with the expense of producing the predictions when ensembles are utilized),
II. Changes in variability: Predictability of ecosystem variables varies between years, across the Mediterranean, Atlantic, Baltic and Black Sea with strong evidence that these variations are also related to atmospheric and hydrodynamic variability,
III. Seasonal quality: Most metrics of seasonal forecast quality, are mainly technically presented in a nature not easily communicated to audiences outside the seasonal forecasting community.
Several broad qualitative and quantitative outcomes have been produced from the experiments recommending that seasonal predictions are more skilful in specified regions. Furthermore, climatology, or persisting recent seasonal anomalies in many cases can provide useful information. The quality of seasonal predictions varies on an inter-annual basis, partly connected to inter-annual variability of the specific marine ecosystem dynamics; average quality also differs between particular seasons. The primary seasonal variables of great interest, predictions of specific hydrodynamic variables (temperature, salinity, etc.) usually are of better quality than biogeochemical variables, such as chlorophyll, nutrients, primary production, etc.
Uncertainty and Recommendations - Decision-Making Issues, Societal Benefits
An overall recommendation, but with exceptions, is that initial conditions provided by data assimilation and used for the initialization of seasonal forecasts, improve the model projection but information on spatial patterns is lost quite quickly. It is demonstrated that in some regions and during some seasons, seasonal predictions have quality, but their translation into useful information for end-users is far from optimal; a concerted effort is essential to engage customers and seek their quantitative definition of value, so that the forecasts will be able to be used in decision-making issues. Since the direct connection between seasonal forecast quality and value has not been established, appropriate processes need to be engaged in measuring value in specific decision making instances independently from the assessment of quality. It is worth noted that a better experience in climate variability (e.g., the chances of varied scenarios, despite forecasts) could aid applications/ planning/management. The applying of forecasts requires trust in the overall quality of the forecasts and knowledge of forecast uncertainty.
Forecast initialization is an area that requires active research. Ocean data assimilation has improved forecast quality; however, coupled data assimilation can be a field of active research that is seeking enhanced support and maybe international coordination. There is certainly significant evidence that coupled atmosphere- ocean-biochemical data assimilation should improve forecast quality. Multi-model methodologies [8,13,14] are indeed a useful and practical approach for quantifying forecast uncertainty as a consequence of the model formulation. Still, there are open questions associated with the multi-model approach. For example, the approach is ad-hoc meaning that the choice of models isn't optimized. Nor has the community converged to any best strategy for combining the models. Multi-model calibration activities tend to yield positive results, but considerable work needs to be done, and additional research is required. The multimodel approach should not be utilized to prevent the need to improve models.
Validation has to be undertaken routinely on seasonal dynamical application models. These models ought to be complex enough to capture non-linear interactions, and at the same time being sufficiently simple to avoid over-tuning through non-constrained parameters. The relationship of forecast quality, such as applications of fish models linked to meteorological, biophysical models, is often highly non-linear. Consequently, quality of the prediction of seasonal chlorophyll might not be translated into quality within the prediction of mean fish stock, for example. Such application models like fish models should have additional metrics of forecast quality. Furthermore, these metrics should be suitable for a selected user group.
A clear need to provide information at local scales (space and time) is needed by many applications. Further effort is essential to provide and increase such information, e.g., through statistical and/or dynamical downscaling. Although there are numerous examples of seasonal forecast application (e.g., atmosphere, ocean, water management), there is potential to do considerably more. More progress has to be achieved in bringing seasonal prediction providers and seasonal prediction users together. More effort is required for the development, the production and understanding of probabilistic forecasts. More understanding of what exactly is predictable and what is not. The value of predicting 'extremes' can also be incredibly important.
The societal benefit is not fully realized, partly because there haven’t been adequate interactions among physical scientists associated with seasonal prediction research and production, applications scientists, decision-makers and operational seasonal prediction providers. The issues and problems go beyond merely improving forecast quality and making forecasts readily available. The physical scientists should actively facilitate and understand users' requirements, to provide enhanced climate information, prediction products and services leading to enhanced applications. Users also have to maintain an active dialogue with the physical scientists and forecast providers so that their climate information needs are taken into account. Decision makers should take into account the seasonal forecast information, however, we ought to be aware that our products aren't the only factor they consider within the decision process. Successfully communicating uncertainty and the limitations of seasonal forecasts are essential to the process of making seasonal forecasts useful. It is challenging for several users to make explicit utilization of seasonal forecasts, most importantly when they are not following the needs of many decision makers. In several occasions, our best forecast may simply be climatology that could be useful information. Access to hind-cast data is also essential to assist users in assessing the model performance and the potential benefit from the forecasts. In these data, there is often quantitative information, which users will try to convert into terms that more closely meet their needs. Similar issues can arise regarding the spatial scale of forecast information in which the resolution of regional ecosystem models is considered too coarse for managers or decision makers. Seriously misleading situations can take place when users take information appropriate for the large scale and apply it to local scales without considering the additional uncertainty connected with such action. Application models can be used with seasonal forecasts to provide a metric that combines quality and value. For example, the quality of a seasonal forecast of fish stocks is both a proper measure of skill and a measure of potential value. Seasonal predictability research should be encouraged. Collaborations and interactions with the climate change community need to be supported and have the potential for significant benefits. Indeed, a seasonal prediction is expected to be addressed in the context of a changing climate.
For more about Juniper Publishers please click on: https://twitter.com/Juniper_publish
For more about Oceanography & Fisheries please click on: https://juniperpublishers.com/ofoaj/index.php
1 note · View note
Text
The Effect of Ramadan Fasting on Oral Anticoagulant: Experience of Tertiary Academic Hospital- Juniper Publishers
Tumblr media
The Effect of Ramadan Fasting on Oral Anticoagulant: Experience of Tertiary Academic Hospital- Juniper Publishers 
Authored by Fakhr AlAyoubi
Abstract
Introduction: Direct oral anticoagulants (DOACs) have been approved for stroke prevention in non-valvular atrial fibrillation (AF). The outcomes achieved by clinical trials depend on pharmacokinetics and pharmacodynamics of these drugs which are used orally in regular doses once or twice daily. During our routine practice we face an obligatory choice to prescribe NOAC twice daily during the month of Ramadan. Twice daily dose drugs in our practice as well as with Muslim patients everywhere in the world during the holy month of Ramadan. Objective: to evaluate the effectiveness and safety of NOACs either once daily or twice daily used during the month of Ramadan for stroke prevention in Non-Valvular atrial fibrillation. Methods: An observational study in tertiary university hospital at Saudi Arabia for 114 patient’s non-valvular atrial fibrillation on NOACs during the month of Ramadan 1441 H/ 2019 with laboratory follow ups before Ramadan and one month after for each patient who completed at least 15 days fasting during this Holy month of Ramadan 1441 H/ 2019. Results: 114 patients, 40 were on Rivaroxaban and 74 on Dabigatran, the baseline characteristic are similar except the history of myocardial infarction 28%, and Heart failure 58% in the Rivaroxaban arm which was higher than the Dabigatran arm 9%, & 12%. Regarding the concomitant medications used Aspirin in Rivaroxaban arm 33%, Dabigatran arm 11%, NSAID only Dabigatran arm 8%, %, no significant deferent in the stroke and bleeding events between two arms. Conclusion: The data shows that the Direct oral anticoagulant are effective and safe to be used in Ramadan for non-valvular atrial fibrillation regardless the frequency of anticoagulant , further studies are needed to support these findings in non-valvular atrial fibrillation and other indications.
Keywords: Anticoagulant; Rivaroxaban; Dabigatran; Direct oral anticoagulant (DOAC); Ramadan; Fasting; Bleeding; Efficacy; Stroke prevention; Safety of direct oral anti-coagulant; non-valvular atrial fibrillation
Abbreviations: DOAC: Direct Oral Anticoagulant; AF Atrial Fibrillation; NOAC’s: Non-Valvular Atrial Fibrillation; TIA: Transient Ischemic Attack
Introduction
Around the world, the Muslim population performs fasting of the ninth month of the Islamic lunar calendar (Ramadan) every year. Fasting during the month of Ramadan, includes food, beverage, smoking, as well as all oral drugs with some restrictions for intravenous drugs to be restriction from dawn until sunset. Ramadan migrates throughout the seasons due to the Islamic lunar calendar year is shorter than the solar year by 10 to 11 days. Most Muslims change their lifestyle during Ramadan, including dietary and sleep pattern [1,2] leading to change in the hemodynamic profile for Ramadan fasters. Fasting Ramadan has a significant metabolic, hormonal, and inflammatory changes [3]. Dehydration was confirmed on Ramadan faster during the daylight hours but in summertime [4]. Although fasting Ramadan has a positive impact on fasting blood glucose and serum lipid profile [5,6]. The effectiveness of oral drugs may be changed during Ramadan, especially renal excreted drugs [7-9]. As the dehydration may decrease renal function, which will increase the toxicity of renal excreted drugs.
Methods
We conducted an observational cohort study at King Saud University Medical City (KSUMC), Riyadh, Saudi Arabia. The study included all patients with non-valvular atrial fibrillation on direct oral anticoagulant during Ramadan 1440Higri calendar. a) Inclusion criteria: Patients with non-valvular Atrial Fibrillation documented by ECG,CHA2 DS2 VASc 1 or more and treated with Rivaroxaban or Dabigatran started at or before 1st of Shaban 1440 H which is one month before Ramadan . Each patient included in the study had to be fasting at least 15 days during the Holy month of Ramadan. b) Exclusion criteria: Active liver disease, Pregnancy, Patient with transient ischemic attack (TIA), stroke or major bleeding within 14 days of recruitment, Patient with non-valvular A. Fib Received warfarin. In order to study the effectiveness of direct oral anticoagulant for Ramadan fasting patients, we calculate the incidence of the clinical outcomes before the start of fasting and 1 month after fasting. Data were analyzed using the SAS version 9.2 (SAS Institute, Inc, Cary, NC) for Windows®. With p-value of <0.05 considered statistically significant. Descriptive statistics are reported mean and medians ± standard deviation or as frequencies and percentages, as appropriate. Chi-squared tests were used to determine association between qualitative variables.
Discussion
During Ramadan fasting thru the daylight hours, practicing Muslims are undoubtedly dehydrating at a rate that is determined by the loss of body water minus the amount of metabolic water that is produced over this period. Losses in body mass over a long period, any other changes in body composition may bias the calculation. Many studies showed that the daytime urine osmolality tended to increase progressively throughout the month of intermittent fasting and it was even higher before breaking the fast especially if the fasting hours exceeded the 12 hours which is the case in summer time [10]. As recommended by the International Diabetes Federation and Diabetes and Ramadan International Alliance, patient on Chronic dialysis or chronic kidney disease stages 3-5 are considered to be at very high risk and high-risk categories, respectively, and are exempted from fasting [11,12]. Fasting for one month in Ramadan (June– July 2015, Rize, Turkey) for approximately 17.5 hours each day was not associated with worsening of renal functions in patients with Stage 3–5 CKD. These findings were similar to the results from other observational studies both in patients with CKD and renal transplantation. However, elderly patients may still be under a higher risk [13]. The study showed a significant association between fasting Ramadan and stroke as an independent risk factor for stroke, and specifically ischemic stroke [14]. Atrial fibrillation is a type of irregular heartbeat. It means that the heart may not be pumping as well as it should. As a result, blood clots are more likely to form in the heart, if a clot blocks one of the arteries leading to the brain, it could cause a stroke or transient ischemic attack (TIA). So, the direct oral anticoagulant one of the way to reduce the risk of a stroke by decrease risk of blood clots forming, [15] and have been approved for stroke prevention in non-valvular atrial fibrillation [16]. In all patients with AF, the CHA2DS2-VASc score should be systematically assessed to identify the risk of stroke and to decide on the initiation of appropriate direct oral anticoagulation therapy accordingly. Technology tools may assist in this undertaking. Oral anticoagulation therapy is recommended in all male patients with CHA2DS2-VASc score of 2 or more, and in all female patients with a score of 3 or more. Oral anticoagulation therapy should be considered in male patients with CHA2DS2-VASc score of 1 and in female patients with a score of 2, considering individual characteristics and patient preferences. When initiating anticoagulation, a non-vitamin K antagonist (DOAC) which include (Dabigatran, Rivaroxaban, Apixaban, and Edoxaban) are recommended over warfarin in DOAC-eligible patients with AF (except with moderate-to-severe mitral stenosis or a mechanical heart valve) [17]. Patient who treated with warfarin during Ramadan, has risk to develop supra-therapeutic INR [17-19]. The outcomes achieved by clinical trials for direct oral anticoagulant depend on pharmacokinetics and pharmacodynamics of these drugs which are used orally in regular doses once or twice. Across many diseases, studies have shown that patients taking medication at specific times of day may make the therapy more effective [20-23]. In Riyadh, Ramadan falls at summer at the time of this study. Adult Muslims who join the holy fast will refrain from taking any food, beverages, or oral drugs between dawn and sunset for up to 15 to19 hours a day. In fact, drug doses can be taken only between sunset and dawn, and the time span between them is shorter than outside Ramadan. It is therefore not surprising that potentially invasive studies have been challenging to conduct, leading to a relative paucity of direct evidence regarding the physiological effects of Ramadan fasting. However, over the past few years some studies examining glucose metabolism, lipid profiles, circadian rhythms, sleep, and aspects of hormone physiology during Ramadan have been performed, hypertension and small few studies for stroke in fasting patients.
The objective of this study is to evaluate the effectiveness and safety of direct oral anticoagulant used by Ramadan fasting Muslim patients for stroke prevention in non-valvular atrial fibrillation at King Saud University Medical City (KSUMC), Riyadh, Saudi Arabia. There is retrospective, observational study, supplemented with physician and patient questionnaires, with data captured before, during and after Ramadan that revealed 64% of diabetic patients reported fasting everyday of Ramadan and 94.2% fasted for at least 15 days [24]. The dehydration may be a significant risk factor for the physically and economically costly outcome of ischemic stroke and reinforces the need to better understand how dehydration increases the risk of ischemic stroke especially in patients at higher risk of stroke due to comorbid conditions such as atrial fibrillation [25]. There is a study conducted in Turkey concluded that although Ramadan fasting had an adverse effect on diabetic patients with ischemic stroke, there was no negative effect on stroke frequency, and the hypertensive hemorrhagic stroke ratio is lower [26]. There is a small study that was conducted in the State of Qatar in the Arabian Gulf demonstrated that no significant difference was found in the number of hospitalizations for stroke while fasting during the month of Ramadan when compared to the non-fasting months [27].
Result
Out of two hundred thirty-four (234) patients visiting KSUMC anticoagulant clinics receiving Dabigatran or Rivaroxaban during the past three months before Ramadan 1441 H/ 2019 , only 114 patients are eligible to our study. 120 patients were excluded duo to various reasons (Figure 1). There is no clinically significant in Laboratory values before and after fasting in patients who received Dabigatran or Rivaroxaban (Table 1-4).
*Significant at level ≤ 0.05.
Conclusion
The data shows no clinical significant between the patients who received DOACs before and after Ramadan (Fasting) that’s mean the direct oral anticoagulant (DOACs) are effective and safe to be used in Ramadan for non-valvular atrial fibrillation regardless the dose frequency of anticoagulants, further studies are needed to support these findings in non-valvular atrial fibrillation and other indications [28,29].
To know more about Juniper Publishers please click on: https://juniperpublishers.com/aboutus.php
For more articles in Open Access Journal of Reviews & Research please click on: https://juniperpublishers.com/arr/index.php
To know more about Open Access Journals please click on: https://juniperpublishers.com/journals.php
0 notes
juniperpublishersna · 2 years
Text
PPARα as Potential Therapeutic Target for Neurodegenerative Diseases-Juniper Publishers
JUNIPER PUBLISHERS-OPEN ACCESS JOURNAL OF DRUG DESIGNING & DEVELOPMENT
Tumblr media
Abstract
Peroxisome proliferator activated receptor a (PPARα) is ligand-activated transcriptional factor receptor belonging to nuclear receptors family. It plays a key role in lipid metabolism and glucose homeostasis and it is important in the prevention and treatment of metabolic diseases. PPARα has also a protective effects against brain cell death attributed to its anti-inflammatory and antioxidant properties. In the present work, we discuss the PPAR involvement in neurodegenerative pathologies and its potential as therapeutic target for these diseases.
Keywords:      Alzheimers disease; Parkinsons disease; Neuroprotection
Abbreviations:   PPARs: Peroxisome Proliferator Activated Receptors; RXR: Retinoid X-Receptor; SARs: Structure Activity Relationships; PEA: Palmitoyl Ethanol Amide; LPS: Lipo Poly Saccharide.   
Mini Review
Peroxisome proliferator-activated receptors (PPARs) are members of the nuclear receptors super family and are ligand-activated transcription factors. They are involved in the regulation of metabolic pathologies such as cardiovascular disease, obesity, lipid disorder, hypertension and diabetes [1]. PPARs exist as three subtypes commonly designated as PPARα, PPARγ, and PPARβ/δ. All PPAR isoforms, once within the nucleus, heterodimerize with retinoid X-receptor (RXR) and bind to specific DNA-response elements in the promoter of target genes. When a ligand binds to PPARs, there is a conformational change in the receptor that causes the removal of co-repressors and the recruitment of co-activators; this causes chromatin remodeling which allows the initiation of DNA transcription [2].
PPARα, PPARγ, and PPARβ/δ are expressed in different tissues and with distinct binding ligands, co-activators or corepressors. PPARα, mainly expressed in tissues involved in lipid oxidation such as kidney, liver, skeletal and cardiac muscle, plays an important role in fatty acid oxidation and lipoprotein metabolism; PPARγ is expressed predominantly in adipose tissue and vascular smooth muscles; PPAR β /δ  is expressed broadly and particularly in tissues associate with fatty acid metabolism, but also in the small intestine, liver, colon and keratinocytes [3]. A lot of studies showed that PPARs are expressed also in brain and in particular in neurons and glia [4]; for this reason, the potential use of PPAR agonists as neuroprotective agents in neurodegenerative disorders has been suggested. Neurodegenerative diseases are incurable pathologies with a progressive degeneration of neurons associated with motor and cognitive damage. These conditions are characterized by oxidative stress, mitochondrial and transcriptional dysregulation and apoptosis [5]. Because oxidative stress and neuro inflammation are involved in cell death, these dysfunctions are the key factors for the development of the most common neurodegenerative disorders such as Parkinson's disease, Alzheimer's disease and amyotrophic lateral sclerosis.
Therefore, novel therapeutic approaches are useful to obtain a reduction of the symptoms and slow progression of the pathology. In this contest, the role of PPARα is emerging as a promising pharmacological target for the treatment of neurodegenerative diseases. Fibrates are PPARα agonists widely studied especially for the treatment of hyperlipidemias (Figure 1). Some of these, such as gemfibrozil, ciprofibrate, WY-14643 or fenofibrate, activate selectively only PPARα, others do not have an isoform selectivity. For example, GFT505 is a dual PPARα/δ  agonist and bezafibrate, that actives all three isoforms, is a PANagonist [6].
In the last years, the development of new fibrates that activate PPAR has been an important objective to better understand structure activity relationships (SARs) for obtaining new drugs with a better pharmacological profile [7,8]. In this contest, the neuroprotective effects of PPARα agonists have been studied; some researchers attributed this effect largely to the PPARα antioxidant and anti-inflammatory properties but also to the positive effects in lipid metabolism and glucose homeostasis [9].
About anti-inflammatory properties of PPARα, it was showed especially in astrocytes and microglia [10,11]. In fact, several authors demonstrated that the use of PPARα agonists, such as ciprofibrate, fenofibrate, gemfibrozil and WY-14643, causes a reduction of NO production especially in mouse microglia stimulated by lipopolysaccharide (LPS). Furthermore, it has been demonstrated that the treatment with palmitoylethanolamide (PEA) causes a reduction of oxidative stress in astrocytes mediated by PPARα [12]. PPARα is also expressed in brain and the anti-inflammatory role was evidenced by reduction of LPS- induced TNFα , IL-1β, IL-6 and COX-2 [13].
The anti-inflammatory effect mediated by PPARα has also been identified in reactive astrocytes. It has been shown that PPARα attenuates the inflammation in reactive astrocytes by decreasing NO and pro-inflammatory cytokines. Additional, PPARα has an important role in other glial cells such as microglia and ependymal cells in response to injury. [14]. PPARα has an antioxidant effect associated with a reduction of cerebral oxidative stress depending on the increase in activity antioxidant enzymes, such as Cu/Zn superoxide dismutase and glutathione peroxidase. This activity causes a decrease in lipid peroxidation and ischemia-induced reactive oxygen species production [9].
The anti-inflammatory and antioxidant properties of PPARα explain the neuroprotective effects especially in Parkinson's disease and Alzheimer's disease [15]. For these reasons, PPARα could be a therapeutic target for Parkinson’s disease; in fact, it has been established that there is a neuroprotective effect in the brain of animals treated with fenofibrate by decreasing inflammation. Uppalapati et al. showed that fenofibric acid, the active metabolite of fenofibrate (Figure 2), was present in the brain of animals treated with fenofibrate, suggesting that this compound was metabolized and that crossed the blood-brain barrier in vivo [16].
It was discovered that fenofibrate prevent the dopaminergic neurons loss in the substantia nigra, and it attenuates the loss of tyrosine hydroxylase immune reactivity in the striatum [17]. Many studies have shown that PPARα could have a therapeutic effect also in Alzheimer's disease, even if this conclusion remains controversial. Some researchers demonstrated that PPARα has a protective effect against beta-amyloid-induced neurodegeneration [18], but others found that fenofibrate increases beta-amyloid production in vitro; perhaps this effect of fenofibrate is not connected with PPARα activation [19]. Further, PPARα activation induces vascular protection through an improvement of cerebral artery sensitivity [20].
To conclude, neurodegenerative diseases induce progressive loss of cognitive functions and current drugs only furnish temporary symptomatic alleviation without blocking disease progression. During last years, growing interest was directed towards PPARs that have the capability to positively regulate the genes expression with the aim to modulate several molecular pathways responsible of neurodegenerative diseases. In particular, different researchers have shown the positive involvement of PPARα in neurodegenerative disease.
The useful effects are principally due to PPARα antiinflammatory and antioxidant properties but also to the capacity to restore the vascular and endothelial integrity. The potential use of PPARα agonists as neuroprotective agents against neurodegenerative disorders is an important start point to find new drugs that could cure definitively these pathologies. Though more laboratory and clinical studies are needed to understand all mechanisms involved in the neuroprotective actions of the PPARα agonists, these receptor is an effective target for neurodegenerative disorders.
For more Open Access Journals in Juniper Publishers please click on: https://www.crunchbase.com/organization/juniper-publishers
For more articles in Open Access Novel Approaches in Drug Designing & Development please click on: https://juniperpublishers.com/napdd/index.php
For more Open Access Journals please click on: https://juniperpublishers.com
To know more about Juniper Publishers please click on:
https://juniperpublishers.business.site/
0 notes
Text
Quality by Design in Enzyme Catalyzed Reactions-JuniperPublishers
Journal of Chemistry-JuniperPublishers
                                Abstract
Quality by Design is the new-age path chosen towards achieving the demanding quality standards in pharmaceutical industry. The present paper aims to throw light on Pharmaceutical Quality byDesign (QbD) and how its implementation will help manufacture better quality of Pharmaceuticals. Quality by Design is introduced along with its key elements to help make the understanding process easier. To attain built-in quality is the primary objective of Quality by Design. Finally, it can be said that the quality that is achieved by end product testing is not something that can be guaranteed unlike the quality assurance that can be provided by Quality by Design.
Keywords: Quality by Design (QbD); Quality Target Product Profile; Design Space; Critical Quality Attributes
Go to
Introduction
“Quality Can Be Planned.”-Joseph Juran
The above quote is self-explanatory when it comes to product quality in the pharmaceutical manufacturing industry. Quality by design (QbD) is not very old but a recent inclusion in the pharmaceutical industry. It`s sole objective is to achieve better quality standards that is especially important in the pharmaceutical industry. The QbD approach consists of various components, important ones being risk assessment, assessment and management of the identified risks, design of experiments (DoE), quality target product profile (QTPP), and establishing a control strategy to keep the product within the design space that was created with the QbD study [1]. Out of all the components, a lot of pharmaceutical development studies have incorporated DoE for a more rational approach [2].
The target of analytical QbD approach is to establish a design space (DS) of critical process parameters (CPPs) where the critical quality attributes (CQAs) of the method have been assured to fulfil the desired requirements with a selected probability [3-4].
The principles that are involved in the pharmaceutical development and are relevant to QbD are all described in the ICH guidelines (ICHQ8-11) [5].
Any Pharmaceutical Development Process Typically Covers the Following Sections:
a) Complete portfolio including all the details as well as analysis of the Reference Listed Drug Product
b) Quality Target Product Profile (QTTP) compilation.
c) Figuring out the Critical Quality Attributes (CQA)
d) Complete characterization of API &CMA (Components of Drug product) identification of the API
e) Excipient selection& excipients CMA identification
f) Formulation Development
g) Manufacturing Process Development [6]
Quality Target Product Profile (QTPP) describes the design criteria for the product, and should therefore form the basis for development of the CQAs, CPPs, and control strategy.
Critical Quality Attributes (CQA) – A physical, chemical, biological, or microbiological property or characteristic that should be within an appropriate limit, range, or distribution to ensure the desired product quality (ICH Q8) Critical Process Parameter (CPP) – A process parameter whose variability has an impact on a CQA and therefore should be monitored or controlled to ensure the process produces the desired quality. (ICH Q8) Critical Process Parameters (CPP) identification and their impact analysis is done by conducting a preliminary risk analysis for every process parameter (PP) that is involved in the individual unit operations.
Need for QbD in Pharmaceutical Industry [7,8]:
a) To integratepatient needs, quality requirements and scientific knowledge all in one design while the pharmaceutical product is still under developmentand further extending to the manufacturing process.
b) To have a better understanding about the impact of raw materials and process parameters on the quality of the final product. This is especially important for biopharmaceutical products since raw materials like cell culture media can be the risk for variability, effecting important factors likecellular viability, cell growth and specific productivity.
c) To collaborate closely with rest of the industries and the regulators and successfully keep up with the regulatory reviews
d) To maintain harmonization in all the regions so that a single CMC submission worldwide is all that is needed.
e) To encourage continuous quality improvement for the benefit of patients.
f) To enable better product design that will have less problems while manufacturing, thus facilitating more efficiency in the manufacturing process.
g) To make post-approval changes easier since it will be contained within a pre-defined design space, thus resulting in regulatory flexibility.
Every production process in a pharmaceutical industry to implement certain control strategies with the ultimate goal of a robust process. A robust process is the gateway to high product quality at the end of the day [9]. Process variability stands as a hurdle to process robustness, and this originates from lack of control on the process parameters. Thus, QbD steps-in to avoidbatch to batch variability in pharmaceutical products [10].
The net outcome of the detailed QbD study (applied in any product) is the segregation of process parameters with respect to their criticality and the finalization of a proven acceptable range (PAR) for every operation. The knowledge that is gained post the QbD evaluation encompasses every minute detail of the operational process as well as the product in general, and lead to the defining of a Design Space. This way, the impact that the manufacturing process might have with regard to the variability of the CQAs becomes apparent, which helps in strategizing testing, quality and monitoring of batches [11].
Process Evaluation: Linking Process Parameters to Quality Attributes
It is important to carefully evaluate the process completely before applying QbD to it. The better knowledge you have of the process, the more effective your QbD will be. Moreover, process characterization is required to specify the proven acceptable ranges (PAR) for critical process parameters (CPPs). In the traditional approach that is implemented in biopharmaceutical production, existing empirical process knowledge is used on a daily basis. However, this approach leads tolaborious and time consuming post approval changes during process adaptation and any new technology implementation that may have become necessary for raising the efficiency of the process. Also, the effects aprocess scale-up can have on the quality of the final product cannot be predicted when using the empirical process development.
This can increase costs and also can cause difficulty in implementing any changes in the set manufacturing process. Thus was born a way to achieve deeper understanding of processes which would lead to greater flexibility and freedom to effect changes. The concept of operation under a pre-defined design space gave this flexibility. Design space is nothing but a concept that is a part of the “Quality by Design” (QbD) paradigm. Now, manufacturers are to follow a science-based process development than their empirical counterpart.
The QbD Concept is Best Explained in this Flowchart Below
Define a Quality Target Product Profile (QTPP) for product performance
Identify its Critical Quality Attributes (CQAs)
Create experimental design (DoE)
Analysis done to understand the impact of Critical Process Parameters (CPP) on CQAs
Identify and control the sources of variability.
Process characterization sets the ball rolling in any process development, which employs a sound risk assessment rating the various critical process parameters according to their importance [12-14].
Downstream Processing in Biotransformation
Downstream processes of biopharmaceutical industry essentially include the following steps:
a) Harvesting
b) Isolation
c) Purification
Various unit operations that constitute any biopharmaceutical process follow a designed sequence to form an integrated process [15]. Thus, any change in any one of the one-unit operation can affect the functioning of the subsequent unit operations. This is the reason why interaction effects between participating parameters across unit operations should also be taken into account during the process development. Interactions are said to happen when setting of a parameter will show effect on the response of another parameter. Due to this dependence between the parameters, the combined effects of any two parameters hailing from different unit operations cannot be predicted from their individual effects. Regulatory authorities demand inclusion of interactions of parameters within the QbD approach during any process optimization [16]
Example: Downstream processing of 1, 3-propanediol
Process: Fermentation
Fermentation broth that uses flocculation, reactive extraction, and distillation was studied. Flocculation of soluble protein as well as cellular debris that were present in the broth was carried out by using optimal concentrations of chitosan (150 ppm) and polyacrylamide (70 ppm). It was seen that the soluble protein that was present in the broth decreased to 0.06 g L-1. Recovery ratio (supernatant liquor: broth was found to be greater than 99% (Figure 1) [17,18].
The above flowchart shows a typical fermentation process broken down in steps. Glycerol fermentation process is taken as example for the illustration [19].
Go to
Case Study for API
API product development from the very nascent stages require a lot of planning when implementing QbD at every stage. Whether it is two-step process or a multi-step process, each and every operation and parameter needs to be scrutinized before creating a relevant design space. Brainstorming every possible roadblock that might threaten the quality of the final pharma product is what will help design a top-quality process. A futuristic vision is important in the initial steps of QbD planning. The most important part is to pay sufficient attention to detail lest critical aspects might be missed. This is best done by sitting with the entire development team and taking every minor detail into account. Given below is a case study for a API intermediate development process with the help of QbD that highlights the important steps as to how to go about implementing it from the very beginning of your research. QbD is done best, when it is implemented from the very nascent stage of product development.
Quality Target Product Profile
When making your QTPP, make sure you list down everything from your vendor details to target costing. This step basically asks you to think of every aspect of your product and make a comprehensive profile of it. The specification of quality must be highlighted here with all the challenging impurities that might threaten your quality. Everything from stability testing requirements to raw material quality [20] is encompassed in this stage of QbD.
CQA Determination
Given below are some typical CQA parameters that are considered in most of the enzymatic methods of API intermediate preparations.
a) Purity
b) Chiral purity
c) Enzyme residue
d) Assay
e) Appearance
f) Residual Solvent
g) Yield
h) Polymorphic forms
i) Moisture content
j) Melting point (Table 1)
Initial Risk Assessment
The risk assessment can be done in various ways and is the customizable step in QbD. This part calls for a group-discussion or a team meeting where everyone can list down all possible risks related to the project in discussion and grade each one in the list with the amount of risk that it poses. The simplest module suggests you number them 1, 2, 3 with the increasing or decreasing order of the risk threat. A more complicated and detailed risk assessment requires linking of CQAs and CPAs to highlight the risk of their interdependence (Figure 2) [21].
Post risk assessment, comes the control strategies to be followed to tackle the possible risks that are probable. The control strategies are for you to think and execute to achieve your target quality specifications.
Design of Experiment
This is a valuable tool for channelizing your experimental work, to move ahead in a systematic manner. Design of experiments can be of several types: comparative, screening, response surface modeling, and regression modeling [1].
Comparative Experiments: The aim of this study is simple, i.e., picking best out of two options. The selection can be done by the comparison data generated, which is the average of the sample of data.
Screening Experiments: If you want to zero-in on key factors affecting a response, screening experiments would be the best bet. For this, list down concise list of factors that might have critical effects on response that you desire. This model serves as preliminary analysis during development studies.
Response Surface Modeling: Once you have identified the critical factors that affect your desired response, response surface modeling comes handy to identify a target and/or minimize or maximize a response.
Regression Modeling: This is used to estimate the dependence of a response variable on the process inputs.
A step by step guide is given for the DoE step of the QbD process (Figure 3).
Response columns were filled post experimentation as per the design creation (Figure 4).
Factorial Design Analysis Done as Given Under
Analysis Done First for One of the Responses, “Yield”: (Table 2)
P-Values Were Checked for Significance and Higher P-Value Term Eliminated First to Create a Reduced Model:
(Table 3) (Figures 5 & 6)
Observation
From the above graph, significant interaction between the two terms can be inferred.
Analysis Done for the Response “Diacid”: (Table 4)
P-values Checked for Significance and Higher P-Value Term Eliminated First to Create a Reduced Model: (Table 5) (Figure 7)
Observation
From the above graph, significant interaction between the two terms can be inferred.
Response Optimizer Was Used to Optimize Both The Terms With Respective to The Given Responses- Yield and Diacid: (Table 6) (Figure 8)
The optimized parameters predicted for maximum yield and minimum impurity (of di-acid) was found to be 8pH and 37C.
Go to
Case Study 2
As mentioned before, regression analysis is another important tool that can be used to study existing data. This means that if you have done some experiments (without designing them beforehand), you can quickly run a regression analysis of the collected data to derive a relationship between CPPs and the reaction results.
A lot of times, when one follows the one-factor-at-a-time optimization process, by the time any CPP is optimized, a lot of data stands generated. Instead of just tabulating the data and wasting time manually making sense out of them, regression analysis can come to your rescue. As always, graphical data representation seems much easier to understand and also saves your valuable time.
The effect of pH was studied [22] separately in the preparation of deoxynojirimycin base (stage III). The reaction involved N-formyl amino sorbitol, water, oxygen and whole cells of Gluconobacter oxydans DSM2003. Later involvement of sodium hydroxide and sodium borohydride gave rise to deoxynojirimycin. Further work-up and 2-methoxy ethanol facilitated crystallization yielded Deoxynojirimycin base. In this experiment, pH of the reaction was changed to find out its role during the reaction and a regression analysis was run using Minitab to study this affect.
Observations recorded showed that reaction did not occur at pH2 and at pH8, the reaction did not reach completion. The optimum pH range between 4 to 6 showed certain effect on yield and purity. The significance of pH variation during the reaction was thus established as described below (Graphs 1-3):
When null hypothesis p-test was carried out, no significant effect of pH was to be found on product purity, impurity1 and impurity2, but its significant influence was seen in minimizing impurity3.
Furthermore, large-scale batches conducted were statistically analyzed as well to achieve better understanding of the influence of list of parameters on the output obtained. The following parameters were studied during the stage III reaction described above:
a) pH, RPM and Oxygen cylinders consumed during the course of the reaction.
Their effect on the output and reaction completion time was studied. It was seen that only RPM showed statistically significant effect on the reaction completion time and rest of the factors did not contribute to any significant effect on the output or reaction completion time.
During biotransformation process, i.e. during oxidation of N-formyl using Gluconobacter oxydans DSM2003 whole cell, three main unknown impurities peaks were observed in HPLC chromatogram while reaction monitoring. This process is capable of removing these three impurities during down streaming, work up & isolation to the levels mentioned below:
a) Impurity 1 (has defined RRT on HPLC chromatogram) not more than 3%
b) Impurity 2 (any other unknown impurity) not more than 1%
c) Impurity 3 (has defined RRT on HPLC chromatogram) not more than 10 %
Since higher level of impurities affect the yield of the process, efforts were carried out to study the factors which can reduce the formation of process impurities.
Go to
Conclusion
The concept of Quality by Design (QbD) is highly reliable when it comes to achieving foolproof quality of your product. This is a modern tool that is going viral in Pharmaceutical industry especially because this industry demands high quality standards and tolerates no compromise when it comes to the quality. Breaking down QbD, it essentially comes down to identifying the critical parameters of the process and assigning a particular design space for every single critical attribute. Thus, QbD can be considered as an intelligent approach to quality that yields robust processes. QbD also ensures that there is continuous improvement in the process during the entire lifecycle of a Pharmaceutical product [23].
Go to
Acknowledgement
Our group would like to thank the Department of Scientific and Industrial Research India, Dr. Hari Babu (COO Mylan), Sanjeev Sethi (Chief Scientific Office Mylan Inc); Dr. Abhijit Deshmukh (Head of Global OSD Scientific Affairs); Dr. Yasir Rawjee {Head - Global API}, Dr. Sureshbabu Jayachandra (Head of Chemical Research); Dr. Suryanarayana Mulukutla (Head Analytical Dept MLL API R & D) as well as analytical development team of Mylan Laboratories Limited for their encouragement and support. We would also like to thank Dr. Narahari Ambati (AGC- India IP) & his Intellectual property team for their support.
To know more about Journal of chemistry,
Click here:
https://juniperpublishers.com/omcij/index.php
To know more about juniper Publishers,
click here:
https://juniperpublishers.com/index.php
3 notes · View notes
Text
Effect of Addition of Silica Fume and Oil Palm Fiber on the Engineering Properties of Compressed Earth Block- Juniper Publishers
Tumblr media
Abstract
In the recent years, the population has increased to use fire bricks and concrete blocks causing to raise demand and price. Further, people turn back using new techniques such as mixing the earth block with silica fume and oil palm fiber to improve compressed earth block to aid people to use it, especially people with low income. The compressed earth block has a weak resistance to compressive strength which applies on it., also has a high percentage of water absorption due to the voids in earth block. This study aimed to improve the durability and compressive strength of compressed earth block by using additives such as silica fume and oil palm fiber. In addition, to investigate which one of the additives have appropriate to obtain the highest compressive strength with reducing the percentage of voids. The purpose of this study is to mix the compressed earth block with cement and some additives such as silica fume and oil palm fiber to achieve the objectives mentioned previously. further, to identify the highest compressive strength to compressed stabilized earth block. This paper focused to implement the significant experiments such as Moisture Content test, Atterberg Limit Tests, compaction test, compressive strength test and water absorption test. The total of samples prepared to perform the experiments for compressed earth block cubes are 48 cubes. The results have shown that compressed earth block mix with additives such as silica fume and oil palm fiber has higher compressive strength than compressed earth block without additives by 62%. In addition, it indicated that the water absorption rate in the compressed earth block mixed with silica fume and oil palm fiber is less than the compressed earth block mix without additives by 0.3%. The silica fume and oil palm fiber materials in the compressed earth block are leading to decreasing spaces among the particles more than in compressed earth block without additives. Therefore, the results prove that earth block with additives is better than the earth block without additives. besides, the using of silica fume is more appropriate than to utilized oil palm fiber in compressed earth block.
Keywords: Compressed earth block; Compressive strength; Water absorption; Silica fume; Oil palm Fiber
Go to
Introduction
Earth is an ancient factory of material that has been producing a lot of different materials that were used in several ways around the globe. The suitable scientific usage of these materials and resources could provide a very significant change in terms of sufficiency of the efforts and solutions that being provided nowadays to cover the shortage among housing with low-income standard around the world or countries that suffering from the high prices of construction materials. In previous years, the popular material used to construct the buildings are soil, therefore, the soil plays a significant role in creating several of construction materials such as compressed earth block. Compressed earth block is the modern descendant of the shaped earth block, it also called adobe block. CEB was used for the first time in the beginning of 19th century in Europe by architect Francois Cointreau. CEB has offered a different option for conventional building practices that are relatively cheaper, using the local resources which has been found recently in some countries. It was used worldwide until the 70’s and 80’s, there was a new generation of manual, prompting to the emergence today for a real market to produce the compressed earth block. In the result, to increase the price of concrete blocks and more demand for using it. These reasons drive people to improve the compressed earth block by additives in order to obtain compressive strength equal the compressive strength in concrete blocks. thereby, the additives such as silica fume and oil palm fiber have the popular usage because of silica fume has a good surface area and it has a good percentage of silicon dioxide, the average size of silica fume is smaller than the average size of Portland cement by 100 to 150 times. On the other hand, Oil palm fiber is easy to obtain because it is the highest yielding in the world. It is cultivated in 42 countries in 11 million ha worldwide including Malaysia.
Taallah, Guettala & Kriker [1] studied the properties of compressed earth block while date palm fiber was being added and found that the better results could be achieved for compressive strength of compressed earth block with 0.05% of date palm fiber content, 8% cement and compaction pressure of 10 MPa. They also stated that general decrease in water absorption could occur if the cement content increased and date palm fiber was decreased.
Izemmouren, Guettala & Guettala [2] aimed to test the mechanical properties of steam-cured stabilized compressed earth block and here are their objectives which are analyzing the impact of the conditions and curing period on mechanical durability of stabilized compressed earth block with lime, and particular attention is given to impacts of blended treatment on the mechanical qualities, strength and resistance of Stabilized compressed earth block undergone a steam curing. The materials that used in their test were Soil, Crushed Sand, Water, Lime, and Stabilizers and they did tests from different percentage of lime 6%, 8%, 10%, the result show that when they use 10% of lime and 24 hours of curing the block was better and higher strength than 6%, 8%.
Cid-Falceto, Mazarrón, & Cañas [3] studied in this research was to resist the rain using the most industrialized building material to improve the physical properties of the earth block that was made in Spain, and the purpose was to analyze the utility of universal tests. They used three types of compressed earth blocks and all solid blocks. The second block added 6% cement and the third block was added with cement and quicklime 8%. The result for the three blocks was as follows: the first block was negative because the first block did not meet the procedure evaluation standard, but the result for the second and third blocks was positive because they met the standards.
Al-Sakkaf [4] reported the relationships between soil properties considering the several offsets and compressive strength and intensity soil. The tests were achieved throughout 1, 7 and 28 days for stabilized compressed earth block, and the materials that were used were soil 78.25%, Cement 10%, Bitumen 6%, Lime 5% and 0.75 calcium silicate. The purpose of tests at the laboratory were to compare the properties of the normal earth block with the additives. The results showed the best compressive strength was with the percentages of additives that was assumed. In addition, the highest compressive strength that was the mixture of lime and cement earth block.
Quagliarini & Lenci [5] used straw fiber to investigate its influence in the mechanical properties of the Roman bricks. The outcomes were in an agreement with another previous research announced were the straw fiber did not have any positive influence on compressive strength. On the other hand, mentioned that the straw fiber had a control on the plastic behavior and caused that was broken easily for specimens of Roman bricks.
Bouhicha, Aouissi & Kenai [6] studied four different types of soil mixed with barley straw fiber in various percentages. The results indicated that 1.5% increase in reinforcement led to the improvement of the compressive strength by 10-20% with considering the type of the soil. Furthermore, found that the compressive strength was reduced by 45% at 3.5% of barely straw fiber. The results appeared the declining fiber size causing the compressive strength was weak that was led to raising the capability of the blocks to distort at failure phase.
Qu et al. [7] showed the seismic behavior tests of the walls in stabilized compressed earth blocks controlled by bending. There were approximately four specimens of walls with the height was 1.8m that were constructed and tested using the standard unit which was 100* 150* 300mm 4”* 6” * 12 “block and which was known as “Rhino Block” three “grout channels”, and there were also two locking bolts. They mixed the materials to provide the best resistance, the required constructability, and the most stable pressure, they used the sand 10, 0%, soil 74.3%, cement 6.2% and The results appeared that the second and fourth walls went according to plan, but at the time of testing the first and third walls, mechanical failures occurred; which caused damage, including cracks, and permanent distortion, third damaged walls at the top of the sample caused by mechanical failures in the first and third walls. Ghavami, Toledo Filho & Barbosa [8] used Coconut and Sisal fiber as reinforcement to exam the behavior of the soil. The proportions of the fiber used were 4% of Sisal fiber and 4% of Coconut fiber for two various specimens. The results appeared in this research that the use of Taperoa soil with 28% W/S ratio and 4% of Sisal or Coconut fiber will slightly increase the compression strength.
Morel Pkla & Walker (2007) [9] reported that was for the purpose of determining the compressive strength of the compressed earth blocks. Also, the RILEM test. Earth blocks were mostly used for small houses with two levels, but a few years ago became earth blocks have been used for to build 10-levels high buildings in Yemen. In over the last 50 years’ earth blocks have developed many countries such as Mayotte. Adding 4-10% of cement to earth block was to improve the compressive strength and water resistance comparison with the normal block. The purpose of tests was to achieve the critical failure limit. The RILEM was used to test for the compressive strength. The results appeared that the cement made the earth block had the significant defect because of tensile stress that was led to slit in the line of the load in the block.
Laursen Herskedal & Jansen Qu [10] indicated in this research was to study these goals, the first was to document the experimental behavior of the stabilized compressed earth block walls fixed to the slip. The second was to test the flexural strength of the existing wall and to relate it to the current coded analytical procedures for predicting resistance. The last objective was to develop an analytical procedure for prediction. The materials used to make the blocks are 74.3% soil, 6.2% Portland cement, 10% sand and 9.5% water. Overall, these results indicate that the first wall
A. and the second wall
B. were approximately identical, but for the third wall
C. it had a greater resistance due to increase the thickness of the wall.
Sharma Marwaha & Vinayak [11] used vernacular fiber (Pinus roxburghii and Grewia optiva) together with cement to improve the sustainability of rural adobe houses. The maximum increase in compressive strength were observed with 2% of Grewia optiva fiber and 2.5% cement in the range of 225-235% for both cubical and cylindrical samples (base strength 1.17 and 0.85 N/𝑚𝑚2 respectively of soil), followed by the sample mixed with 2% Pinus roxburghii and 2.5% cement which showed an increase in compressive strength by 87-145% for both cylindrical and cubical samples (base strength 1.17 and 0.85 N/ 𝑚𝑚2 respectively of soil).
Walker & Stace [12] have done an experimental study to estimate the properties and similarity of both lime and cement soil. Also, to check the average compressive strength. They did two tests the first test was by using 95.0% soil and 5.0% cement and the second test was 90.0% soil and 10% cement, the result shows that the compressive strength for the first test by using 5.0% cement reach 3.67 (MPa), but for the second test when they use 10.0% cement the result was 7.11 (MPa).
Millogo, Morel, Aubert & Ghavami [13] have done an experimental analysis for compressed adobe blocks and they used Hibiscus cannabinus fiber as reinforcement. The amounts used in this experiment were 0.2–0.8𝑤𝑡% of 30𝑚𝑚 and 60𝑚𝑚 lengths of Hibiscus cannabinus fiber. And they declared that there was an increase occurred for the compressive strength by 16% for the fiber of 30𝑚𝑚 lengths and by 8% for a fiber of 60𝑚𝑚 lengths. They also stated how the addition of 60𝑚𝑚 fiber had an undesirable effect on the compressive strength.
This research indicated that the compressed earth block has a lot of voids that allow water to enter the compressed earth blocks and making the resistance of the compressed earth block very weak. therefore, cement and additives were added to the compressed earth block to achieve the maximum compressive strength possible using some of the proportions proposed in previous studies. in addition, a comparison was made to obtain the best results among the compressed earth blocks using silica fumes and oil palm fibers.
Go to
Methodology
The main objective of this study aims to compare between compressed earth block without additives and with additives such as silica fume and oil palm fiber. Therefore, to identify the best additive material mentioned, the compressive strength test and water absorption test were done to obtain appropriate compressed earth block that gives highest compressive strength and resistance to water absorption. The tests carried out in this study include water absorption test, moisture content, Atterberg limit test, classification test, compaction test and compressive strength test. The number of compressed earth block without additives were 18 cubes, the compressed earth block with silica fume were 18 cubes and the earth block with oil palm fiber were 18 cubes. The total number of cubes carried out in this experiment were 54 cubes. in addition, the experiment was done in IKRAM’s laboratory. The general procedures for laboratory works are illustrated in Figure 1.
Go to
Preparation of Materials
Materials used for this study were soil, water, cement, and additives such as silica fume and oil palm fiber.
Soil
Soil is one of the main materials in this study that used about 80% of the components of the compressed earth block. it was brought from behind Unipark Condominium in Kajang, Selangor, Malaysia. the excavation was 1.5m deep under level ground. the soil was tested in several experiments as follows:
Moisture content
Moisture content test is a method for determination of the water content of a soil as a percentage of its kiln-dried weight. It is important as a guide to classify the normal soil, it can be useful to calculate other properties as the Atterberg Limits.
Atterberg limits
A. Liquid limit LL is the lowest water content at which the fine-grained soil behaves like a viscous mud, flowing under its own weight. It is the transition water content between plastic and liquid states. It along with the plastic limit provides a mean of soil classification as well as being useful in determining other soil properties. At the liquid limit, the soil has a little strength.
B. The Plastic limit is the moisture content at which a soil becomes too dry and the lowest water content at which the soil to be in a plastic specification, as determined by the Plastic Limit Test.
Classification tests
Hydrometer analysis: Hydrometer analysis is the procedure generally adopted for determination of the particle size distribution in a soil for the fraction that is finer than no. 200 sieve size 0.075mm. The lower limit of the particle size determined by the procedure is about 0.001mm, in hydrometer analysis; a soil specimen is dispersed in water. In a dispersed state in the water, the soil particle will settle individually.
Sieve analysis test: Sieve analysis test can make the soil be classified into coarse, smooth, and by regularity coefficient of the soil can be classified into the regular gradient, good gradient, and poor gradient.
Compaction test: Compaction is a mechanical procedure by which the soil particles are set intently by reducing the air voids. By reducing the air voids the thickness and the shear quality of the soil will also be expanded.
Additives
Silica fume: Silica Fume also is known as (microsilica). MicroSilica Buzolanah is material that added to concrete block mixes by-product of the process of production of ferrosilicon metal in electric arc furnaces where it is obtained from the smoke from the furnace through the stacks by the condensation process. Notes that the blocks of the container on the proportion of silica fume (5%) has increased by (13%), the increase in the block of up to (84%) when the silica fume proportion (20%) Bottom line so that downturn increases with the proportion of silica fume, the reason is due to the effectiveness of Buzoalnah materials that contain silica fume, which react slowly, causing size reduction resulting from the interaction and cause an increasing downturn as a result of drought.
Oil palm fiber: The oil palm is the main edible oil crop in the world. It is grown in 42 countries on 11 million hectares worldwide. Southeast Asian countries such as Malaysia and Indonesia, India, Latin America and West Africa are the main oil palm crops. Each year, 1ha of oil palm produces about 55 tons of dry matter in the form of fibrous biomass, which gives 5.5 tons of oil.
Water
The common specifications regarding the quality of the mixing water is that the water should be potable. Such water should have an inorganic solid of less than 1000ppm.
Cement
The cements used in the construction are generally inorganic, often based on calcium silicate or lime, and can be characterized as being hydraulic or non-hydraulic, depending on the capacity of the cement to be taken in the presence of water.
Go to
Mix Proportion
The performance of the mixes was specified by the researcher after the experiments were carried out and the result was analyzed while the mix proportions was shown according to the suitability and workability of both Oil Palm Fiber data and Silica fume. The mix proportion for Oil Palm Fiber 10% cement, 10% water, 0.03 oil palm fiber and 79.97% soil. And for the silica fume 7% cement, 10% water, 3% Silica fume and 80% soil. This is most rational approach to the selection of mix proportions with specific materials in mind possessing more or less unique characteristics. The approach results in the production of compressed earth block with the most economically appropriate properties.
Compressive strength test
Compressive strength test was done to know the limit of compressive stress that leads to failure of the compressed earth block.
Water absorption test
The water absorption is one of the main factors which affect in compressed earth blocks. Therefore, it is significant to identify the rate of water absorption of compressed earth blocks because the high rate of water absorption of specimens may cause weakness of the stabilized compressed earth block which will result in losing strength with time.
Go to
Results
The experiment was carried out in accordance with standard stated in BS 1377 procedures for laboratory testing of soil. The purpose of the tests is to achieve the goals of this research. The laboratory tests to be carried out on the soil samples (S1, S2) that were obtained from these specific locations are particle moisture content, atterberg limits, hydrometer test, sieve analysis, compaction test. The average moisture content of the samples is 20%, atterberg consist of liquid limit and plastic limit and the average result for each test are 46% and 30% respectively, the compaction test result has a maximum dry density (γd) of 1.858mg/m³ and the optimum moisture content (%) is 12.50%. The result for hydrometer test and sieve analysis test is shown in the Figure 2.
The results of compressive strength after 7,21 and 28 days shown that the earth block mixed with cement and silica fume is stronger than the control and oil palm fiber which the average for control is 0.66MPa, 1.3MPa and 1.47MPa, the average for oil palm fiber is 0.62MPa, 1.71MPa and 1.81MPa and for earth block mixed with cement and silica fume is 0.80MPa, 2.2MPa and 2.38MPa. After 21 and 28 days the difference becomes much bigger because the cement and silica fume mixed with soil and it increase the intensity and strength of the earth block. In the Figure 3 below shows the difference between the control sample and the mix with stabilize materials in 7, 21 and 28 days.
As we know that when the water absorption rate is low that mean the block is good and when the water absorption rate is high that mean the block is not that good. The results shown in the chart below and it can be seen that after 7, 21 and 28 days, the average of water absorption for control sample is 2.0%, 7.6% and 8.1%, the average for oil palm fiber is 2.7%, 9.4% and 9.7% and the average for earth block mixed with cement and silica fume is 1.9%, 7.4% and 7.8%. Which show that the absorption rate for the earth blocks mixed with silica fumes are less than the control block and oil palm fiber and that is because silica fumes bond the laterite particles together thereby reducing the pores which reduce the percentage of the water to flow into the blocks. The percentage of water absorbed by the blocks are shown in Figure 4.
Go to
Conclusion
The conclusion of this research was given based on the objectives that were shown at the beginning of this research. The purpose of this study was to improve the durability and compressive strength of compressed earth block by using additives such as silica fume and oil palm fiber and to investigate which one of the additives have appropriate to obtain the highest compressive strength with reducing the percentage of voids. Two tests had done for compressed earth block after adding additives which showed the blocks have been improving in durability also the strength of the blocks have been increased. A comparison has been done between compressed earth block with additives which are Silica fume and oil palm fiber. Results showed that when the compressed earth block mix with silica fume the compressive strength got the highest which was 2.38MPa compare when it is mix with Oil palm fiber which was 1.81MPa. Water absorption test was also done and according to the recorded results, the average water absorption of silica fume and oil palm fiber were 7.8% and 9.71% respectively, which showed that compressed earth block with silica fume is much better to compare with oil palm fiber.
For more open access journal, please click on Juniper Publishers
For more civil engineering articles, please click on Civil Engineering Research Journal
https://juniperpublishers.com/cerj/CERJ.MS.ID.555684.php
0 notes
Antimicrobial finishes for Textiles - Juniper Publishers
To know more about Journal of Fashion Technology-https://juniperpublishers.com/ctftte/index.phpTo know more about open access journals Publishers click on Juniper Publishers
Tumblr media
Abstract
Infestations by micro-organism instigate cross disease by pathogens and odor develops in fabric have direct contact with the skin. Moreover, discoloration, tints and loss of functional characteristics of textiles are consequence of microbial damage. Antibacterial finished textile is an important area for medical and hygienic applications and there is enormous need of non-toxic and eco-friendly antimicrobial agents. The synthetic biocides finishes extensively reported were polyhexamethylenebiguanide (PHMB), quaternary ammonium compounds (QACs), metals (including metal oxides and salts), triclosan and n-halamines. Whereas, the natural based biocides (aromatic compounds, dyes, essential oil), chitosan and antimicrobial peptides (AMPs) were mainly considered among plant-based extracts. This paper will cover briefly, review of the latest research work on antimicrobial finishing, types of finish agents and various current developments in antimicrobial finishing to minimize the risks associated with application of organic, inorganic and plant based antimicrobial finishes
Keywords: Antimicrobial; Organic and inorganic finishes; Life; Textiles
Introduction
The major use of the antimicrobial was in the medical and the pharmaceutical industry. However, newer applications are possible. The textile fibers are these days increasingly treated with antimicrobial reagents. The other examples include the applications in food packaging and food storage, and medical, surgery and hygienic products etc. [1-3]. With the improvement of life standards, the demand of hygienic products is increasing for biocidal finishes in textiles (sports-wears, undergarment, bed-linen) and water filtration. The antibacterial finish treatment has become vital area of medical, surgical and healthcare activities due potential pathogenic microorganisms present in hospital environment and cause cross-infection diseases [4-8]. The types of micro-organisms include different kinds of organisms such as virus, bacteria, unicellular plants and animals, certain algae and fungi. Classification in bacteria family is “gram positive, gram negative, spore bearing or non-spore bearing type”. Some of the bacteria are of pathogenic nature that may cause infections to human [9]. A microbe (e.g. bacteria and fungus) normally protected with an outer cell wall that is composed of polysaccharides. The cell wall keeps up the veracity of cellular components and protects the cell from the extracellular situation; below the cell wall is a semi-permeable membrane that holds intracellular organelles, enzymes and nucleic acids. Chemical reactions within cell wall take place due to the enzymes present in cell wall. The nucleic acids hold the entire genetic directory of organisms [10]. The microorganisms responsible for microbial damage are generally present in surroundings; besides, formation of the substrates and the chemical processes may encourage growth of the microbes; further moist and warm environment still exaggerate the problem [11]. A gram-positive bacterium contains peptidoglycan and teichoic acid, peptidoglycan comprises of 90% of cell walls and made of amino acid and sugar. One example of gram-positive bacteria is Staphylococcus aureus that is in form of pair, short chain or graphic like cluster. Its size range is 0.5μm to 1.0μm and grows in temperature range of 35 to 40 °C.
Staphylococcus aureus is major cause of cross infection in hospital environment and 19% of total surgical infection. It’s also responsible for boils and also cause scaled skin infections. Other gram-positive bacteria are Staphylococcus epidermidis, Streptococcus pneumonia, Streptococcus pyogenes and Steptococcus viridians. The gram-negative bacteria are alike to gram positive bacteria apart from an outer layer of membrane affixed to peptidoglycan by lipoproteins which used to transport too low molecular weight substances. Gram negative bacteria are firm to diminish has compare to gram positive bacteria for the reason that of extra cell walls. An example of gram-negative bacteria is Escherichia coli (E. coli); its shape is similar to a bacillus and dwell in intestine of human. Escherichia coli can be proliferated during eating and/or usage of raw food stuff. The indications of E. coli are result in rigorous diarrhea (especially in kids) and kidney destruction. Other bacteria of this class are Klebsieella pneumonia, Pseudomonas aeruginosa, Salmonella typhi, Salmonella enteritidis and Haemophilusinfluenzae etc. [12]. Infestations by micro-organism instigate cross disease by pathogens and odor develops in fabrics that are worn after to skin or having direct contact with the body mass. Moreover, discoloration, tints and loss of functional characteristics of textiles are consequence of microbial damage [13-17]. Fungi moth or mildews are organism with lower progress ion rate; they stained the substrate and damage the fabric functional characteristics. Algae are classic microbes that either be fungi or bacteria, generate darker stains on the fabric surface [18]. Dust mites’ dwell in the home textiles and bed linen items include blankets, bed sheet, pillows and; especially in mattress and carpets. The dust mites feed on human skin and causes allergic reaction by healing waste products.
Antimicrobial Finishing Process
“The antimicrobial finishing process imparts the ability, to textile substrate, to inhibit the growth (-static) or reproduction of at least some types of microorganisms or to kill (-cidal) at least some types of microorganisms” [19,20]. Therefore, an antimicrobial finish should be capable to kill the microbes by breaching the cell wall or alter cell membrane permeability, obstructing the synthesis of proteins of microbes, blocking enzyme production necessary for microbes’ food. A few established antimicrobial agents, e.g. silver, quaternary ammonium compounds (QAC), N-Halamines; triclosan [21- 28] and polyhexamethylenebiguanide (PHMB) all are almost biocides [29-31].
Go to
Finishing Mechanism
Three finishing mechanisms may be recognized based on the antimicrobial function performed by the particular finish on the textile. These mechanisms include control-release, regeneration and barrier-block. The first two finishing mechanism having problems in usage
The problems with control release mechanism are its durability after laundering and leaching of antimicrobial agent from fabric which can come in contact with wearer’s skin. These agents have the potential to affect the normal skin, which could lead to extreme skin irritation and allergy issues. These problems can occur with the fabric using a regenerate mechanism as these agents require chlorine bleach to activate its antimicrobial properties after laundering. Chlorine bleach not only damages the cotton fabric but is also harmful for human skin. Barrier- block mechanism does not pose the problems associated with other two methods. These agents are bonded on fabric surface and do not leach, thereby killing the bacteria that come in contact with the fabric [32].
Classification of Antimicrobial Finishing Process
The antimicrobial finish can be applied by physical and chemical methods, and by adding functional agents on to textile fibers. Such functional finishes can be of two main types, i.e. temporary antimicrobial finish and durable antimicrobial finish. The temporary finishes may lose easily while come into contact with skin or body fluids or during washing process because of weak bonding of finishing agent with fibers surface. Durable finish can generally be achieved by adding an antimicrobial finishing agent into fiber or textiles in wet processing, this method also known as controlled release mechanism. In such treatment, the finishing agent itself bonded with the fiber surface or a bonding substance may be used. The treated textiles deactivate bacteria by slowly releasing the biocide from the fiber or fabric surface [32].
Types of Antimicrobial Finish
A variety of chemical agents are available that may impart significant effect in textile fibers to inhibit the growth of microorganism. The important types of antimicrobial chemical agents are described in the following sections
Organic antimicrobial agents such as quaternary ammonium compounds (QACs),N-Halamines, Polyhexamethylene Biguanide; triclosan,; silicon based quaternary agent [33]; iodohors, phenols and thiophenols, heterocyclics, inorganic salts, nitro compound, urea, amines and formaldehyde derivatives, have been applied for antimicrobial treatment of textiles [34].QACs have been tested for antimicrobial activity of protein base wool, cellulose base cotton, synthetic base polyamides and polyester, the MIC value 10-100mg/l presented good reproducibility and good washing durability. These formations kill the microbes by altering cell membrane permeability, obstructing the synthesis of proteins of microbes, blocking enzyme production necessary for microbes’ food. The N-halamine compound is used for the development of antimicrobial cotton fabric through pad-drycure process followed by the exposure to chlorine bleach. The chlorinated sample showed potential antimicrobial ability against gram +ve and Gram –ve pathogens. It was experimented that on chlorinated after 15 days storage 85% of chlorine could be recharged that shows N-halamine compounds have good biocidal efficiency for healthcare textiles. Another organic based antimicrobial agent, Triclosan has been investigated for its antimicrobial ability for polyester, nylon, regenerated cellulose and acrylic fibers; with MIC value below than 10ppm versus bacteria against. Triclosan has excellent durability after use/washing and it prevents microbial growth by obstructing lipid biosynthesis. The most acceptable organic agent used for healthcare procedures, pharmaceutical and food industry is Poly-hexamethylene biguinide (PHMB). It’s efficient against both types of bacteria, in addition to yeasts and fungi. PHMB is slightly toxic and fewer skin infection issues were reported. It used in variety of products including undergarment and towel fabric to obstruct microbial growth and exhibited good washing durability. PHMB is bacterio-static at 1-10mg/l but at elevated values its bactericidal activity and inhibition rate raise collectively. The utmost antibacterial inhibition action of PHMB obtained between 5-6pH value [35,36].
Inorganic antimicrobial agents
The inorganic finishing agents such as metal oxides, copper and zinc, titanium, magnesium, silver and gold were applied for antimicrobial effects on textiles. These agents exhibited good durability for cellulose, protein, regenerated and synthetic materials with MIC value 0.05-0.1mg/lversus gram negative bacteria, E.coli. Sliver is wide acceptable inorganic antimicrobial agent and kills microorganisms by blocking and disengages the intracellular proteins. However, silver is a slight toxicagent, it releases slowly and can worn-out of the fabric [37-40].
Zeolites of chabazite-type with its optimal morphology and lowest silicon to aluminum ratio (Si/Al)solution that replaced with different combinations of silver, copper, and zinc ions to prepare single, binary, and ternary metal cation-modified zeolites were experimented and silver based zeolites exhibited more antimicrobial activity than the others and demonstrated good/suitable mechanical characters and excellent biocide effect against food borne bacteria and fungi on green polyethylene developed based on injection-molded composite. Further, the result confirmed its capability to rule the propagation of dangerous pathogens in environment of food processing and storage. Thus, these innovative antimicrobial materials are prospects for hygiene surfaces, kitchen accessories and packaging applications [41].
Limitations of inorganic and organic agents
In general, antibacterial property of any inorganic finishing agent is established with its chemical components. The biocide efficiency of inorganic agents slowly drops in use and during wash. The most of such agents carry limited intensity of microbes’ inhibition, moreover they are poisonous, initiate skin problem to humans and having problem to decompose in down streaming [42,43]. To reduce the risks allied with the application such inorganic agents, there is enormous need of substitute agents for antimicrobial treatment of textiles. As mentioned early, a wide range of organic antimicrobial agents are available for textiles treatment but out of these agents; triclosan, quaternary ammonium compounds, Polyhexamethylene Biguanide have been used on commercial scale. Polyhexamethylene Biguanide is slightly contaminated with poisonous concerns and hard to decompose in down streaming. In US Preregistration Eligibility Decision for PHMB by US Environment Protection Agency” the discharge of effluents containing PHMB is not allowed without mandatory treatment.
Eco-Friendly Antimicrobial Agents (Natural Plant and Fruit Extracts)
Plant extracts provided an attractive source of eco- friendly antimicrobial finish. The natural cure using plant extracts is increasingly receiving interest in the development of antimicrobial textiles. One of the plant-based sources is belong to Meliaceae family Neem (Azadirachtaindica); it is one of the most prominent from natural gifted sources of antimicrobial compound. All parts of neem are established for potential antimicrobial constituents. The extract from each part of the neem presented active antimicrobial effectiveness to block the proliferation of the bacteria. Currently, a small number of studies has demonstrated neem’ used for textiles to evaluate it antimicrobial activity. However, cotton and cotton/polyester blended fabric treatment with seed and bark extracts were reported [44-46]. Moreover, the cotton fabric imparted with neem leaf extract loaded nanoparticle [47]and synthesis of sliver nano-particles using extract of neem leaf for cotton treatment was also used [48]. Another plant-based source belongs to the Liliaceae family Aloevera (Aloe barbadensis), its leaf extract has antibacterial and antifungal potential and have been used for dressing gauzes, sutures and other medical textile applications [49,50]. Similar to neem applications in textiles, a few studies of aleovera application for cotton fabric treatment were articulated. However, more research is required. Antimicrobial finishing of cotton and cellulose fibers is significantly useful and important in medical textiles utilization [51]. One of the other plant-based sources is Ginkgo biloba or Ginkgoceae (Mantissa Plantarum Altera). Ginkgo biloba tree has flourished in jungles for more than 150-250 million years. It is assumed to be one of the aged living species on earth [52-54]. The standardized extract formulation of ginkgo leaf in used hold “5-7% ginkgolides and bilobalide (BB) [55]”. It is an excellent candidate for antimicrobial treatment of healthcare cotton textiles. The formulation of Ginkgo biloba extract standard values were forced because cyto-toxicity issues was reported beyond these limits [56-58]. Jang and Lee investigated ginkgo leaf extract antimicrobial activity for Tencel fabric in extract formulation containing silicon softer along with crosslinking agent.
The study concluded that Ginkgo Biloba extract is ecofriendly antimicrobial agent and their application was investigated in health and medicinal purposes. It is exclusively for non-toxicity characteristics linked with such other agents, it is potential candidate for antimicrobial finishing of institutional textiles range including home accessories and hospital bed sheet, nurses’ uniforms, surgical gown and drapes etc. [59]. The plant based natural fruit source reported for antimicrobial properties, the fruit-based source is Pineapple (Ananascomosus) juice was investigated against harmful microbes60. The antimicrobial activity was evaluated through agar diffusion method. Another plant base source reported is Papaya (Carica papaya). Its fleshy tissues hold three influential antioxidants i.e., vitamin A, C and E. Further, it contains stuff of proteolytic enzymes that have good antimicrobial activity against bacteria, fungi and virus. The papaya fruit seeds are spicy and very strong that yield them almost indigestible. These seeds have more potential pharmaceutical worth as compare to it’s the flesh and are effective against bacterial infection. The juice presented the excellent antimicrobial ability versus a number of gram native bacteria [60-63]. Moreover, the uses of both these fruit juices/ extracts were not reported to assess their biocide or bio-static activity.
The medicinal plants Clove (Eugenia caryophyllata), Falsedaisy (Eclipta alba), Leadwort (Plumbagozeylanica), and Mint (Mentha Arvenesis) parts were dried, powdered, grinded and extracted with solvents and applied through pad-drycure and microencapsulation techniques. The fabric samples were then subjected to antimicrobial testing and the bacterial growth was analyzed after 5, 10, 15 and 20 washing cycles. The antimicrobial activity of microencapsulated finish was effective till 15 wash cycles64. Several other plant base dyes are reported for their antimicrobial and antifungal activity such as Henna (Lawsoniainermis), Walnut and alkanet (Anchusa Tinctoria), curcumin, pomegranate, cutch, red onion peel and a mixture of red onion peel/curcumin (40 g/L, 50%)65. The extract of neem (Azadirachataindica), Lam (Buteamonosperma) and Gaetin (Litchi chinensis) trees was used to check antibacterial, antifungal activity and aesthetic properties (stiffness and appearance) of 100% silk fabric. It was confirmed that the formulation of antimicrobial finish improves the aesthetic properties. It is further reported that treated finished showed good/suitable/optimize results and 89% reduction in microbial growth was achieved up to 25 washes66. Application of the plant base dye-stuff is the art of imparting hues and tints to textile substrate. Dye-stuff or coloring matters acquired from natural resource are tested for antibacterial activity of the fabrics and results of the dyed fabrics presented these days have effective antibacterial activity. Although, synthetic dyes contain a range of vibrant color and are extensively used but now a day, natural dyes gaining interest because of strict environmental standard forced by a number of European states due to carcinogenicity and photosynthetic issues of synthetic dyes. Natural dyes are considered as eco-friendly, nontoxic, medicinal features [67]. The essential oil extracted from Rosemary (Rosmarinusofficinalis) and orange (Citrus sinensis) were obtained by steam distillation from rosemary vegetal mater and orange peel used to evaluate the antimicrobial activity for textile substrate (56% cotton/44% polyester) with concentration of 1%, 3% and 5%of each oil and antimicrobial activity was assessed against each strain. The demonstrated results support textiles functionalized with rosemary and orange essential oils, both are efficient active antimicrobial barriers with maximum reduction of 56.99% for rosemary and 92.48% reduction for orange essential oil [68].
Plant based Bamboo material is well known for their antimicrobial ability. In presented reported study focused to evaluate the antimicrobial property of plasma treated bamboo fabric imparted with combinatorial herbal extract. The knitted bamboo fabrics were rendered to plasma treatment at most appropriate setting to improve the hydrophilicity. The variations in the hydrophilic characteristics, physical and chemical changes of the plasma treated fabric were measured by using standard tests and combinatorial herbal powder was subjected to different solventextracts and their antimicrobial efficiency against pathogens were evaluated. The ethanol herbal extract presented higher antimicrobial activity against E. coli (12mm), S. aureus (14mm) for zone of inhibition and tests proved wash durability retained till 25 washes [69,70].
The chitosan and alginate have been used for antimicrobial finishing of textiles. Chitosan is derivate of chitin, water-soluble cellulose based. Chitin is a polysaccharide base on amino sugars. In an acid solvent amine component turn into quarterly amino unit that inhibits growth of microbes. Theses amino unit performs as shield to block protein and slow down proliferation by distracting cell membrane; this permit the substance to escape from bacterial cell, consequential results is death of the bacteria. Antimicrobial activity of chitosan was reported in many studies and it is widely accepted antimicrobial agent [71]. The β-Cyclodextrin, Chitosan citrate and β-Cyclodextrin/Grafted Chitosan with lavender essential oil were also used to evaluate the combined effect of fragrance and antimicrobial activity on cotton textiles through pad-dry method. The results discovered that β-CD was highly soluble in 0.6g/l NaOH solution and 80 gpl β-CD and 6% essential lavender oil solutions were found to be a most suitable combination for fragrance and antimicrobial property [72]. In another study, the most common polymers polypyrrole (PPy) was used with its environmental stability, ease of synthesis, exciting chemical, electrical, electrochemical and optical properties [73]. The antimicrobial activity of polypyrrole-graftchitosan copolymer was investigated by chemically synthesized, and then its composition and morphological characteristics were evaluated. The results discovered the strong interactions among polypyrrole and chitosan chains. Further, the electrical conductivity of chitosan increased to semi-conducting level by grafting. The thermal stability and crystallinity of polypyrrolegraft- chitosan copolymer increased while compared to chitosan. The copolymer was evaluated versus various bacterial and fungal strains at different concentrations and results achieved were evaluated with the reference antibiotics and it was pointed that the polypyrrole-graft-chitosan copolymer has stronger antibacterial activity than the polypyrrole and the chitosan alone; and it further increased at higher concentrations [74]. Monica Periolatto reported, the sound fastness and stability was attained with both photo-grafted chitosan and polypyrrole coating on textiles. It was proposed that a synergic impression of polypyrrole-chitosan finish, exploitable in textiles [75].
Go to
Challenges Associated with Plant Finishes
The chitosan is one of the recognized bioactive agents used on commercial scale for fabric antimicrobial finishing. On the other hand, its effectiveness is spoiled by a few factors. It depends on the chitosan molecular weight, pH value, ions intensity, add-on of non-aqueous solvents and the grade of deacetylation. Moreover, their treatments for textiles are efficient at maximum concentrations consequently reduce the air permeability of fabric and impart stiffness. Normally, herbal extracts including chitosan evaluated for their antimicrobial activity for textiles reported various issues, such as problem in extraction, separation of bioactive substances, textiles treatment with bioactive agents and the most important concern is poor finish durability after uses and during washing. Regardless of few most important challenges linked with plant based antimicrobial finishes, nevertheless these extract formulations are appealing with their non-toxic and environment friendly characteristics [76,77].
Go to
Role of Nanotechnology in Antimicrobial Finishing
Nanotechnology may provide finishes to combat infectious pathogens. Their application through nanotechnology engaged several parameters that control, manipulate and assemble nanoscale constituents to develop materials, systems or devices. Studies reported that the silver nano-particles exhibit excellent antimicrobial property versus microorganisms. For examples, the expedient use of antimicrobial metals such as zinc, copper and silver were incorporated into an FDA-approved polymer (polycaprolactone– PCL) to produce filaments. Hot melt extrusion was used to extrude pellets obtained by vacuum-drying of solutions of PCL and the different metals in order to manufacture metal-homogeneously-loaded filaments. Wound dressings with different shapes were produced with the filaments containing different concentrations of metals. The antibacterial efficacy of the wound dressings was tested using a thermal activity monitor system, revealing that silver and copper wound dressings had the most potent bactericidal properties [78].
Now a day, metal oxide nanoparticles (MeO-NPs) become a potential substitution to combat toxic infectious complaints and substantially resistant to different types of antibiotics [79]. ZnO particles nano-structured use on the cotton textile surface with different surfactants to stabilize, homogenize the coating and has improved the durability of ZnO NPs with decreased its leaching and showed the highest antibacterial and antifungal activities against different pathogenic bacterial and fungal species with high reduction reached over 90% [80]. Another technique used to investigate the antimicrobial property, i.e. application of Znonano-particle and use soluble starch as capping agents revealed that antimicrobial activity is oversee by the type of capping agents and results in achieved lower particle size of 3-5nm and higher antimicrobial rate as compared to other capping agents [81].
The use of Copper nano particle/nano-composite for antimicrobial ability in glycerol-polyvinyl alcohol matrix in gel and moldable plastic form proves that it can be produced and easily figured at high temperature. The materials show very good long-term stability in air, protecting the produced copper nano-particles from oxidation and proven inhibition of bacterial proliferation of both Escherichia coli and Enterococcus faecalis bacteria in nano-composite existence [82]. Biocompatible nanogold (AuNPs) have gained considerable attention for potential applications in nano-medicine due to their characteristic size dependent chemical, electronic and optical properties and displayed antibacterial efficacy towards different bacterial species and the MIC was evaluated to be 960μL/ml against S. aureus [83].
To overcome the toxicity and washing durability problems associated with plant-based extracts, application of chitosanneem nano composites for development of antimicrobial cotton was used. Silver nanoparticles micro-gel based on poly-(N-isopropylacrylamide) and chitosan [84]; and chitosan nanoparticles loaded with Fe2+ or Fe3+ surfactant-assisted chitosan chelating Fe2+, Fe3+ and ionic gelation chitosan showed very high antimicrobial property at lower concentrations as compared to chitosan [85]. In another study, the results confirmed the biosynthesized AgNPsusing pre-hydrolysis liquor of Eucalyptus wood as effective growth inhibitors against microbes for various biomedical applications [86]. Further, Chitosan and acrylic acid bi-grafted polypropylene melt-blown nonwoven membrane immobilized with silver nanoparticles presented excellent antibacterial and hydrophilic properties [87].
Go to
Conclusion
An interesting variety of antimicrobial finishing agents is available. However, limitations are possible to provide acceptable performance, environment-friendly traits, and cost requirements. Majority of inorganic antimicrobial agents are poisonous, potential problem to degrade in environment, inhibited a limited range of microbes and possess poor laundering durability; but comparatively organic agents have lower adverse effects. The use of nano-particle has improved the efficiency of some of the present use antimicrobial agents and reduced the environmental issues associated with these agents (such as toxicity and washing durability) and exhibit excellent antimicrobial property versus microorganisms. Moreover, despite the washing durability challenge associated with natural plants based antimicrobial finishes; they are widely accepted antimicrobial agents for textiles finishing with their eco-friendly and non-toxic characteristics. Use of plant based nano-particle antimicrobial agents has been growing in many different fields primarily due to their advanced characteristics and protection against pathogens as comparison to conventionally used biocides and such value-added finishes may provide sustainable healthcare applications in textiles
To know more about Journal of Fashion Technology-https://juniperpublishers.com/ctftte/index.php
To know more about open access journals Publishers click on Juniper Publishers
0 notes
juniperpublishersoa · 4 years
Text
Juniper Publishers| Overview and Future of Hemo-Components and Natural Guided Regeneration
Journal of Surgery-JuniperPublishers
The History of Platelet Rich Fibrin (hemocomponents) started in 1970, when Matras described a fibrin glue, formed by polymerizing fibrinogen with thrombin and calcium, which was used to improve skin wound healing in a rat model in 1970 [1]. Because of the low concentration of fibrinogen in plasma, the stability and quality of fibrin glue were low. A few years later several research works proposed an upgraded concept for the use of blood extracts, termed “platelet-fibrinogen-thrombin mixtures” or “gelatin platelet - gel foam” [2,3]. In this new concept, the fibrin glues were presenting a significant concentration of platelets within the final preparation. The idea was first to reinforce naturally the fibrin gel, and also to combine the healing properties of the platelets with those of the fibrin. This improvement allowed to prepare more natural products, integrating more natural blood constituents as it should.These products were the first platelet-rich plasma gels. These new strategies insisted in the role of platelets within the fibrin gel, and offered excellent preliminary results in ophthalmology, neurosurgery and general surgery. Whitman proceeded to develop this technique in 1997 and particularly Marx et al. [4,5] in 1998. The Leukocyte- and Platelet-Rich Fibrin L-PRF clot was often described as “optimized blood clot” that can be surgically handled and used. The rationale to use this glue/membrane and its success is due to fibrin, platelets, growth factors slow release, leukocytes and other cells: all these components are the key active actors of the natural healing process and combined together are forming a kind of engineered tissue extracted from the blood circulating tissue [6].Unfortunately at the moment there is a lack of an international standard for characterization, classification and identification of surfaces in implantable materials [7,8], in particular a standardization is needed to obtain an optimal and reproducible results, however the current classification of platelet-rich concentrates is based on their fibrin architecture and cell content. It consists in two main groups of products, platelet-rich plasma (PRP) and platelet-rich Fibrin (PRF), both of which are available in a pure or leukocyte-enriched form (L-PRP and L-PRF) [9]. Each product has an unique biological profile that dictates its clinical applications. L-PRF concentrates provide slow release of many growth factors and can be easily prepared during surgery [10-14]. They are inexpensive and autologous; therefore, they avoid the complications associated with allogenic blood use.Pure Platelet-Rich Plasma (P-PRP) products are preparations without leukocytes and with a low density fibrin network after activation. One largely advertised method of P-PRP is known under the commercial name PRGF [Plasma Rich in Growth Factors or Preparations Rich in Growth Factors or EndoRet, Biotechnology Institute BTI (dental implant company), Vitoria, Spain] and was tested in many clinical situations, particularly in sports medicine. P-PRP gel released most of its growth factors in the first hours and completely dissolved in the medium after 3 days, even after a maximum artificial fibrin polymerization.Leukocyte-and Platelet-Rich Plasma (L-PRP) products are preparations with leukocytes and with a low-density fibrin network after activation. The methods to prepare the PRP membranes require two or one centrifugations, there are, infact, some new faster machines like Arthrex ACP®, nevertheless an anticoagulant is always needed. PRP families are not adapted (complicated, expensive, with mixed clinical relevance) for daily oral applications. PRP families are substitutions to fibrin glues in most other surgeries, particularly to improve skin wound healing. The use of gelling of the PRP on the surgical site makes it adequate surgical adjuvants in many clinical situations, even if the exact effects - in comparison to fibrin glues - remain largely debated.The PRP solutions have also the advantage to be liquid before activation, and can therefore be used as injection or placed during gelling on a skin wound or suture (similar to the use of fibrin glues) in various sports medicine or orthopedic applications. In this strategy of regenerative medicine, the platelet suspensions are injected like other pharmaceutical preparations. The results of this method remain however largely debated in the literature, probably because of the large quantity of different protocols [14-16]. Pure Platelet-Rich Fibrin (P-PRF) - or Leukocyte- Poor Platelet- Rich Fibrin preparations without leukocytes and with a high-density fibrin network. These products only exist in a strongly activated gel form, and cannot be injected or used like traditional fibrin glues. However, because of their strong fibrin matrix, they can be handled like a real solid material for other applications.L-PRF membrane remains solid and intact after 7 days and relases continuously a large quantity of growth factors, a significant part of it being produced by the cell population within the membrane. L-PRF family fits the needs of the applications in oral and maxillofacial surgery, as L-PRF clots and membranes present a volume and shape easy to combine with most surgical techniques, as filling and interposition healing biomaterial or as protection healing membrane. The fibrin architecture of L-PRF is constitued by connected trimolecular junctions, due to a slow polymerization of the platelet concentrate and due to the absence of heterologous thrombin. The results of this process is a flexible fibrin network, able to promote the gradual release of growth factors and leukocytes migrationduring extended period.It is easy to prepare in large quantity and inexpensive, what makes it particularly adapted for daily clinical practice. PRF families in general are usable in other disciplines with interesting results, particularly for the treatment of skin chronic wounds and ulcers. The methods to prepare PRF never require an anticoagulant and a lower G-force is needed (around 400G). PRF products cannot be used as injectable products in sports medicine for example [12,17]. Some groups advocated that the presence of leukocytes may be negative for the therapeutic outcome, due to a potential risk of stimulation of the inflammatory process after the membrane placement in a wounded site [18]. Other researchers insisted on the need of some leukocyte population in the injectable PRP in order to increase the growth factors production, the release of anti-pain mediators and the natural anti-infectious activity.Some kind of leukocytes, lymphocytes in particular, are playing a key function as regulation turntable of the healing and inflammatory process, and there is no reason to discard them. Leukocytes are not only inflammatory cells: they also present anti-nociceptive effects through different chemokines, anti-inflammatory cytokines (IL-4, IL-10 and IL-13) and opioid peptides (b-endorphin, metenkephalin, and dynorphin-A) and can therefore promote a clinically relevant inhibition of pathological pain [19-21]. The classification previously described is the only nomenclature which considers all forms of platelet concentrates for surgical use. However, other classifications systems were proposed in the recent years, but are limited because they only refer to Platelet-Rich Plasma products and sports medicine applications. Both proposals are not significantly evidence-based and do not allow to improve the current terminology [22].Most publications about growth factors and platelet concentrations showed the relative lack of significance of these parameters, due to the many inter-individual variations and the short-term effects of these parameters: platelets being activated and active during a very short time and the growth factors being released, consumed locally or dissolved in the blood circulation in few minutes or hours after their release [23,24]. Platelet concentrates for surgical use are a system of all blood elements within a logical healing platform including the fibrin matrix, the platelets, the mediators and the cells all together to reach a clear and reproducible clinical result [25]. Castro in a systematic review founded favorable effects on hard and soft tissue healing and postoperative discomfort reduction were often reported when L-PRF was used, nevertheless, they found a lack of standardization of the protocol in regenerative procedure [26].Temmerman et al. [27] compared bone ridge preservation L-PRF socket filling and natural healing following tooth extraction after 3 months; the results showed the use of L-PRF as a socket filling material in order to achieve ridge preservation is beneficial for all parameters considered (vertical height changes, width reduction, mineralized bone) during a 3 month observation period. Furthermore, the use of L-PRF results in less post-operative discomfort and pain for the patients. Multiple surgical specialties have recognized the potential advanatges of platelet-rich concentrates. Their use has been described in ophthalmology, neurosurgery, general surgery [22] orthopedic surgery, sports medicine [28] and oral and maxillofacial surgery [29]. Several applications of L-PRF concentrate have been described in the literature including postoperative hand wound healing yielding faster re-epithelialization and in the treatment of androgenic alopecia diminishing hair loss among others [30- 32].The role of L-PRF in endoscopic endonasal skull base surgery defect reconstruction was investigated by Soldatova et al. [33] who demonstrated the potential benefits of L-PRF membranes for the reconstruction of skull base defects with encouraging rate of healing progression as measured by the crusting score. During the lasts years the production of platelet concentrates for surgical use from the PRF (Platelet-Rich Fibrin) family are becoming very popular in some surgical fields. The main product is classified as L-PRF and is used in oral and maxillofacial applications in particular. Many systems are available on the global market, but only one system to date is duly CE-marked and FDA-cleared (Intra-Spin System, Intra-Lock, Boca-Raton, FL, USA) [34].The impact of the centrifuge characteristics and centrifugation protocols on the cells, growth factors and fibrin architecture of L-PRF was investigated comparing 4 different centrifuges. The results showed significant differences in the vibrations level at each rotational speed between the 4 tested machines. The CE-marked and FDA-cleared device was the most stable machine in all configurations and it remains under the threshold of resonance, unlike the 3 other tested machines [35]. In another study M.F-Kobayashi demonstrated in vitro that reducing the centrifugation speed favored an increase in growth factor release from PRF clots which in turn may directly influence tissue regeneration by increasing fibroblast migration, proliferation and collagen mRNA levels [36].Go to
Conclusion
L-PRF treatment offers additional advantages: favorable effects on hard and soft tissue healing, postoperative discomfort reduction, simple harvesting, simplicity in use, no need for primary closure, and no risk for early membrane exposure. The economic implication in the final cost of a treatment has also to be taken into consideration. The vitro and molecular biology studies are very useful to understand which molecules are present in the clot and to hypothesize their role in the healing and regenerative process, however more clinical standardized studies are needed to demonstrate the quantity of growth factor is actually necessary to significantly improve the regenerative processes. Literature’s results are often discordant, several practitioners report different clinical experiences and mixed clinical outcomes. These unpleasant facts are due to a chaotic market and a lack of standardization of the procedure. Further researches and clinical trial under a rigid protocol are needed to fully understand the potential and optimal effect of L-PRF in regenerative procedures. To read more articles in Journal of Surgery Please Click on: https://juniperpublishers.com/oajs/index.php For More Open Access Journals in Juniper Publishers Click on: https://juniperpublishers.com/journals.php
0 notes
Text
Juniper Publishers Indexing Sites List All journals may be registered in the ICI World of Journals database. The database gathers information on international scientific journals which is divided into sections: general information, contents of individual issues, detailed bibliography (references) for every publication, as well as full texts of publications in the form of attached files (optional). Within the ICI World of Journals database, each editorial office may access, free of charge, the IT system which allows you to manage your journal's passport: updating journal’s information, presenting main fields of activity and sharing the publications with almost 200 thousand users from all over the world. The idea behind the ICI World of Journals database is to create a place where scientific journals from all over the world would undergo verification for ‘predatory journals’ practices by scientific community. The ICI World of Journals database allows journals which care about completeness and topicality of their passports to build their citation rates and international cooperation. https://journals.indexcopernicus.com/search/details?id=48074 Scilit : The name Scilit uses components of the words “scientific” and “literature”. This database of scholarly works is developed and maintained by the open access publisher MDPI. Scilit is a comprehensive, free database for scientists using a new method to collate data and indexing scientific material. Our crawlers extract the latest data from CrossRef and PubMed on a daily basis. This means that newly published articles are added to Scilit immediately. https://www.scilit.net/publisher/4378 Publons : Publons is a commercial website that provides a free service for academics to track, verify, and showcase their peer review and editorial contributions for academic journals. It was launched in 2012 and by 2018 more than 500,000 researchers have joined the site, adding more than one million reviews across 25,000 journals. Publons' mission is to "speed up science by harnessing the power of peer review". Publons claims that by turning peer review into a measurable research output, academics can use their review and editorial record as evidence of their standing and influence in their field. Publons says its business model is based on partnering with publishers. Publons produces a verified record of a person's review and editorial activity for journals. This evidence is showcased on reviewers' online profiles and can be downloaded to include in CVs, funding and job applications, and promotion and performance evaluations. Publons also provides: • tools for publishers to find, screen, contact, and motivate peer reviewers; • data and publications about global peer review behaviour; • peer review training for early-career researchers; and • features for academics to discuss and evaluate published research. https://publons.com/publisher/6250/juniper-publishers Sindexs: Scientific Indexing Services (SIS) was founded by renowned scientists. A group of 70 scientist from various countries in different disciplines are started SIS with specific objective of providing quality information to the researcher. SIS offering academic database services to researcher. It's mainly: citation indexing, analysis, and maintains citation databases covering thousands of academic journals, books, proceedings and any approved documents SIS maintains academic database services to researchers, journal editors and publishers. SIS focuses on: citation indexing, citation analysis, and maintains citation databases covering thousands of academic journals. SIS Provides Quantitative And Qualitative Tool For Ranking, Evaluating And Categorizing The Journals For Academic Evaluation And Excellence. This Factor Is Used For Evaluating The Prestige Of Journals. The Evaluation Is Carried Out By Considering The Factors Like Paper Originality, Citation, Editorial Quality, and Regularity & International Presence. We Perform The In-Depth Analysis Method. The Acceptance And Rejection Rates Of Journals Can Be A Determining Factor. Low Acceptance Rate, High Rejection Rate Journals Are Considered The Best And Most Prestigious Journals As The Acceptance Criteria Is Of High Quality Standard. Many Journals And Societies Have Web Pages That Give Publication Data And Style Requirements And Often Includes Acceptance/Rejection Rates. The Paper Copy Of The Journal Occasionally Includes This Data And Will Always Provide Current Contact Information. Whether A Journal Is Indexed In The Major Indexing/Abstracting Service In The Field Is Another Criteria That Can Be Used To Assess The Worth And Quality Of A Journal. Researchbib : ResearchBib is open access with high standard indexing database for researchers and publishers. Research Bible may freely index journals, research papers, call for papers, research position. We share a passion to build research communities to discover and promote great research resources from around the world and maximize researchers’ academic social impacts. Google Scholar : Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes most peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents. While Google does not publish the size of Google Scholar's database, scientometric researchers estimated it to contain roughly 389 million documents including articles, citations and patents making it the world's largest academic search engine in January 2018. Previously, the size was estimated at 160 million documents as of May 2014.] An earlier statistical estimate published in PLOS ONE using a Mark and recapture method estimated approximately 80–90% coverage of all articles published in English with an estimate of 100 million. This estimate also determined how many documents were freely available on the web.   Worldcat : WorldCat is the world's largest network of library content and services. WorldCat libraries are dedicated to providing access to their resources on the Web, where most people start their search for information. You can search for popular books, music CDs and videos—all of the physical items you're used to getting from libraries. You can also discover many new kinds of digital content, such as downloadable audiobooks. You may also find article citations with links to their full text; authoritative research materials, such as documents and photos of local or historic significance; and digital versions of rare items that aren't available to the public. Because WorldCat libraries serve diverse communities in dozens of countries, resources are available in many languages. https://www.worldcat.org/search?qt=affiliate_wc_org_all&ai=Directory_ashok.drji%252540gmail.com&fq=&q=Orthopedics+and+Rheumatology+Open+Access+Journal&wcsbtn2w=Go Crossref: Crossref (formerly styled CrossRef) is an official Digital Object Identifier (DOI) Registration Agency of the International DOI Foundation. It is run by the Publishers International Linking Association Inc. (PILA)[2] and was launched in early 2000 as a cooperative effort among publishers to enable persistent cross-publisher citation linking in online academic journals Crossref is a not-for-profit association of about 2000 voting member publishers who represent 4300 societies and open access publishers, including both commercial and not-for-profit organizations. Crossref includes publishers with varied business models, including those with both open access and subscription policies. Crossref does not provide a database of fulltext scientific content. Rather, it facilitates the links between distributed content hosted at other sites. Crossref interlinks millions of items from a variety of content types, including journals, books, conference proceedings, working papers, technical reports, and data sets. Linked content includes materials from Scientific, Technical and Medical (STM) and Social Sciences and Humanities (SSH) disciplines. The expense is paid for by Crossref Member publishers. Crossref provides the technical and business infrastructure to provide for this reference linking using Digital Object Identifiers (DOIs). Crossref provides deposit and query service for its DOIs. In addition to the DOI technology linking scholarly references, Crossref enables a common linking contract among its participants. Members agree to assign DOIs to their current journal content and they also agree to link from the references of their content to other publishers' content. This reciprocity is an important component of what makes the system work. Non-publisher organizations can participate in Crossref by becoming affiliates. Such organizations include libraries, online journal hosts, linking service providers, secondary database providers, search engines and providers of article discovery tools. https://www.crossref.org/06members/50go-live.html#Juniper+Publishers ICMJE: The ICMJE recommendations (full title, Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals) are a set of guidelines produced by the International Committee of Medical Journal Editors for standardising the ethics, preparation and formatting of manuscripts submitted for publication by biomedical journals. Compliance with the ICMJE Recommendations is required by most leading biomedical journals. As of 2017, over ~3274 journals worldwide followed the Uniform Requirements http://www.icmje.org/journals-following-the-icmje-recommendations/ scribd: Scribd began as a site to host and share documents While at Harvard, Trip Adler was inspired to start Scribd after learning about the lengthy process required to publish academic papers.] His father, a doctor at Stanford, was told it would take 18 months to have his medical research published Adler wanted to create a simple way to publish and share written content online. He co-founded Scribd with Jared Friedman and attended the inaugural class of Y Combinator in the summer of 2006. There, Scribd received its initial $120,000 in seed funding and then launched in a San Francisco apartment in March 2007. Scribd was called "the YouTube for documents", allowing anyone to self-publish on the site using its document reader. The document reader turns PDFs, Word documents, and PowerPoints into Web documents that can be shared on any website that allows embeds. In its first year, Scribd grew rapidly to 23.5 million visitors as of November 2008. It also ranked as one of the top 20 social media sites according to Comscore. In June 2009, Scribd launched the Scribd Store, enabling writers to easily upload and sell digital copies of their work online.] That same month, the site partnered with Simon & Schuster to sell e-books on Scribd. The deal made digital editions of 5,000 titles available for purchase on Scribd, including books from bestselling authors like Stephen King, Dan Brown, and Mary Higgins Clark. In October 2009, Scribd launched its branded reader for media companies including The New York Times, Los Angeles Times, Chicago Tribune, The Huffington Post, TechCrunch, and MediaBistro. ProQuest began publishing dissertations and theses on Scribd in December 2009.] In August 2010, many notable documents hosted on Scribd began to go viral, including the California Proposition 8 ruling, which received over 100,000 views in about 24 minutes, and HP's lawsuit against Mark Hurd's move to Oracle. Citefactor: Citefactor is a service that provides access to quality controlled Open Access Journals. The Directory indexing of journal aims to be comprehensive and cover all open access scientific and scholarly journals that use an appropriate quality control system, and it will not be limited to particular languages or subject areas. The aim of the Directory is to increase the visibility and ease of use of open access scientific and scholarly journals thereby promoting their increased usage and impact. https://www.citefactor.org/journal/index/14860/orthopedics-and-rheumatology-open-access-journal#.XeDgu-gzbIV PlagScan : PlagScan is a plagiarism detection software, mostly used by academic institutions. PlagScan compares submissions with web documents, journals and internal archives. The software was launched in 2009 by Markus Goldbach and Johannes Knabe. PlagScan is offered as a Software as a Service and as an on-premise solution. Users can either register as a single user or as an organization. Upon first-time registration, single users receive a free test credit and can purchase additional credits for future submissions, after the completion of a satisfactory trial. Organizational users verify the organization’s address prior to using the software. An obligation-free quote can be requested immediately on the website. Organizations can choose from a variety of options and create multiple administrators and groups, for example, to divide different departments within one institution. After scanning a submission for plagiarism, PlagScan provides users with a detailed report that indicates potential plagiarism and lists the matched sources. Genomics: Genamics is a software and web development firm dedicated to ensuring scientists have access to all the computer tools and computer resources available today. As science becomes increasingly reliant on the plethora of new ways computers improve our productivity, it is essential that scientists can readily apply this technology to their work. Our tight communication with scientists and computer technologists enables us to provide both down-to-earth and cutting-edge ways of achieving this goal. Foundations The products and services we create at Genamics are built on three core foundations: 1. Ease of use; 2. High Technology; 3. Future foresight. 1. Ease-of-Use At Genamics we have strived to make every effort to design our products and services to be as easy as possible to use, yet not compromise their power and flexibility. We have taken great care and thought in creating user interfaces that are highly intuitive and easy to understand. Perhaps most importantly of all, we listen to our users and respond to their suggestions and requirements. It is only by this constant refinement, that we can create products that are genuinely friendly and fulfilling to our users. 2. High Technology Computers and biotechnology are perhaps the fastest moving industries of our times. The utilization of the latest technology is a key factor in progressing and maintaining our products and services to the forefront in their field. By adopting cutting-edge programming tools, we have been able to drastically reduce development time and have the additional capacity to rapidly steer our applications in new directions. Our broad knowledge in computers and science, allows to select the best technologies to meet our goals and ultimately provide the best experience for our users. Applications built at Genamics are developed using a highly object-oriented approach. This has allowed us to build up a large library of general components and controls, which can readily be re-used for new and upcoming projects. Our Visual J++ Developer Center provides the medium by which we can maintain contacts and support with Visual J++ programmers on an international scale. Being largely open-source, the Genamics Library mutually benefits programmers and us, by allowing it to be extended in ways that would not be possible within a single company. Custom coursework - reliable research papers writing help from a team of professional writers. 3. Future Foresight With the fast moving industries that we are involved in, predicting and understanding their future directions is especially important to us. Consequently, the solutions we create don't just solve the problems of our customers today, but are also ready to handle the problems of tomorrow. The products and services we build are designed within a highly open framework, with many of these future considerations in mind. Similarly, the tools and technologies we adopt to create our solutions are chosen not just for how much can be achieved currently with them, but also for their own future potential and capacity to meet future challenges. At Genamics we are continually prospecting for new innovations and technologies. Already, we have a number of exciting new projects underway, which we hope to bring to you in the near future. Semantic Scholar : Semantic Scholar is a project developed at the Allen Institute for Artificial Intelligence. Publicly released in November 2015, it is designed to be an AI-backed search engine for scientific journal articles. The project uses a combination of machine learning, natural language processing, and machine vision to add a layer of semantic analysis to the traditional methods of citation analysis, and to extract relevant figures, entities, and venues from papers. In comparison to Google Scholar and PubMed, Semantic Scholar is designed to highlight the most important and influential papers, and to identify the connections between them. As of January 2018, following a 2017 project that added biomedical papers and topic summaries, the Semantic Scholar corpus included more than 40 million papers from computer science and biomedicine. In March 2018, Doug Raymond, who developed machine learning initiatives for the Amazon Alexa platform, was hired to lead the Semantic Scholar project. As of August 2019, the number of included papers had grown to more than 173 million after the addition of the Microsoft Academic Graph records https://www.semanticscholar.org/ DRJI : DRJI provides ready access to education literature to support the use of educational research and information to improve practice in learning, teaching, educational decision-making, and research. Directory of Research Journals Indexing is a free online service that helps you to find web resources for your articles and research. With millions of resources available on the Internet, it can be difficult to find useful material. We have reviewed and evaluated thousands of resources to help you choose key websites in your subject. Our indexed journals will be submitted to all social networks and world's top most indexing and they will be displayed on world's top electronic library. In short, all journals will reach all continents. http://olddrji.lbp.world/JournalProfile.aspx?jid=2473-554X ORCID: The ORCID Open Researcher and Contributor ID) is a nonproprietary alphanumeric code to uniquely identify scientific and other academic authors and contributors This addresses the problem that a particular author's contributions to the scientific literature or publications in the humanities can be hard to recognize as most personal names are not unique, they can change  have cultural differences in name order, contain inconsistent use of first-name abbreviations and employ different writing systems. It provides a persistent identity for humans, similar to that created for content-related entities on digital networks by digital object identifiers (DOIs). The ORCID organization, ORCID Inc., offers an open and independent registry intended to be the de facto standard for contributor identification in research and academic publishing. On 16 October 2012, ORCID launched its registry services and started issuing user identifiers. BASE : BASE is one of the world's most voluminous search engines especially for academic web resources. BASE provides more than 150 million documents from more than 7,000 sources. You can access the full texts of about 60% of the indexed documents for free (Open Access). BASE is operated by Bielefeld University Library. We are indexing the metadata of all kinds of academically relevant resources – journals, institutional repositories, digital collections etc. – which provide an OAI interface and use OAI-PMH for providing their contents (see our Golden Rules for Repository Managers). The index is continuously enhanced by integrating further sources (you can suggest a source which is not indexed yet). We are working on several new features like a claiming service for authors within the ORCID DE project. BASE is a registered OAI service provider. Database managers can integrate the BASE index into their local infrastructure (e.g. meta search engines, library catalogues). Further on there are several tools and services for users, database and repository managers. https://www.base-search.net/Search/Results?lookfor=juniper+publishers&name=&oaboost=1&newsearch=1&refid=dcbasen Sciforum: Sciforum is an event planning platform that supports open science by offering the opportunity to host and participate in academic conferences. It provides an environment for scholarly exchange, discussion of topics of current interest, building of networks and establishing collaborations. Sciforum was launched in 2009 by MDPI, an academic open-access publisher with headquarters in Basel, Switzerland. Sciforum does not only offer the possibility to participate in conferences, but invites scientists to organize their own conferences. The organizers reduce their administrative efforts thanks to an online tool that supports all aspects of conference organization, including setting up and maintaining the conference website, managing the peer-review process, publishing the conference proceedings, handling and coordinating the conference schedule, registration, billing, sponsors, etc. Organizers can choose between physical and online conferences and whether they require administrative support from Sciforum staff. ScienceOpen: ScienceOpen is an interactive discovery environment for scholarly research across all disciplines. It is freely accessible for all and offers hosting and promotional services within the platform for publishers and institutes. The organization is based in Berlin and has a technical office in Boston. It is a member of CrossRef, ORCID the Open Access Scholarly Publishers Association, STM Association and the Directory of Open Access Journals. The company was designated as one of “10 to Watch” by research advisory firm Outsell in its report https://www.scienceopen.com/user/9f93b14f-c289-4e21-b3c8-8f448ea424ab Citeseerx: Sindexs: Sintex Industries BSE: (Earlier known as The Bharat Vijay Mills Ltd) is the world largest producer of plastic water tank. It is also Asia's largest manufacturer of corduroy fabrics. Sintex has a strong presence in 4 continents, i.e Europe, America, Africa, and Asia. Presence in the countries like France, Germany and USA. It is primarily into Building Material Solutions, Textiles Solutions & Custom moulding Solutions. Its manufacturing includes a wide range of plastic products including prefabricated structures, industrial custom moulding products, monolithic constructions and water storage tanks. In the textile segment, the company focuses on niche segment specialising in men's shirting. http://sindexs.org/JournalList.aspx?ID=6090 Get More Information About Juniper Publishers Please Click on Below Link: https://juniperpublishers.com/member-in.php
Tumblr media
0 notes
Text
Juniper Publishers| Novel Contemporary Guidelines and Dimensions of Occupational Psycho-Social Stress, Psychological Risk Factors and Patternsof the XXI century: Time is in us and we are in time. It changes us and we change it
Journal of Psychology-JuniperPublishers
Introduction
Generations of individuals of the XXI century who live in conditions of contemporary complex global oriented world are exposed to the traditional, well-developed research-based model of psycho-social job stress but also to novel guidelines, dimensions, scheme and characteristics of occupational stress. Psycho-social characteristics are bound and linked with human value system and human nature. Human values are intransient and in different generations can be retained and maintained, but the process of historical and social development change the nature of human value system in today's complex global oriented world. The major psychological risk factors, guidelines, dimensions, and patterns of the XXI century are dependent on recent stress-induced social and political process, and are dictated by the new imperatives of our modernity. Socio-political processs affect and shape lives and behavior of modern generations in normal conditions and particularly under exposure to life and occupational stress. Then why individuals should think about the arising and occurring changes in worldly existence and working life today and their future consequences? Is there anything special that takes place today compared to what previous generations have suffered and experienced? Do we need to learn from the experience, knowledge and complicated process of historical development and related mistakes of previous generations? Are we to believe in the eternal human values? Yes, I think that there are and that we have to Advance in science, information technology, automation and all spheres of intellectual, occupational and social development of society of the XXI century has inspired us from the limitless possibilities of the human spirit, intellect and creativity.
Discussion
Major occupational psycho-social stressors, psychological risk factors, guidelines, dimensions and patterns of the XXI century
Differentiation and discrimination of risk factors and risk occupations. Genesis, exacerbation and deterioration of health status of risk individuals. Studying physiological regulatory mechanisms inducing diseased states:
One of the methods for assessment of occupational psycho-social stress that we apply in our work is the NIOSH job stress questionnaire for occupational stress study [
1
]. NIOSH job stress questionnaire is adapted for Bulgarian conditions and language. The application of this questionnaire in individuals from different professional groups - operators of chemical plants, air traffic controllers, operators of telecommunications, military pilots, computer operators, employees from healthcare, sound producers, employees of gas construction industry and many other contingents allows their comparison, and facilitates differentiation and discrimination of risk occupations and risk factors for triggering, developing and accelerating stress response, and for determination of working performance. Another important method for the study and assessment of occupational psycho-social stress that I apply in my research studies is the job stress questionnaire that I translated and adapted for Bulgarian conditions and language and with which I am authorized to work in Bulgaria from Job Stress Center. Apart from immanent basic traditional psychological modus and approach of occupational stress for assessment of psycho-social risk factors at specific working places on the one hand and the impact of psycho-social factors in the work and living environment on the degree of subjectively experienced stress (mental strain, psycho-somatic complaints) on the other, we are witnessing novel guidelines, dimensions and patterns dictating a different scheme and characteristics of occupational stress. A growing emphasis in medical research and screening of risk factors that we are interested in is the essential role of psychosocial stress on the genesis, exacerbation and deterioration of the health status of risk contingents of individuals, and determination their performance. For example, the program for research occupational (work) studies NORA of NIOSH emphasizes on recent physiological regulatory mechanisms (autonomic, immunological, hormonal, neural) that mediate and induce a wide range of diseases as a result of the impact of occupational stress [
2
,
3
]. Determining the functional role of autonomic cardiovascular control (investigated by analysis of Heart Rate Variability) as a mechanism mediating the risk of cardiovascular diseases (CVD) under the effect of psycho-social occupational stress was studied in foreign [
4
-
6
] and our [
7
] researches. In this regard psycho-physiological significance of Heart Rate Variability (HRV) is expressed in possibility to use it as a reliable method and indicator for occupational stress and workload research, for determining of pattern of functional strain when studying the relationship between autonomic cardiovascular control and psychological and social work-related occupational factors, for identification and definition of these psycho-physiological dependencies that indicate the pattern of functional strain and exhaustion and which might induce CVD, for assessment and prediction of cardiovascular risk, and health promotion at the working place [
2
-
5
,
8
,
9
]. In our research we are interested in determination of these psycho-social stressors of the working environment that accelerate the process of functional strain and exhaustion as a consequence of chronic prolonged exposure and mediate diseases with stress etiology: cardiovascular diseases, depression and neurosis. The process of functional strain is characterized by stress-induced effects on cardiovascular functional status mediated by neural control mechanisms. The process of cardiovascular functional strain might be accelerated under the influence of psycho-social occupational stress factors and might lead to CVD. In our study on cardiovascular risk and psycho-physiological assessment of occupational stress in shift working telephone operators we found and observed functional dependence of mean heart rate on skill utilization, and functional dependence of short-term variability on cognitive skills and skill utilization
The indicated factors: skill utilization and cognitive skills might exert a stress generating effect and affect the autonomic parasympathetic function and might induce CVD. These results are in agreement with other studies revealing an association between psycho-social occupational factors and cardiovascular functional status controlled by the autonomic and central neural mechanisms mediating CVD incidence and support the suggestion of Kristal-Boneh 1995  that HRV might be used for assessment and prediction of cardiovascular risk and performance under the effect of cognitive load, stress and psycho-social occupational risk factors.
Socio-economic and technological changes affecting working activity and health:
Another new dimension and trend, as opposed to 50 years of the XX century, when psychological studies were focused primarily on the difficulties and adaptation of employees to the working environment, the present studies highlight and emphasize on major socio-economic and technological changes affecting working activity and their effect on psychological, functional and social status of individuals
Socio-political transition:
In each new generation many times and multilaterally is underlined the idea that people think, consider and feel they were born and live in a very special period in human history, and that living conditions are changing faster than ever before. Old ideas bound by the relevant public and political development are replaced by new ones with increasing speed. If this is true, then each new generation is actually exposed at a faster pace and rhythm of life than previous generations. In Bulgaria my generation lives in the years of socio-political transition - from totalitarian communistic regime to democratic regime. The generation before us of our parents also lived twice in years of complex socio-political transition - first from democratic regime and monarchy to totalitarian communistic regime, and second from totalitarian communistic regime to democratic regime. The decision of Franklin D. Roosevelt, Winston Churchill and Joseph Vissarionovich Stalin at the Yalta Conference is detrimental and fatal to the lives of millions of people worldwide. Still is difficult to overcome the wounds of the decision to redistribute Europe and shaking off the old communist dictatorship imposed order for the management of post-totalitarian societies in Eastern and Central Europe in the period 1944-1989. These facts impose high mental load on cognitive functioning of freethinking people and indicated generations of individuals who do not profess communistic ideology and acquire novel dimensions of stress. Cognitive mental load of these categories of people deepens and increases in all spheres of intellect and consequent from this type of cognitive load action of the thinking working man, determines concrete and global integrated thought, worldview and ideology of human being.
The new socio-political experiment of the XXI century:
The new socio-political experiment of the XXI century is a new form of communism, which was transformed into socialism and professes and declares from some categories of people in post- totalitarian societies and Western societies. In the ideology of socialism is encoded the essence of old communistic ideology dressed in new form that does not change and which is not detected and understand by people living in Western societies. The transformation is invisible and unnoticeable especially for some representatives of Western societies and for the youngest generation of post-totalitarian societies who have not lived in conditions of communism, and hides and brings a risk to democratic values and democratic order and their stability. Next fact and characteristic of the new socio-political experiment of the XXI century is associated with the new formed socio-political religious order which is based on dictatorship and terrorism and which causes and will continue causing an increasing burden of human mental load and psychosocial stress. These transformations in the socio-political life characterize other novel dimensions of psycho-social stress and cognitive load.
Human fanatical intolerance of more retrograde regimes to democratic countries and strongest regimes of the XXI century and democratic institutions:
In our modernity stands and forms the increasingly fanatical, frantic and frenetic intolerance of retrograde regimes to strong democratic states and regimes of the XXI century and their democratic value system, to their prosperous and highly developed economies, to their innovative technologies, ideology, religion and culture. This is an act of attack on civilization. Intolerance degenerates and converts into attack on civilization. This apparent intolerance became apparent only after the beginning of terrorist attacks on the democratic Western societies and their value system. The human value system of Western societies has not allowed and assumed for the existence of primitive "value system” and consolidation of the world of evil. The barrier between the two values systems is hard to overcome and failure to cope with the retrograde mindset loads human psyche and creates a novel guideline, level and form of stress.
The new sense of generations of the XXI century which lives in conditions of democratic regime and value system are facing new imperatives, missions, roles, political compromises and shaterring shock political development:
The new mind and way of thinking of the generations of the XXI century are faced with a rapidly changing socio-political reality. This reality dictates new imperatives, creates new roles and performs complex missions, requires political compromises and the socio-political developments in certain cases undergo shattering shock effect. Cognition for these guidelines and collision with these trends cause pronounced degree and level of psycho-social stress, and burdens cognitive workload and functioning.
Generations of the XXI century face a novel level and guideline of psycho-social stress due to continuously growing and unmanageable conflicts, refugee and immigrant waves crises, the lack of stability and even balance:
Democratic institutions of the XXI century and the generations of the XXI century are facing a new level and guideline of psychosocial stress due to lack of stability and even balance arising from obscurity and uncertainty that perceives refugee waves, immigrant crises, and unmanaged conflicts of our modernity. Some countries are or were always bright generator of foreign conflicts and clogging them.
Parallel and simultaneous life and work in a constantly growing risk of terrorist attacks:
Our generation should comply with avant-garde but atthe same time and detrimental patterns that exist in the field ofpsycho-social stress and the new look of occupational stress thatis raised after the terrorist attacks on the World Trade Centerin New York and Pentagon in 2001, after the terrorist attacks in Paris and in Nice during the National Holiday of France in 2016, after the terrorist attacks inGermany in Munich, Berlin, Cologne in 2016, and after terroristattack in Belgium in Brussels in 2016. Strong democratic states and regimes of the XXI century and their democratic value system are exposed to a greater extent to this type of stress.
The role of the level of control on cognitive functioning during risk of exposure to terrorist attacks:
The role of terrorism as a constant stressor and the new emerging stressor - the parallel and simultaneous life and work in a constantly growing risk of terrorist attacks as a new guideline of stress should be reviewed by the theorists of psycho-social occupational stress. Theories are strong and well developed, but the practice is different. When in combinations described as high working demands and high level of control, which presents an active working situation and low level of working demands and high level of control, which is described as work related to low level of strain, we add exposure in threatening risk of terrorist attacks, we have to ask the question whether the work situation and activity will continue to be active or in the second combination whether the work will continue characterize with low level of strain. We need to revisit the response, and the ratio and proportion of the level of control to the exposure of threatening risk of terrorist attacks in the two next job situations: high demands combined with low control over the working process, creating high occupational stress and low level of working demands and low level of occupational control, which is described as passive work. The response of control, which is one of the most vulnerable variables and stressors, to risk of terrorist attacks will be crucial to the emerging level of cognitive load of the working population. This novel form of stress increases significantly the level of cognitive functioning and the resulting consequences for mental and functional status of working individuals.
Human value system and psychological and social characteristics of totalitarian society and society with unknown face:
When we analyze the dependence of terrorism on human factors we should not ignore mentality of the “little man” and his advisors and leaders nurtured with malicious feelings and the socio-political environment in which he lives and builds, and to not underestimate him and them. Psychological, social and community-based fundamental of totalitarian societies and societies with unknown face create conditions for dominance of this kind of people. The system and the nature of the human values of democratic well-developed Western and post-totalitarian societies are preserved and thoroughly organized. Human value system of totalitarian societies and societies with unknown person is also retained for the most categories of people. But there are some categories in which human value system and human nature are dictated and confessed, and is bound of the respective socio-political development that changes the mentality and creates people who unshakable believe and profess ideologies that are detrimental to the democratic community, and create its hidden and evident enemies. Such ideological elements were observed also in former post-totalitarian societies. The European Union, NATO, United Nations, the world democratic community and the new democratic institutions of the XXI century have defended the enduring principles of civilization: freedom, progress, pluralism and tolerance without which humanity cannot continue to develop normally. Psychological and social characteristics and value system of indicated categories of the above-described societies are deeply embedded and rigid, and even impossible to change and form new guidelines of stress. To live and work with this system is real and major challenge to the democratic world, since its roots are in venerate, and are built from many generations and decades ago.
Stressors that induce novel socio-political process:
In recent decades, where live and work generations of the XXI century, significant changes occurred in occupational life, inducing novel dimensions and patterns in psycho-social stress. More significant professional changes such as the introduction of new technologies at work, the accelerated and fast transition from physical to mental-based cognitive workload, the origination generating and triggering of occupational stress and work in exposure to job stressors, work in the process of functional and psychological interface and collision with automated and intelligent systems, extremely significant for our times and contemporaneity topic for life and work in a constantly growing threat of risk of terrorist attacks, recent social and political complicated dimensions, opening new ways of communication and interface induce new socio-political process of novel challenges to the human spirit and creativity. A key theme of medical science and ethics in the XXI century is considering and ensuring of optimal health status, optimal physical and psychological efficiency performance and work ability, highly reliable expertise and professionalism to consolidate freedom in the world, to preserve human peace and to guarantee defense of the safety and security against terrorism and tyranny, and to preserve the principles and values of the world democratic community. Description of some social aspects of self-identity and modernity in the late XX century offers a work of Giddens which however has hues and shades of egocentric abstract thinking and lacks the thought for our reality. It is appropriate to perform adequate comparison of stress-induced psychosocial stress of social and political reality of the XX century and especially in the last half and last decades of the XX century when important changes and shakes occurred in the sociopolitical processes with this of the XXI century. The beginning of the XXI century and modern times in which we live accelerates and transforms sharply the social and political processes and acquires novel guidelines, dimensions, shapes and tendencies of life and psycho-social occupational stress. It is difficult to determine the level of stress in this comparison and which stressors of the XX century and the XXI century will weigh more.
Characteristic patterns of occupational psycho-social stress of the post-totalitarian society. Similarities and differences between post-totalitarian societies and Western societies:
Societies change their pace and characteristics of development. Post-totalitarian societies have come a long way of development affecting their lives and work. Regardless of the potential similarities between employees (and human being/ operator) in their adaptation to the increasing demands of work there is pronounced differences between occupational tasks that put complex professionally bound and influenced by environment systems of well-developed Western societies and post-totalitarian societies. The difference is due to the high level of psycho-social stress in the latter. The increased level of stress operates and acts on mental cognitive processes. Operators and workers in Western societies are also experiencing stress, but in most cases it concerns relevant social group and not the entire society, and stressors are different. In these societies, social and economic processes have evolved and were settled in an evolutionary and gradual manner, and these processes have not caused such a conflict and upheaval as in our society. In the post-totalitarian society we are witnessing a revolution in thinking, work and life as a totalitarian way of life is totally rejected. But now the work is performed in terms of economic collapse, imposing difficulties in adapting to these new requirements - the requirement and the need to acquire new skills, knowledge and abilities; working with new technologies and systems; ignoring the state's role in employment and social status; reduction of the staff of enterprises and government- supported and related structures, and so on. Health situation of individuals of Central and Eastern Europe and its relationship with the social, economic and risk factors for cardiovascular and other diseases have been discussed extensively. The only difference between the post-totalitarian societies and Western societies is that post-totalitarian societies have passed and experienced communistic regime due to the Yalta Conference. This circumstance is the reason for the high level of life and professional psycho-social stress. Post-totalitarian societies are in its nature democratic societies, but were subjected to repressive totalitarian communistic regime and originating from this regime obscurantist terror. In our modernity, when still exist totalitarian societies and when appeared societies with obscure and unknown origin and face, hiding risks of terrorism, terror and terrorist attacks, post-totalitarian societies can be very useful to the contemporary democratic process with its rich experience from the totalitarian past to determine the correct course of contemporary democratic socio-political process. We remember the past, we do not forget it and knowing the history between 1944 and 1989 appeal to thinking people it will not happen again, because it hides terror and fanaticism, which generate primitivism. Still are fresh traces of these events and people who have passed and have not returned from the communistic concentration camps. During those years without trial, court and judgment in Bulgaria pass about 500000 people. From 1944 to 1962 in Bulgaria operated 44 concentration camps among the most famous were the camps in the dam Rosica, uranium mines in Buhovo, Belene, Saint Vrach, Kutsiyan, Dupnitsa, Bobovdol, Lieutenant Chunchevo, Bosnia, Nojarevo, Chernevo, Green dol, Zagrad, Skravena, Koprinka, Lovech and many others. The number of people killed by the communist government without trial, court and judgment immediately after 9.09.1944 is unknown. It is also unknown the number killed in concentration camps. Killed and detained in the camps are politicians, intellectuals, clerics, students, officers and wealthy Bulgarians. The motto of the camps was the words and the motto of Stalin: There is a man there is a problem, no one no problem! And Dzerzhinsky: The enemy is never educates, and destroys always and everywhere! Many of those killed and those who died in these camps were given to pigs to no traces and were applied brutish methods of torture and murder. The horrors of these camps were written many, many survivors told about the tragic fate, their stories are collected in thick volumes. Bulgarian death camps are created by Soviet model. Russia has extensive experience of totalitarian communist regime and terror in the creation of concentration camps after the Bolshevik Revolution in 1917. Alexander Solzhenitsyn describes terror, repression and killings in Bolshevik and Communist death camps GULag after their creation in 1918 in his book The GULag Archipelago. The hallmark and distinctive mark of the repressive system of communism and its most powerful metaphor becomes GULag - the network-conglomerate of concentration camps in the USSR. It GULag allows even not knowing anything about Marxism-Leninism to face the terrible statistics of genocide that characterize the “most humane social order” in human history. We risk repeating the past if we do not know it. Knowing the history and these facts helps us to comprehend, assimilate, sift out, and realize the actual and existing causes, guidelines, dimensions and patterns of the psycho-social stress of our modernity, and the guidelines of contemporary sociopolitical process. Distinction of stress of our times is that the transformations in the socio-political processes increase notably and significantly the level of stress both in post-totalitarian and Western societies.
Genetic and cognitive information in different societies:
Genetics and consciousness transfer from the past to the modernity concrete and essential for the socio-political process information which induces a novel guideline and level of psychosocial and life stress. Societies differ from one another precisely on specific concrete nature of this information. Qualitative changes in it lead to qualitative changes in the society to which it relates. Notions of historical fate form the basis of historical memory on which a certain volume of real facts and events create and maintain at the distinct individual, his sense of belonging to his community. When the community has its own political organization consciousness of belonging of its members is covered with the consciousness of state affiliation.
The impact of intellectual, political, social, and electronic revolutions on human mental workload:
Mental workload as well as stress related to environmental factors and psychological factors affects cognitive functioning, as reflected in the human functional state and performance of working tasks. The functional response of operators and workers is determined by the balance between the demands placed on the individual and the resources available and the ability to copy with the requirements of working tasks. One important reason why I think the last decades of human development are outstanding inherently is the intellectual, political, social, and electronic revolutions that dramatically changed the lives of people in the last generation. But this pattern has developed and built gradually. It took considerable time while large groups of people were able to start working and to use the results and outcomes (ideas, creativity, productivity, innovation, technology, equipment, facilities, etc.) from these revolutions. A novel guideline and pattern of life and psycho-social occupational stress is the impact of intellectual, political, social, and electronic revolutions on human mental workload.
Psycho-social risks affecting health status:
The occupational stress at the working place is a serious risk factor for psychological and physiological health status of working people, and the consequences of its effects occur in other areas: effectiveness of work, morals, nature and quality of human and collegial relationships, socio-medical context of work, performance, quality and level of social and medical health . A number of research studies have found that stressors are all demands that present to people the occupational working environment and working place. These in-depth studies dedicated on the subject and its resulting extensive discussion concludes that working individuals who practice occupations characterized by: a high level of job demands - workload, role conflict, high responsibility, cognitive demands, skills utilization, interpersonal conflicts - conflicts between members of the working group and conflicts between different working groups within the workforce; insufficient opportunity to participate in the management of the working process; low degree of control over the working process; low level of social support from the supervisor and colleagues, family and friends; insufficient satisfaction of employment; under load and overload are exposed at risk of developing cardiovascular diseases and diseases affecting the health status - ulcer disease and other gastrointestinal pathology, neurosis, disturbance of the Autonomic Nervous System and others, and damage to psychological condition - depression, Post-Traumatic Stress Disorder, anxiety, fatigue, hostility, social isolation and others, and changes in behavioral activity of the individual - smoking, excessive drinking, accidents and others. Psycho-social risks affecting health status, psychological condition and behavioral activity form new dimensions of stress. According to the program of the European Union reduction of occupational stress in the workplace and in the working environment is not only a moral and legal case. Reducing this risk factor has economic, social and medical impact, and effect on the quality of public health. Good economic and social impact suggests relevant health and medical status, performance of working activity, work ability and safety.
Work in large-scale complex sophisticated human - socio- technical system. Factors affecting the failure and collapse of the system:
Conditions in more developed countries are complex and cover interactions between individual, organizational, social and international political and social processes. Novel contemporary dimension and pattern of occupational activity is work in the large-scale complex sophisticated human - socio - technical system forming the interaction: human/operator - social characteristics - automation/technology. Progress in scientific medical, psychological and social cognition of human being, design and the development of human being of new information and intelligent technologies and agents, introduced automation of work activity and all areas of avantgarde intellectual professional thought and the dynamics of social development in the XXI century induce novel guidelines of work in today's global world. The contemporary working life provides opportunities to improve our economies and health as well as control and management of novel health risks. Physical occupational risks are reduced but significant increase is the mental and musculoskeletal diseases and injuries especially in young women and groups of people with social and economic problems. Contemporary theoretical and practical issues treated by the field of research of psycho-biology and physiology of occupational stress are the impact of mental workload and occupational stress in complex human/operator - socio - technical system. To predict future problems and to prevent difficulties in the interaction of human/operator - automation system and in its coordination is necessary to dynamically examine three components of the complex human/operator - socio - technical system: human/ operator, social characteristics, and automation/technology. To understand the success and collapse of complex system: human/operator - automation we must determine and evaluate the effects of mental workload and stress in complex operating environments, to explore psycho-social factors in the work environment, and design, control and regulation of automated and intelligent systems that promote and enhance work activity of human/operator and ensure the security and safety of life and work. We defined a model discriminating human-system/ human-computer interaction. Controllability of work situation, quantitative workload, variance in workload and work satisfaction discriminate human-system from humancomputer interaction. The functional dependence of cognitive demands and work satisfaction on vagal activity assessed by spectral power of cardio intervals in Respiratory Sinus Arrhythmia band (Prsa) could promote future research on the association between cognitive workload, work stress, and performance in considering the role of Prsa as an indirect measure of performance and as a direct indicator of the cardiovascular protective function. In our study we observed a functional relationship between cognitive workload assessed by Heart Rate Variability and Health Risk, and work stress measured with work-related psycho-social factors. Our result is consistent with the result of Hockey et al. who reported a functional association between workload and work stress. This century - the present century in which we live poses and imposes ahead demands for adequate functioning, interaction and coordination between human being and intelligent systems. Each side of this interaction could contribute and introduce in the system work safety and defense against terrorism, elimination of critical micro- and macro-ergonomic errors in performance of strategic and/or occupational tasks. Past examples of failure and collapse of the system are not complying with the critical factors: damage and failure in the system components and their interactions; defect and disorder in the system; disturbance in the environmental and in the work safety factors; failure to comply with the human factors - such as accidents in Chernobyl, Bhopal, Three Mile Island and the Exxon Valdez.. We, in Bulgaria, continue and will continue to experience the consequences affecting our health status due to Chernobyl accident as a result of human error and lack of competence. In addition to the high degree of cardiovascular diseases in our country is observed a trend of increasing the level of neoplasms. Such data were found in the study of the impact of occupational stress on the health status of sound-operators working in the Bulgarian National Radio. The high degree of neoplasms is due to the result of accident in Chernobyl and to the high level of psycho-social occupational stress.
Patterns of cognitive load. Sub-optimal functional status could induce deterioration and degradation in performance and risk of complications of health status:
Optimization and innovation are critical processes necessary for success of work performance in complex systems. Work in conditions and in the interaction of automated, human/operator-socio-technical systems are characterized by the potential risk of variability of its components and parameters. Successful management of automated systems is dependent on the capacity of human/ operator to undertake actions and make decisions under normal, sub-normal and emergency situations. As a consequence the role of the operator is changed by an operator who must follow certain rules and laws to operator who solves the problems and takes the actions and the skills and abilities are changed and switched from a psychomotor to cognitive actions and solving problems. Highly automated tools introduced in the system to contribute human/operator to undertake and make decisions impose novel cognitive and information demands for action at work. Monitoring and management of the system in prolonged periods of time induce a high level of mental workload . The workload increases and enhances as a result of cognitive effects related to the monitoring and management of processes of automation. Implementation of professional tasks in the process of interaction between the human operator with automated and intelligent systems with cognitive context and the possibility cognitive load they cause to change the functional status and under prolonged impact to induce sub-optimal functional state is a new guideline and pattern of psycho-social stress. Variations in workload require determination of the cognitive capacity needed for the operation and functioning of the operator. An essential novel element of regular human interactions with automated systems is to evaluate and to predict optimal functional status of the operator in the working environment. Sub-optimal functional status could induce deterioration and degradation in performance and risk of complications of health status. Specific psycho- physiological measures are reliable indicators of research and assessment of mental workload and stress. These indicators reflect changes in the level of mental effort, and mental and cognitive load of the operator. Psycho-physiological indicators reveal the level of mental effort required by the operator to maintain the necessary degree of cognitive performance. In the basis of testing and study of the human factor, and the evaluation of complex automated civilians, military and aerospace systems are underlined specific specialized psycho-physiological indicators. These indicators assess the major effects of argumentative compensatory effort in performing job tasks and work activities. Performance of mental tasks requiring a compensatory effort is associated with significant changes in the psycho-physiological parameters in laboratory and field conditions to perform and execute the task, and hypothetically is discussed that these indicators are result of defense reaction that is related to disturbance and reduction of the mechanisms of regulation. In the working activity associated with interaction with highly automated tools demanding and imposing cognitive load, operators are exposed to the influence and impact of psychosocial factors of the work environment. Work related to interface with automated systems requires greater and higher degree of activation of cognitive resources: memory committed to the performance and implementation of the working task or activity, processes of attention, decision making and planning. The high level of cognitive demands required for the performance of work can lead and cause stress response if is associated with low level of control over the work environment. Requirements to work in the process of interaction with automated and intelligent systems exposed human/operator to an increased level of mental workload and work-related occupational stress. The goal in work with automated systems is to achieve an optimal functional state, to regulate and optimize the level of workload and to improve performance. Prolonged work in terms of increased workload and occupational stress, as well as work activity in critical conditions can induce dysfunctional degree of regulation. Disturbed level of regulation control is associated with a sub-optimal functional state, which can cause degradation and failure in performance and collapse of the system.
Conclusion
Professional work environment focuses and at the same time reflects through its prism the complex socio-political processes of transformation in modern society and determines the behavior and actions of people in optimal, sub-optimal, and critical conditions, and working environment under the effect of cognitive workload and occupational psycho-social stress. The human - operator working in complex human/operator-socio- technical system is a critical element of this transformation. This complex interaction on the one hand leads to new efficiency, operativeness and performance, and which is the most important to new opportunities for human and social progress. On the other hand the socio-political characteristics of the complex global focused world in which we live delineate another group of risk factors - social, professional and psychological that decrease and reduce opportunities for psychological and physiological human adaptation. Work activity of the individual under conditions of exposure to specific risk factors might be reflected not only in the change of health status, but also affects the performance of individual. Discussed and treated by the psycho-biology and physiology of occupational stress contemporary and advanced avant-garde research dimensions and patterns of occupational psycho-social stress such as the introduction of new technologies at work, the accelerated and fast transition from physical to mental-based cognitive workload, the origination generating and triggering of occupational stress and work in exposure to job stressors, work in the process of functional and psychological interface and collision with automated and intelligent systems, opening new ways of communication and interface, the impact of cognitive workload and occupational stress on human/operator in the complex automated and intelligent systems and in the complex occupational system: human/operator, social characteristics and automation/technology inspire us to continue in-depth studies in this area. Progress in scientific medical, psychological and social cognition of human being, the design and the development of human/operator of new information and intelligent technologies, introduced automation of work and all areas of the avant-garde intellectual occupational thought and the dynamics of social and political development in the XXI century generate novel ideas for creativity and action in a globalized world. Critical analysis of existing discussed psycho-social and psycho- physiological literature and knowledge of the historical past and present provides us an opportunity to show insight into human nature and define a new group of risk factors that characterize our modernity which are not determined and described. The novel contemporary guidelines and dimensions of occupational psycho-social stress, major psychological risk factors and patterns of the XXI century defined by us are: Differentiation and discrimination of risk factors and risk occupations. Genesis, exacerbation and deterioration of health status of risk individuals. Studying physiological regulatory mechanisms inducing diseased states; Socio-economic and technological changes affecting working activity and health; Socio-political transition; The new socio-political experiment of the XXI century; Human fanatical intolerance of more retrograde regimes to democratic countries and strongest regimes of the XXI century and democratic institutions; The new sense of generations of XXI century which lives in conditions of democratic regime and value system are facing new imperatives, missions, roles, political compromises and shattering shock political development; Generations of the XXI century face a new level and guideline of psycho-social stress due to continuously growing and unmanageable conflicts, refugee and immigrant waves crises, the lack of stability and even balance; Parallel and simultaneous life and work in a constantly growing risk of terrorist attacks; The role of the level of control on cognitive functioning during risk of exposure to terrorist attacks; Human value system, and psychological and social characteristics of totalitarian society and society with unknown face; Stressors that induce novel socio-political process; Characteristic patterns of occupational psycho-social stress of the post-totalitarian society. Similarities and differences between post-totalitarian societies and Western societies; Genetic and cognitive information in different societies; The impact of intellectual, political, social, and electronic revolutions on human mental workload; Psychosocial risks affecting health status; Work in large-scale complex sophisticated human - socio - technical system. Factors affecting the failure and collapse of the system; Patterns of cognitive load Sub-optimal functional status could induce deterioration and degradation in performance and risk of complications of health status. Humanity is facing in front of increasingly high barriers for variation and growing level which perceives life and occupational stress. Generating stressful exposure and its impact in the human life span and especially the embodiment and expansion of occupational stress in the recent workforce extend challenges towards social and occupational development in the sphere of health, psychology and in the field of social development and progress. These challenges affect the present and future generations who must understand and follow the main line and the truth about human behavior and existence in conditions of a growing cognitive mental thought focused and oriented towards building a civilized society in which work activity to perform in conditions of a eustress aiming a creative personal expression and realization of individual, contributing to individual's prosperity and social progress. A number of disciplines of contemporary medicine and psychology with its specific research approaches are aimed strategically towards the issues for individual life and occupational stress and the consequences of its expansion, and the aggressive as well as favorable stressor effects on human existence in life and work related situations. Advanced medical and psychological thought counteracts with its reliable and adequate approaches of the scales that accepts and which is quite possible in the future to accept occupational stress. Presented range of innovative searches of the human intellect, as well knowledge and experience of experts in the field of occupational psycho-social stress and cognition, build us realistic and at the same time inspire us grading optimistic to a certain extent attitude for the modes, approaches, methods and resources for neutralization of stressor impact and optimization of physical and mental status of individual in the process of work activity and in the collision with the changing characteristics of today's workforce and performed by her complicated work. Unlimited possibilities of the freedom of human spirit, intellect and creativity contribute to the study and development of the history of genesis of human spirit, advance and progress and embodiment of the innovative creativity of mentally- based and focused workload and occupational activity, as well eustress as a stimulus for life, work and success in the process of interface and collision with intelligent systems. To predict future problems and to prevent difficulties in the interaction of system and in its coordination, and to prevent its disturbance, degradation and collapse is necessary to dynamically control and manage the contemporary dimensions of occupational psycho-social stress, and the novel psychological risk factors and patterns, and to be reckoned with and direct if it is in our range of possibilities guidelines of modern psycho-social stress, and to examine three components of the complex occupational system: human/operator, social characteristics and automation/ technology, as well the human factor with a goal of defense of human freedom, health status, efficiency, work ability, safety at work, and ignoring the real existing in our contemporaneousness risk of terrorist attacks, and not to repeat the past. To read more articles in
Journal of Psychology and Behavior Please Click on https://juniperpublishers.com/pbsij/index.php
For More Open Access Journals in Juniper Publishers
Click on:https://juniperpublishers.com/journals.php
0 notes
Impact of Coastal Flooding on Fish Production in Brass, Niger Delta Nigeria, Implication for Coastal Resource Management- Juniper Publishers
Abstract
In recent times, the deteriorating state of the rivers have begin to gain prominence, this is because there have being a reduction in fish catch and the economy of the fishermen and women who depends on this as their source of livelihood. Due to the fragile nature of the Niger Delta and the turn out of climate change which has lead to excessive rainfall and intense heat, the area has been in so much pains as their only source lo livelihood is been affected. In this study two fishing ports were used and from a water sample analysis it showed that although flooding in the area to an extent affects the quality of water as well as the state of fishes, but the most is the activities of man arising from the use of water bodies as dumpsites for refuse and toilets and oil spills from industrial activities of multinational companies and sand dredging. Therefore the study recommends that government and the private sector should see that within these delicate areas developmental projects that have true bearing on the lives of the people are put in place as to reduce the use of water bodies as the only source of dumping refuse and toilets and that fishermen and women in these areas should be encouraged by way of training and provision of modern fishing equipments as to meet up the demands for fish and improve their own economy.
Keywords: Flooding; Costal development; Climate change; Anthropogenic activities
Introduction
The increasing frequency and sometimes intensity of unusual weather-linked phenomenon as a result of climate change in recent times is evident. The United Nations Framework Convention [1] on climate change is attributed directly or indirectly to human activity, that alters the composition of the global atmosphere, leading to unprecedented weather condition. This in turn overtime affects the food production.
In recent times there is limited understanding of how climate variability currently impacts food systems and associated livelihoods [2,3]. This needs to be better understood before assessing the impact of climate change on food security. The impact of long - term trends in climate change, in particular related global warming, is less well - understood in fisheries but is beginning to receive attention.
The Food and Agriculture Organization (FAO) of the United Nations (UN) has developed expertise and experience in rapid appraisal of the impacts of disasters on local fishing communities and aquatic ecosystems and the immediate and longer-term remedial action required. Long-term climate change has important feedback loops to global ocean circulation patterns, sea level rise and changes in ocean salinity all of which affect the biological properties and distribution of species. The Intergovernmental Panel on Climate Change (IPCC) has linked the rise in sea level to climate change. Between 1960 and 1970, a mean sea level rise of 0.462m was recorded along the Nigerian coastal water [4]. Current environmental problems in the coastal area of the country are flooding which comes from the high rainfall, run off from rivers and urban channels and tidal movement and wind [5]. With this problem already common, the issue of sea level rise occasioned by global climate change will exacerbate it. In addition is the potential to cause permanent inundation, beach erosion and salinity. The inundation arising from the rise in sea level will increase problems of floods, intrusion of sea-water into fresh water sources and ecosystems, destroying such stabilizing systems as mangroves, and affecting agriculture, fisheries and general livelihoods [6]. Flooding of low-lying areas in the Niger Delta region has been observed. The phenomenon of flooding is of concern when places important to humans are affected. Floods may be caused by different factors such as rise in stream level, inadequate drainage during and after prolonged precipitation and coastal storms, tides and high waves.
Flooding is one of the most costly and serious hazards to be faced in the use of land resources. Damage is of three categories:
Water Immersion
i. The impact of moving water and debris carried by water.
ii. Erosion and depositional activities of streams as their volume and velocity rise and fall.
Floods have continued to be an annual menace to the inhabitants of the Niger Delta Region. The people have been resigned to accepting the situation and also adjusting their activities, agricultural, social or otherwise according to the dictates of the floods about which they have been able to do very little. Some communities in worst affected areas have to build their houses on stilts that rise above the local flood levels. Because of the threat of the floods to property and life, the area has not offered the favourable atmosphere for meaningful investment. Houses in most parts have continued to be temporary structures of mud and thatched roofs, not considered too great an economic loss to the owner in the event of damage by flood or erosion.
The Niger Delta, like any other delta, is built up by deposition of sediments due to the decrease of flow velocity as the in flowing rivers (rivers Niger and Benue), enter a larger and calmer body, in this case, the Atlantic ocean. The resultant topography is that of very gentle slopes leading to numerous criss-crossing distributaries and estuaries with pronounced meanders which have a rich fertile soil with high agricultural potential. But unlike many other deltas the world over, the Niger Delta is about the least utilized with regards to its potentialities for large-scale agricultural development. The factors militating against such development include the difficulty in assess, being mainly by water, and the non-availability of large tracts of flood free land. The Niger Delta can be categorized as the area with Aboh, as the apex, to the Atlantic coast. The main characteristics of the area include the levees of the rivers as the highest land sloping gradually away from the banks to lowlands called back swamps. Such bank swamps are usually filled during the floods as the level of the water rises in river channels, usually leaving only a narrow stretch of land between the riverbanks and back swamps uninundated.
Bayelsa State of which the study area is a local government area is located in the central Niger Delta. So all the aforementioned characteristics of and problems affecting the production of fish in the Niger Delta occur in the state. Flooding is the most serious in the state in comparison with other Niger Delta states.
Fisheries ecosystems and fishing-based livelihoods are subject to a range of climate related variability, from extreme weather events, floods and droughts, through changes in aquatic ecosystem structure and productivity, changing patterns and abundance of fish stocks. Fishery resources are of particular significance in Nigeria as they provide a considerable amount of dietary protein in the country and the sector also serves as a major source of employment and labour for a large proportion of Nigerians in the riverine areas.
Fishing is a major livelihood activity of the people living in the coastal areas of Nigeria. This is because most of their land area is covered by water bodies with very little percentage of the land area suitable for other agricultural production activities such as crop and livestock production. Many households in these coastal areas are most vulnerable to the impact of climate change as a result of their low adaptive capacity to climate change [7]. Recent occurrences of flood have been reported in these coastal areas [7,8]. The consequences of flood due to climate change such as loss of fish, change in fish species, erosion of human habitat, and land will have greater impact on the welfare and the livelihood of fishermen most especially in terms of income realized from fishing activities.
As a result of flooding, some species of fishes have migrated to another location while some others died. This leads to low productivity and consequently reduced fish catch and low standard of living. This has also increased the spread of different types of diseases among the fishers and their household which includes malaria, typhoid etc. All these coupled with the problem faced by fishers has led them to adopt different coping strategies.
Francisco et al. [9] conducted a survey related to the perception of the riverine population to flood occurrences at the lower Sao Francisco river municipalities, especially with regards to the 2004 flood. To the riverine population, natural floods were always historically recognized as being positive, with the practice of agriculture on the flooded lands possible and also working as a nursery area for fish, promoting the local biodiversity conservation. With the river discharge regularization throughout the year, a decrease of fish quantity, biodiversity, and waterlogged land farming (marginal lagoon) was reported. Having stated this the interest of the study is to examine coastal flooding and it's effect on fish production in Brass local Government area,along sideidentifying the factors responsible for coastal flooding in the area, identify the extent of flooding, determine the impact of flooding on the economy of fishermen and women in the area and to proffer solution to the menace.
Literature of the Problem
Water is an indispensable ingredient of life in earth. Its supply is essentially constant and beyond the scope of humans to increase or alter any threat of reduction in availability of lessening of quality of a material so basic to our very lives as water is certain to arouse strong, emotion and deep concerns [10].
Water is a basic constituent of the biotic community. In nature, it occurs in land, below its surface, in the atmosphere and in the biosphere 97% of the total volume of water available, is in the ocean, 2% stored in the form of ice-sheels and less than 1% is available as fresh water. More so, only water in rivers, lake, swamps constitute only 0.36% of the world fresh water supplies is easily accessible to man and available to use [11]. Although man could alter the form and distribution of this usable water or at best improve its quality for better human use the distribution is uneven and fixed.
As population grows, so does demand for fresh water for various uses as mentioned above increases and in turn, population density typically affect the availability and quality of water resource in an area. The problem therefore is not with the global amount of water but its distribution, availability and quality.
According to Caddle in [10], the problem of water lies in four areas, namely, quality, quantity, reliability and financing. Reduced reliability and availability of supply are echoed in a reduced quality of the world's fresh water inventory, increased silt loads of streams, pollution of surface and groundwater supplies and lakes acidified and biologically dead or prematurely filled by siltation and algae growth are evidence of adverse human impact on an indispensable component of the biosphere [12].
Although. human activities are the major causes of water quality degradation. Nature also contaminates water, to a certain degree. Natural water is never pure in the chemical sense. The most basic of these is the role of water as a "universal solvent". Through precipitation and surface run off, minerals, gases, particulate matter are collected down the water bodies and in turn contaminate water. Natural event such as torrential rainfall and horricanes can lead to excessive erosion and landslides, which in turn increases the content of suspended materials in affected rivers. Seasonal overturn of the water in some lake can bring water with little or no dissolved oxygen to the surface.
Flooding is one of the hazardous phenomena associated with human settlements; especially those built lowland, which are generally regarded as floodplains. Floods may result from excessive rainfall, the bursting of dams or in marine environment, tidal influences. Coastal floods may be distinguished from river flood. These differ in cause, length of time of the event and predictability. Flooding brings about deposition of the sediment; the level of levee as well as the river channel may be raised.
The walls of the channels of a flowing river are often steeper than the surrounding landmass. The adjoining floodplain is usually flatter land. The floods that inundate the floodplains represent water that are in excess of the channel's holding capacity. When this occurs, water outside the channel is referred to as floodwater while the river itself is said to be in flood or flood stage. The area of the channel and adjoining flat ground is referred to as floodway. A rare situation is where a large valley has well defined floodplains. When this happens, flood fringe is identified. Inundation of a flood fringe occurs in very high flood situations with a return period of 100 years, in the U.S.A [13].
Perhaps the commoners definition of flood is that of Perkins and Stembridge which defined flood as "when land becomes covered with water". This occurs mostly during rainy season when the level of the river rises, and water covers the land nearby. The rivers are then flooded while the mangrove swamps in the area are flooded when the tide rises.
Defining flood is difficult, partly because floods are complex phenomena and partly because they are viewed differently by different people. Floods occur in many ways, usually in valley bottoms and coastal areas and can be produced by a number of influencing conditions. Their (flood) locations and magnitudes vary considerably and as a result they have marked different effects upon the environment. For most practical purposes and certainly in popular usage a meaningful flood definition will incorporate the notions of "damage" and "inundation".
As might be expected most flood definitions relate to river flood and the one by Chow [14] is not untypical: A flood is a relatively high flow, which overtakes the natural channel provided for the runoff.
In fact many stream channels have been artificially improved upon that the definition of Rost V et al. [14] is probably most appropriate 'A flood is any high stream flow, which overtops the natural of artificial bank of a stream".
However because the banks of a stream vary in height throughout its course, there is no single bank full level above which the river is in flood and below which it is not. In a strictly hydrological sense therefore, a flood may be any relatively high water level or discharge above an arbitrarily selected flood level or flood discharge.
The most cause of flooding is climatological in nature. Keller [15], Akintola [16], and Ward [17] just to mention a few illustrate this assertion in their studies on floods. They all uphold that excessive and prolonged rainfall and the incidence of snowfall are the most universal causes of floods. In other types of floods, climatological factors are partly or indirectly responsible.
Schulz & Cleaves [18] assert that soil characteristics can also affect flooding with regards to infiltration rate. The portion of rain that sinks below the surface has been estimated to change from about 50% in humid regions to nearly 100% in arid regions minus of course soil evaporation. Hence a prolonged input of water can lead to soil beingsaturated resulting in floods as further infiltration becomes impossible [17]. Thus, where rainfall intensity exceeds infiltration capacity, there is increase in the ratio of flood peak discharge resulting in floods. This made a quick flow generation originally propounded by Horton [19] which was though to have universal application, but recent revelations demonstrate that it relates only to conditions of low infiltration and increased human activities [18].
Coates [13] has conveniently divided flood-causing agencies to geologic and human. A comprehensive list of geomorphologic causes include excessive precipitation, snowmelt, ice dams, landslides and glaciers. Of these only two are common in tropical environments like Nigeria. These are excessive precipitation and landslides. According to Coates, excessive precipitation is the principal cause of most floods and ultimate cause of almost every flood. The mechanism operates as flows: Stream channels, gullies and rills, are often developed during the fluvial cycle of erosion to hold only the runoff caused during the heaviest rainfall periods. Storms of greater magnitude or intensity may exceed the bank storage and overflow into the floodplain and adjoining terrain.
Analytical Techniques
Since the study is a survey to determine the effect of coastal flooding on fish production in Brass as well as its effect on the economy of fishermen and women in the area. The researcher made use of secondary data such as rainfall data of the area for a period of 8 years from 2004-2011 which is a determinant of flooding, and a water sample analysis of the river within the period of flooding to see if the quality of the river meets world standard for aquaculture. Among the parameters tested for are; pH, dissolved oxygen, temperature, biological oxygen demand, chemical oxygen demand, total suspended solids, total dissolved solids and turbidity.
Three fishing ports were identified in the study as the sample population amongst which two were randomly selected which are Mbikiri and Igbabele, it was in these two fishing ports that the study was conducted. In the analysis of the data, the multiple regression statistical technique was used to determine the influence of flooding on water quality in the area. Secondly a descriptive statistical analysis was done to ascertain if the quality of the water in the area meets the world standard for aquaculture (Table 1) [21].
Source: extracted from Weli [21].
The table above shows rainfall data and the calculated rainfall intensity for a period of twelve (8) years. The calculated rain fall intensity record shows that there exist variations in the intensity of rainfall amongst the various years with 2010 having the highest intensity of 78.2mm/h, followed by 2007 having an intensity of 58.6mm/h and the year 2006 having 55.4mm/h. The implication of this is that with the intensity of rainfall in the area been a coastal community which ordinarily is liable to flooding, the area has always experienced flooding but the extent has been determined by the intensity of the rainfall.
Water Sample Analysis Report
The result of the water samples analysis of the two fishing ports in Brass as the study area is shown below (Table 2)
In this section the multiple regression statistical technique is used to determine the influence of flooding on the water quality of the two fishing ports, using the equation;
y = a + b1x1 +b2x2+ + e (Equation1)
where;
Y = flooding (rainfall intensity)
a = regression constant
b1-b2 = regression co-efficient
X1 = water quality parameters for Igbabele
X2 = water quality parameters for Mbikiri
e = error term
This is because there is an interplay within the various parameters, their extent of relationship especially as it concerns flooding. That is, to ascertain if water quality parameters of the two fishing ports is influenced by flooding in the area. However, from the data generated the mean flooding (rainfall intensity) was 47.8875 with a standard deviation of 16.82247 (Table 3). Moreso, a table of the various independent variables mean and standard deviations including that of the dependent variable of flooding (rainfall intensity) is shown on the table below.
A view ofthe out put of the SPSS computer model among other thingsreveal the correlation matrix to show the relationship between suspended sediment yield to velocity, depth, discharge and bed load.
The Table 4 above displays the correlation matrix of two independent variables of Water quality parameters for Igbabele and Water quality parameters for Mbikiri on the dependent variable of Flood (Rainfall Intensity) in the area.
From Table 4, it is revealed that a relationship exists between the independent variable of Water quality parameters for Mbikiri and the dependent variable of Flood (Rainfall Intensity). The students "t" statistic for this correlation is 2.855 and its corresponding table value of 8 degrees of freedom at 0.05 significance level is 2.31. Hence, this implies that there is no statistically significant relationship between flood (Rainfall Intensity) and Water quality parameter for Mbikiri.
From the Table 4, the water quality parameters for Igbabele has a negative correlation co-efficient of -0.62 with suspended sediment yield. In the same vein, the calculated "t" statistic for this correlation is 0.199 which is less than the table value at 0.05 significance level and 8 degrees of freedom at 0.05 significance level is 2.31. Hence this implies that there is a statistically significant relationship between flood (Rainfall Intensity) and Water quality parameter for Igbabele.
Conclusively, the implication of the above statement is that flooding influences the water quality of Mbikiri more than that of Igbabele, this is as a result of the type of land use in the area which is high commercial activities and residential land use and the presence of several multi nationals carrying out oil exploration activities in the area. For Igbabele flooding has been described as a major cause of the state of the water quality, arising from surface runoff and anthropogenic actions such as using water bodies as toilets and refuse dumps. Looking at Table 2 above, it is observed that the quality of the water according to each parameter is heavy for Mbikiri more than Igbabele (Table 5).
From the table above, it is observed that the water quality of the two areas meets the WHO standard with Mbikiri having the highest amount for pH, Temperature (°C), Dissolved Oxygen and Biological oxygen demand while for the other parameters such as Total suspended solid, Chemical oxygen demand, Total dissolved solids and Turbidity, the two areas have amounts of these constituents higher than the WHO standard with Mbikiri also having the highest amount of constituents for each parameter. This then shows that for fishing Igbabele is the best point though some of the parameters do not fall within acceptable standards but is far better than that of Mbikiri.
Conclusion
In conclusion, arising from the result of the study it is important to state that flooding in its totality may not be responsible for the problem of fishing been experienced in the Niger Delta, but as proofed by the study the effect of man has lead to the deteriorating state of the water quality in our Rivers, in terms of using water bodies as refuse dumps, toilets, avenues for effluent discharge, and more so the activities of oil companies such as oil spill in rivers, and the current boom of sand dredging leading to loss of aquatic lives and reduction in the economic power of fishermen and women in the area.
Hence, the following recommendations are put forward;
a. There is need to create an agency saddled with the responsibility of coastal zone development, so as to give the area its rightful place noting is fragile nature.
b. Legislations should be put in place and enforced in the interest of the aquatic lives which is fastly declining so as not to go into extinction.
c. Government and the private sector should see that within these delicate areas developmental projects that have true bearing on the lives of the people are put in place as to reduce the use of water bodies as the only source of dumping refuse and toilets.
d. Fishermen and women in these areas should be encouraged by way of training and provision of modern fishing equipments as to meet up the demands for fish and improve their own economy.
For more about Juniper Publishers please click on: https://twitter.com/Juniper_publish
For more about Oceanography & Fisheries please click on: https://juniperpublishers.com/ofoaj/index.php
0 notes
Text
Does Foliar Application of Macro and Micronutrients Have Any Impact on Roses Production? A Review- Juniper Publishers
Tumblr media
  Juniper Publishers- Open Access Journal of Annals of Reviews & Research
Does Foliar Application of Macro and Micronutrients Have Any Impact on Roses Production? A Review- Juniper Publishers
Authored by Muhammad Adnan
Abstract
In all over the world rose plants are most widely cultivated and supreme plants among all ornamental plants. However, production and quality are poor due to insufficient knowledge, non-technical skills and the most important inadequate dose and application of method of fertilizers application. The quality and yield of roses directly depends on the balanced and proper application of macro and micronutrients. Most of the traditional framers prefers to apply nutrients in soil but it causes many losses. Therefore, application of fertilizer through an efficient method allows the plant to absorb the nutrients in shorter time. Foliar fertilization is the most appropriate method of providing balanced plant nutrition in horticulture. It provides the immediate translocation of nutrients to various plant organs via leaf tissues under various nutrient deficiencies. The present review focuses on the role of foliar application of macro and micronutrients on yield and quality of roses and demonstrates that foliar application of macro and micronutrients have great impact on roses growth, yield and quality.
Keywords:Roses; Poor Production; Foliar Application; Macro and Micronutrients   
Introduction
Rose plants are the most imperative attractive plants in all our world [1]. Rose (Rosa spp.) belongs to the Rosaceae family. It is native to the northern hemisphere and is mainly grown in temperate zones. Rose is one of the most important cut flowers [2]. They are cultivated for aesthetics purpose, but numerous species of roses have correspondingly important pharmaceutical appositeness. There are many cultivars and species available that are cultivated in various environment across the world [3]. However, production and quality are poor due to insufficient knowledge, non-technical skills and the most important inadequate dose and application of method of fertilizers application. Most of the soils have high pH value (alkaline), which hinders the absorption of micronutrients [4]. When nutrient shortage cannot be corrected through soil application then foliar nutrition is an alternate method to correct those problems [5-7]. Due to high pH value of sandy desert soil, application of fertilizers through foliar spray may be helpful in these conditions to keep away from the soil fixation of some micronutrients [8]. Foliar application is very important strategy for plant nutrient management [9]. Foliar fertilization of nutrients can be a useful method of providing balanced plant nutrition in horticulture [10]. Many studies have highlighted the benefits of foliar fertilization in improving plant growth, crop yield, nutrient uptake and product quality. This technique can ensure immediate translocation of nutrients to various plant organs via leaf tissues under various nutrient deficiencies [11,12]. According to literature survey, many reports are available about foliar fertilizers on many plants as chrysanthemum, rose, tuberose and iris plants [13,14]. Micronutrients such as Fe, Mn, B, Cu, Zn, Mo, Ni and Co are necessary in much lesser amount and essential for plant intensification than those of the primary nutrients [15]. These are essential because of their immense connotation and involvement to enzyme system in metabolism. Foliar application of micronutrients may be six to 20 times more efficient than soil application in increasing crop production and other growth parameters [16]. The present review focuses on the role of foliar application of macro and micronutrients on yield and quality of roses and demonstrates that foliar application of macro and micronutrients have great impact on roses growth, yield and quality.
Effect of Foliar Application of Macro Nutrients on Roses
Macronutrients plays a very important role in plant growth and development. Their function ranges from being structural units to redox-sensitive agents. Generally, application of macronutrients increases yield, growth, and quality of crops. In the recent years, plant physiologists, biotechnologists and eco-physiologists have been working to investigate various other blind features of these minerals and their future prospective because nutrients are involved in every step of plant life. Every macronutrient has its own character and is therefore involved in different metabolic processes of plant life [17] performed a trial to check the effect of foliar application of macronutrients on rose plants. They reported that combined application of macronutrients significantly affected the rose production [18] conducted the trial to check the response of growth, yield and flowering of rose cultivar “Gruss-an-taplitz” against the sole and combined application of NPK. They studied plant height, flower diameters, number of branches, number of flower plants-1, flower diameters and flower weight. This research concluded that sole application of N and P significantly affected both vegetative and reproductive parameters. Moreover, flower weight, diameters and yield were also affected by K. However, combined application of all NPK was more effective for flower weight and diameters [19] conducted the experiment to check the effect of soil and foliar application of NPK and (NH4)2 SO4 on several growth parameters of roses. Different concentration was prepared for foliar and soil application. Application of fertilizers were applied at three different stages, namely: 1st at seedling stage, 2nd at juvenile stage and 3rd at pre-flowering stage. However, sole foliar application was not satisfactory [20] conducted the research to check the response of rose plant on different doses of NPK. They reported that the fertilizer level NK (20:12) was best to achieve maximum production.
Effect of Foliar Application of Micronutrients on Roses
In plant sciences, the prodigious significance of micronutrient is unavoidable since plant relies primarily on micronutrient as it has profound influence on array of plant activities. Although micronutrients are abundantly present in the soil, but plants usually acquire them in relatively trace amounts hence, regarded as trace elements. B, Cu, Fe, Mn, Zn are such micronutrients required in minute amounts by plants but inexorably play an eminent role in plant growth and development [21] conducted the research to test the result of foliar spray of micronutrients on quality and quantity of roses (Floribunda). They observed that combination of micronutrients showed significant result in yield and quality of floribunda roses but among all of these treatment Boric acid (0.5%)+ ZnSO4 (0.75%)+ FeSO4 (1.5%) + MgSO4 (0.5%)+ MnSO4 (1%) + CuSO4 (0.3%) showed superior results as compared to other [22] performed the research to check the effect of foliar application of iron and zinc for the quantity and quality of flowers of tube roses and reported that the among all ZnSO4 doses, 0.4% enhance the flowering parameters such as stalk length, minimum days to initiation and flower quality etc. They reported that 0.4% FeSO4 also proved best dose among all the other doses of iron. The contrast doses of Fe and Zn were non-significant so finally concluded that the 0.4% of Fe and Zn is the best level for enhancing the quality and yield [23] conducted the experiment to check the response of foliar application of micronutrients and reported that micronutrient level 0.5% ZnSO4+ 1.0% FeSO4 +0.5% MnSO4 + 0.2% Boric acid 0.1% Citric acid showed significant result for silkworm by increasing weight of larvae, efficient rearing rate, weight of cocoon, weight of shell and cocoon yield, as compared to others treatments [24] conducted the research to check the effect of foliar spray of N, B, Mn and Zn on physiological characteristics of plant. They observed the several parameters like spreading of plant, number of leaves, leaf length, leaf width, total number of suckers’ plant 1. Result indicated that ZnSO4+MnSO4+FeSO4 (0.2%+ 0.2%+0.1%) showed more significant results for total number of leaves, plant spread, leaf length, as compared to other treatments while ZnSO4+ MnSO4+ FeSO4 (0.4%+ 0.4%+ 0.3%) produced maximum leaf width.
Conducted the trial to check the response of damask rose against the foliar application of Copper, Magnesium and Zinc sulphate under Western Himalayas [25]. They reported that ZnSO4 and MgSO4 showed the better result in composition of oil and flowering of damask rose [26] performed the research work to check the response of rose bushes in term of growth, flowers and yield against the application of iron doses. Finally, the result of this study was that the doses of Fe effect positively, but the recommended dose of Fe was 3.6 mg kg-1 that enhance the growth and yield of rose bushes cultivar “Shiny Terrazza” [27] Carried the research to check the response of roses against the foliar application of two level of micronutrients (ZnSO4 and FeSO4). They reported that among all of the treatments ZnSO4 level 0.4% enhance the flowering parameters such as stalk length, minimum days to initiation, flower quality etc. Also, FeSO4 level 0.4% also proven best dose among all the other doses of Fe. So, this research recommended the dose of Fe and Zn is 0.4% for the better production [28] conducted the research to check the growth, stalk length, flower diameter, days to flower initiation and length of plant against the application of salicylic acid and Boron. They reported the dose of SA at the rate of 3 mM showered the best result and the dose of B 15 ppm showed the better result among all of the other [29] conducted the research to study the effect of foliar application of Zinc and copper on growth parameters and also the shelf life of Lilium. Positive response was showed by application of micronutrients in this experiment. Almost all doses showed significant result in specific parameters, such as the Zn 0.4% + Cu 0.4% exhibited the enhancing of leaf area, Zn 0.2% + Cu 0.2% showed maximum chlorophyll content and also maximum fresh and dry weight. Zn 0.4% + Cu 0.2% showed the increasing of shelf life of lilium and maximum solution up taking.
Conducted the research to check the results of foliar spray of micronutrients on medicinal plants growth, yield and also fresh and dry weight of roots and shoots [30]. After the application of these fertilizer the result showed that application of nutrients 400 ppm proved the best among all of the others at this concentration oil formation, growth, yield, fresh and dry weight almost all were showed the positive effect. Therefore, this dose was recommended after this research. [31] Performed the experiment to check the outcome of foliar spray of micronutrients on flowering and growth of Gerbera Jamesonii. They reported almost all the levels of nutrients were showed the positive effect on plants as compared to control. So, concluded that the nutrients improved the flowering and growth of Gerbera Jamesonii as compared to the control [32] conducted the experiment to check the response of foliar application of some macronutrients against yield, growth and nutrient concentration in grains of two registered varieties. They observed result showed that both varieties of plant showed significant response against macronutrients. Nutrient contents in grains was significantly affected by Ca, Fe, Zn, Mn and K [33]. Conducted the research to investigate the result of foliar application of zinc and manganese sulphate on, yield, fruit quality and leaf nutrition status of pomegranate. Experiment showed that the application of Zn and Mn were positive result for most of the character, but their interaction was non-significant. Mn at the rate of 0.3 and 0.6% showed the better result in some character such as growth, fruit size and quality similar Zn also play significant role for some character like TSS, TSS/TA ratio and juice content. The combination of these two nutrients at the rate of 0.3% ZnSO4 and 0.6% MnSO4 is suitable dose for the better growth and yield of pomegranate.
Conducted the research to investigate the response of growth and flowering of roses against the foliar application of micronutrients [34]. They reported the result showed that the combination of nutrients at the rate of 0.3% showed significant result for vegetative growth including number of branches (primary and secondary), number of leaver shoot-1 and number of leaves plant-1, also this level increase the flower yield and quality but showed minimum result for the initiation of 1st flower bud. [35] conducted this research to check the effect of micronutrients on growth, yield and quality of rose under control condition. They observed result was concluded that the variety Grand Gala showed the best result as compared to Corvette in parameters such as flower yield, stalk length, chlorophyll content, leaf area etc. [36]. Conducted the experiment to check the effect of foliar application of different combination of micronutrient on resistant of plant against salinity. They reported result showed that foliar application of nutrients showed significant result in root features, nutrients reserves and growth of plant [37] carried out the experiment to check the effect of foliar application of micronutrients on plant growth and flowering of three rose cultivars Kardinal, Amalia and Rosy cheeks. They reported this experiment showed that the foliar application of micronutrients improved the plant growth, quality and flower yield. B+Zn and all three nutrients combination (B+Zn+Fe) was more effective. It is concluded that the combination of these nutrients showed better result instead of their individual application [38]. Carried the experiment to check the effect of foliar application of manganese and iron with uniconazole on yield, chemical content of plant and photosynthetic pigments under salinity. This experiment showed that the salinity affects mostly parameters such as height of plants, no of tillers, weight of spike, grain weight, and also biological yield. It also enhances the photosynthetic pigments and concentration of K in grains. By this experiment observed that by application of Mn and Fe photosynthetic pigment in leaves and yield parameters were increased [39] conducted the experiment to observe the effect of application of zinc and boron on sunflower and check the difference of physiological parameters, yield and nutrition absorption of flower. This experiment revealed that application of these nutrients were showed positive affect on mostly parameters and among the dose (Zn 15 and B 1.5 along with NPK) prove the best for sun flower because at this level of nutrients plant increase their dry matter, leaf area, flower size, seed rate etc. So, they reported concluded that application of Zn and B at the rate of 15 Kg ha-1 and 1.5 Kg ha-1 respectively were recommended for the production [40]. Conducted the experiment to observe the response of shelf life of two cultivars of rose Trika and whisky mac against the different concentration of sucrose and silver nitrate. They observed the result of this treatment was positive that the cultivar that were treated with different doses of sucrose and silver nitrate was increase the vase life of flower 3 to 4 days as compared to control.
Effect of Foliar Application of combined Macro- Micronutrients
Conducted the experiment to check the response of Lilum hybrid cultivar against the foliar application of micro and macronutrients and vitamins solution. Murashige and skoog media was used [41]. They reported the result showed that the application of MS-macro, MS-micro and MS-vitamins showed significant result in qualitative and quantitative parameters like length of flower, length of flower bud, width of flower bud and weight of bulb. Shelf life of flower improved by the application of MS-macro solution alone [42] performed the trial to check the effect of foliar application of nitrogen, boron, manganese and zinc on maize and check the response of physiological characteristics. They observed result of the research showed that foliar application showed the positive effect on all three stages. Zn application on tasseling stage showed highest weight of grain and at seed forming stage foliar application of B showed significant result, N foliar spray also showed the best result as compared to control that only spray with water [43] conducted the research to check the effect of nitrogen and potassium and some micronutrients on yield and chemical composition of plant. They reported the result showed the positively effect on yield, number of tillers, length of plant, weight of spike, number of grains spike 1. They also observed result revealed that the soil application of N and K and foliar application of micronutrients were suitable for growth and yield of plant. [44]. Executed the research work to study the result of foliar spray of micronutrients on photosynthetic rate and yield of citrus fruit. They reported result of foliar application of Zn, Cu and B at the rate of (0.1%, 0.2%, and 0.3%) respectively showered the improvement in photosynthetic rate, chlorophyll content, transpiration rate, stomatal conductance. Zn, Cu and B (0.3%, 0.1% and 0.2%) respectively was proved best for better fruit quality and yield.
Carried out the research to investigate the result of foliar application of macro and micronutrients on growth and yield [45]. They observed result revealed that foliar application of micro and macronutrients showed significant result in growth and yield. Treatment improved the number of flower, flower weight, leaf area, diameter of flower and NPK (15:32:7) + micro power improved the growth and NPK (15:32:7) + Chelated mix micronutrients improved the length of branches, height of plant. So, by this experiment micro and macronutrients foliar spray proved best for yield and growth [46] performed the research to check the response of rose cultivars “Cardinal mac” and “Whisky mac” to different doses of macro and micronutrients. They observed that NPK (15:32:7) and (NPK + micro power) proved the best combination of macro and micronutrients. They also reported that the NPK (15:20:15) + Chelated mix micro + VC- 10 (vermi compost) showed less effective result due to the addition of VC- 10 because VC-10 increase the acidity of soil which give low growth and also low quality. The comparison between two cultivars “Cardinal mac” response well as compared with “Whisky mac”.   
Conclusion
In all over the world rose plants are most widely cultivated and supreme plants among all ornamental plants. However, production is poor due to inadequate application method of fertilizers. The presents review concluded that foliar application has a great impact on roses production, and it is the easiest and most economical way to improve quality production of rose flowers.
To know more about Juniper Publishers please click on: https://juniperpublishers.com/aboutus.php
For more articles in Open Access Journal of Reviews & Research please click on: https://juniperpublishers.com/arr/index.php
To know more about Open Access Journals please click on: https://juniperpublishers.com/journals.php
0 notes
annieboltonworld · 4 years
Link
Tumblr media
0 notes
Text
Synthesis and Characterization of Low-Cost Activated Carbons for Pollutants Removal from Automotive Emissions-JuniperPublishers
Journal of Chemistry-JuniperPublishers
                                       Abstract
Air purification is one of the most widely known environmental applications of activated carbons. In order to guarantee the successful removal of contaminants and pollutants on activated carbons, the development of new adsorbents has been increasing in the last few years. This paper presents a systematic study for cleaning vehicles emissions of CO, SO2, NO2 and H2S using the process of physical adsorption on novel adsorbents obtained from tropical biomasses. Use of this simple method is a valuable alternative to meet emission standards in Developing Countries. It is known that the agricultural wastes studied here are a feasible alternative for granular activated carbons preparation for pollutants removal during engines operation, approaching its efficiency to the commercial Catalytic Converters.
Keywords: Combustion gases purification; Activated carbons; Adsorption and adsorbents; Pollutants removal
Go to
Introduction
“Quality Can Be Planned.”-Joseph Juran
Wastes cannot be introduced to the environment in unlimited amounts, especially in case of air pollutants. Different measures have been taken to limit the pollution emission. These are e.g.: the elimination of technological processes generating a lot of waste, introduction of new technologies which minimize the contamination, etc. If it is not possible to reduce the emissions, the waste gas must be purified [1,2]. Nowadays, the economic conditions of Developing Countries don’t allow that all individuals own a new automotive. It is necessary to develop alternatives to reduce the negative environmental impact associated with obsolete engines operation. The best way to address it is by reducing certain exhaust gas components during fuel combustion. The answer therefore is to look at vehicles as an integral whole to identify which solutions would be more feasible. Taking this holistic approach to vehicle improvement as a basis, three main exhaust emission control strategies can be defined:
a.Reduction of fuel consumption;
b.Exhaust gas treatment, and
c.Performance monitoring.
From these three alternatives the second one is currently the more effective for air quality improvement. The main gas treatment currently used is the Catalytic Converter, typically comprises of an expensive porous ceramic substrate with large surface area [3]. Unfortunately some users in Central America and the Caribbe an Countries tend to remove the Catalytic Converter from the vehicles to get better power loads. Over the last decade, the study of combustion gas treatment has been focused on more sophisticated Catalytic Converters. Consequently, the study of other alternatives for exhaust gas purification is important. There are a few methods to purify harmful combustion gases such as physical adsorption [4,5], chemical absorption [6], catalytic methods, etc. [7,8]. It is necessary then to select a suitable method to purify harmful gas for Developing Countries. Besides, standards for vehicles become more mandatory day to day. The more feasible alternative would be the development of customized activated carbons filters for the betterment of the environment. This can be accomplished by reducing the emissions that contribute to smog and acid rains [9].
Activated carbons can be obtained from different precursors, with benefits to the environment [10-12]. Due to its chemical composition, forest biomasses are valuable sources in the synthesis of adsorbents materials. Several examples of activated carbons preparation can be found in the open literature [13,14]. They have been used among others in the purification of pollutants gases such as carbon dioxide, sulfur dioxide, hydrogen sulfide, nitrogen oxides and mercury [15-17]. Taking into account this background, the main objectives of this work can be summarized as follows:
I.Study the feasibility of some agricultural wastes as raw material for activated carbons production with high specific surface area, high mechanical resistance and wide availability.
II. Definition of the best experimental conditions for “chemical activation” with H3PO4 such as “physical activation” with steam water for each precursor.
III. Study of the elimination of pollutant gases in vehicles engines with filters of the adsorbents produced.
IV. Proposal of a methodology for filters evaluation in the removal of undesirable pollutants (CO, SO2, NO2 and H2S) during engines operation.
The practical aspects addressed in this research cover the broad spectrum of applications for air cleaning. Better engines performances can be obtained with an adsorption technique of activated carbons, through an extremely economic method.
Go to
Materials and Methods
The raw materials selected for the study are presented in Tables 1-3.
Preparation of Activated Carbons
The starting raw materials were cut up in small pieces and next subjected to pyrolysis. This process was carried out in a tubular reactor in nitrogen atmosphere. The samples were heated (10 °C/ min) from room temperature to the final pyrolysis temperature of 500. In the final pyrolysis temperature, samples were kept for 60 minutes and then cooled down. The solid products of pyrolysis were next subjected to physical activation [18]. In the case of chemical activation the raw materials were the original precursors which were overnight impregnated into H3PO4 and later submitted to pyrolysis into a stainless steel reactor of 30cm of length per 3cm of diameter. Once the reactor reached the desired temperature the samples were kept at the final temperature according to the experimental conditions of the specific experiment. The activated products then cooled down and washed with enough water till get a neutral pH. Finally, the products were dry at 120°C and then stored [19]. Two different processes were used for the synthesis of the activated carbons from the chars previously obtained by pyrolysis. [18,19]. The experimental conditions used were:
Factorial experimental designs 32 were executed to evaluate the simultaneous influence of the activation conditions on the final product features [14,20]. Following the details for both synthesis processes:
Key properties of the activated carbons prepared were analyzed:
a. Raw material availability;
b. High specific surface area;
c. High mechanical resistance;
d. High adsorption speed.
Characterization of the Raw Materials and Synthetized Activated Carbons
Elemental Analysis: The amount of elements (carbon, hydrogen, nitrogen and oxygen) in the raw materials was determined by an Elemental Analyzer by flash combustion. The samples were firstly dried in an oven at 110°C before the measurement was carried out. The materials was burned at a temperature of 1000°C in flowing oxygen for C, H and N analysis in the analyzer. The CO2, H2O and NOx combustion gases were passed through a reduction tube with helium as the carrier gas to convert the NOx nitrogen oxides into N2 and bind the free oxygen. The CO2 and H2O were measured by selective IR detector. After corresponding absorption of these gases, the content of the remaining nitrogen was determined by thermal conductivity detection. The oxygen was calculated by the difference of carbon, hydrogen and nitrogen.
Apparent Density Measurement: Apparent Density is a measure of the mass per unit volume of a material. It is also called Bulk Density and provides a measure of the “fluffiness” of a material in its natural form. In this work the Standard ASTM D1895 was used. According to this standard the materials are poured into a cylinder of known volume (e.g. 100 mL pipettes) and later weight. Apparent density was calculated as the mass of material divided by the volume occupied into the cylinder [21].
Specific Surface Area Measurement: In order to examine the structure of the synthetized materials, the measurement of the specific surface area of the activated carbons was carried out by gas adsorption isotherms using a Sorptometer applying BET Model. All samples were degassed at 200°C prior to N2 adsorption measurements. Specific surface area was determined by a multipoint BET method using the adsorption data in the relative pressure range: 0.05-0.3 [22,23].
Mechanical Resistance Measurement: The mechanical resistance of the obtained activated carbons was measured through a simple method. A know mass of the granular material was impacted by six glass balls into a semispherical container of stainless steel. The percentage relation between the fragmented mass retained in a 0.5mm mesh and the initial mass is used to estimate the mechanical resistance [24].
Adsorption Speed Evaluation: The adsorption speed was determined by Arrhenius Equation:
where: dX/dt is speed of the adsorption process studied; α is the residual concentration of the adsorbed; and k'ads the apparent kinetic constant of the adsorption process that can be determined by:
Applying logarithm to (Equation 2) brings the possibility to change an exponential equation into a linear dependence, see below:
Plotting ln k'ads vs 1 the activation energy (EA) and the preexponential factor (k0) of the adsorption process studied can be calculated. In the Results discussion ads k'ads will be refers as K for practical reasons.
Designing Process of Activated Carbon Filters
Figure 1 illustrates the process of activated carbon units customized assembling. These filters are very useful to study pollutant gases elimination in automotive engines with the adsorbents produced [25]. The samples, in the form of granules of 2-5 mm in diameter, were packed into a steel column (length 300 mm and internal diameter 90 mm). The gas was passed through the bed of the adsorbent at 0.50 L/min. The concentration of CO, SO2, NO2 and H2S were monitored using a Gas Chromatograph Equipment with standard TCD detector. The concentrations were calculated by integration of the area above the curves.
Pollutants Monitoring at Laboratory Scale
There is no known method available in the open literature which is capable of simultaneously determining all components of combustion gas evaluated here [2]. So the method developed in this work in an innovative alternative for Developing Countries. In order to determine the suitability of the obtained adsorbents in the elimination of CO, SO2, NO2 and H2S, the pollutants removal rate was determined. Figure 2 show a schematic diagram of the customized laboratory system for pollutants monitoring. The system includes, among others:
a. 6 cylinder automotive engine;
b. Activated carbon filter;
c. Exhaust gas analyzer device (Gas Chromatograph);
d. Computer system for data acquisition and recording, etc.
Gas Monitoring System: A standard gas chromatography was used with the following specs:
i. Detector: TCD;
ii. Carrier: Helium;
iii. Column: Porapak Q and Molecular Sieve 5A;
iv. Oven Temperature: 100°C;
v. Sample volume: 1 ml;
vi. Carrier Flow: 25 ml/min;
vii. Detector Temperature: 120°C
Under the described chromatographic conditions, the four gases could be easily separated and quantified [25,26].
The methodology used starting with the preparation of the calibration gas sample by injecting known volumes of each of the four pure gases (CO, SO2, NO2 and H2S) and balance nitrogen into adequate Gas Sampling Bag through the bag’s rubber septum. One mL of the calibration gas mixture and the combustion gases were analyzed by a GC system. Randomly measures of combustion gases before and after the purification process were made on a similar manner in order to evaluate the removal rate of the undesirable pollutants.
Go to
Results and Discussion
Composition and Physical Properties of Raw Materials and Synthetized Products
Table 4 brings a summary of the chemical composition (elemental analysis), such as some physical properties of the 5 precursors studied. Elemental nitrogen, carbon and hydrogen were determined from the elemental analyzer by flash combustion while oxygen was determined by the difference of these three elements. Table 4 shows that the largest amount of element in the raw materials was carbon (except for Corncob which had smaller amount of carbon than oxygen), followed by oxygen, and the smallest amount was nitrogen. The lower content of carbon for Corncob can be attributed to a higher content of volatiles in the structure, translated into a high porosity for the raw material. All precursors do not contain sulfur in their structure, which is very favorable from the ecological and technological points of view.
dap: apparent density; dr: real density; P: porosity
The activation process increases carbon amount (~20- 40%) after modification. On the contrary, there was a reduction in the oxygen content (~20-30%) after physical and chemical modifications. There will be also a reduction of hydrogen amount. The amount of nitrogen was so small for all materials. In Table 4 it can be observed that the initial porosity of all material, except corncobs, have lower values, below 0.5, it makes these materials adequate for activated carbons preparation through chemical or physical activation. The best products synthetized by each process will be reported in Tables 5 & 6.
S: specific surface; K: apparent kinetic constant of the adsorption process; Y: yield; Mr: mechanical resistance; dap: apparent density.
S: specific surface; K: apparent kinetic constant of the adsorption process; Y: yield; Mr: mechanical resistance; dap: apparent density.
Activated carbon from Central American Mahogany was the more reactive material with a significant porosity development (S = 847 m2/g). Should be noted also that Activated carbon from Mamey Zapote was the best adsorbent (S=940m2/g) and also have the higher mechanical resistance, yield and apparent density, very important for filters durability, but it’s the less available material. Finally it’s necessary to remark that Common Corncobs, a widely available agricultural by-product in Central American Countries, showed the worst results for all adsorbents properties; it can be attributable to the higher porosity of the initial raw material (P = 0.79%).
Table 6 show that activated carbon from Central American Mahogany was again the more reactive material but registered now the higher yield and adsorbent area (S = 832 m2/g). Furthermore, Activated carbon from Mamey Zapote was the second better adsorbent (S = 805 m2/g) and again have the higher mechanical resistance and apparent density. One more time activated carbon from Common Corncobs was the worst adsorbent (S = 470 m2/g). This fact is a consequence of poorly porous structure development during the activation process. Further analysis of the data from Tables 5 & 6 indicates that irrespectively of the variant used, the process of activation leads to further changes in the structure of carbonaceous material. The activated carbons synthetized from different materials studied here differ significantly mainly in the specific surface area development. The adsorbents differ not only in the surface properties but also in their texture and morphology that depend first of all on the variant of activation and the pyrolysis conditions of the initial material. Figure 3 illustrates the differences in specific surface development with both methods and the same materials.
In Figure 3, it can clearly be observed that larger specific surface area developments were achieved with physical activation processes. Those products were the better adsorbents to remove the undesirable pollutants. It also confirms that the factorial experimental designs used are the most suitable to optimize the conditions for activated carbon preparation. Textural parameters significantly affect the adsorption properties of the samples studied. [5] This observation suggests that the functional groups of the surface also have considerable influence on the abilities for combustion gases removal. All adsorbents studied had a rapid decrease in CO, SO2, NO2 and H2S concentration after gases interacted with the corresponding filters. High intensity of these harmful gas reductions at ambient conditions can be the reason for much better adsorption on higher surface area products. This explains the lower efficiency of gases removal by chemically activated carbons. The chemical activation process has the additional disadvantage of the required product washing after preparation which inevitably aggregates additional costs.
Gas Monitoring System
The calibrating gas analysis through the regression equations obtained from triplicate analysis of the gas mixtures at identical concentrations, revealed excellent agreement with the known concentrations. The pollutants monitoring system and analytical method used were effective for the simultaneous analyses of the four toxic combustion gases. Figure 4 shows two examples of 2 chromatograms obtained during the analysis of combustion gases purified with activated carbons from Mamey Zapote physically activated. In this figure it can clearly be observed the significant difference before and after gases interaction with adsorbents that can remove large amount of these undesirable gases with the associated environmental benefits.
The analyses of combustion gases revealed moderate concentrations of H2S and CO but very high concentrations of SO2 and NO2. The most effective adsorbent to remove these gases were the physically activated ones. At the present state of knowledge we can only speculate about the reasons for such poor results obtained from the chemically activated samples. Most probably the reason is the presence of a large number of acidic groups on their surface, in contrast to the physically activated samples, that probably have more basic functional groups present on the surface of the samples. Other chemicals present in the combustion atmospheres did not appear to interfere with the analyses. The chromatographic peaks were well separated and defined and the gases were present in amounts that could be easily determined. An excellent precision with relative standard deviations significantly below 2% were achieved in all gas monitoring analysis. The speed, sensitivity and selectivity of the used method make it suitable for analyzing combustion gas mixture of the four gases studied. Table 7 shows the overall average values of pollutants removal with activated carbons (A.C.) during automotive engines combustion.
In Table 7 it can clearly be observed that SO2 and NO2 amounts monitored are remarkable higher than the average limit values for 24 h. The good news is that the activated carbons studied can efficiently remove about 80% of pollutants in exhaust gases from automotive engines with the added value that the harmful gases concentration goes below the limit values. Figure 5 show the correlation between pollutants removal rate and specific surface area of activated carbons during automotive engines operation. In Figure 5 it can be clearly be observed that higher activated carbons specific surface area translated into higher pollutants removal rates that could be estimated by the equation 4 with a correlation coefficient R2 = 0.995:
A proper choice of the parameters of chemical and physical activation such as temperature, activation time, activates agent, etc., permits getting universal adsorbents showing very good adsorption properties towards such pollutants as SO2, CO, NO2 and H2S, however more studies are needed.
Go to
Conclusion
Agricultural wastes studied here are a feasible alternative for the synthesis of activated carbons for pollutants removal during automotive engines operation. The main features that make these products feasible for the diminishing of automotive engines emission are their high adsorption capacity, approaching its efficiency to the commercial Catalytic Converters such as the cheaper costs and its renewability. Based on these results the granular activated carbons studied, produced in large amounts, are fully exploitable for combustion gases treatment. The complex composition of the flue gas with SO2, CO, NO2 and H2S can be successfully analyzed with good compound separation and repeatability. The method used in this investigation would be also be suitable for combustion toxicology researches and could possibly be easily modified to analyze these gases when they are liberated from biological sources [27].
Go to
Acknowledgement
The author wishes to acknowledge Maria Andrea Camerucci and Ana Lia Cavalieri from Mar del Plata University, Argentina, they provided a crucial help in the experiments of this work.
To know more about Journal of chemistry,
Click here:
https://juniperpublishers.com/omcij/index.php
To know more about juniper Publishers,
click here:
https://juniperpublishers.com/index.php
2 notes · View notes
Use of Moringa (Moringa Stenopetala) Seed Extract for Removal of Some Anionic Dyes (Direct and Reactive Dyes) in Textile Wastewater - Juniper Publishers
Tumblr media
Abstract
With textile wastewater being one of the most sources of pollution containing higher value of colour, BOD, COD and several pollutants, brings serious problem to the ecological environment. This study is particularly focused on evaluating the efficacy of removal of dyes such as reactive dyes, direct dyes, mixture of dye wastes and mixture of industrial and dye wastewaters by the natural absorbent Moringa. The seeds of the Moringa tree contain a coagulant protein that can be used in the treatment of industrial wastewater. The extracts of seeds (coagulant) obtained by two methods viz. simple extraction with distilled water and with saline water have been used for the study. The effect of some operating parameters on coagulation namely pH, coagulant dosage, mixing time, colour removal and turbidity was studied. It was observed that the colour removals in direct red and reactive red dyes are 94.45 and 98.4% respectively with simple extract of seeds done with distilled water and it was in the order of 96.6 and 97.3% respectively with saline extracts of the seed. These values of colour removal of dyes are optimized at 70ml/L of coagulant and pH 10. The mixture of industrial wastewater and dye wastes, colour removal and turbidity removal was 85.8% with simple extract and 53% with saline extract respectively at optimized point. Moringa stenopetala seed has demonstrated to have high removal ability for anionic dyes.
Keywords: Moringa stenopetala; Natural coagulant;Colour removal; Wastewater; Treatment; Coagulation; Flocculation
Introduction
In textile waste water, dyes are considered the most important pollutants. High volume of waste water that is produced by the textile industry causes water pollution. Generally, dyes in waste water from textile and dyestuff industries are difficult to remove. This is because dyes are usually synthetic and have complex aromatic structures which make them more stable consequently they are difficult to biodegrade [1]. The removal of dyes from textile waste water is one of the most environmental challenge [2]. For many years, researchers have been working on ways of removing dyes from wastewater and different procedures have been developed; for example, physical and chemical degradation and adsorption onto materials such activated carbon and, in addition to a large number of other techniques such as Fenton’s oxidation, electrochemical degradation, ozonisation, etc [3]. The most commonly used in the textile industry are chemical methods that use oxidizing agents such as, peroxide of hydrogen, ozone and purification by physicochemical process of coagulation-flocculation, in which chemical compounds are used, the most employed are iron and aluminum salts [4]. This physicochemical process is widely used both in developed countries as in developing, for it easy operation and low cost. Nevertheless, when applied in textile wastewater, it generates large volumes of sewage sludge and the ineffective decoloration of some soluble dyes.
Moringa tree belongs to the family Moringaceae which is shrubs trees cultivated across the whole of the tropical belt including Ethiopia, used for a variety of purposes such as food, medicinal, and others [5]. The seeds of the Moringa tree contain a coagulant protein that can be used in the treatment of industrial wastewater. For drinking water clarification, Moringa seeds is also used as coagulant/flocculent agent due to its high content of a water-soluble cationic protein which able to reduce turbidity. Oil extracted from Moringa can be used for water treatment, for drinking water clarification and it is also used for textile wastewater treatment [6]. The use of natural coagulant is followed in developing countries, as substitution of external chemical coagulants such as aluminum sulfate and ferric chloride. The water -soluble extract of the dry seeds of Moringa is one of the natural coagulants. Moringa is used for water treatment in two different methods, one as a primary source of activated carbon and the second method through seed extraction, and produce a product working as a coagulant/flocculant agent [2].
At low temperature working conditions, the performance of these customary chemical coagulants is dubious and on-going developments have resulted in the introduction of polymerized aluminium coagulants. Owing to its higher superiority and lower consumption of alkalinity, Polyaluminium chloride (PACl) has garnered a growing market. Flocculants such as the more common polyacrylamide, which are organic synthetic polymer available in the market, offers a wider selection of chemical coagulants to cater for the diverse requirements of the individual water treatment plants [7]. Currently, there is an increased interest in the decolorization and decontamination of industrial textile wastewater. Different treatment technologies have been studied in order to solve the problems caused by the toxic substances contained in industrial textile wastewater, such as electrocoagulation, adsorption, photocatalytic process, ozonation, membrane bioreactor and anaerobic/aerobic biological treatment [8]. However, these methods are neither economically nor technologically suitable for large scale use and normally require the combination of two or three methods to achieve an appropriate level of colour removal [9-11]. The various coagulants attempted for colour removal and their efficiencies are listed in Table 1. To overcome the drawbacks of inorganic coagulants and synthetic polymers associated with growing environmental concerns worldwide, there is a need to consider other potential alternatives for textile wastewater treatment in order to minimize environmental damage and improve the wellbeing of human populations. Therefore, researchers have shown considerable interest in the development of natural polymers as coagulants in the recent past [7,12-14].
RTW: Real Textile Wastewater; SS: Suspended Solids; SSP: Surjana Seed Powder; MSP: Maize Seed Powder; TW, Cr: Tannery Waste, Chromium
Plant materials as coagulants offer several advantages over conventional coagulants such as aluminium sulphate as stated below [8].
1. Activity is maintained over a wide range of influent pH values - no pH correction required
2. Natural alkalinity of the raw water is unchanged following coagulation-no addition of alkalinity required
3. Sludge production is greatly reduced and is essentially organic in nature with no aluminium residuals – sludge volumes are reduced by a factor of up to 5.
4. Minimal coagulant dosage requirement
5. Efficiency at low temperature
6. Chemical coagulants are generally more expensive, toxic and with low biodegradability.
Many researchers carried out studies using Moringa seed for water and wastewater treatment on the seed itself and as cake powder. As the studies reported, the Moringa seeds have content of protein (26.50% - 32.00%), fiber (5.80% - 9.29%), ash (5.60% -7.50%), fat (42% - 45%) and moisture contents (8.7% - 9.1%) [15]. It is also stated that Moringa seeds as one of the most effective natural coagulants, applied to transform water constituents into forms that can be separated out physically. Significant quantities of high molecular weight water-soluble proteins present in the seed of Moringa carry a positive charge [16]. When the crushed seeds added to raw water, the protein produces positive charges acting like magnets and attracting the predominantly negatively charged particles (such as clay, silt, bacteria, and other pollutants). Under proper agitation, these bound particulates then grow in size to form the flocs, which may be removed by filtration or left to settle by gravity [17-19]. In developing countries, Moringa seed is considered favorably in terms to reduce the costs of wastewater treatment in comparison with chemical coagulants [17,20-22]. In addition, the sludge produced by Moringa seed as a coagulant is stated to be innocuous and 4-6 times less in volume than the chemical coagulants produced. In the present attempt, a study has been carried out systematically to assess the efficacy of colour removal in textile waste water containing some anionic dyes such as direct and reactive dyes which are most widely used for coloration of cotton, using Moringa stenopetala seed extracts as coagulant.
Materials and Method
Materials
Domestic mill for grinding the seeds in to powder, nylon sieve filter, Whatman filter paper 4, digital pH meter, UV spectrophotometer (Perkin Elmer, Model Lambda 2500), and Moringa seed were used during the experiment work. The HI93703 Turbidity meter was used to measure turbidity as per ISO 7027 Method. The dyes used in this study were Direct Blue/ Red, Reactive Red/ Blue and a mixture of both dyes; such as C.I Direct Red 81 (λmax 497nm), C.I Direct Blue 86 (λmax 346nm), C.I Reactive Blue 19 (λmax 315nm) and C.I Reactive Red 195 (λmax 532 nm)). Finally, all dyes are mixed together simultaneously with themselves (λmax 362nm) and with industrial wastewaters obtained from Bahir Dar Textile Share Company, Bahir Dar, Ethiopia (λmax 333nm). Various chemicals as presented in Table 2 for making the buffer solutions of different pH, Sodium chloride and ethanol for Moringa seed extraction, were used.
Methods
Moringa (M. Stenopetala) seed collection: Moringa seed were collected manually from the dried pods of trees. The seeds were dried in sun light for three dyes. The hull was removed from the seed surface and wings from the kernels after drying. The kernels were ground in to medium fine powder with a domestic mill to achieve solubilization of active ingredients in the seed.
Coagulant extracts: It is possible to use the Moringa seed as coagulant either as seed or extracting the oil from the seed (defatted cake). To enhance effectiveness on wastewater treatment, some studies have recommended the importance of using defatted Moringa seed[23,24]. In line with this recommendation, defatted Moringa seed was prepared for the purpose of this study.Extracts were prepared by two methods such as a simple extract with distilled water and a saline extract with sodium chloride, as explained below.
1. Simple extract
Production process of simple extract coagulant: Grinding of dry seed without shell→Aqueous dissolution of seed in distilled water→stirring→filtration To prepare 1 L of simple extract coagulant in aqueous solution, 50 g of seed powder was dissolved on 1 L of distilled water by mixing vigorously for 45 min in a magnetic stirrer at room temperature. Then, the mixture was filtered twice: once through commercial filter paper on a funnel and once again through a finefiltering system (Whatman filter paper). The result was clear, milklike liquid and was used as coagulant without further purification.
2. Saline extract
Production process of saline extract coagulant: Grinding of dry seed without shell → Dissolution of seed in a solution of 0.5M NaCl → Stirring → Filtration
To prepare 1L of saline extract coagulant in aqueous solution, 50g of powder were dissolved on 1L of NaCl 0.5M solution by mixing vigorously at pH 7 for 45min in a magnetic stirrer at room temperature. Then, the mixture was filtered twice: once through commercial filter paper on a funnel and once again through a finefiltering system (Whatman filter paper). The result was clear, milklike liquid and was used as coagulant without further purification. After, the seeds kernel dried, a domestic mill was used to grind in to fine powder to achieve solubilization of active ingredients in the seed. The extract oil was soaked in 95% ethanol, 100g of the powder in 500ml of ethanol for 45 minutes at room temperature while mixed with the help of stirrer from time to time. To obtain the defatted cake, it was required to filter the solution using filter paper. The remaining solids (pressed cakes) in the filter were then dissolved in water followed by stirring and filtration in the same way. Then the cake was allowed to dry in oven at 40 °C for 24 hours. In that process ethonal got removed from the seed cake powder. Finally, the dried seed powder is stored under room temperature until it is used for coagulation experiments (Figure 1).
a. Yield of Moringa seed in coagulant preparation
The yield is calculated as given in equation (1).
Yield of moringa=
Where Mi is the original weight of Moringa powder; Mf is the Final weight of Moringa powder after filtration (undissolved solids in the solution).
b. Preparation of synthetic Effluent samples
In order to test the coagulants extracts, in a first stage, synthetic samples were prepared in the laboratory using two types of dyes, all are azo type but with a classification of different class; direct and reactive dye; tested individually and as mixtures and with industrial wastewater, and mixture of dyes added to industrial waste water. The dye stock solutions were prepared by dissolving accurately weighed dyes in water to the concentration of 100-300mg/L, then dyeing with 100% cotton fabric sample following the standard procedures was carried out and dye waste water was collected after dyeing. Different concentrations were prepared from collected dye waste (250, 500, 1000ml in beakers) for treating with coagulant.
c. Effluent analysis
Total solids, dissolved solids, suspended solids, BOD5, COD were measured following standard procedure (AMHA, 1995 and Standard methods, 1995) and SPSS Statistical Data Analysis Software was used for analyzing the data.
d. Optimisation of process parameters
In order to optimize the various process parameters, coagulant dosage range of 10-80ml/L, mixing time of 30-45 minutes, pH in the range of 2-12 were used.
e. Colour Measurements
The difference in absorbance before and after treatment measured in UV-VIS spectrophotometer was used for measurements of colour removal. The results are presented as graphs with respect to various process parameters. The percentage removal efficiency of the parameters was calculated using following formula (2).
Colour Removal efficiency=
Where Ao is the Absorbance value before treatment and A is the Absorbance value after treatment.
Go to
Results and Discussion
Yield of extracts
It has been observed that the yield of moringa seed powder dissolved in the solution is 93.4% and 92.5% for simple and saline extracts respectively. There was no significant difference between the extraction methods in terms of yield and it is evident that maximum percentage of Moringa powder is going to be consumed by the water and used wastewater treatment in the studies by dissolution.
Characteristics of Raw Textile Effluent
An initial experiment was carried out to determine the preliminary characteristics of textile effluent for examining the effectiveness of the M. stenopetala as a coagulant. The pH of the effluent was found to be in the range of pH 9 and 11.5 for all dye effluents and industrial wastewater having dyes Table 3. This indicated that the effluent/dye waste from the textile industry has more alkaline in nature. As the measurement shows, the total solids were found at maximum level of 9000ppm and 8000ppm for the two direct dyes Red 81 and Blue 86 respectively. It was decreased to 3575ppm and 3830ppm after treatment with simple extract and 3500 and 2200 in saline extract as shown in Table 4. In addition, the experiment also confirmed that the particles of TDS are higher than TSS in the textile wastewater samples. Furthermore, the experimental result of wastewater sample show COD higher than BOD5 values. This value indicates, textile wastewater contains high amount of non-biodegradables, 3 to 4 times than degradable organic matters. For instance, direct red 81 abd blue 86 have higher value of COD which is decreased to 750 and 190 respectively in simple extract. In the same way, the highest value of BOD5 was found at reactive 195 and blu 19 which was reduced to 180 and 205 respectively in simple extract. The reduction of COD and BOD5 was also observed in saline extract treatment. The treated effluent was characterized and the values of various parameters (pH, Total Dissolved Solids, turbidity, BOD5 and COD) are compared with the raw effluent. The characteristics of raw textile effluent and after treatment with simple and saline extracts are summarized in (Table 3 & 4). One liter of wastewater was used for the treatment from the total solution. All samples have been mixed for 30-45minutes. From the table, it is also clear that less removal of Turbidity, TS, COD and BOD5 for Industrial wastes and the one mixed with dye solutions waste water due to plenty of other components of wastes in industrial wastewater.
The Effects of processing parameters on Coagulation
Effect of coagulant dose
Coagulant dosage is one of the most important parameters that have been considered to determine the optimum condition for the performance of coagulants in coagulation and flocculation. The coagulant dosage indicates the concentration of M. stenopetala seed extract in the water. This difference is important to note since a lot of the seed mass was separated during the filtration step when preparing the extract. Essentially, insufficient dosage or overdosing would result in the poor performance in flocculation. Therefore, it is significant to determine the optimum dosage in order to minimize the dosing cost and sludge formation and also to obtain the optimum performance in treatment. The effect of coagulant doses (10-80ml/L) on the removal of reactive and direct dyes using Moringa stenopetala coagulant and flocculation time is 30-45min is shown in Figure 2-5. This shows that there was continuous removal of these dye colours with increasing coagulant doses up to 70ml/L as shown in (Figure 2 & 3). After 70ml/L, the colour removal decrease that confirms the optimal concentration as 70ml/L. This may be as a result of re-suspension of solids at this concentration. Furthermore, the high concentrations (>25.0mg/L) of the coagulant confer positive charges on the particle surface (a positive zeta potential), thus redispersing the particles.
It is also an assumption that an increase in the coagulant dose may cause a decrease in pH of system. The decrease in pH may be as a result of the hydrolysis of the coagulants. Low pH values of the coagulated system usually may be attributed to the neutralization of the negatively charged surfaces of wastewater colloids, leading to their destabilization by H+ ions. However, acidification of coagulated wastewater may disturb sorption or could increase the solubility of freshly formed sludge. The highest percentage colour removal of these dyes was found to be 98.4, 86, 94.45, 89.3, 90.5 and 85.8% and 97.3, 84.45, 96.6, 84.8, 87.73 and 84.6% for Reactive red 195, Blue19, Direct Red 81, Blue 86, Industrial waste water and the mixture for both simple and saline extract respectively. This confirms that Moringa seeds to have absorbent properties and effective for removal of colour.
Colour removal
After the characterization of the effluent, the prepared coagulants from the Moringa stenopetala seeds were added to the effluent in the form of coagulant to determine the effectiveness of the extract over the textile effluent. The result shows that the extract removes the turbidity and colour from the textile effluent. Removal efficiency of up to 98.4%, for colour, was reached using 70ml Moringa Stenopetala coagulant extract. The use of Moringa seeds has an added advantage over the chemical treatment of water because it is biological and has been reported as edible [25]. All dyes selected for this study, was prepared in the laboratory for treatments and also mixed with industrial wastewater. A dye solution was prepared as required and its pH was measured by using pH meter. One liter of the initial solution was put into two beakers, and various doses of coagulant were added (10, 20, 30, 40, 50, 60, 70 and 80 in ml/L). The beakers were put one by one onto a standard magnetic stirrer. The solution was stirred for 10 min at high speed and then slowly mixed at 60rpm for 30-35min. After that, the solution was left for 45-60min for settling. The supernatant after settling was filtered through a Whatman filter paper (pore size 20–25μm). The reduction in colour concentration was measured at maximum absorbency visible wavelength of each dye solution was measured by using a UV-VIS spectrophotometer. The result of colour removal is shown in (Figure 2 & 3). In industrial wastewater colour removal is less because the effluent contains high content of dyestuff, surfactants and other additives that are generally made up of organic compounds with a complex structure. These wastewaters are collected from different sections having such different compounds and collected in wastewater plant for treatment.
Effect of flocculation time
The time of macrofloc formation (flocculation time) is one of the operating parameters that is given great consideration in any water treatment plant that involves coagulation–flocculation operations (Figure 4 & 5). Presents the effect of flocculation time using different dose of coagulants for removal of reactive dyes, direct dyes and mixture of industrial wastewater and dyes of textile effluents. The consistence increment of removals has been revealed with increasing flocculation time up to 60min, and then it decreased. The optimum flocculation time was found to be 60min. The highest removal of selected dye colours was found to be 98.4, 86, 94.45, 89.3, 90.5 and 81.5% in reactive red, reactive blue, direct red, direct blue mixed dyes and mixture of dyes/and industrial wastewater respectively in simple extract at 60 minute. As well as, for saline extract the maximum colour removal was 97.3, 84.41, 96.6, 84.8 87.73 and 80.3% in reactive red, reactive blue, direct red, direct blue, mixed dyes and mixture of dyes/and industrial wastewater respectively.
Ebeling et al studied that the removal of Turbidity and Soluble Reactive Phosphorus (SRP, orthophosphate) was increasing as settling time increased from 5 to 45min [26]. A series of jar tests were conducted to fetch the effect of settling time (0, 5, 10, 15, 20, 25 and 30min) on the removal efficiencies of BOD5, COD and TSS using low alkalinity wastewater and the results show that the small particles settle out quickly within the first 5min, with little change in the values up to 15min. The differences in BOD5, COD and TSS removal were not significant after 20min of settling as indicated by other studies [27,28]. The result of settling time is given below in (Figure 4 & 5).
Effect of mixing time
Studies were made to find out the effect of variation of the mixing time on the colour removal efficiency. Various mixing time ranging from 15 minutes to 60 minutes were maintained and the results were reported in (Figure 6 & 7). The experimental result shows that there was a continuous removal of colour, turbidity, TDS and TSS while increasing the mixing time from 15 to 45 minutes. Similar results are observed by Patel and Vashi for some other dyes [29]. When the mixing time is short (<45 minutes), the collisions between the coagulants and colloids are not efficient to precipitate suspended solids in wastewater. On the other hand, if mixing time is longer (>45 min) it would lead to an increase in flocs breakage & limit the size of the floc formed. As a result, small size flocs which are not dense to settle down & finally cause the sample to be turbid again. In sum, it is also found that the longer or shorter mixing time would result in the poor performance of the coagulant seed for binding and bridging. There were similar results in experiments made by other researchers [16]. Initially at less contact time (≤15min.) the maximum colour removal efficiency was achieved to only 63.6 and 63.7% for mixed dyes and reactive red respectively in both simple and saline extract methods respectively. But results show that when the reaction time was increased to 45 minutes, the efficiency was nearly 95% and it decreased when time increased was beyond 45minutes [30- 35].
Effect of pH
The aqueous solution of dye wastes was treated by constant concentrations of dose at 70ml/L in dye wastewater of adsorbent for half a day with varying pH (2 -12). The pH was maintained with the help of buffer solutions. (Figure 8 & 9) show the effect of pH of the dye solution on the decolorization percent within the range of (2-12) for both simple and saline extract methods. The results showed that the decolorization reached maximum between pH 8-10 for both the selected dyes waste and industrial wastewater for all dosages. The effect of pH was one of the crucial parameters to determine the optimum leveling order to minimize the dosing cost and to obtain the optimum performance in treatment. pH variation in comparison had a significant effect on the decolorization of reactive dye, direct dye and mixture of them with textile wastewaters by Moringa stenopetala seed extract.
From these figures, we can understand that the highest colour removal was at pH 10 for all dye wastes. As pH increases from 2-10 the colour removal from textile wastewater increases and reach the highest point at pH 10. The removal of dyes are more at higher pH, because the surface of activated coagulants are negatively charged, the decrease in adsorption capacity at the low pH values would be expected as the acidic medium would lead to an increase in hydrogen ion concentration which would then neutralize the negatively charged coagulant surface thereby decreasing the adsorption of the positively charged ions because of reduction in the force of attraction between adsorbate and adsorbent [36-40].
Removal of Turbidity
Varying doses of the seed extract were added to the effluent followed by mixing at optimum time. The mixture was then filtered through filter paper, finally measure the turbidity value by turbidity meter. The result shows that the higher turbidity removal efficiency was obtained in simple extract method in the order of 84.6% and 83.7% in Saline extract methods for reactive blue dye (Reactive Blue 19) with the dosage of 70ml/L as shown in (Figure10 & 11).
Effect of Dye types
Comparative efficiency of colour removal between reactive, direct dyes and mixture of them with industrial wastewater for both the dyes is shown as bar chart in Figure 12. The average colour removal efficiency of reactive dyes is higher than direct dyes because of the fact that the reactive dyes are a colour that can be water dissolved due to the negative charge of the sulphonate group (SO3-) and direct dyes are the molecular structure with planar positive charges that are more than the negative charges and is water soluble. Direct dyes also have sulphonate (SO3-) functionality, but in this case, it is only to improve solubility, as the negative charges on dye and fibre will repel each other. The colour shade affects the efficiency of reactive colour removal natural coagulant. On the other hand, for the mixed dye, the average colour removal is found between reactive and direct dyes, because all reactive and direct dyes are mixed together and to have interchanging of all charges which affect colour removal [41-43].
Conclusion
Seeds of Moringa stenopetala contain materials that are effective as coagulant. Coagulant dose and coagulation pH are important factors influencing the mechanism of coagulation. Depending up on the type of dye, the coagulation process varies. The optimized parameters for the coagulation of textile wastewater using M. stenopetala were pH 10 and dosage of 70ml/L that can result in removal of 98.4 and 94.45% of colour in simple extract and 97.3 and 96.6% in saline extract in both reactive and direct red dyes respectively. Moringa seed can also remove the maximum turbidity of textile dye wastes to the tune of 85%. It could be concluded that natural coagulant aid created a significant impact on the physical treatment of textile wastewater. The colour removal of mixture of industrial wastewater having reactive and direct dye solution was 85.8 and 84.6% in both simple and saline extract respectively because industrial effluent contains high content of dyestuff, surfactants and other additives that are generally made up of organic compounds with a complex structure while it is collected from different sections which use different compounds.
To know more about Journal of Fashion Technology-https://juniperpublishers.com/ctftte/index.php
To know more about open access journals Publishers click on Juniper Publishers
0 notes
Link
Index Copernicus :
All journals may be registered in the ICI World of Journals database. The database gathers information on international scientific journals which is divided into sections: general information, contents of individual issues, detailed bibliography (references) for every publication, as well as full texts of publications in the form of attached files (optional). Within the ICI World of Journals database, each editorial office may access, free of charge, the IT system which allows you to manage your journal's passport: updating journal’s information, presenting main fields of activity and sharing the publications with almost 200 thousand users from all over the world. The idea behind the ICI World of Journals database is to create a place where scientific journals from all over the world would undergo verification for ‘predatory journals’ practices by scientific community. The ICI World of Journals database allows journals which care about completeness and topicality of their passports to build their citation rates and international cooperation.
https://journals.indexcopernicus.com/search/details?id=48074
Scilit :
The name Scilit uses components of the words “scientific” and “literature”. This database of scholarly works is developed and maintained by the open access publisher MDPI.Scilit is a comprehensive, free database for scientists using a new method to collate data and indexing scientific material. Our crawlers extract the latest data from CrossRef and PubMed on a daily basis. This means that newly published articles are added to Scilit immediately.
https://www.scilit.net/publisher/4378
Publons :
Publons is a commercial website that provides a free service for academics to track, verify, and showcase their peer review and editorial contributions for academic journals. It was launched in 2012 and by 2018 more than 500,000 researchers have joined the site, adding more than one million reviews across 25,000 journals. Publons' mission is to "speed up science by harnessing the power of peer review". Publons claims that by turning peer review into a measurable research output, academics can use their review and editorial record as evidence of their standing and influence in their field. Publons says its business model is based on partnering with publishers.Publons produces a verified record of a person's review and editorial activity for journals. This evidence is showcased on reviewers' online profiles and can be downloaded to include in CVs, funding and job applications, and promotion and performance evaluations.Publons also provides:• tools for publishers to find, screen, contact, and motivate peer reviewers;• data and publications about global peer review behaviour;• peer review training for early-career researchers; and• features for academics to discuss and evaluate published research.
https://publons.com/publisher/6250/juniper-publishers
Sindexs:
Scientific Indexing Services (SIS) was founded by renowned scientists. A group of 70 scientist from various countries in different disciplines are started SIS with specific objective of providing quality information to the researcher. SIS offering academic database services to researcher. It's mainly: citation indexing, analysis, and maintains citation databases covering thousands of academic journals, books, proceedings and any approved documents SIS maintains academic database services to researchers, journal editors and publishers. SIS focuses on: citation indexing, citation analysis, and maintains citation databases covering thousands of academic journals. SIS Provides Quantitative And Qualitative Tool For Ranking, Evaluating And Categorizing The Journals For Academic Evaluation And Excellence. This Factor Is Used For Evaluating The Prestige Of Journals. The Evaluation Is Carried Out By Considering The Factors Like Paper Originality, Citation, Editorial Quality, and Regularity & International Presence. We Perform The In-Depth Analysis Method. The Acceptance And Rejection Rates Of Journals Can Be A Determining Factor. Low Acceptance Rate, High Rejection Rate Journals Are Considered The Best And Most Prestigious Journals As The Acceptance Criteria Is Of High Quality Standard. Many Journals And Societies Have Web Pages That Give Publication Data And Style Requirements And Often Includes Acceptance/Rejection Rates. The Paper Copy Of The Journal Occasionally Includes This Data And Will Always Provide Current Contact Information. Whether A Journal Is Indexed In The Major Indexing/Abstracting Service In The Field Is Another Criteria That Can Be Used To Assess The Worth And Quality Of A Journal.
Researchbib :
ResearchBib is open access with high standard indexing database for researchers and publishers. Research Bible may freely index journals, research papers, call for papers, research position.We share a passion to build research communities to discover and promote great research resources from around the world and maximize researchers’ academic social impacts.
Google Scholar :
Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes most peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents. While Google does not publish the size of Google Scholar's database, scientometric researchers estimated it to contain roughly 389 million documents including articles, citations and patents making it the world's largest academic search engine in January 2018. Previously, the size was estimated at 160 million documents as of May 2014.] An earlier statistical estimate published in PLOS ONE using a Mark and recapture method estimated approximately 80–90% coverage of all articles published in English with an estimate of 100 million. This estimate also determined how many documents were freely available on the web.
Worldcat :
WorldCat is the world's largest network of library content and services. WorldCat libraries are dedicated to providing access to their resources on the Web, where most people start their search for information.You can search for popular books, music CDs and videos—all of the physical items you're used to getting from libraries. You can also discover many new kinds of digital content, such as downloadable audiobooks. You may also find article citations with links to their full text; authoritative research materials, such as documents and photos of local or historic significance; and digital versions of rare items that aren't available to the public. Because WorldCat libraries serve diverse communities in dozens of countries, resources are available in many languages.
https://www.worldcat.org/search?qt=affiliate_wc_org_all&ai=Directory_ashok.drji%252540gmail.com&fq=&q=Orthopedics+and+Rheumatology+Open+Access+Journal&wcsbtn2w=Go
Crossref:
Crossref
(formerly styled
CrossRef
) is an official Digital Object Identifier (DOI) Registration Agency of the International DOI Foundation. It is run by the Publishers International Linking Association Inc. (PILA)
[2]
and was launched in early 2000 as a cooperative effort among publishers to enable persistent cross-publisher citation linking in online academic journalsCrossref is a not-for-profit association of about 2000 voting member publishers who represent 4300 societies and
open access publishers
, including both commercial and not-for-profit organizations. Crossref includes publishers with varied business models, including those with both open access and subscription policies. Crossref does not provide a database of fulltext scientific content. Rather, it facilitates the links between distributed content hosted at other sites.Crossref interlinks millions of items from a variety of content types, including journals, books, conference proceedings, working papers, technical reports, and data sets. Linked content includes materials from Scientific, Technical and Medical (STM) and Social Sciences and Humanities (SSH) disciplines. The expense is paid for by Crossref Member publishers. Crossref provides the technical and business infrastructure to provide for this reference linking using Digital Object Identifiers (DOIs). Crossref provides deposit and query service for its DOIs.In addition to the DOI technology linking scholarly references, Crossref enables a common linking contract among its participants. Members agree to assign DOIs to their current journal content and they also agree to link from the references of their content to other publishers' content. This reciprocity is an important component of what makes the system work.Non-publisher organizations can participate in Crossref by becoming affiliates. Such organizations include libraries, online journal hosts, linking service providers, secondary database providers,
search engines
and providers of
article discovery
tools.
https://www.crossref.org/06members/50go-live.html#Juniper+Publishers
ICMJE:
The
ICMJE recommendations
(full title,
Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals
) are a set of guidelines produced by the International Committee of Medical Journal Editors for standardising the ethics, preparation and formatting of manuscripts submitted for publication by biomedical journals. Compliance with the ICMJE Recommendations is required by most leading biomedical journals. As of 2017, over ~3274 journals worldwide followed the Uniform Requirements
http://www.icmje.org/journals-following-the-icmje-recommendations/
scribd:
Scribd began as a site to host and share documents While at Harvard, Trip Adler was inspired to start Scribd after learning about the lengthy process required to publish academic papers.
]
His father, a doctor at Stanford, was told it would take 18 months to have his medical research published Adler wanted to create a simple way to publish and share written content online. He co-founded Scribd with Jared Friedman and attended the inaugural class of Y Combinator in the summer of 2006. There, Scribd received its initial $120,000 in seed funding and then launched in a San Francisco apartment in March 2007.Scribd was called "the YouTube for documents", allowing anyone to self-publish on the site using its document reader. The document reader turns PDFs, Word documents, and PowerPoints into Web documents that can be shared on any website that allows embeds. In its first year, Scribd grew rapidly to 23.5 million visitors as of November 2008. It also ranked as one of the top 20 social media sites according to Comscore.In June 2009, Scribd launched the Scribd Store, enabling writers to easily upload and sell digital copies of their work online.
]
That same month, the site partnered with Simon & Schuster to sell e-books on Scribd. The deal made digital editions of 5,000 titles available for purchase on Scribd, including books from bestselling authors like Stephen King, Dan Brown, and Mary Higgins Clark.In October 2009, Scribd launched its branded reader for media companies including
The New York Times
,
Los Angeles Times
,
Chicago Tribune
,
The Huffington Post
, TechCrunch, and MediaBistro. ProQuest began publishing dissertations and theses on Scribd in December 2009.
]
In August 2010, many notable documents hosted on Scribd began to go viral, including the California Proposition 8 ruling, which received over 100,000 views in about 24 minutes, and HP's lawsuit against Mark Hurd's move to Oracle.
Citefactor:
Citefactor is a service that provides access to quality controlled Open Access Journals. The Directory indexing of journal aims to be comprehensive and cover all open access scientific and scholarly journals that use an appropriate quality control system, and it will not be limited to particular languages or subject areas. The aim of the Directory is to increase the visibility and ease of use of open access scientific and scholarly journals thereby promoting their increased usage and impact.
https://www.citefactor.org/journal/index/14860/orthopedics-and-rheumatology-open-access-journal#.XeDgu-gzbIV
PlagScan
:
PlagScan
is a plagiarism detection software, mostly used by academic institutions. PlagScan compares submissions with web documents, journals and internal archives. The software was launched in 2009 by Markus Goldbach and Johannes Knabe.PlagScan is offered as a Software as a Service and as an on-premise solution. Users can either register as a single user or as an organization. Upon first-time registration, single users receive a free test credit and can purchase additional credits for future submissions, after the completion of a satisfactory trial.Organizational users verify the organization’s address prior to using the software. An obligation-free quote can be requested immediately on the website. Organizations can choose from a variety of options and create multiple administrators and groups, for example, to divide different departments within one institution.After scanning a submission for plagiarism, PlagScan provides users with a detailed report that indicates potential plagiarism and lists the matched sources.
Genomics:
Genamics is a software and web development firm dedicated to ensuring scientists have access to all the computer tools and computer resources available today. As science becomes increasingly reliant on the plethora of new ways computers improve our productivity, it is essential that scientists can readily apply this technology to their work. Our tight communication with scientists and computer technologists enables us to provide both down-to-earth and cutting-edge ways of achieving this goal.
Foundations
The products and services we create at Genamics are built on three core foundations: 1. Ease of use; 2. High Technology; 3. Future foresight.
1. Ease-of-Use
At Genamics we have strived to make every effort to design our products and services to be as easy as possible to use, yet not compromise their power and flexibility. We have taken great care and thought in creating user interfaces that are highly intuitive and easy to understand. Perhaps most importantly of all, we listen to our users and respond to their suggestions and requirements. It is only by this constant refinement, that we can create products that are genuinely friendly and fulfilling to our users.
2. High Technology
Computers and biotechnology are perhaps the fastest moving industries of our times. The utilization of the latest technology is a key factor in progressing and maintaining our products and services to the forefront in their field. By adopting cutting-edge programming tools, we have been able to drastically reduce development time and have the additional capacity to rapidly steer our applications in new directions. Our broad knowledge in computers and science, allows to select the best technologies to meet our goals and ultimately provide the best experience for our users.Applications built at Genamics are developed using a highly object-oriented approach. This has allowed us to build up a large library of general components and controls, which can readily be re-used for new and upcoming projects. Our Visual J++ Developer Center provides the medium by which we can maintain contacts and support with Visual J++ programmers on an international scale. Being largely open-source, the Genamics Library mutually benefits programmers and us, by allowing it to be extended in ways that would not be possible within a single company. Custom coursework - reliable research papers writing help from a team of professional writers.
3. Future Foresight
With the fast moving industries that we are involved in, predicting and understanding their future directions is especially important to us. Consequently, the solutions we create don't just solve the problems of our customers today, but are also ready to handle the problems of tomorrow. The products and services we build are designed within a highly open framework, with many of these future considerations in mind. Similarly, the tools and technologies we adopt to create our solutions are chosen not just for how much can be achieved currently with them, but also for their own future potential and capacity to meet future challenges.At Genamics we are continually prospecting for new innovations and technologies. Already, we have a number of exciting new projects underway, which we hope to bring to you in the near future.
Semantic Scholar :
Semantic Scholar is a project developed at the Allen Institute for Artificial Intelligence. Publicly released in November 2015, it is designed to be an AI-backed search engine for scientific journal articles. The project uses a combination of machine learning, natural language processing, and machine vision to add a layer of semantic analysis to the traditional methods of citation analysis, and to extract relevant figures, entities, and venues from papers. In comparison to Google Scholar and PubMed, Semantic Scholar is designed to highlight the most important and influential papers, and to identify the connections between them.As of January 2018, following a 2017 project that added biomedical papers and topic summaries, the Semantic Scholar corpus included more than 40 million papers from computer science and biomedicine. In March 2018, Doug Raymond, who developed machine learning initiatives for the Amazon Alexa platform, was hired to lead the Semantic Scholar project. As of August 2019, the number of included papers had grown to more than 173 million after the addition of the Microsoft Academic Graph records
https://www.semanticscholar.org/
DRJI :
DRJI provides ready access to education literature to support the use of educational research and information to improve practice in learning, teaching, educational decision-making, and research. Directory of Research Journals Indexing is a free online service that helps you to find web resources for your articles and research. With millions of resources available on the Internet, it can be difficult to find useful material. We have reviewed and evaluated thousands of resources to help you choose key websites in your subject. Our indexed journals will be submitted to all social networks and world's top most indexing and they will be displayed on world's top electronic library. In short, all journals will reach all continents.
http://olddrji.lbp.world/JournalProfile.aspx?jid=2473-554X
ORCID:
The
ORCID
Open Researcher and Contributor ID
) is a nonproprietary alphanumeric code to uniquely identify scientific and other academic authors and contributors This addresses the problem that a particular author's contributions to the scientific literature or publications in the humanities can be hard to recognize as most personal names are not unique, they can change  have cultural differences in name order, contain inconsistent use of first-name abbreviations and employ different writing systems. It provides a persistent identity for humans, similar to that created for content-related entities on digital networks by digital object identifiers (DOIs).The ORCID organization, ORCID Inc., offers an open and independent registry intended to be the
de facto
standard for contributor identification in research and academic publishing. On 16 October 2012, ORCID launched its registry services and started issuing user identifiers.
BASE :
BASE is one of the world's most voluminous search engines especially for academic web resources. BASE provides more than 150 million documents from more than 7,000 sources. You can access the full texts of about 60% of the indexed documents for free (Open Access). BASE is operated by Bielefeld University Library.We are indexing the metadata of all kinds of academically relevant resources – journals, institutional repositories, digital collections etc. – which provide an OAI interface and use OAI-PMH for providing their contents (see our Golden Rules for Repository Managers).The index is continuously enhanced by integrating further sources (you can suggest a source which is not indexed yet). We are working on several new features like a claiming service for authors within the ORCID DE project.BASE is a registered OAI service provider. Database managers can integrate the BASE index into their local infrastructure (e.g. meta search engines, library catalogues). Further on there are several tools and services for users, database and repository managers.
https://www.base-search.net/Search/Results?lookfor=juniper+publishers&name=&oaboost=1&newsearch=1&refid=dcbasen
Sciforum:
Sciforum is an event planning platform that supports open science by offering the opportunity to host and participate in academic conferences. It provides an environment for scholarly exchange, discussion of topics of current interest, building of networks and establishing collaborations. Sciforum was launched in 2009 by MDPI, an academic open-access publisher with headquarters in Basel, Switzerland.Sciforum does not only offer the possibility to participate in conferences, but invites scientists to organize their own conferences. The organizers reduce their administrative efforts thanks to an online tool that supports all aspects of conference organization, including setting up and maintaining the conference website, managing the peer-review process, publishing the conference proceedings, handling and coordinating the conference schedule, registration, billing, sponsors, etc. Organizers can choose between physical and online conferences and whether they require administrative support from Sciforum staff.
ScienceOpen:
ScienceOpen
is an interactive discovery environment for scholarly research across all disciplines. It is freely accessible for all and offers hosting and promotional services within the platform for publishers and institutes. The organization is based in Berlin and has a technical office in Boston. It is a member of CrossRef, ORCID the Open Access Scholarly Publishers Association, STM Association and the Directory of Open Access Journals. The company was designated as one of “10 to Watch” by research advisory firm Outsell in its report
https://www.scienceopen.com/user/9f93b14f-c289-4e21-b3c8-8f448ea424ab
Citeseerx:
Sindexs:
Sintex Industries
BSE: (Earlier known as The Bharat Vijay Mills Ltd) is the world largest producer of plastic water tank. It is also Asia's largest manufacturer of corduroy fabrics. Sintex has a strong presence in 4 continents, i.e Europe, America, Africa, and Asia. Presence in the countries like France, Germany and USA. It is primarily into Building Material Solutions, Textiles Solutions & Custom moulding Solutions. Its manufacturing includes a wide range of plastic products including prefabricated structures, industrial custom moulding products, monolithic constructions and water storage tanks. In the textile segment, the company focuses on niche segment specialising in men's shirting.
http://sindexs.org/JournalList.aspx?ID=6090
Get More Information About Juniper Publishers Please Click on Below Link:
https://juniperpublishers.com/member-in.php
0 notes