Sunday, September 8, 2019
Government Job vs Private Job Research Paper Example | Topics and Well Written Essays - 1250 words - 1
Government Job vs Private Job - Research Paper Example The ââ¬Å"gameâ⬠presents no investment in the future of the company or corporation, its merle an exchange of wealth. This scenario contributes many problems to the current operating business atmosphere. Mergers bring about immediate economic problems that include, loss of markets to foreign competitors, continuing trade deficits, inadequate operating capital, declining productivity, debt-heavy corporations, and loss of many jobs. The debt is due mainly to financing in order to carry out the merger. These problems plus lagging research and development add to the complications of business operations after the merger is finalized. Another factor that has played a significant role in the situation of state and federal government budgets now are government financial bailouts. The first of these was the Savings and Loan Bailout of 1989 due to more than half of Americaââ¬â¢s Savings and Loans failing between 1986 and 1989. This was primarily due to lax government lending policies. These business bailouts have directly affected the budget, deficits, and economic stability of our federal and state governments. The US passed the Emergency Economic Stabilization Act in October 2008 for a $700 billion financial sector bailout. This resulted in the bank rescue of 2008, which called for a $250 billion cash infusion into the banking system. The bailout of Bear Stearns in April 2008 ended in lending the firm $29 billion to JP Morgan to buy the troubled firm. Fannie Mae and Freddie Mac collapsed in the late summer of 2008. The federal government committed up to $200 billion to save both these giant mortgage lenders. Also, $100 billion in cash credits was guaranteed to each of them to prevent bankruptcy. American International Group (AIG), who was one of the largest insurance companies in the world. The government took control of the company and guaranteed them $85 billion in loans.
Saturday, September 7, 2019
Implementation of ecotourism principles in Pembrokeshire Coast Research Paper
Implementation of ecotourism principles in Pembrokeshire Coast National Park. Success or failure - Research Paper Example The bar graph above shows the responses to the question ââ¬â What best describes Ecotourism? A total of 25 visitors indicated that it meant minimal impact on the environment and also ecological protection of preservation. Seventeen (17) visitors indicated that it meant travelling to natural destination and appreciating nature as well as create environmental awareness for the locals and the tourists. Twelve (12) visitors indicated that it provides direct financial benefits for conservation and that it involves community participation while nine (9) visitors indicated that it provides financial benefits for locals. From the diagram above it can be seen that out of the five modes of transportation ââ¬â Car, Train, Bus, Bicycle and Foot, only two modes of transportation were used by the visitors surveyed. Car was the most popular method of transportation for visitors to the park. Of the 29 visitors surveyed 22 or 76% used a car while 7 or 24% used the train. The pie chart above shows the types of accommodations that they used. Six accommodations were specified in the questionnaire and visitors were required to state any other option. The chart shows that guesthouse was the most popular type of accommodation used. A total of 29 visitors were surveyed and (12) or 42% used Guest houses, 7 or 24% used hotels, 6 or 21% persons used relatives/private houses, 3 or 10% used cottages, and 1 or 3% used day trip. No visitor used camping site or any other unspecified accommodation. All of the persons surveyed were motivated to visit PNCP because of the natural beauty of the park or the scenery. Fourteen (14) persons were motivated by the good lodging facilities and services, eight (8) because of the hospitality of the people, six (6) because of its convenience and quality, five (5) because of the local food and beverage as well as the educational aspect of the park. The responses by the visitors indicated that on a scale
Friday, September 6, 2019
History of Mexican Revolution Essay Example for Free
History of Mexican Revolution Essay The novel transports readers to a ghost town on the desert plains in Mexico, and there it weaves together tales of passion, loss, and revenge. The village of Comala is populated by the wandering souls of former inhabitants, individuals not yet pure enough to enter heaven. Like the character Juan Preciado, who travels to Comala and suddenly finds himself confused, as readers we are not sure about what we see, hear, or understand. But the novel is enigmatic for other reasons. Since publication in 1955, the novel has come to define a style of writing in Mexico. Sparse language, echoes of orality, details heavy with meaning, and a fragmentary structure transformed the literary representation of rural life; instead of the social realism that had dominated in earlier decades, Rulfo created a quintessentially Mexican, modernist gothic.. The haunting effect of Pedro Paramo derives from the fitful story of Mexican modernity, a story that the novel tells in a way that more objective historical and sociological analyses cannot. As an aesthetic expression characterized by imaginative understanding, the novel explores Mexican social history of the late nineteenth and early twentieth centuries. The decadent remnants of a quasi-feudal social order, violent revolutions, and a dramatic exodus from the countryside to the city all gave rise to ghost towns across Mexico. Pedro Paramo tells the stories of three main characters: Juan Preciado, Pedro Paramo, and Susana San Juan. From the point of view of Juan Preciado, the novel is the story of a sons search for identity and retribution. Juans mother, Dolores Preciado, was Pedro Paramos wife. Although he does not bear his fathers name, Juan is Pedros only legitimate son. Juan has returned to Comala to claim [j]ust whats ours, as he had earlier promised his dying mother. Juan Preciado guides readers into the ghost story as he encounters the lost souls of Comala, sees apparitions, hears voices, and eventually suspects that he too is dead. We see through Juans eyes and hear with his ears the voices of those buried in the cemetery, a reading experience that evokes the poetic obituaries of Edgar Lee Masters Spoon River Anthology (1915). Along with Juan Preciado, readers piece together these fragments of lives to construct an image of Comala and its demise. Interspersed among the fragments recounting Juans story are flashbacks to the biography of Pedro Paramo. Pedro is the son of landowners who have seen better days. He also loves a young girl, Susana San Juan, with a desire that consumes his life into adulthood. I came to Comala because I had been told that my father, a man named Pedro Paramo, lived there. ââ¬âpage 3 Although the story line in these biographical fragments follows a generally chronological order, the duration of time is strangely distorted; brief textual passages that may read like conversational exchanges sometimes condense large historical periods. Moreover, the third-person narrative voice oscillates between two discursive registers. On the one hand, poetic passages of interior monologue capture Pedros love for Susana and his sensuality; on the other, more exterior descriptions and dialogues represent a domineering rancher determined to amass wealth and possessions. Within this alternation between the first- and third-person narrative voices, readers must listen for another voice and reconstruct a third story, that of Susana San Juan. We overhear bits of her tale through the ears of Juan Preciado, listening with him to the complaints that Susanaââ¬âin her restless deathââ¬âgives forth in the cemetery of Comala. I was thinking of you, Susana. Of the green hills. Of when we used to fly kits in the windy season. We could hear the sounds of life from the town below; we were high above on the hill, playing out string to the wind. Help me Susana. And soft hands would tighten on mine. Let out more string. ââ¬âpage 12 Poetic sections evoke her passion for another man, Florencio, and Pedro never becomes the object of Susanas affection. Juan Preciado, Pedro Paramo, and Susana San Juan are all haunted by ghosts; in turn, they become ghosts who haunt the realities of others. They say that when people from there die and go to hell, they come back for a blanket. ââ¬âpage 6 Although as readers we have the sense of lives once lived by these characters, they emerge for us as phantasms, as partially known presences who are not immediately intelligible and who linger with inexplicable tenacity. Reading Pedro Paramo creates a transformative recognition of Mexicos move toward modernity in the early twentieth century; more than the objective lessons learned from social and cultural history, as a novel, Pedro Paramo produces a structure of feeling for readers that immerses us through the experience of haunting. As ghosts, Pedro, Susana, and Juan point outward to the social context of Mexico in the difficult movement toward modernization, toward social arrangements that never completely die as a newer social order is established. Pedros accumulation of land as a rancher harks back to the trends of capital accumulation during the benign dictatorship of President Porfirio Diaz (1876-1911). The Porfiriato strove to modernize the nation through the development of infrastructure and investment; it allowed for anomalies such as the creation of the Media Luna ranch and strong local power brokers such as Pedro Paramo who shared the interests of the elite and helped maintain a thinly veiled feudal social order. Within this context, Susana San Juan and other individuals murmur their complaints in ghostly whispers. Indeed, at one point, Rulfo planned to call the novel Los murmullosââ¬âthe murmurs. Speaking in the streets of Comala, overheard in dreams, and groaning in the cemetery, these spectral murmurs bespeak a reality hidden beneath the facade of Porfirian progress. The Mexican Revolution of 1910-1920 gave expression to repressed peasantsââ¬âthe campesinos of rural Mexicoââ¬âand put an end to the Porfiriato. Susana San Juan, in turn, reveals the repressed role of women in a patriarchal order. In this world women are chattel and ranch-owners can forcibly populate the countryside with bastard children by asserting feudal rights to the bodies of peasant women living on their lands. Peasant revolutionaries and Susana San Juan as well are all manipulated by Pedro Paramo. He can force events to keep them all in the places where he would have them, but he cannot control their desires and their pleasures. The peasants celebrate festivals, and after the revolution they eventually rebel again by participating in the Cristero Revolt of 1926-1929. Susana suffers guilt and remembers pleasure in evocative passages that underscore her erotic ties to Florencio, a man unknown to others in the novel, perhaps a dead soldier from the revolution, the man Pedro would have had to be in order to have Susanas love. The sky was crowded with fat, swollen stars. The moon had come out for a little while and then vanished. It was one of those sad moons that nobody looks at or even notices. It hung there for a little while, pale and disfigured, and then hid itself behind the mountains. -Juan Rulfo References Carol Clark DLugo, The Fragmented Novel in Mexico: The Politics of Form (Austin: University of Texas Press, 1997), 70-81. Patrick Dove, Exigele lo nuestro: Deconstruction, Restitution and the Demand of Speech in Pedro Paramo, Journal of Latin American Cultural Studies 10. 1 (2001): 25-44,
Ethical Integrity Essay Example for Free
Ethical Integrity Essay This paper will deal with the concept of ethical integrity relative to the economic crisis of 2009. In order for this concept to make any sense, it must be a social ethic, guides to life and behavior for living in society. But the current state of western economics mas made it clear that revolutionary ideas need to be introduced into our conceptions about ethics, largely utilitarian and relativist. In this paper, the damage done to western economicsââ¬âand the public perception of economicsââ¬âwill be seen through the eyes of four very different, but complimentary authors: John Locke, Pierre Proudhon, Murray Bookchin and GWF Hegel. All three will be used to deal with the elements of ethic integrity in a time of radical dissatisfaction with the status quo: a status quo where the state and the corporate governance of the western world is coming into question like never before. Proudhon was a revolutionary that functioned in the tradition of Locke. He takes the contract of free peoples that was so dear to Locke in forming the state and takes it one step further: that the state, as outlined by Locke, is not necessary at all, if the main basis of it is the contract in defense of natural rights. The state, in this view, seems to be an unnecessary middleman that always grows far beyond the bounds the libertarians like Locke seek to imprison it (George, 1922, 534). For Proudhon, then, all politics is coercive and power hungry, and hence, Lockeââ¬â¢s libertarian theory just provides the groundwork for later tyranny and statism. Proudhon is the creator of a system fo exchange called, for lack of a better phase, ââ¬Å"mutualist anarchism. â⬠What Proudhon saw in his day (the late 18th century) was the wild industrialization of life, the making of quick fortunes and the basic instability of life that was the lot of the average worker and small business man. Such a view would fit to our own day as well. But what Proudhon envisaged is the dismantling of the central state and the large corporate behemoth into t mutualist federation of communities (George, 1922, 535). For him, the man was not a citizen, for that was a mystification with no meaning. He was primary a producer: an industrial worker, farmer, fisherman or banker. It was here that his economic worth was found. All others, the state and the corporate boss, were mere parasites that produced nothing. But if the ethical option of revolution is a proper one, then what would replace the huge modern state? This is the essence of mutualism: the morally integral person manifests his integrity by making and keeping contracts with other people and communities (George, 1922, 538). Anarchy for Proudhon is the moral force that binds individuals and communities to contracts, contracts which represent mutual agreement. If this is the case, then the state makes little sense: the force that binds is the community whose moral force as well as oneââ¬â¢s reputation serve to cement ties one person (producer) to another. In other words, each community of producers, functioning in the larger community of diverse members, have their worth in their skills in a trade or producerââ¬â¢s association: this means that the function of this skill in the society requires a moral approach to contracts: by refusing to hold up oneââ¬â¢s side of the bargain will expose the person in question as morally fraudulent and hence, outside of the system of mutual exchange, and hence, needless to say, broke. Mutualism means moral integrity because oneââ¬â¢ ability to exchange goods and services by way of contract is the basis of an orderly society, not the direction of the state or the creation of needs by corporate bosses. The nature of revolution, then is the gradual taking of political power away from the sate and the corporate boards by these societies of mutual aid: producerââ¬â¢s organizations of farmers, mechanics, etc. Hence, what Locke began as the contract among free property holders to create a state is taken to its next level: workers and producers protecting their autonomy by joining in associations to function on the basis of mutual aid, guaranteed by contract and personal reputation. In other words, Proudhon takes Locke to the next level: from the mutual aid of property holders to the mutual aid of all producers (Proudhon, 1977, 12ff). In both cases, the idea of contract and mutual aid is central, but, since Proudhon is writing in an already industrialized time (Locke, right at the beginning), much has changed since Locke wrote, and the world of industry and finance has destroyed individual autonomy, not enhanced it. As in our own times, both the state and the corporate actors have grown into a symbiotic monster that sucks the average worker dry in taxes and debt. The reality is that no rational person can look at the economic system in the western world in 2009 and claim that it has protected autonomy, community and property: it has done exactly the opposite. Hence, this paperââ¬â¢s focus: the creation, basis and reaction of the morally integral person to this crisis.
Thursday, September 5, 2019
Monitoring Therapeutic Drugs: Strategies
Monitoring Therapeutic Drugs: Strategies This article provides an introduction into some of the current techniques and assays utilised in Therapeutic Drug Monitoring (TDM) TDM is a multi disciplinary function that measures specific drugs at intervals to ensure a constant therapeutic concentration in a patient blood stream. The selection of an analytical technique for TDM involves a choice between immunoassay and chromatography technique. Once the methodology has been chosen, there are also numerous options available within these categories including FPIA, EMIT, KIMS, HPLC and nephelometric immunoassay. An overview of each method is given and its processing of drugs. The future outlook in the methodology involved in TDM is also explored and discussed. INTRODUCTION Therapeutic drug monitoring (TDM) is a multi disciplinary function that measures specific drugs at selected intervals to ensure a constant therapeutic concentration in a patient blood stream. (Ju-Seop Kang Min Hoe Lee) The response to most drug concentrations is therapeutic, sub-therapeutic or toxic and the main objective of TDM is to optimize the response so the serum drug concentration is retained within the therapeutic range. When the clinical effect can be easily measured such as heart rate or blood pressure, adjusting the dose according to the response is adequate (D.J. Birkett et al). The practice of TDM is required if the drug meets the following criteria: Has a narrow therapeutic range If the level of drug in the plasma is directly proportional to the adverse toxic If there is appropriate applications and systems available for the management of therapeutic drugs. If the drug effect cannot be assessed by clinically observing the patient (Suthakaran and C.Adithan) A list of commonly monitored drugs is given in table 1. The advances in TDM have been assisted by the availability of immunoassay and chromatographic methods linked to detection methods. Both techniques meet the systemic requirements of sensitivity, precision and accuracy. Within both methods are many numerous options and will be further explored in this title. Ideally the analytical method chosen should distinguish between drug molecules and substances of similar composition, detect minute quantities, be easy to adapt within the laboratory and be unaffected by other drugs administrated. An overview of the current analytical techniques and future trends in TDM is emphasised in this title and its role in laboratory medicine. NEPHLEOMETRIC IMMUNOASSAY AND its USE IN TDM Immunoassays play a critical role in the monitoring of therapeutic drugs and a range of techniques in which the immunoassay can be existed exist. Nephleometric immunoassays are widely used for TDM and are based on the principle of hapten drug inhibition of immunoprecipitation. The precipitation is measured using nephelometric principles that measure the degree of light scattering produced. In some cases Turbidmetry principles can be applied to measure precipitation via the amount of transverse light. In nephleometric immunoassays, if the drug molecule is a monovalent antigenic substance, a soluble immunocomplex is formed. However if the drug molecule is a multivalent antigenic substance, whereby two drug moieties are conjugated to a carrier protein, the conjugate reacts with the antibody to form an insoluble complex. The insoluble complex may compose of numerous antigens and antibodies, thus scattering the light. Therefore nephleometry of turbidmetry techniques are required to measure the reaction. In respect to this principle precipitation inhibition of a drug can be measured. The test sample (serum) is introduced to a fixed quantity of polyhaptenic antigen and anti drug antibody. The serum drug antigen competes with polyhaptenic antigen for binding to the anti drug antibody. Any free drug present in the sample inhibits the precipitation between the antibody and polyhaptenic antigen. Therefore the drug concentration ids indirectly proportional to the formation of precipitate whi ch is quantified by a nephelometer. The more polyhaptenic antigen present, the more precipitate is formed until the maximum is encountered. Further addition of antigen causes a reduction in the amount of precipitate formed due to antigen excess. The use of nephelometric immunoassay for TDM is termed competitive due to the competitive binding for the sites on the antibody by the antigen. It also distinguishes the drug assay system from the conventional nephleometric immunoassay for proteins. Variations of this assay exist including: The use of saliva or CSF may be used as an alternative to serum. Both alternative matrixes contain less light scattering molecules and so a larger volume of sample is used in order to compensate. Turbidmetric methods may also be applied to quantitative immunoprecipation . turbidmetric analysis is preformed at a lower wavelength and similarly detects immunoprecipation like nephelometric techniques. End point analysis of immunoprecipitaion is commonly employed, however rate analysis is also applicable. Addition of formaldehyde blocks further precipitation and is utilised in end point analysis. Agglutination inhibition immunoassay can also be detected by nephelometric immunoassay systems in which the drug or hapten is directly linked onto the surface of the particle and is generally suitable for low serum drug concentration while precipitation inhibition detects concentration above 1ug/ml If homologus and heterologus drug concentrations are utilized for antibody and polyhaptenic antigen preparations, sensitivity and specificity may be increased. Polyclonal and monoclonal antibodies may be employed in this assay. The use of monoclonal antibodies removes any interference caused by antibody cross reactivity. Choosing a hybrid cell with the most desirable antibody is difficult and therefore is most likely to be less sensitive than the use of polyclonal antibodies Overall the nephelometric immunoassay is an excellent assay system for TDM. Advantages over other assay systems include its simplicity, speed and low cost. It is a homogenous method that requires no separation steps or isotopes. Only two reagents are required in limited amounts as if the antibody to antigen ratio is not optimum, the sensitivity is decreased. This is due to the formation of less precipitate in the absence of drug. In the presence of a drug, inhibition is less efficient. The sensitivity of the assay depends on antibody hapten binding, however it yields high specificity. Therefore nephelometric precipitation inhibition immunoassays are a novel technique in the clinical practice of TDM. (Takaski Nishikawa Vol 1, 1984) FLUORESCENCE POLARIZATION IMMUNOASSAY AND its USE IN TDM Fluorescence polarization immunoassay(FPIA) is a widely used 2 step homogenous assay that is conducted in the solution phase and is based on a rise in fluorescence polarization due to the binding of the fluorescent labelled antigen with antibody. The first step of the immunoassay involves the incubation of the serum sample with none labelled anti drug antibody. If the patient sample contains drug molecules, immune complexes will form between antibody and antigen. The second stage of this assay involves the addition of a flourscein labelled antigen (tracer) into the mixture(.Jacqueline Stanley 2002) The purpose of the flourscein tracer is to bind on any available sites on the drug specific antibody for detection purposes. If the first stage occurred in which the anti drug antibody formed a complex with the drug from the sample, less or no antigen binding sites will be available for the tracer to bind to. Consequently a higher proportion of the flourscein tracer is unbound in the solut ion. If the sample contains no drug an antigen, Step 1 does not occur and the anti drug antibodies will bind the flourscein antigen tracer. In this assay the degree of polarization is indirectly proportional to the concentration of drug present. (: Chris Maragos 2009) Fluorescence polarization is calculated to determine the concentration of drug present. Fluorscein labelled molecules rotate when they interact with polarised light. Larger complexes rotate less then smaller complexes and therefore remain in the light path. When the large immune complex is labelled with a fluorescent tracer, it is easily detected once present in the light path. If no drug was present in the sample, the availability of binding sites on the antibody entices the fluorscein tracer to bind, restricting its motion resulting in a higher degree of polarisation, Thus it is easy to identify that polarization is indirectly proportional to the concentration of drug present. The benefit of utilising FPIA in TDM includes the elimination of processed to separate bound and free labels, an indicator that this assay is time efficient. An unique feature of this assay is that the label used is a flurophore and the analytical signal involves the measurement of the fluorescent polarizatio n. ( Jacqueline Stanley 2002) A standard curve is constructed to determine the concentration of drug present and is easily reproducible due to the stability of the reagents utilized and the simplicity of the method. However FPIA has some limitations and is prone to interference from light scattering and endogenous fluorescent tracers in the samples. To help overcome these limitations variations on the technique is employed including: Use of a long wavelength label The fluorscein tracers utilized produce adequate signals, however light scattering events can interfere with these signals. The use of a long wavelength label permits extended fluorescence relaxation times which may be more sensitive for the detection of high molecular weight antigens on drugs. Use of CE-LIF The use of capillary electrophoresis with laser induced fluorescence detection enhances the sensitivity of this method. This competitive FPIA separates free and antibody bound tracers and utilizes LIFP as a detection system.( David S. Smith Sergei A 2008) Overall FPIA has proven to be a time and cost effective, accurate and sensitive technique in TDM and remains one of the most promising methods in this clinical field. ENZYME MULTIPLIED IMMUNOASSAY TECHNIQUE AND its USE IN TDM Enzyme Multiplied Immunoassay Technique (EMIT) is an advanced version of the general immunoassay technique utilising an enzyme as a marker. EMIT is a 2 stage assay that qualitatively detects the presence of drugs in urines and quantitatively detects the presence of drugs in serum.( David S. Smith Sergei A )Both the competitive and non-competitive forms of this assay are homogenous binding based that rapidly analyze microgram quantities of drug in a sample. in the competitive assay, the patient sample is incubated with anti drug antibodies. Antibody antigen reactions occur if there is any drug present in the sample. The number of unbound sites indirectly correlates with the drug concentration present. The second step involves the addition of an enzyme labelled specific drug which will bind to available binding sites on the antibody inactivating the enzyme. A enzyme widely used in EMIT assays is Glucose 6 Phosphate Dehydrogenase which primarily oxidises the substrate added (Glucose 6 Phosphate). The co-factor NAD+ is also reduced to NADH by the active enzyme. Any enzyme drug conjugate that is unbound remains active, therefore only in this case , can the oxidation of NAD+ to NADH occur. An increase in absorbance photometrically @ 340nm correlates with the amount of NADH produced. (Jacqueline Stanley 2002) A non competitive format of this assay also exists, where by drug specific antibodies are added in excess to the sample resulting in antigen antibody interactions if the drug is present. A fixed amount of enzyme drug conjugate is then added which occupy any unbound sites present on the antibody. The active enzyme that is unbound oxidised NAD+ to NADH indicating presence of free enzyme conjugate and subsequently drug molecules present. (chemistry.hull.ac.uk/) EMIT technology is becoming increasingly popular as a method to monitor therapeutic drug levels. Drugs monitored using this technique includes anti asthmatic drugs, anti epileptic drugs and cardio active drugs. Radioimmunoassay work on the same principle as competitive EMIT with the exception of the use of a radio isotope as a marker. Gamma radiation is emitted from the marker leading to a high level of sensitivity and specificity. As it uses radio isotopes it is not the most cost effective in todays modern environment. MICROPARTICLE IMMUNOASSAY AND its USE IN TDM Microparticle agglutination technology uses latex microparticles and plays a leading role in TDM in the quantitative measurement of carbarbapenzaine, phenytoin, theophylline and phenybarbital. Kinetic movement of microparticles in solution (KIMS) is a homogenous assay and is based on the principle of competitive binding between microparticles via covalent coupling. When free drug exists in the patient sample, it will bind to the antibody present. As a result the microparticle antigen complex fail to bind with the antibody and the formation of a particle aggregate does not occur. Micro particles in solution fail to scatter light causing a low absorbance reading. If the patient sample is negative for the drug, the micro particle drug complex binds to the antibodies. The complex that is formed upon binding blocks the transmitted light and causes light scattering resulting in increasing absorbance readings. Hence the degree of light scattering is inversely related to the concentration of drug present. Light scattering spectroscopy improves the sensitivity and quantitation of particle based immunoassays, thus making KIMS a highly sensitive and accurate technique in TDM. Its popularity has developed throughout the years for many reasons. Reagents required for this assay are in expensive and have high stability. KIMS is a universal assay and can be preformed on a variety of analyzers. The assay has minimal interference as a change of absorbance is measured as a function of time while absorbance readings of interfering substances do not alter with time.( Frederick P. Smith, Sotiris A. Athanaselis) CHROMATOGRAPHY AND its USE IN TDM For many years liquid chromatography has been linked to detection systems and its application in TDM is becoming incredibility popular. Liquid chromatography was initially employed in response to difficulties arising in Gas Chromatography (G.C) due to heat instability and non specific adsorption on surfaces. High Performance Liquid chromatography is the main chromatography technique utilized for TDM. Thin Layer Chromatography (T.L.C) and Gas Chromatography are other alternatives, however have limitations that suppress their use in TDM. A derivatization step must be performed for highly polar and thermo liable drugs for G.C to be successful. TLC has a poor detection limit and is unable to detect low concentration of drug present. HPLC has revolutionized the monitoring of TDM with rapid speed and sensitivity of analysis and can separate a wider variety of drugs compared to GC and TLC. For this reason, HPLC is considered the most widely adaptable chromatographic technique when coupled w ith UV detection and Mass Spectrophotometry for TDM.( Phyllis R. Brown, Eli Grushka) BASIC PRINCIPLES IN HPLC HPLC is a separation technique performed in the mobile phase in which a sample is broken down into its basic constituents. HPLC is a separation technique that employs distribution differences of a compound over a stationary and mobile phase. The stationary phase is composed of a thin layer created on the surface of fine particles and the mobile phase flows over the fine particles while carrying the sample. Each component in the analyse moves through the column at a different speed depending on solubility in the phases and on the molecule size. As a result the sample components move at different paces over the stationary phase becoming separated from one another. Drugs that are localised in the mobile phase migrate faster as to those that are situated in the stationary phase. The drug molecules are eluted off the column by gradient elution. Gradient elution refers to the steady change of the eluent composition and strength over the run of the column. As the drug molecules elute of HPL C is linked to a detection system to detect the quantity of drug present in the sample. Detection systems include mass spectrophotometry and UV detection. (Mahmoud A. Alabdalla Journal of Clinical Forensic Medicine) DETECTION SYSTEMS USED IN HPLC FOR TDM Detection of HPLC with a diode array ultraviolet detector has proved to be a sustainable application system in the identification after HPLC analysis. The use of UV detection allows the online possession the compounds UV spectra. These detection system absorb light in the range of 180-350nm. UV light transmitted passes through a sensor and from that to the photoelectric cell. This output is modified to appear on the potentiometric recorder. By placing a monochromatoer between and light source and the cell, a specific wavelength is created for the detection , thus improving the detectors specificity. A wide band light source can also be used as an alternative method. In this case the light from the cell is optically dispersed and allowed to fall on the diode array.( Mahmoud A. Alabdalla Journal of Clinical Forensic Medicine) HPLC can also be coupled to a mass spectrophotometer as a detection method. Mass spectrophotometry (MS) elucidates the chemical structure of a drug. Sensitivity of this technique is observed as it can detect low drug concentration in a sample. Specificity of this method can be futher enhanced by Tandem mass spectrophotometric analysis. This involves multiple steps of mass spectrophotometry. This is accomplished by separating individual mass spectrometer elements in space or by separating MS phases in time. (Franck Saint-Marcoux et al) FUTURE TRENDS IN TDM METHODOLOGY AGILENTS 1200 HPLC MICRO CHIP Agilents 1200 HPLC micro chip technology combines microfliudics with an easy use interface that confines the HPLC procedure tot his dynamic chip. The micro chip technology integrates analytical columns, micro cuvette connections and a metal coated electro spray tip into the chip to function as a regular HPLC analyzer. The compact chip reduces peak dispersion for a complete sensitive and precise technique. The microchip comes complete with an integrated LC system into sample enrichment and separation column. The operation of the chip is well defined and manageable upon insertion into the Agilent interface which mounts onto the mass spectrophotometer. The built in auto sampler loads the samples and the sample is moves into the trapping column by the mobile phase. Gradient flow from the pump moves the sample from the trapped column to the separation column. The drug is separated the same as the convention methods however reduced peak dispersion does produce better separation efficiency than the conventional method. This form of technology is currently in use in the United States but has not developed outside of the U.S(http://www.agilent.com) PHYZIOTYPE SYSTEM This is the latest application on the market for the treatment and monitoring of drugs associated with metabolic disorders. The PhyzioType system utilizes DNA markers from several genes coupled with biostatisical knowledge to predict a patients risk of developing adverse drug reactions. (Kristen k. Reynolds Roland Valdes) AMPLICHIP CYP450 TEST The Amplichip CYP450 Test is a new technology that has revolutionised the TDM of anti psychotic drugs. This test has been approved by the FDA in 2006 but is not currently in use in laboratories in Ireland. This test is used for the analysis of CYP2D6 and CYP2C19 genes, both of which have an influence in drug metabolism. The function of this test is to identify a patient genotype so their phenotype is calculated. Based on the patient phenotype, a clinician determines the type of therapeutic strategy he/she will commence (Kristen k. Reynolds Roland Valdes) DISCUSSION This paper illustrates the increasing role of immunoassay and chromatography techniques in the clinical laboratory routine monitoring of therapeutic drugs. Before an analytical technique is introduced into TDM it must meet the requirements of sensitivity, accuracy and specificity needed for most TDM applications. The methodology of TDM in todays clinical setting revolves around the use of immunoassays and chromatography techniques. A range of immunoassays was discussed revolving around their principle and advantages and limitations. The majority of immunoassays utilised in the TDM are homogenous based for rapid analysis and efficient turn around time for drug monitoring. Most immunoassays involved in TDM are based on the same principle of competitive binding for antibody. The factor that distinguishes each immunoassay is the detection methods used. Detection methods discussed in this reviewed include nephelometric techniques, flourscein labels, enzyme labels and the use of micro part icles. Each method relies on different detection principles as discussed, however characteristics common to all methods include accuracy, sensitivity and specificity. The methodologies discussed also are time and cost efficient, both essential in laboratory assays. Chromatographic techniques are also discussed with HPLC providing the most impact to TDM. Gas and thin layer chromatography are other chromatographic techniques, however neither can be utilised in TDM due to the limitations both techniques hold against TDM. . HPLC is a rapid sensitive method for the quantitation of drugs in a sample and for this reason is the most widely adaptable chromatographic technique applied in TDM. Like all chromatographic techniques drugs are separated based on the interaction of the drug with the stationary phase which determines the elution time. Detection methods primarily used are UV detection and mass spectrophotometry The final thought on this overview of TDM was an insight into the future of its methodology and applications .Future and approved methods are discussed given a brief outline on each. The constant development of methodologies and techniques in this area of TDM are ongoing constantly keeping the area of TDM one of the most fastest and interesting in clinical medicine. Literature Review: The Impact Of Legalized Abortion Literature Review: The Impact Of Legalized Abortion The publication of the controversial paper on legalised abortion and its affect on the rate of crime by Levitt and Donohue (2001) has resulted in widespread condemnation from a variety of sources, for example, Joseph Scheidler, executive direction of the Pro-Life Action league, described the paper as so fraught with stupidity that I hardly know where to start refuting it Crime fell sharply in the United States in the 1990s, in all categories of crime and all parts of the nation. Homicide rates plunged 43 percent from the peak in 1991 to 2001, reaching the lowest levels in 35 years. The Federal Bureau of Investigations (FBI) violent and property crime indexes fell 34 and 29 percent, respectively, over that same period. (Levitt, 2004) In his journal The impact of Legalized abortion on crime Levitt attempts to offer evidence that the legalization of abortion in 1973 was the chief contributor to the recent crime reductions of the 1990s. Levitts hypothesis is that legalized abortion may lead to reduced crime either through reductions in cohort sizes or through lower per capita offending rates for affected cohorts. The smaller cohort that results from abortion legalization means that when that cohort reaches the late teens and twenties, there will be fewer young males in their peak crime years, and thus less crime. He argues that the decision in Roe v Wade constitutes an abrupt legal development that can possibly have an abrupt influence 15-20 years later when the cohorts born in the wake of liberalized abortion would start reaching their peak crime years. In essence, Levitt puts forward the theory that unwanted children are more likely to become troubled adolescents, prone to crime and drug use, than wanted children are. As abortion is legalized, a whole generation of unwanted births are averted leading to a drop in crime two decades later when this phantom generation would have grown up. To back up this point, Levit t makes use of a platform from previous work such as (Levine et al 1996) and (Comanor and Phillips 1999) who suggest that women who have abortions are those most likely to give birth to children who would engage in criminal activity. He also builds on earlier work from (Loeber and Stouthamer-Loeber 1986) who concludes that an adverse family environment is strongly linked to future criminality. Although keen not to be encroach into the moral and ethical implications of abortion, Levitt, through mainly empirical evidence is able to back up his hypothesis by concluding that a negative relationship between abortion and crime does in fact exist, showing that an increase of 100 abortions per 1000 live births reduces a cohorts crime by roughly 10 per cent and states in his conclusion that legalized abortion is a primary explanation for the large drops in crime seen in the 1990s. One of the criticisms that can be levied against this study is its failure to take into consideration the effect other factors may have had in influencing crime rates during the 1980s and 1990s, such as the crack wave. Accounting for this factor, the abortion effect may have been mitigated slightly. Also Levitts empirical work failed to take into account the greater number of abortions by African Americans who he distinguishes as the race which commit the most amount of violent crime, and his evidence fails to identify whether the drop in crime was due to there being a relative drop in the number of African Americans. The list of possible explanations for the sudden and sharp decrease in crime during the 1990s doesnt stop at Levitts abortion/crime theory and Levitt himself in his 2004 paper identifies three other factors that have played a critical role. The first is the rising prison population that was seen over the same time period, and (Kuziemko and Levitt, 2003) attribute this to a sharp rise in incarceration for drug related offences, increased parole revocation and longer sentences handed out for those convicted of crimes, although there is the possibility of a substitution effect taking place where punishment increases for one crime, potential criminals may choose to commit alternative crimes instead. There are two ways that increasing the number of person incarcerated could have an influence on crime rates. Physically removing offenders from the community will mean the avoidance of any future crime they may plausibly commit during the time they are in prison known as the incapacitation affect. Also there is the deterrence effect, through raising the expected punishment cost potential criminals will be less inclined to commit a crime. As criminals face bounded rationality, expected utility gained from crime will have an effect on the amount of time spent devoted to crime. (Becker, 1968). A study conducted by Spelman (2000) examined the affect the incarceration rate would have on the rate of crime and finds the relationship to have an elasticity measure of -0.4 which means that an increase in the levels of incarceration of one percent will lead to a drop in crime of 0.4%. In Economic models of crime such as Becker (1968), improvements in the legitimate labor market make crime less attractive as the return earned from legitimate work increases. Using this model, the sustained economic growth that was seen in the 1990s (Real GDP per capita grew by almost 30% between 1991 and 2001 and unemployment over the same period fell from 6.8 to 4.8 percent) could be seen as a contributing factor to the drop in crime witnessed and many scholars (such as) have come to that conclusion. However, the improved macroeconomic performance of the 90s is more likely to be relevant in terms of crimes that have financial gains such as burglary and auto theft and does not explain the sharp decrease seen in homicide rates. Also, the large increase in crime seen in the 1960s coincided with a decade of improving economic growth, further corroborating the weak link between macroeconomics and crime (Levitt, 2004). One other explanation for the drop in crime and the most commonly cited reason can be seen in the growing use of police innovation, and an adoption of community policing. The idea stemmed from the broken window theory, which argues that minor nuisances, if left unchecked, turn into major nuisances (Freakonomics) The main problem with the policing explanation is that innovative police practices had been implemented after the crime rate had already began declining, and perhaps more importantly, the rate of crime dropped in cities that had not experienced any major changes in policing (Ouimet, 2004).
Wednesday, September 4, 2019
Distillation Essay -- essays papers
Distillation Abstract This report outlines the steps taken to separate a 50:50 by volume ethanol and isopropanol side stream. The resulting separation must contain no more than 3% alcohol impurity in each product. A laboratory column, run at total reflux, was utilized to scale up to a forty foot high by one foot diameter column. The laboratory column allowed the team to determine vapor velocities and HETP values for the 0.24 inch Pro-Pakq packing. HETP is defined as the height of packing divided by the number of theoretical column stages. The column consisted of four main sections: packing, controls, a reboiler, and a condenser. To complete the vapor velocity vs. HETP relationship, the vapor velocity must be found. The vapor velocity was found using a system energy balance. The design vapor velocity was determined to be 4.85 ft/hr. However, this vapor velocity did not result in the column flooding; therefore the scaled-up column is not designed to its full potential. Ideally, distillation columns should be designed at 70-80% of the flooding velocity. The column HETP was found by use of the Fenske equation and was determined to be an average of 4.55 inches. As a result of the design parameters from the experimental column, the following design is proposed: the column will run at a vapor velocity of 4.85 ft/hr and will have a HETP of 4.30 inches. This will result in a packing height of 38.7 feet. The reboiler will have an area of 113.52 ft2 and the area of the condenser will have a value of 45.54 ft2 in which heat exchange will take place. Introduction A chemical plant spends approximately 50 to 90% of capital investment on separation equipment (1,1) Therefore, the ability to utilize a small laboratory column and to scale-up a column is an important skill for a chemical engineer. This report will outline the steps taken to design a packed distillation column. The column needs to separate a 50:50 mixture of ethanol and isopropanol into a distillate stream containing no more than 3 wt% isopropanol and a bottoms stream containing no more than 3 wt% ethanol. The design of the full-scale column was based on a laboratory simulation column. This column allowed the team to determine vapor velocities and HETP values for the 0.24 inch Pro-Pakq packing. Once the simulation vapor velocities are determined, they can ... .../hr)/*(1/0.0154 kmol/L)*(1/(p(.25)2ft2)*(0.0159 kmol/L) (p(1)2(ft)) = 6.857567 kmol/hr MWAVG,D = 46.493 kg/kmol VD = (6.857567 kmol/hr) * (46.493 kg/kmol) VD = 318.82886 kg/hr *Equation of Top Operating Line y = (L/V)x + (1-(L/V))xD = (RACT / RACT +1)x + (1/ RACT +1)(0.97) = 0.912779x + 0.084605 *Distillate Rate R = (V-D)/D = 10.4651 318.82886 (kg/hr) - D = 10.4651D D = 27.808642 (kg/hr) R = L/D = 10.4651 * 27.808642 (kg/hr) = L L = 291.02022 (kg/hr) *Bottoms Flow Rate L/V = R (z -xB) + q (xD - xB) R(z -xB) + q(xD-xB) -(xD-z) z= Feed mole fraction of ethanol q= 1 (feed assumes to be liquid) L/V = 10.4651(0.567-0.03) + 1(0.97-0.03) 10.4651(0.567-0.03) + 1(0.97-0.03) -(0.97-0.567) L/V = 1.05 L/V = (VB + 1)/ VB = 1.05 VB = 20 B = V/ VB = (318.82886 kg/hr)/20 = 15.941443 (kg/hr) *Feed Flow Rate F = D + B = 15.941443 (kg/hr) + 318.82886 (kg/hr) = 334.7703 kg/hr *Bottom Operating Line y = (L/V)x - ((L/V)-1) xB = 1.05x - 0.0015 * Condenser Heat Duty QCOND = V * DHVAP DHVAP = xETOH * DHVAP,ETOH + xISOP * DHVAP,ISOP QCOND =
Tuesday, September 3, 2019
From Taco Bell to Tanzania Essay example -- Graduate Admissions Essays
From Taco Bell to Tanzania I lived until the age of 18 in Lacey, Washington, a small town made up mostly of the strip malls and Taco Bell fast food restaurants that line Interstate 5 from Portland to Seattle. Very few of my high school classmates left this town, and instead moved back into the service industries and lower rungs of state bureaucracy where their parents had worked before them. For those of us who wanted to leave, the only routes, at the time, seemed to be the military or higher education. Since, by middle school, I had been tracked into college prep courses, I assumed that I would go to college but did not know where or what to study. In our garage, my grandfather kept back issues of National Geographic dating to the 1920's. The summer before starting high school, he paid me to dust them and it was then that I discovered something called "Anthropology" which, when studied, appeared to lead to a more interesting life in a more interesting place. For my Freshman Physical Science course's "SCIENCE CAREERS DAY," I wrote "Anthropology" down as my career goal, though I knew nothing at the time about the discipline besides the name. I likewise chose a college which I knew nothing about - Lewis and Clark in Oregon - because the brochure mentioned that there were several dozen overseas programs available through the school. Though I could have gone to India, Indonesia, Ecuador, Australia, Korea or many other countries, I decided to apply for Kenya because the year before I had read a book about nomads and the program included a unit on nomadic pastoralism and ecology. After rereading this book much later, I discovered it to be an incredibly sappy, melodramatic and condescending ... ...conflicts in other areas of social life. In the summer of 1994, I had the opportunity to travel to Tanzania on an SSRC Predissertation Grant to begin to establish affiliation, research clearance and possible fieldsites. I have also made contacts at the district level with officials and academics in the area. Though I already speak Kiswahili, the national language of Tanzania, I also have made arrangements to study Maa, the language of the Kisongo Maasai and WaArusha who live in the district in which I will be working. I am looking forward to working in Tanzania not only because of its political stability and unique history as a nation, but also because of the opportunity to generate information about children and education in pastoral communities there, a topic which is still under-researched despite the restructuring of national curriculum in recent years.
Subscribe to:
Posts (Atom)