Tuesday, August 6, 2019
History of Gaming Essay Example for Free
History of Gaming Essay Gaming has been around since the late 1960ââ¬â¢s far later than I had thought, for me I had always thought that the Atari 2600 was the first gaming system known to mankind, and after starting my research I found out how wrong I was. In this part of the teams paper I will discuss the different types of this technology thought it history. In 1967 the first gaming console ever was created and was named the ââ¬Å"Brown Box.â⬠A German born television engineer Ralph Baer and his colleagues created the Brown Box which worked on your basic standard television set. Working with a firm called Sanders Associates, Baer along with co-workers drew up schematics for a game which was called the chase game, and they built a vacuum tube circuit that connected to the television. Two players controlled two squares that basically chased each other on the screen, at that point in time gaming was born. Baer and his associates later added a light fun and developed a total of 12 games for the Brown Box. Fast forward five years after Baer and his associates created the Brown Box, Magnavox began production of the Magnavox Odyssey. The Odyssey was called the first commercial video-game console and was marketed in Magnavox TV dealerships. The Odyssey used six cartridges to play up to twelve games. The Odyssey downfall is that many TV dealers didnââ¬â¢t see any potential in it and along with the false rumor that it only worked in Magnavox televisions hurt the popularity of the gaming systems. In 1972 Nolan Bushnell founded Atari and three years later in 1975, they had their first smash hit with an arcade game called Pong. Atari sold a home version of the game Pong through Sears under the Sears Tele-games label. What made Pong unique was that it produced an on-screen score and sound all by single chip. With his first system Nolan Bushnell and Atari shot to the top of household gaming. Knowing that some of not anything last forever especially a single game system popularity, Nolan Bushnell and Atari started working on a cartridge based system, one that could play multiple games. In 1976 Bushnell Sold Atari to Warner Communications, and Warner immediately backed the Atari project, and the Atari VCS later renamed the Atari 2600 was introduced. The Atari was a big hit and had multiple game hits to go along with it, hits like the ever popular Spaced Invaders, Breakout, Missile Command, and Combat just to name a few. Atari was the first ever massed produced home video gaming system, selling 400,000 systems and hitting sales figures of $120 million dollars during the first Christmas season in 1977. In 1980 Mattel released Intellivision, and was the first to take a run at Atariââ¬â¢s gamming thrown. This sparked what some would like to call a console war between Atari and Mattel. Intellivision featured better game graphics and synthesized voices in video games. Both systems brought on other game developers like Coleco and Activision, but with so many of the games being unlicensed games and no household system lead to gaming industry crash in 1983-84 cause in many companies to file bankruptcy. Nintendo, a Japanese based company became part of the gaming industry in 1985 with its release of the Nintendo Entertainment System. With some of the most popular titles ever like Super Mario Brothers, Metroid, and Legend of Zelda and that of third party games like Tecmo Bowl and RBI baseball, the NES turned the non-believers that didnââ¬â¢t think it was a good idea to get into gaming after a crash, into believers. In 1987 Sega released its gaming system which dwarfed the capabilities that its predecessor, the NES, had offered. It was called the MasterSystem but it didnââ¬â¢t make the splash that many in the Sega camp had thought it would make, due to its lack of third-party games and other things it was the demise of the MasterSystem and the first failing of Sega. Sega went directly back to the drawing board and soon after released its first hit the Sega Genesis in 1989. Backed by many game developers like Electronic Arts, the Sega Genesis opened the door for the first ever battle for supremacy for gaming dominance. Four years after the release of the Nintendo Entertainment System, Nintendo released its second smash hit, with the Gameboy. Gameboy was dubbed the as the first ever handheld gaming console and had a black and white screen. Tetris one of the most popular Gameboy games ran the Gameboy sales through the roof. With many make-overs throughout the course of history Gameboy continued to stay relevant in gaming. 1990 was the release of SNKââ¬â¢s Neo-Geo a 24 bit home gaming system that was said to be many years ahead of its competitors and its time. The Neo-Geo had one problem, it wasnââ¬â¢t 2-D graphics those were great, it wasnââ¬â¢t its arcade level quality that was what everyone wanted, it was neither of those two things that helped the Neo-Geo fail it was its $650.00 price tag and its $200.00 games that took the Neo-Geo from great innovation to under achiever. In 1991 Nintendo releases the Super Nintendo Entertainment System. Beaten to the punch by Sega with its gaming system the Sega Genesis, the 16 bit SNES had some catching up to do. It wasnââ¬â¢t long before the SNES thanks to its technological superiority, had overtaken the Sega Genesis thanks to its big name games like Super Mario Brothers, Metroid, and Zelda. By the end of the 90ââ¬â¢s Sega Genesis was no more than an afterthought for SNES who had taken over the 16 bit era and the gaming industry. It wasnââ¬â¢t till the year 1995 that Nintendo again was challenged for its gaming dominance. Sony introduced the Sony PlayStation a 32 bit CD-ROM based technology and has been said to be the most popular gaming system ever. The games had three dimensional gameplay much better than the 2-D of the past. What was the most attraction to any gamer was the price of the games which dramatically dropped compared to those of cartridge based games. It was also in 1995 the second day of September that the Sega franchise basically started its demise. This day marked the release of the Sega Saturn, which was dubbed the best worst system, according to many. Because it was pushed up to get out ahead of the newly announced PlayStation 2, it gave the Sony corporation time to sit back and polish the upcoming PlayStation 2 and learn from the mistakes that the Sega Saturn was making. In 1996 the Nintendo franchise countered the PlayStation with the Nintendo64, the last cartridge based system. Although the cartridges loaded faster than the CD-ROMS of PlayStation the Nintendo64 couldnââ¬â¢t even keep up with the amount of games that were released for the PlayStation, leaving the PlayStation on top of the gaming for the time being. It was in 1998 when Sega made its final push as a contender in the world of gaming with its release of the CD-based system Dreamcast, which incorporated a 128-bit processing system. The Dreamcast was the first system that had a modem which allowed for internet connectivity that brought on the ability to play online, download extra content and updated to the system itself. This would be the last time that anyone would here from the Sega franchise in the console industry. In 2001 Sega discontinued the Dreamcast and announced that it would be leaving the console market for good and taking its Sonic franchise with them. Four years after the release of the Nintendo64 and five years after the first initial PlayStation, Sony releases its second version of the PlayStation 2. The first 128 bit system featured backwards compatibility allowing the PS2 to play the older PlayStations games, and also could be used as a DVD player. Not only was the PS2 named most popular 128 bit system but with its release it marked the rise of console popularity played over the internet. 2001 was the introduction of the old guy with a new system and the new kid on the block in gaming. Nintendo hit the gaming market again going away from its usual cartridge based systems and introducing its first ever CD based system. The GameCube was marketed more for the younger demographics and is said to be what made the system flop and sell short. Along with the inability to connect to the internet it was called a one trick pony and was behind in the gaming world and home entertainment systems. The now new kid on the block Microsoft introduced its own gaming system the X-box. The PC giant used PC technology when building the X-Box which allowed for greater performance. Although the X-Boxââ¬â¢s technology proved greater performance when compared to that of the 128 bit PS2 it still remained behind the number of sales that the PS2 had leaving Sony alone at that top of the gaming world. In 2002 Nintendo stepped back in with the Gameboy Advance and again in 2004 with the Nintendo DS. Some would argue and go as far as to say that all Nintendo did was step on its on toes with how they released the two versions so closely together. By doing so experts say that they made the Gameboy obsolete by releasing the DS so soon. Playstation soon joined Nintendo in the hand held gaming evolution, challenging Nintendoââ¬â¢s dominance. The PlayStation Portable like the Gameboy and DS featured wireless capability, high graphics and non-gaming functions mimicking PC-based devices. 2005 also marked the release of the Microsoft X-Box 360. The processing power of the X-Box 360 was said to be unrivaled, and all came with an internal hard drive, played DVDs and had the ability to connect four controllers at one time wirelessly, as well as connecting to the internet. The Nintendo Wii was released in 2006 and featured a built in WiFi for online game play, wireless controllers and Bluetooth. The wireless controllers werenââ¬â¢t new to gaming but what Nintendo did bring to the table with their wireless controllers was the motion sensor technology, allowing the game to mimic the performance of the actual player. 2006 is also when the Sony franchise again exerted its dominance and rivaled back against Microsoft X-Box 360 with the PlayStation 3. The PS3 came with an internal hard drive as well and the ability not only to play DVDââ¬â¢s but also Blue Ray DVD, along with its ability to connect to the internet wirelessly and connect multiple controllers wirelessly, it is said to be still atop the gaming industry, but again thatââ¬â¢s all in who you ask. These two systems, PlayStation 3 and X-Box 360, along with the Nintendo Wii brings us to where we are now with gaming in what is called the 8th Generation of gaming. Every year the three gaming giants Sony, Microsoft and Nintendo continue to wow us with different and greater gaming innovation that keeps the rivalry for gaming dominance going on and going strong. In the last few years the X-Box 360 stepped up the gaming war and has released technology that gets rid of the hand held controllers all together! The Kinect from Microsoft has really stepped up its technology and has gotten rid of the controller period. The Kinect set the Guinness Book of World Record for the fastest gaming system ever and comes highly recommended by many consumers and game developers. Recently it was reported that Microsoft X-Box 360 is looking to compete with cable television providers announcing that its consoles will allow the gamer to watch television through it. All of the consoles allow you to watch things like Net Flicks and get on the internet and surf the net, but rivaling cable providers thatââ¬â¢s a big step. I look for Sony and Nintendo to sit back and wait to see where this goes before they follow in the footsteps of Microsoft, as they have previously done, but I donââ¬â¢t think they will jump right on board with this because of fear of failure and losing money. If you are a gamer a hardcore gamer you have to be excited for the future of gaming, from a standpoint it looks bright and exciting. With rumors of Sony releasing the Playstation 4 and the system having touch screen technology; it has all the gamers grinning from ear to ear about what the future of gaming actually may be. Gaming has already developed full 3D gaming technology and 3D moti on sensors and powerful graphics and they look to continue to improve upon these technologies. Many have argued that consoles in the future will eventually fall by the wayside making way for hand held portable gaming to be the dominate gaming systems in the future. Of course this upsets many hardcore gamers because they donââ¬â¢t see that as being a plus since many like to play their games on a larger scaled screen. With that being said many say the IPhone 8 will probably be the future of gaming and going as far as to say it although it is a hand held device it will be able to plug into a television set or even connect wirelessly to continue to give that big game experience. Many say that the future of gaming is mobile, digital, and cloud gaming. The better gaming system is said to be one that you can take anywhere and everywhere. Most of the time even of if home many gamers are even using mobile gaming at home, some experts say up to 47% of gamers are mobile gaming even at home. Mobile gaming is not just the casual gaming it used to be and has changed the way all gamers vi ew games. Spending approximately 16 hours a week compared to that of the 18 hours of traditional gaming some say digital is the way of the future and hardware doesnââ¬â¢t matter anymore. The type of hardware you use never really ever did matter, it has always been about the software. Hardware will be less and less relevant in the future and it will be more about the software. With cloud gaming it is becoming more and more of a reality and with two client services already launched it is coming along full circle. The angle of cloud gaming is to offer high definition gaming on any television or device that can receive a broadband signal with even high-end games that can be played on devices that get low power. In a CNNTech report back in September of 2008 had reported that a future release of the Emotiv EPOC headset. The Emotiv EPOC headset is said to be the first Brain Computer Interface and would be able to pick up on over 30 different expressions all by processing real time brain activity. With this technology many enthusiast say that this type of technology makes movies like The Matrix and the Star Trek Holodeck more of a reality in the future. Me personally I am kind of skeptical about that type of technology, and even though experts say that being able to control a computer with your mind is the ultimate quest of human to machine interaction, I still donââ¬â¢t feel any better. In those movies, well all except Star Trek, the machine had a way to take over. So all I see is me playing the video game and the machine taking over my mind and it controlling me and then the earth being run by machines. Well that is how it works in the movies hopefully this wonââ¬â¢t be the case. The goal of this though is for the interactions of machines to mimic the way interaction with humans and they say it will ultimately bring communities closer together, so I guess that is one thing we will have to wait to see in the future. So with that being said there is no clear cut way to know which way the future of gaming is actually going, but seeing how far it has come since the Brown Box it is very interesting to sit back and see what the technology produces. At one point in time it was all about transistors and diodes and now itââ¬â¢s about multiple processors, internal memory, and real life graphics. Whatââ¬â¢s more gaming consoles are no longer just gaming consoles they are now the full true definition of Home Entertainment, with internet connection, streaming music and video and HDD, it is truly a new age for gaming. Social media and social networks has become a big part of many of our lives, well I really should say all of our lives. With that gaming has also being a part of that as it is being input into the Social Medias like Facebook. Many game makers look to theses Social Medias to see exactly what the gamers are taking to and what they are not taking to. World of Warcraft is one of those who uses social media to help them develop more of what they think users want to see and how they want the game to operate and it is actually one of the largest virtual game worlds due to just that along with Second Life which heralds itself as a virtual social world. Social Medias focus on building blocks, identity, conversation, sharing, presence, relationships, reputation, and groups, and many gaming systems that can connect to the World Wide Web can do just that. These gaming technologies give you the opportunity to engage with others from around the world no matter where they may be. Facebook has gaming that lets you connect with others and build a farm or a city and share with each other and allows you to help keep up someone elseââ¬â¢s farm, building relationships with people anywhere anytime. No matter what you think about gaming and the gaming technology it has come a long way from where it started and there is no telling where it will go. It is a mainstay in the lives of many of people and I am sure that the technology will only get better or more interesting from here into the future. I do myself wonder what is next will the whole computer to brain interaction ever come full circle or is it just a past thought. Will Microsoft, Sony and Nintendo continue to stay on top or will a unknown company come along and knock all of them off the throne. I guess all we can do is sit back watch wait and see what that next big technology is and who will be the first to come out with it. Right now itââ¬â¢s more or less of a waiting game, the Kinect was a big jump and gaming technology and, I love to sit and watch my kids play on it, I have even been known to play on it myself, gaming technology is what keeps many of us known and connected so there is nothing wrong with that. Works Cited Admin. (2012). The History of Video Game Consoles. computergameoffer.com. Bellis, M. (n.d.). Computer and Video Game History. About.com. Edwards, B. (2009). 30 Years of Handheld Game Systems. PC World. Johnson, J. (2008). From Atari Joy Board to Wii fit. BBG. Melanson, D. (2006). A Brief History of Handheld Video games. engadget.
Monday, August 5, 2019
Synthesis of IBT Scaffolds Experiment
Synthesis of IBT Scaffolds Experiment Chapter III: Section A Present work PRESENT WORK Over the years, multi component reactions (MCRs) or three component coupling reactions have gained much attention in synthetic as well as medicinal chemistry to generate structural diversity for drug discovery programs.31 MCR is a process in which at least three simple building blocks are combined together in one pot to provide diversity oriented product with minimum of synthetic time and effort. The imidazo [2,1-b] benzothiazole (IBT) motif is an important pharmacophore and is known to exhibit significant biological activities such as antimicrobial, antitumor, antituberculosis, and anti-inflammatory etc.,13-30 Consequently, different methods23-26 have been developed for the synthesis of IBT scaffolds. Of these, the MCR of 2-aminobenzothiazole, aldehyde, and alkyne is one of the most attractive methods for the synthesis of 2-aryl imidazo[1,2-a]benzothiazoles. To the best of our cognition, there is only one report14c for synthesis of the title compounds till date which is a multi step procedure. In prolongation of our involvement in exploring the synthesis32 of biologically active scaffolds, we herein describe a three-component, one-pot synthesis of IBTs using a catalytic amount of La(OTf)3 and CuI in acetonitrile. Accordingly, we first treated 2-aminobenzothiazole (1a) and benzaldehyde (31a) with ethylpropiolate (32) in acetonitrile in the presence of 10 mol % La(OTf)3 and CuI. The reaction proceeded well at r. t. and made the desired IBT 33a in 86% yield (Scheme 10). Scheme 10 Formation of 33a confirmed by its spectral analysis, 1H NMR of IBT 33a shows a characteristic methylene protons at à ´ 4.16 (s, 2H) as a singlet, doublets at à ´ 7.78 (d, J = 7.9 Hz, 1H) and 7.70 (d, J = 8.3 Hz, 1H) belongs to benzothiazole ring (ortho-hydrogens w.r.t sulphur and nitrogen) in the fused system. Another doublet at à ´ 7.68 (d, J = 7.9 Hz, 2H) belongs to ortho-hydrogens of phenyl ring at C-2 position and peaks for remaining aromatic protons observed at à ´ 7.42 (t, J = 7.5 Hz, 1H), 7.33 (t, J = 7.5 Hz, 2H) and 7.01 (t, J = 7.9 Hz, 2H). Ethyl ester protons resonated at à ´ 4.26 (q, J = 6.8 Hz, 2H) and 1.27 (t, J = 8.1 Hz, 3H) belongs to OCH2CH3 and OCH2CH3 respectively. In 13C NMR, presence of a peak at à ´ 171.1 resembling ester (-C=O) functionality, a peak at à ´ 61.1 belongs to CH3ââ¬âCH2-O-COAr, and remaining aromatic carbons resonated at their expected chemical shift values. The new absorption maximum at 1735 cm-1 in IR spectrum also supports the presence of ester (-C=O). Finally, ESI-MS also supports the IBT 33a showing a peak with m/z 337 for its molecular ion. Plausible mechanism Scheme 11 Plausible mechanism for CuI-La(OTf)3 catalyzed multi-component reaction In accordance with the mechanism described by Mishra et al.33 The reaction may proceed in one of the two paths mentioned above. In path-I, benzaldehyde was first reacted with 2-aminobenzothiazole in the presence of La(OTf)3, and the resulting imine further reacted with ethyl propiolate to form desired imidzobenzothiazole 33a via initial formation of propargylamine I. Tautomerization of propargylamine I followed by 5-exo-dig cyclization formed intermediate III, which finally isomerized to product 33a. Optimization study Table 1 Screening of the catalyst for three component reaction Entry Catalyst Additive Solvent Yield [%] 1 CuCl CH3CN 39 2 CuBr CH3CN 42 3 CuI CH3CN 65 4 FeCl3 CH3CN 5 InCl3 CH3CN 48 6 InBr3 CH3CN 51 7 CuI Cu(OAc)2 CH3CN 66 8 CuI Cu(OTf)2 CH3CN 69 9 CuI Sc(OTf)3 CH3CN 59 10 CuI La(OTf)3 CH3CN 86 11 CuI Yb(OTf)3 CH3CN 63 12 CuI TMEDA CH3CN 54 13 La(OTf)3 CH3CN 55 14 CH3CN Initially, we screened the reaction between 2-aminobenzothiazole 1a benzaldehyde 31a and ethylpropiolate 32 as starting materials using different catalysts to optimize the reaction conditions (Table 1). The desired product yielded in 65% when 10 mol % of CuI used in CH3CN. Further optimization was performed to improve the yield of the product. The best result was obtained when La(OTf)3 was used with high yield, low reaction time and optimal temperature. However, in the absence of the catalyst the reaction proceeds with low yield even after longer reaction time (24 h). Our attempts to optimize the conditions for the synthesis of the 2-aryl imidazo[2,1-b]benzothiazoles 33a-o are summarized in Table 1. To check the generality and scope of the present protocol (Table 1), variety of benzaldehydes containing electron withdrawing or electron donating substituents were reacted under these conditions with 2-aminobenzothiazole, which affords corresponding IBT (Scheme 12). Scheme 12 To explore the limitations of this reaction, we extended it to various para-substituted benzaldehydes with 6-methyl-2-aminobenzothiazole. As can be seen in Table 2, the yield of products seems to be affected by the nature of substituents and their positions on the benzothiazole as well as benzaldehydes. The yields decreased when electron-withdrawing substituents were present on reactants (scheme 13). Scheme 13 The compounds 33f-j were characterized by 1H NMR, 13C NMR, IR and ESI-MS, the results are shown in the experimental section. For an instance, spectral analysis of 33h explained here. 1H NMR of IBT 33h shows a characteristic methylene protons at à ´ 4.27 (s, 2H), and methyl protons at 2.35 (s, 3H), methoxy protons at 3.86 (s, 3H), and hydrogen adjacent to sulphur attached carbon resonated at 7.71 (s, 1H) as a singlets, doublets at à ´ 7.64 (d, J = 7.9 Hz, 1H), and 7.35 (d, J = 7.1 Hz, 1H) belongs to benzothiazole ring (ortho- and meta- hydrogens w.r.t nitrogen) in the fused system. Another two doublets appeared at à ´ 7.55 (d, J = 7.8 Hz, 2H), and 7.01 (d, J = 7.6 Hz, 2H) belongs to methoxy substituted phenyl ring, whereas ethyl ester protons resonated at à ´ 4.15 (q, J = 8.1 Hz, 2H), and 1.27 (t, J = 8.2 Hz, 3H) belongs to OCH2CH3 and OCH2CH3 respectively. In 13C NMR, the presence of a peak at à ´ 169.1 resembling ester (-C=O) functionality, a peak resonated at 160.8ppm belongs to ââ¬âOme attached carbon on phenyl ring, where as a peak at à ´ 61.2 belongs to CH3ââ¬âCH2-O-COAr, and remaining aromatic carbons resonated at their expected chemical shift values. The new absorption maximum of 1738 and 1210 cm-1 in IR spectrum also supports the presence of ester (-C=O). Finally, ESI-MS also supports the IBT 33h showing a peak with m/z 381 for its molecular ion. Furthermore, a variety of aromatic aldehydes such as p-methyl-, p-methoxy-, p-nitro and p-cyano benzaldehyde participated well in this MCR with 6-nitro-2-aminobenzothiazole and gave excellent yields. The synthesized compounds 33k-o were characterized by 1H NMR, 13C NMR, IR and ESI-MS, the results are shown in the experimental section. For example, the spectral analysis of IBT 33n was explained here. 1H NMR of IBT 33n shows a characteristic methylene protons at à ´ 4.19 (s, 2H), and hydrogen adjacent to sulphur attached carbon and ââ¬âNO2 group resonated at 8.55 (s, 1H) as a singlets, doublet at à ´ 8.01 (d, J = 7.7 Hz, 2H) belongs to meta-hydrogens of nitro-substituted phenyl ring and a multiplet appeared between 8.40-8.50 (m, 3H) belongs to a hydrogen of benzothiazole ring merged with ortho-hydrogens of nitro-substituted phenyl ring, whereas ethyl ester protons resonated at à ´ 4.11 (q, J = 8.0 Hz, 2H), and 1.21 (t, J = 8.2 Hz, 3H) belongs to OCH2CH3 and OCH2CH3 respectively. In 13C NMR, the presence of a peak at à ´ 169.1 resembling ester (-C=O) functionality, peaks resonated at 147.8, 144.7 ppm belongs to ââ¬âNO2 attached carbons, where as a peak at à ´ 61.5 belon gs to CH3ââ¬âCH2-O-COAr, and remaining aromatic carbons resonated at their expected chemical shift values. The new absorption maximum of 1735 cm-1 in IR spectrum also supports the presence of ester (-C=O), bands at 1536 and 1365 cm-1 resemble the ââ¬âNO2 group. Finally, ESI-MS also supports the IBT 33a showing a peak with m/z 427 for its molecular ion. The imidazobenzothiazole derivatives was synthesized by La(OTf)3-CuI catalytic combination in good to excellent yields as shown in Table 2. Table 2 The new ethyl 2-(2-arylimidazo[2,1-b][1,3]-benzothiazol-1-yl)acetates 33a-o Entry Benzothiazole Aldehyde Product Yield (%) a 86 b 89 c 91 d 81 e 79 f 92 g 91 h 95 i 85 j 88 k 82 l 84 m 85 n 78 o 79 Conclusion In summary, a novel method for the synthesis of ethyl 2-(2-arylimidazo[2,1-b][1,3]-benzothiazol-1-yl) acetates was demonstrated from bezaldehyde, ethylpropiolate, and 2-aminobenzothiazole in the presence of La(OTf)3-CuI catalyst in good to excellent yields. This reaction took place under mild conditions and it tolerates a wider range of functionalities. Therefore this methodology offers an alternative to multi step reactions.
Sunday, August 4, 2019
Optimum Currency Area (OCA) Theory
Optimum Currency Area (OCA) Theory What criteria did Mundell use to identify an optimum currency area and how relevant are these criteria today in deciding whether two countries constitute an optimum currency area? An Optimum Currency Area (OCA) is a geographical region in which maximise economic efficiency is attained by the entire region sharing a single currency (a monetary union), or by several currencies pegging to each other via a fixed exchange rate. National authorities have come to the realisation that by merging with other countries to share a currency, everyone might benefit from gains in economic efficiency. An example of this can be seen in the formation of the euro where the countries involved do not individually match the criteria of an OCA, but believe that together they come close. The aim of national authorities is to establish the correct form of economic integration to maximise efficiency. One of the original founders of the OCA theory was economist Robert Mundell. In his first paper ââ¬ËA Theory of Optimum Currency Areas (1961) he presented several principal criteria to create a functioning monetary union. To support these criteria for an OCA I shall on occasion refer to an example of consumer preferences switching from French to German-made products by Paul De Grauwe (2003). The change in consumer preferences will cause an upward shift in aggregate demand in Germany and a downward shift in France as shown in 1 below. The output decline in France and increase in Germany is most likely to cause unemployment to increase in France but decrease in Germany. The first of the criteria for an OCA is price and wage flexibility throughout the geographical area. This means that the market forces of supply and demand automatically distribute money and goods to where they are needed. For example, with regards to France and Germany under perfect wage flexibility, the unemployed workers in France will reduce their wage claims, and conversely excess demand for labour in Germany will push up the wage rate. This inevitably shifts aggregate supply for France outwards making French products more competitive, and stimulating demand, whereas the opposite occurs for Germany. 2 below shows the effect of wage flexibility as an automatic adjustment mechanism. Mundell cited the importance of factor mobility as an ââ¬Å"essential ingredient of a common currencyâ⬠(Mundell, 1961) and thus labour mobility across the geographical region is one of Mundells main criteria for an OCA. In the case of De Grauwes example, French unemployed workers would move to Germany where there is excess demand for labour. This free movement of labour eliminates the need to let wages decline in France and increase in Germany solving both the unemployment problem in France, and the inflationary wage pressures in Germany. The existence of labour mobility relies on the unrealistic assumptions of free movement of workers between regions regardless of physical barriers such as work permits, cultural barriers such as language difficulties and institutional barriers such as superannuation transferrals. Indeed Peter Kenen referred to the additional costs of retraining workers and there is an ââ¬Å"unrealistic assumption of perfect occupational mobilityââ¬Å"(Kenen, 1969). Ronald McKinnon observed that ââ¬Å"in practice this does not work perfectly as there is no true wage flexibilityâ⬠(McKinnon, 1979). McKinnon is simply highlighting the point that in reality wage flexibility, as well as perfect labour and capital mobility do not always exist. Considering a case where wages in France do not decline despite the unemployment situation (no wage flexibility), and French workers do not move to Germany (no labour mobility) both Germany and France would be stuck in the original position of disequilibrium. In Germany the excess demand for labour would put pressure on the wage rate, causing an upward shift in the supply curve. The adjustment from the position of disequilibrium would in this case come exclusively from price increases in Germany making French goods more competitive once more. Therefore if wage flexibility and labour mobility does not exist then the adjustment process will be entirely reliant on inflation in Germany. Mundell stated product diversification over the geographical area is an important determinant of the suitability for a region to share a currency. This has been supported by many economists, such as Peter Kenen who says ââ¬Å"groups of countries with diversified domestic production are more likely to constitute optimum currency areas than groups whose members are highly specialisedâ⬠(Kenen, 1969). Finally Mundell stated that an automatic fiscal transfer mechanism is required to redistribute money to sectors with adverse affects from labour and capital mobility. This usually takes the form of taxation redistribution to less developed areas of the OCA. Whilst this is theoretically ideal and necessary, in practice it is extremely difficult to get the well off regions of the OCA to give away their wealth. Mundell produced two models in relation to OCA theory. In the first, under a model of Stationary Expectations (SE), he takes a pessimistic view towards monetary integration, however in his second paper he counters this, and focuses on the benefits of a monetary union under the model of International Risk Sharing (IRS), which has conversely been used to argue for the forming of monetary unions. ââ¬ËThe Theory of Optimal Currency Areas paper by Mundell in 1961 portrays OCAs under stationary expectations. The assumption is made that asymmetric shocks undermine the real economy and thus flexible exchange rates are considered preferable because a shared monetary policy would not be precisely tuned for the specific situation of each constituent region. This paper led to the formation of the Mundell-Fleming Model of an open economy which has been used to argue against the forming of monetary unions as an economy cannot simultaneously maintain a fixed exchange rate, free capital movement, and an independent monetary policy. Whilst the Mundells criteria for an OCA is held in high regard my many economists, there are some criticisms levelled at him. Capital mobility is seen to have been a ââ¬Å"greater adjustment mechanism than labour mobilityâ⬠(Eichengreen, 1990) and this is a factor John Ingram criticises Mundell for ignoring. Clearly the openness of the region to capital mobility is crucial to the makeup of an OCA, as for trade to exist between participating regions, free movement of capital is necessary. However in the years that followed his 1961 paper on OCAs Mundell realised the criticisms of his previous paper and began to doubt the basic argument for flexible exchange rates as an adjustment mechanism. He became more appreciative of the adjustment mechanism under fixed exchange rates, ââ¬Å"It was not that I had forgotten the Mundell-Fleming model, but that I had gone beyond itâ⬠(Mundell, 1997). In Mundells 1973 paper, ââ¬ËUncommon Arguments for Common Currencies, he discarded his earlier assumption of static expectations to look at how future uncertainty about the exchange rate could disrupt the capital markets by restraining international portfolio diversification and risk-sharing. Here he introduces his second model of OCAs under IRS. He counters his previous idea that asymmetric shocks weaken the case for a common currency by suggesting that a common currency can reduce such shocks by sharing the burden of loss. He uses the example of two countries, Capricorn and Ca ncer. In spring, Cancer ships half of its crop to Capricorn and in return it receives evidence of Capricorns debt, a claim to half of Capricorns food crop in autumn. While one country is expanding its money supply and running a balance of payments surplus, the other will be running a balance of payments deficit, and the process is reversed during the next period. Mundell points out that this system is very satisfactory in a world of certainty, however in reality there is speculation about the convertibility of foreign currencies. If Cancer had a bad harvest and produced less crop, to redeem all of notes from the Capricorn would involve providing them with their promised share of crop as usual, leaving Cancer short. The only defence against paying out the promised share of crop would be a devaluation of Cancers currency and thus a reduction in the claim by Capricorn on the crop. Capricorn needs to get enough crops to survive and produce food in the autumn, so Cancer will not also be left short on supplies in the next period. The solution would appear to be a partial devaluation of Cancers currency, so that the burden of loss would be shared between the two countries. Mundell has shown that with different currencies comes the uncertainty of devaluation, a problem which a common currency would not have. Under a common ââ¬Å"worldâ⬠currency if Cancer has a bad crop the total amount of world currency will exchange for full quantity of crop, irrespective of who holds the money as competition and freedom of arbitrage assures a single price. So long as competition exists, and there are no time lags in the transmission of goods or information, the price of the food will rise for both countries and so the burden of shock is shared automatically and equally by the two countries. To reconcile Mundells two papers and assess the appropriateness the criteria on determining two countries suitability as a currency area I have decided to look at the case of the European Monetary Union (EMU) and its success as a monetary union. There are many examples of countries within Europe that would struggle to maintain international competitiveness without the currency area. The areas of the EU with low labour mobility are furthest away from meeting the criteria of a currency area. However, while the removal of legal barriers (such as visas) has improved this labour mobility, issues such as language barriers remain, for example, a French worker may not wish to move to Spain because they cannot speak Spanish, also people tend to have ties to the places they currently live and may not be willing to move away from them. Bayoumi and Eichengreen (1992) compared the US and Europe with respect to how disturbances in separate regions match shocks in a selected benchmark region. They chose Germany as the benchmark for Europe and found that there is a relatively high symmetry of disturbances within the core of the EU such as Austria, Benelux, Denmark, France and Germany. They also found that the symmetry was lower for western European countries. When compared to the USA, the EMU had a higher probability of asymmetric shocks. However according to Fidrmuc and Korhonen (2001) the extent of the asymmetric shocks is declining in the EU economies. Bayoumi and Eichengreen believe that countries within Europe are further from an OCA than regions in the USA, and so are less appropriate as a currency area. These studies suggest that two countries in the EU are less suited to forming a monetary union than the regions of the USA, although the situation is improving. Frankel and Rose (1998) argued that the higher the trade integration, the higher the correlation of the business cycles among countries, in other words there is greater symmetry of shocks. They also propose that business cycles and trade integration are inter-related and endogenous processes to establishing a currency union. Frankel and Roses empirical findings noted that EMU entry encourages trade linkages among countries and causes the business cycle t o be more symmetrical among the unions participants. Rose and Stanley (2005) find that a common currency generally increases trade among its members between 30% and 90%. These findings agree with Mundells argument that a common currency can help to deal with asymmetrical shocks. Frankel and Roses findings suggest that although two countries considering creating a common currency may not meet the criteria before they join the currency area they may do afterwards. Economists are divided in opinion between Mundells two OCA models. The contrasting views which Mundell presents in his papers have earned him a title as ââ¬Å"the intellectual father to both sides of the debateâ⬠. While some economists support the theory of stationary expectations, preferring flexible exchange rates, and conclude against the euro, others advocate the IRS model, preferring the fixed exchange rate, and conclude in favour of the euro. Mundell himself seems to have eventually settled in favour fixed exchange rates in a monetary union however he does still advocate the use of flexible exchange rates in two cases. In the case of unstable countries, whose inflation differs significant from its currency sharing regions and in large countries where there is no established international monetary system, e.g. the USA. From Mundells studies I can conclude that two countries which are heavily integrated through highly mobile factors of production which are highly diversifie d in their goods should join a common currency. With regard to the relevance of Mundells theory today I would say his studies are still valid and used heavily as complementary theory to monetary integration occurring in Europe and throughout the world. References Robert Mundell ââ¬ËA Theory of Optimum Currency Areas, 1961 ââ¬ËUncommon Arguments for Common Currencies p. 115, 1973 A Conference on Optimum Currency Areas at Tel-Aviv University, 5th December 1997 Paul De Grauwe ââ¬ËEconomics of Monetary Union p. 7, 2003) Robert McKinnon ââ¬ËMoney in International Exchange: The Convertible Currency System, 1979 Peter Kenen ââ¬ËThe theory of Optimum Currency Areas: an Eclectic viewââ¬Ë, 1969 ââ¬ËMonetary Problems of the International Economy, 1969, pp. 95-100 Barry Eichengreen ââ¬ËOne Money for Europe? Lessons from the US Currency Union, 1990 ââ¬ËIs Europe an Optimal Currency Area, 1991 J. Fidrmuc I. Korhonen ââ¬ËSimilarity of supply and demand shocks between the Euro area and the CEECs, 2001 J. A. Frankel A. K. Rose The Endogeneity of the Optimum Currency Area Criteria pp. 1009-25, Jul 1998 A. K. Rose T. D. Stanley ââ¬ËA Meta-Analysis of the Effect of Common Currencies on International Trade, pp 347-365, 2005
Windows NT Proposal Essay -- Essays Papers
Windows NT Proposal Migration to Windows NT Proposal Plan As technology advances so should the products and services provided by companies. In every industry, technology is becoming the key success factor to growth and profit. The ability to communicate with people all around the world has created a new marketplace for business. In order to remain competitive, it is important for companies to utilize the most current technology. At ABC Inc., the use of the latest technological tools allows the company to provide first-rate, quality architectural engineering services to its clients. As part of the company's strategic goal to increase profits and clients, the board of directors established an information technology steering committee to look at how the company could improve its' technology. The committee was tasked to make sure the company had the latest available industry computer tools and to make sure all employees were uniform in terms of the technology. One of the most important findings of the committee's research was the fact th at the company and its branches were using varying types of software and hardware systems. The findings also showed that this lack of uniformity caused numerous communication problems not only with the branches and corporate offices, but also the clientele. These findings were reported to senior management. Based on the findings, recently, senior management made the decision to ensure all employees, branch offices and corporate office were working on the same software and hardware systems. Management decided to move the entire company to a Windows NT environment, in order to improve productivity, to create uniformity, to create a more functional network infrastructure and to develop an Intranet and Internet web sites. The Information Technology department (referred to as the "Team") was asked to take a look at the pros and cons of moving to Windows. The Team has prepared the following report based on its research efforts. Business Requirement(s) ABC Inc. is a progressive company based in San Diego, California. Since 1980, the company has offered a full range of architectural engineering services, from planning and analysis to design and implementation. We currently employ over 50 people in our home and branch offices that include Las Vegas and San Francisco. Like many companies that implemented computer... ...ipment. This problem would cause headaches when one network was not in synch with the others. Centralized manageability would increase the stability of the network system. Although Windows NT will be the operating system of choice, some of the company's UNIX system will have to be retained. The UNIX servers provide high-end graphics and geometric functionality so necessary in the architectural engineering field. However, once Windows NT 5.0 arrives with its 64-bit processor, the company will migrate its graphic functions to the NT format. Integrating the UNIX servers into the Windows NT system will be accomplished by using the public domain software known as Samba. Samba allows a UNXI server to "â⬠¦behave similarly to a Windows-based serverâ⬠¦" allowing clients to access and share Unix applications seamlessly via NT. Communication within our network has much improved with Windows NT. We are now capable of sharing files and data between all offices. Our Fast Ethernet Intranet provides speedy and stable communication transport. Justification {Explain and justify the selected operating system} 1. "Benefits of Migrating to Windows NT" Feb 1998, p. 186, Brian Honan
Saturday, August 3, 2019
Focal Dystonia of the Hand, And What the Brain Has To Do With It :: Biology Essays Research Papers
Focal Dystonia of the Hand, And What the Brain Has To Do With It The body is complicated, and often the origins of a condition are all but obvious. Focal dystonia of the hand is one disorder whose underlying cause has been found in the more recent past. Although it can be genetic (1), the form of focal dystonia of the hand I look at here is caused by environmental factors (2). Focal dystonia of the hand is a condition characterized by a loss in motor control of one or more fingers. A single muscle or group of muscles is involved: muscles in the hand and forearm tense and tighten, with the result of making the hand (or part of it) curl (2). Musicians who have intensively practiced their instruments over a number of years are a group most affected by this condition. The reason is that focal dystonia can be caused by the repetitive movement of the fingers over a significant period of time. The condition was long known as "occupational hand cramp." (3). It can easily be misdiagnosed as simple overuse or stress of the hand (1). Although it may not be obvious at first sight of the symptoms, the level at which the problem is caused is not the hand, but the brain. Researchers at the University of Konstanz report "overlap or smearing of the homuncular organization of the representation of the digits in the primary somatosensory cortex" (3). Given that functions such as motor control cross over from the right side of the body to be represented in the left hemisphere, they found that the distance between the representations of individual fingers was smaller in the somatosensory cortex side corresponding to the hand that had undergone continued repetitious training (the left hand in case of violin players for example). What does all this mean in terms of the brain? Looking at the central nervous system as an input-output system, in very simple terms we can observe that a specific input is presented over and over again - in this case the stimulation of the fingers that play the violin - and as a result the organization within the box changes. More specifically, there is a one-to-one correspondence between input and internal representations of this input: all fingers are individually represented on the somatosensory cortex. But somehow, as these regions of representation begin to smear or overlap, the one-to-one correspondence is blurred.
Friday, August 2, 2019
Acorn Industries Essay
It is the rare corporation that can recognizes the need to integrate its resources, policies, people, assets and procedures with changing business strategies. Rarer still is the organization that acts on this need. Yet, in todayââ¬â¢s competitive global market, an integrated strategy is increasingly necessary. Given the speed with which change occurs in the global business environment, standard planning techniques and asset allocation methods have become woefully outdated. Indeed, achieving new levels of business sophistication is a never-ending process, requiring Acorn to rapidly a strategic organizational transformation to meet changing conditions. To effectively accomplish this transformation the company needs a system that provides continuous evaluation and improvement, ensuring effective use of both business (hard) and organizational (soft) assets. In particular, what is required is a balance and alignment between customer, organizational and business investment. In todayââ¬â¢s market, organizations not taking such an approach run the serious risk of failing to meet the expectations of shareholders. The case of Acorn Industries highlights the lack of strong leadership, the need for a transformation in its organizational structure, the need for a balance scorecard system, the need for a programme manager and the effective operation and utilization of such a structure. Among the distinguishing characteristics of companies achieving sustainable shareholder value is that the management in these organizations constantly evaluates the key operational drivers of the business and, in response to changes in the business environment, strategically transforms the companyââ¬â¢s resources among those drivers, whether they are in marketing and sales or in some area of production. This process must occur every time the business changes marketing strategies, experiences a merger, acquisition or spin off,à or moves to a new level of sophistication and globalization maturity. The result is a company experiencing an ongoing process of active, bottom line-oriented self-assessment and growth. When a companyââ¬â¢s organizational and business assets are in alignment, adjustments occur naturally. For this alignment to occur, however, the business must measure its organizational and business assets differently than it did at previous levels of maturity. It also m ust be able to transform organizational assets rapidly to meet changing conditions. PROBLEMS AND CAUSES Many new projects implemented within organizations either partially or fully fail because the intervention does not adequately address the enabling environment within which the organization operates (UNDP, 1993). For example, Acorn tried to keep the commercial and government contracts separate. They were also managed as separate entities based on marketing and the resources from the functional departments. Any effort to diagnose and improve the performance of this organization requires an understanding of the forces inside and outside the organization that can facilitate or inhibit that performance (Savedoff, 1998). Enabling environments support effective and efficient organizations and individuals, and creating such environments is becoming an increasingly important aspect of developing this organization towards one that can operate on a programme organizational structure. The organizations natural resources, human resources, financial resources, infrastructure and technology together form what is call ââ¬Å"capabilities.â⬠They combine with rules and institutional ethos to create an enabling or inhibiting environment for organizationââ¬â¢s growth and development. This point illustrates the overriding influence of rules and, as noted earlier, the interdependence of the various components of an enabling environment. Acorn embarked on launching ambitious programs to develop capabilities but neglected the importance of conducting a thorough institutional analysis. It involves mapping the institutional environment in terms of politics,à administrative capacity, culture, leadership, organizational structures, etc. in a manner that includes all stakeholders and measures their level of ownership and commitment to reform. Acorn had numerous projects underway with no formal project management process in place to effectively manage successful outcomes. They have not embraced programme management as the discipline to hold people accountable and execute the implementation of strategic change initiatives. Acorn had failed at the process to effectively manage all their projects. Projects emanate from the strategic plan, therefore to increase project success at the strategic level a process must be established to select and monitor projects and ensure projects and resources are in alignment with the strategic plan. For success to occur, synergy is required from all project participants at all levels. RECOMMENDATIONS Strategic leadership is associated with the organizationââ¬â¢s vision, as well as with the ideas and actions that make the organization unique. It is the process of setting clear organizational goals and directing the efforts of staff and other stakeholders toward fulfilling organizational objectives (Mintzberg and Quinn, 1995). In essence, therefore, strategic leadership has to do with the organizationââ¬â¢s ability to influence its internal and external stakeholders so that they will support organizational directions. Strategic leadership needs to empower its members to create the changes that are necessary for an organization to perform and survive (Byrd, 1987). It goes beyond simple planning, in that it creates ways of clarifying and obtaining organizational goals by looking within and outside the organization. It sets the stage for organizational action and the methodologies the organization will use to produce the results required. Thus, an organizationââ¬â¢s strategic leadership involves developing ways of inspiring organizational members and stakeholders to perform in ways that attain the mission, while adapting to or buffering external forces. Strategic leadership consists of three main dimensions: leadership, strategicà planning and niche management: LEADERSHIP _Leadership is basically the process through which leaders influence the attitudes, behaviors and values of others towards organizational goals_ (Vecchio, 1995). Indeed, no one can deny its critical importance to the success of any organization, no matter where the organization is located or what it does. Salopek (1998) outlines four fundamental qualities of leadership, each of which has several specialized and associated competencies. These qualities relate to the ability to become and act as the following: Collaborators skilled at facilitating, coaching and fostering dialogue; Innovators skilled at visioning, championing and diffusing; Integrators skilled at organizing, improving and bridging; Producers skilled at targeting, improving and measuring. The need for leadership qualities is not restricted to executive senior managers, but extends to workers at all levels of the organization. Leadership exists at many places inside the organization, both formally and informally. Formal leadership, exercised by those appointed or elected to positions of authority, entails activities such as setting direction, providing symbols of the mission, ensuring that tasks are done, supporting resource development, and modeling the importance of clients. _STRATEGIC PLANNING_ Strategic planning entails formulating and implementing activities that lead to long-term organizational success. It is essentially a decision-making process that involves a search for answers to simple but critical and fundamental questions: What is the organization doing? How is it doing whatà it does? Where should it be going in the future? What should it be doing now to get there? Strategic planning encompasses issues spanning the entire spectrum of the organization, from introspective questions of what the organizationââ¬â¢s personality is or ought to be, to strategic operational issues connecting the focus on the future with work to do to move the organization forward. The strategic plan itself is a written document; setting out the specific goals, priorities and tactics the organization intends to employ to ensure good performance (Kaplan and Norton, 1996). Thus, strategic planning must typically include a scan of opportunities, threats and constraints presented by the environment. This means that the organization must repeatedly ask itself what potential or pending actions are likely to influence (positively or negatively) what it does and plans to do? How can the organization forestall or mitigate the negative influences, as well as take advantage of the potential opportunities? Another strategic issue for the survival of an organization is the acquisition of resources in the vital areas of funding, technology, infrastructure and personnel. Strategic planning must adequately pursue these resources by anticipating and capitalizing on opportunities in the external environment that might yield or support them. It also means predicting threats to organizational resources and intervening (politically, in general) to ensure that organizational performance and survival are safeguarded (Korey, 1995). This level of leadership and intervention generally transpires between the senior executive of the organization and the organizationââ¬â¢s directors. Resource acquisition entails constantly being on the lookout to create opportunities that will augment the organizationââ¬â¢s resources. For strategies to become operational, they need to be communicated, processed and revised according to feedback from stakeholders, both internal and external. All members of the organization need to work toward making the strategic plan a reality, from senior management down to the most junior worker (Mintzbergà and Quinn, 1995). ORGANIZATIONAL STRUCTURE The ability of an organization to structure and restructure itself to adapt to changing internal and external conditions is important for maximizing organizational performance. Unlike other capacities, the structuring and transformation of an organization does not formally occur on a constant basis; however, adaptations of structure are always occurring. Organizational structure is defined as the ability of an organization to divide labor and assign roles and responsibilities to individuals and groups in the organization, as well as the process by which the organization attempts to coordinate its labor and groups. It is also concerned with the relative relationships between the divisions of labor: Who has authority over whom? How and why should an organization divide labour individually and by grouping people? How should organizations coordinate their work to maximize the benefits of the divisions of labour? What do people look for to indicate that problems are structural in nature rather than some other type of problem, such as one of leadership? OPERATING STRUCTURE The operating structure of an organization is the system of working relationships arrived at to divide and coordinate the tasks of people and groups working toward a common purpose. Most people visualize an organizationââ¬â¢s structure in terms of the familiar organizational chart. The task of creating appropriate and manageable work units or departments has challenged managers and students of organizational development for decades. In looking at structure, we are interested in the extent to whichà individuals, departments or other groupings understand their roles in the organization; whether they have the authority to carry out their roles; and whether they are accountable for their work. Structure also includes coordination issues (Mintzberg and Quinn, 1995). Coordination is the process of linking specialized activities of individuals or groups so they can and will work toward common ends. The coordination process helps people to work in harmony by providing systems and mechanisms for understanding and communicating about their activities. In organizations where innovation and productivity is key, interdisciplinary project teams are a competitive advantage. Entire networks are formed where the best minds collectively tackle difficult projects, with each contributor bringing his or her special perspective and expertise. The ease with which the programme office facilitates interdisciplinary approaches to projects is an indicator of organizational health. Many variables influence organizational structure, including history, size, technology, organizational goals, strategy, governance, funding and other pressures from the external environment, the specific fields of research, and technology. HUMAN RESOURCES The human resources of any organization are its most valuable assets. In the view of many top-level executives, employees are the key source of an organizationââ¬â¢s competitive advantage (Brown and Kraft, 1998; Chilton, 1994). Critically important to effective human resource management is to develop and instill core values throughout the organization (Down, Mardis, Connolly and Johnson, 1997). These values include integrity and honesty, commitment to the organizational mission, accountability for and pride in oneââ¬â¢s work, commitment to excellence, and building trust. They form the basis for developing cohesiveness and teamwork, as well as for developing policies, procedures and programs that focus on meeting the needs of customers or clients. In the case of Acorn Industries, human resources management functions isà charged with planning and controlling human resources to make sure that peopleââ¬â¢s needs are met so they can work to achieve organizational goals. Commitment to meeting employeesââ¬â¢ needs is not merely an altruistic function-it is highly likely that staff who are reasonably comfortable with working conditions, and stimulated by the environment, will be productive (Miron, Leichtman and Atkins, 1993). From an organizational perspective, control over human resources is critical to hold managers accountable for organizational performance. Nevertheless, progress in this area has been slow. HUMAN RESOURCES PLANNING Human resources planning involve forecasting the human resources needs of the organization, and planning the steps necessary to meet these needs. This planning is the first step in any effective human resources management function. Human resources planning should be closely linked to the organizationââ¬â¢s strategic objectives and mission. Even in regions of the world with a plentiful, well-educated workforce, such planning is a challenge because the needs of the organization are constantly changing and sometimes do not converge (Cockerill, Hunt and Schroder, 1995). The challenge is even greater if the pool of people from which the organization recruits is limited by such factors as brain drain, or because labor market wages in the private sector are more attractive (Colvard, 1994). Forecasting in these environments is quite difficult. PROGRAMME MANAGEMENT According to Booth (1998), the term ââ¬Å"programme managementâ⬠is used mainly by two groups of professionals in ways that are consistent. The first group, those involved with information systems, employs the term to describe the management of _big projects, especially system implementations._ The second group, corporate strategists, uses it to mean the _practical task of translating grand strategies into operational reality._ In many organizations, individual managers typically pursue their ownà projects and cite their own successes. In fact, the link between their efforts and organizational performance is generally quite obscure. By coordinating and linking the cascade of corporate goals reflected in diverse projects into specific sets of common-goal actions, program management helps to avoid this problem. Programme management is regarded as ââ¬Å"an additional layer of management sitting above the projects and ensuring that they remain pertinent to the wider organizationâ⬠(Booth, 1998). In the context of funded organizations in developing countries, organizations often receive financing from different donors or funding agencies for different projects that are not necessarily congruent with organizational goals. In such a situation, there is a clear need for programme management to align different projects with wider organizational goals and coordinate them into common-goal actions. PROGRAMME PLANNING Programme planning ranges from working out what to do on a day-by-day basis to long-term strategic planning. It should be happening constantly within a project and program. Programme planning must take into account what an organization has to do to create its goods and services, as well as the resources it needs to do so. Program planning requires thinking ahead and, as such, involves several concurrent questions. Whom are we serving? What demand are we supplying and at what cost? What are our objectives? What must be done to meet these objectives? Who will do this? How will they do it? How long will it take? How much will it cost? How will we know whether we have met our objectives? Programme planning has many levels and is time bound, so it can be short, medium or long term. However, when conducting an assessment, the extent to which the organizationââ¬â¢s plans are well communicated and used as management tools must be determined. This will require written plans. PROGRAMME IMPLEMENTATION The major task of the leaders of an organization is to put the organizationââ¬â¢s program into practice. It is all well and good to have a great plan-makingà it work is the hard part. Programme implementation requires organization and having staff that can put their skills to work. It requires integration of the management skills needed to allocate resources and the technical skills needed to do what has to be done (for example, to do several projects con-currently and with the sharing of resources). Programme implementation is the stage at which an organization integrates all its resources to concretely achieve its goals. PROGRAMME MONITORING AND EVALUATION Sound project monitoring and evaluation need to be built into projects during their planning stage and carried out throughout the project (IDB, 1997). For example, an assessment of the volubility of a programme or project ensures that it contains the basic elements required to monitor results and ultimately determine whether development objectives are being met. The planning section should have an increasing array of tools that help project planners develop quality projects. The logical framework can be incorporated into a project both for use as a planning tool but also to provide indicators for monitoring and evaluation (IDB, 1997). Similarly, outcome mapping (Earl, Carden and Smutylo, 2001) is used as a tool to support better planning, monitoring and evaluation. PROCESS MANAGEMENT Functional managers with many organizations today view their business as a series of functional silos concerned with their own requirements (Dent and Hughes, 1998). This perspective is particularly pervasive among managers accustomed to being rewarded for optimizing the performance of their functions relative to the rest of the organization. Although managers talk about ââ¬Å"big pictureâ⬠processes, their efforts are often focused inwardly on their own requirements and are measured accordingly. In such situations, there is an obvious need for common systems and operations that apply uniformly throughout the organization and, like a thread, sew the various functional parts together into a common purpose. There is also a need for compatible strategies to optimize organizational performance. In other words, process management is required. Taking a vision and making it a reality through smooth-flowing daily work in an organization is largely dependent on ongoing ââ¬Å"processes.â⬠These are the internal value-adding management systems and operations that cut across functional and departmental boundaries. They are the mechanisms that guide interactions among all groups of people in an organization to ensure that ongoing work is accomplished rather than hindered or blocked. Thus, _process management is the task of aligning and integrating the various practices and cultures of different segments of an organization through the introduction of common systems and operations that apply uniformly to all segments of the organization_. These common operations or processes include problem-solving, planning, decision-making, communication, and monitoring and evaluation. If the processes are all working, the outcome is that the organization is learning and accomplishing a great deal. Process management takes place at every level of an organization, from the board of directors to the line worker. The board and senior managers must know how to problem-solve, plan and make timely decisions. If they are deficient in these areas, organizational direction is often hampered. As with the case of Acorn Industries, programme units, departments and other functional segments of the organization must plan and set short- and medium-term goals, as well as solve problems, make decisions and generate strategies to carry out appropriate activities to achieve results. VISION AND MISSION The vision and the mission of an organization emerge from important social, economic, spiritual and political values. They are meant to inspire and promote organizational loyalty. Vision and mission are those parts of an organization that appeal to the heart; that is, they represent the organizationââ¬â¢s emotional appeal. They motivate people and draw upon staffà and stakeholdersââ¬â¢ hopes and aspirations. In this sense, the vision and mission of an organization provide inspirational motivation. Clarifying the vision and mission are important in private organizations. Private sector organizations often identify the importance of serving their customers, and have created visions and missions to support this theme. At issue for many organizations is not only to write but to then live the statements. When vision and mission statements are not lived up to, the result is not to enhance motivation but to foster cynicism. Assessing an organizationââ¬â¢s motivation primarily involves looking at its mission, since this is more closely linked to what the organization wants to do. However, in examining the mission, the link to the larger vision, as well as more operational components, must also be assessed. DEFINITION _An organizationââ¬â¢s vision defines the kind of a world to which it wants to contribute._ Visions lie beyond the scope of any one organization. They represent the hopes and dreams of organizational members. The vision describes the changes in the prevailing economic, political, social or environmental situation that the programme hopes to bring about. Missions, on the other hand, are a step in bringing about the operational aspects into the vision, an organizationââ¬â¢s raison dââ¬â¢Ã ªtre. _The mission is an expression of how people see the organization operating._ In this context, the mission lays a foundation for future action (Bart, 1997) and guides the organizationââ¬â¢s choice of strategies and activities. Some of the main reasons for an organization to have a vision and mission expressed in clear statements are to: Promote clarity of purpose Function as a foundation for making decisions Gain commitment for goals Foster understanding and support for its goals. Whereas the vision locates the organization within a cluster of organizations, it is the mission that answers the questions: Why does this organization exist? Whom does it serve? By what means does it serve them? Those responsible for the performance of an organization increasingly recognize the benefits of clearly and simply communicating the direction in which their organization is going. Such descriptions of the organizationââ¬â¢s future, whom it serves, what it values, and how it defines success can have a powerful impact on the organizationââ¬â¢s personality. ASSESSING THE MISSION Those seeking to diagnose and analyze the mission of an organization often find themselves dealing with multiple realities-those that are written down, and those that are perceived by organization members. One task in an organizational assessment is to determine the degree to which the formal mission statement is understood and internalized by members and stakeholders of the organization; that is, measure the congruence of the perceived and stated missions. CULTURE While the mission statement formally articulates organizational purpose, it is the organizationââ¬â¢s culture that gives life to the organization and helps make the realization of its mission possible. The concept of organizational culture has been the focus of much attention, with analysts associating it with superior corporate performance (Peters and Waterman, 1988), increased productivity (Ouchi, 1981), improved morale, and high rates of return on investment. _Organizational culture is the collectively accepted meaning that manifests itself in the formal and informal rules of an organization or a sub-group._ The culture embodies the collective symbols, myths, visions and heroes of the organizationââ¬â¢s past and present. For instance, culture finds expression in the collective pride (and even embellishment) of the accomplishments of individuals. Values important to the organization are illustrated through stories about past successes and failures; these form a living history that guides managers and drives membersââ¬â¢ behavior. DIMENSIONS Diagnosing organizational culture helps you understand the relative levels of consistency or inconsistency of ââ¬Å"meaningâ⬠that exist in an organization. In some ways, culture is like an iceberg; it has both seen and unseen aspects. From an anthropological perspective, culture has material and non-material dimensions. Culture has both physical artifacts-mission statements, policy guides-as well as basic beliefs that direct the thinking, feelings, perceptions and behaviors of the people in the culture. To know why some people are in trouble, are rejected or punished, or are not appreciated by an organization, you need to know the belief system and norms that underlie the organizationââ¬â¢s behavior. In this context, four dimensions of organizational culture can be identified: artifacts, perspectives, values, and assumptions (Bloor and Dawson, 1994). _Artifacts_ are the most tangible aspects of an organizationââ¬â¢s culture. These are the physical aspects of an organization: the type of office, the logo, dress, rituals (Christmas parties), stories, language and so forth. Artifacts are the physical manifestations of the organizationââ¬â¢s culture. _Perspectives_ are the ideas that people hold and use to act appropriately. For example, a perspective includes how the organization handles customer complaints or, for that matter, employee complaints. In some organizations, people go to great lengths to help customers obtain the products and services they say they need. In other organizations, customers are ignored. _Values_ relate to the ideals held by the organization, including concepts of standards, honesty, quality and integrity. Underlying or basic _assumptions_ are ââ¬Å"the taken for grantedâ⬠beliefs of an organization. This refers to what members of the organization feel is appropriate behavior for themselves and others. Since assumptions are considered a given, they are rarely if ever questioned. The set of tacit assumptions helps form the uniqueness of the organizational culture (Denison, 1996). BALANCE SCORECARDS Balanced Scorecard is a popular tool for implementation of strategy (Kaplan and Norton 1996a). As the founders of the concept, they promote the concept primarily as a tool that can provide aid in the implementation of strategy. They argue that the main causes of poor strategy implementation are: Visions and strategies are not actionable Strategies are not linked to departmental team and individual goals Strategy not linked to resource allocation Feedback that is tactical and not strategic The name BSC reflects the need for a balance between short and long time horizon for goals, between financial and non-financial measure parameter, between lag and lead indicators and between internal and external perspectives (Kaplan and Norton 1996a). The author argues that _â⬠what you measure is what you getâ⬠_ The measurements have a running effect. In order to accomplish a strategic effect, the organization must measure what is strategically important. This can be achieved in the Balanced Scorecard concept. Hence, the concept is not a control tool, but rather a strategic tool to help managers look ahead. In addition, the BSC shows how the results are achieved not only that they are achieved. With the four dimensions; the financial perspective, the internal business perspective, the customer perspective and the innovation and learning perspective, BSC combines aà number of flows that are going on in the organization. By understanding the organisation in this context, t he manager can learn what connections exist between the different perspectives. The common picture of the four dimensions is one of the contributions of the BSC concept. _ARCHITECTURE OF BSC_ Kaplan and Norton (2001a) describes the building of a BSC as a process to define a set of near term objectives and activities, the drivers that will differentiate a company from its competitors and create a long term customer and shareholder value, the outcomes. The process begins in a top down fashion, clearly defining strategy from the perspective of the shareholders and the customer. In other words, the scorecard is supposed to define the short term goals and activities. These are the strategic drivers that are supposed to differentiate the organization from the competitors and create long term value for the customers and the owners. The financial goals for growth and productivity are the most important. Causes of growth are to be defined. When the financial goals are defined, we must ask the question _â⬠Who are the target customers that will generate revenue growth_ _and more profitable products and services? What are their objectives and how_ _do we measure success with them?â⬠_ The customer perspective should also include a value proposition that defines how the company differentiates itself to attract retain and deepen relationship with targeted customers. The defined measurements in the customer and financial perspectives should not describe explicit how this should be achieved internally. It is the internal processes, like product, design, marketing development, sale, service, production that are about to define the necessary activities to achieve the goals in the customer and financial perspectives. The fourth perspective, learning and growth, should put pressure to execute internal business processes in new and differentiated way, based on the organizations infrastructure; the skills, capabilities and knowledge of employees; the technology they use and the climate in which they work, in other words what Kaplan and Norton (2001a) refers to as the learning and growth factors. _IMPLEMENTATION OF BSC_ Kaplan and Norton suggest implementing the BSC to overcome the strategy implementation problems: Visions and strategies are not actionable, strategies are not linked to resource allocation, and feedback is tactical and not strategically. However, when studying Balanced Scorecard, there is no common theory or model for implementation. Some use more perspectives than Kaplan & Nortonââ¬â¢s initial four, others not. For example, some have added a human focus or an environmental focus. Kaplan and Norton do not include the human focus as they believe the human is contained in all of their focus areas. This might be a result from the stepwise development of the BSC. The first concrete model for building a BSC is presented by Kaplan and Norton (1993) where they use a system model in eight steps to create a BSC that should link the measurements to the strategy. In the article _â⬠Using the BSC as a strategic management systemâ⬠_ by Kaplan and Norton (1996b), the development of BSC is extended from the eight step to a ten step model. According to the authors, after the tenth step, BSC has been included in the routine part of the strategic management system. The communication within the organization follows the different units in the business plan and lies in line with BSC. Through follow up of BSC, learning in the organization is enabled through performance and deviation assessments. However, Kaplan & Norton (1996a) mean that this might not be as easy as it looks. This is probably an understatement. They show failures in several cases with structural and organizational problems. The step wise development by Kaplan and Norton is also influenced by other research findings. This also applies to the implementation of BSC system. Kaplan and Norton start out with an implementing model in eight steps, while the Kaplan and Norton 1996b article present another 10 step model. For all models, a common theory for building and implementing BSC is missing. Despite this observation, Kaplan and Norton have developed principles forà how to become a successful strategy focused organization. However, these principles do not tell _how_, but rather _what_ matters to implement strategy successfully. In the article by Kaplan and Norton (2001c) the authors show how organizations use their scorecard to align key management processes and systems to the strategy. Although each organization achieved strategic alignment and focus in different ways at different paces and in different sequences, each eventually use a common set of principles to become what Kaplan and Norton refer to as the principles of strategy focused organization. The five principles are: 1. Translate the strategy to operational terms 2. Align the organization to the strategy 3. Make strategy everyoneââ¬â¢s everyday job 4. Make strategy a continual process 5. Mobilize leadership for change When Kaplan and Norton (2001c, 2001a) talks about the first principle _â⬠TRANSLATE THE STRATEGY INTO OPERATIONAL TERMSâ⬠_ they mean that the scorecard creates a common and understandable frame of reference for all organization units and employees through the translation of strategy into a logical architecture of a strategy map and the Balanced Scorecard to specify the details of the critical elements for their growth strategies. The second principle _â⬠ALIGN THE ORGANISATION TO THE STRATEGYâ⬠_ (Kaplan and Norton, 2001c, 2001a) relates to the organizational performance to become more than the sum of its parts. It must be linked and integrated. The Balanced Scorecard defines what is expected to create synergy and ensure that linkage actually occurs. This will prevent the strategies of different units to go in opposite directions. As many organizations have difficultiesà communicating and coordinating across the different functions, suboptimal behaviours may become a major barrier in strategy implementation. The third _principle_ _â⬠MAKE STRATEGY EVERYONEââ¬â¢S EVERYDAY JOBâ⬠_ means that the BSC should be used to communicate and educate the organization about the strategy. Scepticism towards unlimited communication to the entire organization risking leakage of valuable information to competitors is answered: ââ¬Å"_Knowing the strategy will do little good unless they execute it. On_ _the other hand we have no chance to execute it if people donââ¬â¢t know about itâ⬠._ This is also in line with Kotter (1996) who argues that real power first occur when those involved in the enterprise or activity have a common understanding of goals and directions. The author argues that it is not a top down direction, but rather a top down communication. When Kaplan and Norton (2001a, 2001c) talks about ââ¬Å"_MAKE STRATEGY A CONTINUAL PROCESSâ⬠_ they claim that the BSC introduce a new ââ¬Å"double loop processâ⬠to manage strategy. The process integrates the management of tactics with the management of strategy using three important processes. First, organizations link strategy to the budget process where they use BSC as a screen to evaluate potential investments and initiatives. Just as the BSC attempts to protect long term objectives from short term sub optimization, the budget process must protect long term initiatives from the pressures to deliver short term financial performance. The second step is to make strategy a continual process by introducing a simple management meeting to review strategy. Information feedback systems are changed to support the new management meetings. Finally, a process for learning and the strategy evolves. The initial BSC represent a hypothesis about the strategy. At the time of formulatio n, it is the best estimate of the action expected to create long term financial success. The design process of the scorecard establishes the cause and effect linkages of the strategic hypothesis explicit. As the scorecard puts it to action and the feedback system start reporting actual results, an organization can test the hypothesis of its strategy. In the fifth principle _â⬠MOBILIZE LEADERSHIP FOR CHANGEâ⬠_ also namedà _â⬠mobilize_ _change through leadershipâ⬠_ (Kaplan and Norton 2001a); the authors claim that the first four principles focus on the BSC tool, the framework and the process to support it. They also argue that active involvement of the executive is the single most important condition. If the top management are not active leaders of the process ââ¬â change will not occur, strategy is not implemented and the opportunity for breakthrough performance is lost. Over time, a new management system will evolve; this is a strategic management system that institutionalizes the new cultural values and processes into a new system for management. This is also in line with Kotter (1996) where he describes how transformational change occurs. By linking traditional processes such as compensation and resource allocation to a BSC that describes the strategy, they create a _strategic management system_. Furthermore, the author claims that the strategy must be a continual process that reflects shifts in opportunities and threats. Here, it is important that the integration of the new strategy into the organization does not create a barrier to future progress. CONCLUSION The relationship between organization and innovation is complex, dynamic and multilevel. The existing literature is voluminous and diverse. For Acorn to be a successful organization, I looked at the aspects of organizational structures, human resources, programme management, process management, their vision and mission, the organizational culture and balance scorecards. These are the potential different aspects of the relationships that form the coherent conceptual framework for understanding the phenomenon of ââ¬Ëorganizational innovationââ¬â¢. Executive Management needs to engage organizational functions in programme execution to obtain information evaluate progress and learn from failures regarding strategic change initiatives. If they donââ¬â¢t, they, like most projects, will fail. Committed leadership is required to provide the right environment for people to succeed when implementing change initiatives.à Projects are essential to the growth and survival of their entities because, when executed successfully, they help deal with changes in the environment, fiscal conditions and citizensââ¬â¢ needs. Directors must be held accountable for managing change and the best way to manage change is to employ a project management methodology that enables the Department to manage strategic project initiatives as a portfolio of budget investments and prioritize them in accordance with their importance to the Department strategy. Acorn Industries needs to focus on making change happen to improve their organizations performance!! Programme management is their ticket to that success. It will enable them to get on the road to quicker implementation of strategic initiatives and keep Acorn Industries moving forward. Organizations that want to be successful need to establish an integrated programme management process in order to execute strategic initiatives and enhance the organizational and individualââ¬â¢s project management capability. BIBLIOGRAPHY Bart, C. 1997. Sex, Lies and Mission Statements. _Business Horizons_ (November/ December): 9-18. Bloor, G., and P. Dawson. 1994. Understanding Professional Culture in Organizational Context. _Organization Studies_ 15(2): 275-95. Booth, R. 1996. Accountants Do It by Proxy. _Management Accounting-London_ 74(5): 48. 1998. Program Management: Measures for Program Action. _Management Accounting-London_ 76(7): 26-28. Brown, S. J., and R.J. Kraft. 1998. A Strategy for the Emerging HR Role. _Human Resources Professional_ 11(2): 28-32. Brudney, J., and S. Condrey. 1993. Pay for Performance: Explaining the Differences in Managerial Motivation. _Public Productivity and Management Review_ 17(2): 129-44. Byrd, R. E. 1987. Corporate Leadership Skills: A Synthesis. _Organizational Dynamics_ 16(1): 34-43. Cockerill, T., J. Hunt, and H. Schroder. 1995. Managerial Competencies: Fact or Fiction? _Business Strategy Review_ 6: 1-12. Colvard, J. E. 1994. In Defense of Middle Management. _Government Executive_ 26(5): 57-58. Denison, D. 1996. What Is the Difference between Organizational Culture and Organizational Climate? _Academy of Management Review_ 21(3): 619-54. Down, J. W., W. Mardis, T.R. Connolly, and S. Johnson. 1997. A Strategic Model Emerges. _HR Focus_ 74(6): 22-23. Earl, S., F. Carden, and T. Smutylo. 2001. _Outcome Mapping: Building Learning and Reflection into Development Programs_. Ottawa: International Development Research Centre. Inter-American Development Bank.1997. _Evaluation: A Management Tool for Improving Project Performance._ IDB, Washington, D.C. Kaplan, R.S. Norton, D.P. (2001 b) ââ¬Å"_Transforming Balanced Scorecard from performance measurements to strategic management:_ Part 1â⬠, Accounting Horizons, Mar, Vol 15, Issue 1, pp 87-105. Kaplan, R.S. Norton, D.P. (2001 c) ââ¬Å"_Transforming Balanced Scorecard from performance measurements to strategic management:_ Part 2â⬠, Accounting Horizons, Jun, Vol 15, Issue 2, pp 147-161 Kaplan, R. S., and D.P. Norton. 1996. Using the Balanced Scorecard as a Strategic Management System. _Harvard Business Review_ 74(1): 75-85. Korey, G. 1995. TDM Grid: An Effective Tool for Implementing Strategic Plans in Academic Institutions. _Management Decision_ 33(2): 40-47. Mintzberg, H., and J.B. Quinn. 1995. _The Strategy Process: Concepts, Context and Cases._ New York: Prentice Hall. Miron, D., S. Leichtman, and A. Atkins. 1993. Reengineering Human Resource Processes. _Human Resources Professional_ 6(1): 19-23. Ouchi, W. 1981. Theory Z: _How American Business Can Meet the Japanese Challenge._ Reading, MA: Addison-Wesley. Peters, T. J., and R.H.J. Waterman. 1982. _In Search of Excellence._ New York: Warner Books. 1988. _In Search of Excellence: Lessons from Americaââ¬â¢s Best Run Companies._ New York: Warner Books. Salopek, J. 1998. The New Managerial Mentor: Becoming a Learning Leader to Build Communities of Purpose. _Training and Development_ 52(12): 61. Savedoff, W. D. (ed.). 1998. _Organization Matters: Agency Problems in Health and Education in Latin America._ Washington, D.C.: Inter-American Development Bank. Vecchio, R. P. 1995. _Organizational Behaviour._ Orlando, FL.: Harcourt Brace and Co.
Thursday, August 1, 2019
Immigration and Islam Netherlands and France
Immigration and Islam in France and the Netherlands After the post-war, WWII, era Europe faced a shortage of labor, at the same time it had to rebuild its infrastructure and economy. France and Netherlands both faced the same problem and like their counterparts in Europe they found the answer in guest-workers. These guest workers were immigrants from former colonies and other developing countries. However, these guest-workers later settled down and brought their families. This led to a larger influx of immigrations. The largest, most significant, and most controversial are the Muslim immigrants. This study will focus on the different approaches of integration France and the Netherlands have implemented, the growing discrimination of Muslim immigrants, and the role Islam has in this dilemma. France had a long colonial history in the Maghreb, North Africa, mainly Algeria. To fill in this gap many male immigrants flocked to France in need of work. There was also a large immigration from the Mediterranean, Turkey, in this case. The largest make-up of French immigrants have been Algerians and others from the Maghreb. Netherlands, similar to the French had immigrants from the Mediterranean, Maghreb, and former colonies (Surinam and Antilles); the largest group being Turkish and Moroccan immigrants. These immigrants became a large factor in the rebuilding of the economy but as the economy slowed immigration became more of a problem for Western European countries. France proposed an assimilation model, where it endorsed pluralism in the private sphere. The Laicite, the separation of Church and State/private and public played a large role in the French system of assimilation. In the public sphere you were expected to be French in language and ideals. The private sphere was left for your own beliefs and customs. This can be seen in the expression of symbols in the public schools, where wearing the veil is not allowed because it threatens this division of public and private. However, this still doesnââ¬â¢t explain the dilemma that many second or third generation French citizens from immigrant backgrounds face. Even though they are ââ¬Å"Frenchâ⬠, they are not accepted by the systemââ¬âaccordingly because they still arenââ¬â¢t ââ¬Å"Frenchâ⬠enough. This creates disparity on both sides; the French politicize this dilemma by taking a harder stance on immigration and assimilation, giving rise to far-right parties like National Front under Le Pen. (GS, page 123) The immigrants unfortunately at times result to violent riots in protest and anger at the discrimination they face. The end result being stricter immigration regulations, more assimilation, and seeing ââ¬Å"Islamâ⬠as incompatible to European standards. The Netherlands have the same end results but have come to them from a totally different background. The Netherlands has endorsed a multicultural integration from the beginning. (Coenders, M. , Lubbers, M. , Scheepers, P. , & Verkuyten, M. (C. L. S. V)) The Netherlands have been one of the foremost in democracy and liberalism, ranking third in the world. Its capital, Amsterdam, is the hub liberal and free lifestyle. Since the 2000ââ¬â¢s homosexual marriage and euthanasia have been legalized. (GS 192) Compared to Franceââ¬â¢s full assimilation the Netherlands has put forward a multiculturalism approach, however this has taken a drastic change in the last decade, especially against Muslim immigrants. In the early 2000ââ¬â¢s Pim Fortuyn, a leader of anti-immigration and pro-assimilation party (Liveable Netherlands and latter List Pim Fortuyn), voiced his opinion on Islam being a backwards religion and a threat to liberal European/Dutch ideals. Even though both countries had different policies of integration they both ended up in the same situation against Muslim immigration. How can these phenomena be explained in these democracies that embrace liberal ideals? For it was France, that in 1789, coined the motto, ââ¬Å"All men are born free and equalâ⬠, but now it can be seen that some men are born more free and equal than others. The realistic conflict theory explains this situation as a reaction to materialistic scarcity; jobs and housing. In the post-war era there was a surplus of jobs and also the need of cheap labor, the immigrants rushed in and filled these positions. However, after the slowing of the economic boom employment became scarcer. This led to higher un-employment rates and the native citizens started to see immigrants as a threat, leading to discrimination and pro-assimilation. Though this does explain a significant factor, there is still the growth of anti-Islamic sentiments. The Muslims arenââ¬â¢t the only immigrants in France or Netherlands, but they are the ones who face the blunt of the attention. (C. L. S. V) So the realistic conflict theory falls short in explaining this. More than Two Decades of Changing Ethnic Attitudes in the Netherlands, a study done to explain the attitudes the Dutch had on immigration, saw that social and ideological contents also affect peoples outlook as much as materialistic means. (C. L. S. V) This gives explanation to far-right parties such as List Pim Fortuyn and Le Pen. The parties that use anti-Islamic and anti-immigration sentiments as political platforms, they play on the fearsââ¬â¢ of the people. Yet, how is it that these fears can grow and flourish in such liberal and democratic societies, the ââ¬Å"Heralds of Democracyâ⬠? People fear what they donââ¬â¢t understand. Islam is this ââ¬Å"otherâ⬠and the media and politicians play on this. The Muslim immigrant populations donââ¬â¢t help either because they themselves are in a transition phase. They are trying to find a way to live with an Islamic background and Western ideals. Some see total assimilation as an answer others find a compromise and yet others turn to radicalism. This struggle has been going on since the mid-19th century, between the ââ¬Å"Westâ⬠and Islam. Some essentialists like Huntington and Fukuyama, see this as the next power struggle for the ââ¬Å"Westâ⬠after the fall of the Soviet Union, ââ¬Å"The Clash of Civilizationsâ⬠. According to some 9/11 and other terrorists acts just prove this theory, however even though there are radicals, they are in the minority. The majority of Muslims donââ¬â¢t have problem with the ââ¬Å"Westâ⬠, most even are pro-Western, they support democracies and liberal views. Maybe, it is not the same as Europe or America but they are trying to find the middle ground and negotiate between the two. This is no different for the immigrants in France or Netherlands. Ahmet Yukleyen in his study of social movements in the Netherlands has focused on Turkish immigrants and the role religious movements have played a role in their lives. His studies show that there is not one Islamic front in Europe or a ââ¬Å"Euro-Islamâ⬠as some have supported. Even though the Islamic community is one ummah, they all interpret and practice Islam in slightly different ways. The fundamental tenets are the same, but Islam is flexible according to time and place; taken from a historic or even contemporary perspective this can be seen. Euro-Islamâ⬠was supposed to be the liberal Islam for European standards, the Islam with lacite, secular Islam. This view has been supported by pro-assimilates, like France. However, this didnââ¬â¢t turn out to be true because it would have compromised too much from Islam, it would no longer be ââ¬Å"Islamâ⬠. What happened, like in the Netherlands, was th at people joined different social/religious movements and institutions. This was truer for second and third generation Turks, who felt the need of religion more than Turkish nationalism in their lives. They saw themselves as Dutch, liberal and democratic in their views but still Muslim. Yukleyen, names a few organizations, like Milli Gorus, the Gulen Movement, and Suleymanli. Each movement represents different set of ideals but each represents a facet of Islamic life in Europe. It also shows that Muslims can negotiate between European and Islamic ideals, finding a niche their society. Not only that, but by having dialogue and inter-faith organizations an atmosphere of tolerance and multiculturism can flourish. Maybe, dialogue and negotiation is the answer to the dilemma facing Europe and the Muslim immigrants, the inability to understand one another. Work Cited Coenders, M. Lubbers, M. , Scheepers, P. , & Verkuyten, M. (2008). More than Two Decades of Changing Ethnic Attitudes in the Netherlands. Journal of Social Issues,à 64(2), 269-285. doi:10. 1111/j. 1540-4560. 2008. 00561. x. Maillard, Dominique (2005). The Muslims in France and the French Model of Intergration. Mediterranean Quarterly. Yukleyen, A. (2009). Localizing Islam in Europe: Religious Activism among Turkish Islamic Organizations in the Netherlands. Journal of Muslim Minority Affairs,à 29(3), 291-309. doi:10. 1080/13602000903166556. E. Gene Frankland. (2009). Global Studies Europe. McGraw Hill Companies.
Subscribe to:
Posts (Atom)