Tuesday, January 28, 2020

Contemporary Theories of Reasoning: An Analysis

Contemporary Theories of Reasoning: An Analysis Computational and algorithmic challenges to contemporary theories of reasoning Kattja Madrell Reasoning is the process of using given information to draw valid conclusions and produce new information (Goel Dolan, 2003) based on a combination of beliefs and language of thought (Fodor, 2001). The language of thought hypothesis proposed by Fodor (2001) states that thought and thinking occurs in a mental language; mental representations of reasoning are like sentences and this is why language of thought is sometimes also known as Mentalese (Murat 2010). Fodor (2001) admitted, however, that language of thought alone could not be used to explain reasoning; instead a combination of language of thought and a person’s belief is now accepted as the basis of human reasoning. Evans, Barston, Pollard (1983) found that a person’s beliefs about the conclusion of an argument influenced whether or not they deemed that conclusion to be valid; the truth value of a conclusion was based upon its logical relationship to a belief (Goel Dolan, 2003). Marr’s Levels of Analysis (1982) is a tri-level hypothesis that provides us with a critical framework to analyse and evaluate models of psychology thoroughly and consistently. There are three different levels; the computational level, the algorithmic level and the implementational level. In the field of cognitive psychology these levels have also been referred to as the semantic, the syntactic, and the physical (Pylyshyn, 1984). Marr (1982) describes the three levels of analysis as the following: â€Å"1. Computational Theory: the goal of the computation, why is it appropriate, and what is the logic of the strategy by which it can be carried out? 2. Representation and algorithm: How can this computational theory be implemented? In particular, what is the representation for the input and output, and what is the algorithm for the transformation? 3. Hardware implementation: How can the representation and algorithm be realized physically?† In other words, the computational level of analysis is concerned with what the model or system in question does and why does it do so. The algorithmic level builds upon this and analyses the way in which the system performs its computation whilst the implementational level is concerned with the way in which the system is physically implemented. Each level is a realisation of the level before it providing a more complete explanation of the system than its predecessor. This allows for the preservation of many of the properties of inter-level relationships in complex systems (McClamrock, 1991). This essay will discuss some of the critical issues and challenges to various contemporary theories of reasoning using Marr’s levels of analysis. Monotonic reasoning is based upon a series of logical rules. These rules are strict, rigid and cannot be altered by the addition of new information; instead this leads to the production of new beliefs (Brachman Levesque, 2004). In the absence of justifications that would make a rule non-monotonic, we use monotonic reasoning as a default (Lakemeyer Nebel, 1994). For example: A bass guitar (A) has four strings (B) A = B. James’ instrument (C) is a bass guitar (A) A = C Therefore James’ instrument (C) has four strings (B) (C = B) This is an example of monotonic reasoning; the rules are consistent and based on logic and do not appear to be problematic. But what happens when we learn that James’s bass guitar actually has five strings? Reasoning monotonically forces us to learn a new rule (A =  ¬B) that contradicts a rule that is already known to be true (A = B). The principle of contradiction proposes that statements which contract each other – such as â€Å"a bass guitar has four strings† and â€Å"a bass guitar does not have four strings† – are mutually exclusive and cannot both be true in the same sense at the same time (Whitehead Russell, 1912). Monotonic reasoning displays a computational crisis when faced with logically contradicting information; as the rules cannot be manipulated or altered, the goal of the reasoning cannot be achieved. As we gain new information on various things on a regular basis, it is inappropriate to reason monotonically, like in classical logic (Isaac, Szymanik Verbrugge, 2013), as we will not be able to incorporate any new information to our established beliefs. It stands to reason that the only appropriate time to rely on monotonic reasoning is when in a situation in which one has complete knowledge; this, however, is still risky as one may believe that they have complete knowledge of a situation as long as they are not aware of any reason or evidence to suspect otherwise, demonstrating a false belief of what is known as the Closed World Assumption, an example of non-monotonic reasoning (Etherington, 1986) Non-monotonic reasoning is computationally more complex than monotonic reasoning; with its main forms all sharing the same level of complexity (Eiter Gottlob, 1992). This is because the system is malleable and based on various different connections being made. Unlike in monotonic reasoning, the addition of new information that may be contradictive of beliefs already held can alter what is already known; this occurs in two main ways belief revision and belief update. Belief revision is the addition of new information into a set of old beliefs without any logical contradictions or inconsistencies; preserving as much information as possible. Belief update is the changing (or ‘updating’) of old beliefs to take into account any differences (Gà ¤rdenfors, 2003). Non-monotonic reasoning leads to common-sense conclusions being drawn that are based upon the combination of both supporting evidence and the lack of contradictory evidence; Monotonic reasoning encounters problems with this due to the fact that the beliefs being reasoned about do not consider the absence of knowledge (Etherington, 1986). Non-monotonic reasoning shows a level of tautology that is not present in its monotonic counterpart; as beliefs are revised or updated to incorporate new information they become harder to negate. Take the previous example: A bass guitar (A) has four strings (B) A = B James’ instrument (C) is a bass guitar (A) A = C Therefore James’ instrument (C) has four strings (B) (C = B) We now know that the bass guitar in question has 5 strings. Using non-monotonic reasoning we can now amend our initial belief that a bass guitar has four strings so that it now shows: A bass guitar (A) usually has four strings (B) unless it does not have four strings ( ¬B) A = B unless A =  ¬B This example demonstrates a common display of default reasoning (Reiter, 1980); statistically most A’s are B’s so it is acceptable to make a general assumption based on the statistical majority. As well as making general assumptions, default reasoning is also based upon conventional and persistent assumptions, along with a lack of contradictive information (Brachman Levesque, 2004). Various rules of inference in non-monotonic reasoning have been proposed and explored, including circumscription (McCarthy, 1980) and negation as failure (Clark, 1978). The closed world assumption is a form of non-monotonic reasoning based on the assumption of complete knowledge. Proposed by Reiter in 1978 the closed world assumption is described as follows: â€Å"If we assume all relevant positive information is known, anything which is not known to be true must be false. Negative facts may simply be inferred from absence of positive counter parts† (Reiter, 1978). To put it in other terms, if P is not provable from the knowledge base available then we must assume not P ( ¬P) (Etherington, 1986). This assumption has one major flaw; should a person not be in possession of all the relevant information, then the assumption can no longer apply. When (and only when) there is a complete and expert knowledge of the matter being reasoned about is it truly appropriate to employ the closed world assumption. In order to prevent unwanted inferences of non-monotonic logic, such as the false belief of the closed world assumption, it is necessary to retract any assumption of complete knowledge; this leads to the use of implicit general assumptions (Brachman Levesque, 2004). If the addition of any newly learned information is contradictive to these general assumptions, adjustments are made (Etherington, 1986) and beliefs are updated or revised (Gà ¤rdenfors, 2003). The general assumptions made when reasoning non-monotonically are based upon normalcy obtained from knowledge and experience; we may assume that James’ bass guitar has four strings as bass guitars normally do so. But what statistical probability can be assigned to an assumption to label it as ‘normal’ and what situational factors determine which assumptions can be made? When does a situation deem it appropriate to assume? The complexity of the ever-changing algorithms behind non-monotonic reasoning lead to different results being produced; for example, due to slight changes in situation, individual differences and varying information. Default reasoning is arguably one of the most popular forms of non-monotonic reasoning (Reiter, 1978). Based on the principles of default logic (see Nebel, 1991; Goldszmidt Pearl, 1996), default reasoning demonstrates a serious computational crisis known as the specificity principle. The specificity principle states that, when faced with a logical conflict, people make assumptions based more commonly upon more specific defaults than general ones (Brachman Levesque, 2004); this can lead to stronger conclusions and, although at times, these conclusions are correct, the assumption itself that more specific defaults should be preferred is logically lacking (Brewka, 1994). In order to â€Å"make up† for this problem of specificity, one would have to overtly assign the appropriate priority levels to the defaults in regards to the situation in question. According to the principle of contradiction proposed by Whitehead Russell in 1912, when faced with a logical contradiction, a logical person should be able to disregard the restrictions of their system of reasoning to arrive at a logical conclusion. This however is not the case. In fact, much literature to date has shown human beings to behave in an illogical manner, demonstrating various logical fallacies that people reason with when using argumentation to negotiate life in a complex world (Hahn Oaksford, 2013). A few examples of this are ad hominem, ad Hitlerum and the slippery slope argument. When the character of an individual is attacked, it is suggested that any proposition they put forward should be disregarded; this is known as Ad Hominem (Hahn Oaksford, 2013). Ad hominem is a logical fallacy that proposes that once the character or credibility of an individual has been questioned, it is no longer possible for one to have absolute confidence in what that individual says (Harris, 2012). The term ad Hitlerum was coined by Leo Strauss in 1953; it is the name given to the logical argumentation that an idea or a view can be refuted if it is compared to one that may be held by Adolf Hitler, leader of the Nazi Party. Harris et al., in 2012, conducted a series of experiments to see whether or not participants agreed or disagreed with an opinion that may had been similar to a view shared by Hitler. They found that participants demonstrated sensitivity to probabilistic information when they were evaluating whether or not the ad Hitlerum argument was convincing. This showed that people based some of their conclusions on the origin of an argument rather than current facts. The slippery slope argument is another logical fallacy based upon belief or assumption rather than evidence, in this case not doing something for fear of what negative consequences that action may lead to. Corner, Hahn, and Oaksford (2011) outlined four defining components of the slippery slope argument: . â€Å"An initial proposal (A). . An undesirable outcome (C). . The belief that allowing (A) will lead to a re-evaluation of (C) in the future. . The rejection of (A) based on this belief.† Within beliefs in the slippery slope argument there appears to be some sort of implied mechanism which leads to the consequent action (C) directly from the antecedent action (A), even though this belief is not based upon prior knowledge nor empirical findings (Hahn Oaksford, 2013). These logical argumentations provide a computational challenge as, should human beings operate logically, conclusions should not be drawn based upon these fallacies however empirical evidence has shown that they frequently are (Harris et al., in 2012). Bayes Theorum is a formula proposed by Thomas Bayes that can be used to calculate probability in everyday reasoning (Bayes Price, 1763). Bayesian reasoning is the process of reasoning probabilistically under uncertain circumstances when not all information is known or available (Korb Nicholson, 2011). Using Bayes theorem, we can calculate the likelihood of different outcomes based on prior knowledge and experience of the world, assign probabilistic values and act accordingly (Oaksford Chater, 2007). The use of Bayesian reasoning has provided a new perspective in the analysis of psychological research; results from empirical studies have shown great deficits in human ability to reason logically (Wason, 1972). Where it would be most logical for participants to seek evidence that negated their hypothesis, they instead searched for and selected evidence that could only lead to the confirmation of their hypotheses (Hahn, Harris Oaksford, 2013). Using Bayes Theorem, however, Oaksford Chater (1994) demonstrated that this confirmatory response was actually the most probabilistically logical response; it involved the selection of data that provided the most information about the truth or falsity of the hypotheses (Hahn, Harris Oaksford, 2013). Persuasion is the process of sending a message to change a belief or incite an action. As well as its personal use, persuasion plays a major role in advertising, politics, law and many more public activities (Kamenica Gentzkow, 2009). There are a variety of different Bayesian persuasion mechanisms, such as talk games (Crawford Sobel, 1982), persuasion games (Milgrom Roberts, 1986), and signalling games (Spence, 1973); Bà ©nabou and Tirole (2004) further adapted the use of Bayesian persuasion to investigate mechanisms of self-signalling and self-regulation. Throughout all aspects of Bayesian reasoning, one thing remains constant; a person (A) can affect the actions of another (B) only by first changing the beliefs of B (Kamenica Gentzkow, 2009). Bayesian persuasion has been criticised in terms of its computational properties. Unlike argumentation, persuasion is concerned with what persuasive techniques work and why regardless of whether or not the reasoning was rational (Madsen et al., 2013). Empirically, the results of study into persuasion have shown that the effects on a person’s beliefs rarely persist (Cook Flay, 1978). There is also a lack of evidence in literature demonstrating that belief change resulting from a persuasive argument produces behaviour that corresponds with the change in belief (Festinger 1964). Bayesian reasoning shows a great deal of algorithmic complexity. The type of information being reasoned about has an effect upon the conclusions drawn with people showing greater difficulty in reasoning with conditional information than joint information (Lewis Keren, 1999). The probability estimates for a hypothesis are frequently updated with the addition of new relevant information using Bayesian inference. Gigerenzer Hoffrage (1995) analysed thousands of Bayesian problems and found that the adaptation of Bayes theorem using a frequency formats can be used to reduce algorithmic complexity. Bayesian persuasion is also a very complex process, most successful persuasion of belief happens after multiple persuasion attempts over a long period of time (Kamenica Gentzkow, 2009). Hahn and Oaksford (2013) proposed that the most influential factor of persuasion is the quality of the argument being put forward; because the quality of an argument is subject to personal opinion it provokes the question ‘what makes an argument good or bad?’ Human beings are not perfect Bayesians (Mullainathan, Schwartzstein Shleifer, 2008) and while some persuasive activities may reflect a person’s failures of rationality, Kamenica and Gentzkow (2009) concluded that a complete understanding of a Bayesian persuasion is needed in order to fully assess results in literature. Recently, psychological study has begun addressing the current issues in the computational and algorithmic levels of different types of reasoning. The effects of emotion upon the ability to reason logically have been called in to question (see Blanchette, 2013; Ayesh, 2003) as has the much greater issue of subjectivity in Bayesian reasoning (see Press, 2009; Ben-David Ben-Eliyahu-Zohary, 2000).

Monday, January 20, 2020

GMO Labeling Essay -- Genetically Modified Consumer Food Essays

GMO labeling Ever since their entrance onto the consumer market in the last two decades of the twentieth century, genetically modified organisms (often referred to as GMOs) have been getting mixed reviews from the public. Genetically modified consumer products (primarily food) have pushed the barriers of some people's comfort levels. Born out of either a lack of knowledge or a sincere concern for public health or the environment, a consumer rights movement has been planted around the world pushing for labeling of genetically modified food products. This movement has matured in many places to a degree where interest groups have successfully lobbied governments into adopting criteria for labeling transgenic food products. In other parts of the world strong agriculture interests have clashed with the aforementioned movements. A simple label on a can of beans would seem to easily solve this problem; however, governments have found that GMO product labeling is more complex than that. Considerations such as costs, international markets and cultures must also be taken into consideration, not to mention the public's perception and their level of trust in this relatively new product. Research in both medical microbiology and agriculture laid the groundwork for what is modern biotechnology. This is newer science, seen by many to have officially begun with the discovery of recombinant DNA technology by Stanley Cohen and Herbert Boyer in 1970 (biotech.ca 1). Recombinant DNA technology, aided by the use of restriction enzymes, allow humans to cut one part of a genome of one species that codes for a desirable trait and insert it into a different species in the hope of producing the same effect (biote... ... label to fix. Works Cited - ¡Ãƒ Detailed Description of new GMO labeling in the E.U. ¡ÃƒÅ" Organic Consumers Association. 2001. 10/5/04. http://www.organicconsumers.org/gefood/gmolabing080101.cfm -Diani, Hera.  ¡Ãƒ Inodnesians Demand GMO labeling. ¡ÃƒÅ" The Jakarta Post. November 4th 2001. 10/2/04. http://organicconsumers.org/gefood/indonesia110801.cfm - ¡Ãƒ Economic Impacts of Genetically Modified Crops on the Agri-food sector. ¡ÃƒÅ" European Commission Directorate General for Agriculture. 2003. The European Commission. 10/4/04. - ¡Ãƒ History of Biotechnology. ¡ÃƒÅ" Biotechnology in Canada. 2004. 10/5/04 http://www.biotech.ca/EN/history.html. -Le Meur, Herve.  ¡Ãƒ Re: Have Ground Rules been set for GMO definition? ¡ÃƒÅ" lemeur@diligo.fr. November 26th 2000. - ¡Ãƒ Sticky Labels. ¡ÃƒÅ" The Economist. April 29th 1999. 10/5/04 www.economist.com.

Sunday, January 12, 2020

Paradise Lost Essay

While contrasting the attitudes, and results of Jesus in the Bible, with the attitudes and results of Satan in the book Paradise Lost I discovered many comparing themes. The attitudes of Jesus that we find in the Bible are great examples for us all on how to live our lives, compared to the attitudes of Satan, whose life we may not want to follow in an example. Always when we understand Jesus’ true character, we find that Satan’s true character is the exact opposite of His. Pride is the interest of one’s own selfish needs, desires and not showing interests in everyone else’s needs and desires. The evidence of Satan’s prideful attitude in Paradise Lost is very relevant in Milton’s epic. Satan displayed â€Å"obdurate pride,† translated as stubborn pride, in Paradise Lost. Satan was stubborn in his ways, and would not relent or give in to living the life style of Jesus Christ’s. The antonym for pride is humility. Humility is the actions and thoughts of being humble and modest. Humility was a character trait that Jesus exemplified perfectly. In John 13 we find an example of humility; Jesus knelt down and washed His own disciple’s feet. Jesus did all this with a grateful heart and attitude of humility. Without happiness one cannot experience joy. When Satan was cast to hell he developed a strong hatred towards God. Paradise Lost described Satan’s attitude of â€Å"steadfast hate,† toward God and men. In contrast, Matthew 5:44 states â€Å"†¦love your enemies and pray for those who persecute you.† Jesus modeled love daily in His life and also through His dying; He covered all our sins even though we were unworthy of His everlasting love. The final contrasting attitudes are rebellion, and obedience. In Paradise Lost Satan displayed the attitude of rebellion. This was seen when he said, â€Å"better to reign in hell, then serve in Heaven.† He was firmly stating that no matter what the circumstances he would no want to ever serve the Most High in His Kingdom.

Friday, January 3, 2020

Analysis Of Different Research Methods Finance Essay - Free Essay Example

Sample details Pages: 7 Words: 2018 Downloads: 1 Date added: 2017/06/26 Category Finance Essay Type Analytical essay Did you like this example? This chapter aims to describe the methodology employed in this research. There are basically two approaches, namely quantitative and qualitative approaches. Research Methodology is defined by Fellows and Liu (2008) as the principles and procedures of logical thought processes which are applied to a scientific investigation. Quantitative approach Quantitative methodology tends to gather factual data, to study relationships between facts and how such facts and relationships accord with theories (Fellows and Liu, 2008). The purpose of this strategy is to explain a social phenomena rather than to understand the meaning of things itself (Smith, 2003). The main advantage of quantitative approach is fast and economical, meanwhile the procedures are more reliable and results of the research can be replicated (Naoum, 2009). However, the principles of quantitative approach focus on the facts in natural word. Thus, as Bryman (2008) argued that quantitative method s could ignore the human value and interpretation and distinguish the people from the world. The analysis of the variables creates a statistic view of social life which is apart from peoples life. Furthermore, a quantitative research relies on instruments and procedures hinder the connection between research and everyday life. (Fellows and Liu, 2008) Qualitative approach In contrast, qualitative methodology seek to obtain subjective issues in the natural world and lay stress on peoples internal perceptions, meanings, experiences, opinions, views and understandings rather than the numerical testing and verification to the world (Creswell, 2007). As comments by Bryman (2008), qualitative method could be influenced by outside learning and cannot be replicated as quantitative data. Furthermore, due to resource constrains, the sample size cannot stand for the whole population. In addition, the information might be interpreted subjectively with bias; meanwhile, misunderstanding o f interviewer can also cause the unquantifiable risk to the research. Each of the methodology has its own strength and weakness. The comparison of quantitative and qualitative research methodologies are shown in the table below: Table 3.1 The comparison of quantitative and qualitative methodologies Source: Naoum (2009) Triangulation Research Approach A mixed method can synthesize the strength and reduce weakness of these quantitative and qualitative approaches. When there is a combination of quantitative and qualitative methodologies aiming to study the same problem or phenomenon, it forms a triangulation (Kummar, 2005). As asserted by Fellow and Liu (2008), adopting a triangulation in the research can be very powerful to gain insights and results, to assist in making inferences and in drawing conclusions. The process of triangulation is illustrated as figure below. Figure 3.1 Triangulation of quantitative and qualitative data Source: Fellow and Liu (2008) Resea rch methods for construction, 3rd edition. Research Design Functionally, this research design intends to provide a general plan and procedures to answer research questions(Kumar, 2005), considering data collection method and analysis within constrains of data assessing, time, capital and ethical issues (Saunders et al., 2007). 3.2.1 Literature Review process This research intends to commence with literature review which can provide secondary information in data collection and become a secondary research included in this research paper. A quantitative data collection will be conducted based on the previous literature materials in order to examine practical situation of the Quota and its Pricing System utilizing in construction procurement procedures over construction industry in China. 3.2.2 Data collection Primary data collection will be conducted in this dissertation. Prior to analysis the results, the primary data will be conducted to establish and examine how th e professionals perceive and implement quota and pricing system based on the factual execution, knowledge and professionals views as well as practical executed situation of this system under and after the circumstances of the National Bills of Quantities carried out in 2003 over Chinas construction procurement process. During the collection processes, a triangulation methodology allocating with structured questionnaire, semi-structured interview and case study methods are employed to gather both quantitative and qualitative data. 3.2.1.1 Questionnaire Survey Structured questionnaire The purpose of structured questionnaire is to obtain primary information which could be quantified and analysed to show the facts, attitudes and perceptions of person on quota and pricing system over construction procurement in China. Adopting a questionnaire survey does have a range of benefits. Like comments by Naoum (2009), the method of data collection is perceived to offer relatively hig her validity of results as this survey covers different area in China, including Shenyang, Beijing, Shanghai and Xinjiang and other provinces. Due to it is administrated and transmitted by internet as well as time permitted; it is really a quick method for conducting data collection. Meanwhile, questionnaires can be completed flexibly when the respondents are convenience and provide less opportunity for mistake arising from perception or attitudes in interview methods. Whilst the advantages are attached by researchers, the limitation of a questionnaire cannot be ignored. Because this survey is dependent on the construction industry and professional knowledge, all the questions setting in questionnaire should be simple and straightforward to understand and answer. Furthermore, this method is inflexible for researchers to probe and follow up other interesting points (Bryman, 2008). Meanwhile, there is no opportunity to clarify ambiguity from the respondents answers. Moreover, the r espondents may answer what the researchers want to hear, of which also can lead to inaccuracy in a research result. In addition, under pressures of modern business, there may be fatigue for the respondents in the construction industry and cause less priority for questionnaires. However, it is possible that not getting all responses back, typically the response rates lie between 40%-60% or considerably lower than this rate. (Naoum, 2009) Closed-ended and Open-ended questions According to the aims of the questionnaire, all questions produced in this structured questionnaire are devised from the findings from the literature view. The questionnaire was organized using mainly closed-ended questions which can be analysed straightforward without writing by respondents. However, two open-ended questions were constructed to allow respondents to act their views if they wish. According to Naoum (2009), a pilot study was carried out with several professionals providing a trial run for the questionnaire, involving wording techniques, identifying ambiguous questions. After feedback from pilot study, a few minor amendments were revised to ensure better clarity of questions and eliminate ambiguous words in this questionnaire. Furthermore, from the background information from respondents of pilot study, the content and questionnaire formation could also be refined. Sampling in structured questionnaires A sample is a specimen or part of a whole population which is drawn to show what the rest like (Naoum, 2009). The aims in selecting a sample are to obtain maximum precision in research estimations within given samples. Meanwhile, it should avoid bias in the selection of the samples (Kummar, 2005). Saunders et al. (2007) classified samples as two general types: probability (random) and non-probability (non-random) sampling. In this questionnaire survey, non-random sampling will be conducted to address the research questions and objectives, due to constraints of time and resource. The questionnaire will be sent to professional construction cost estimator working in different regions in China. The sample size intends to be 60. Ideally, the sample will be enough to elicit better response rate, however, the sample size is restricted by time and capital resource. Internet web questionnaire method has been chosen as it is cheaper than postal type; meanwhile it can obtain a wide coverage of cities. 3.2.1.2 Qualitative Semi-structured Interviews In order to supplement deficiency in structured questionnaires, semi-structured interview will be conducted to collect more in-depth subjective information from respondents. Semi-structured Interview In structured interview, a set of predetermined questions are presented in the same order and wording, including open-ended or closed-ended questions, whilst the interviewer will control during the interview process (Kummar, 2005). In contrast, there is no predetermined list of questions in unstruc tured interview. However, Semi-structured interview is the spectrum between structured and unstructured interview (Fellow and Liu, 2008). Semi-structured interviews will also be employed with several professionals working in the industry and the literature review has provided basis of questions in interview through identifying the functions of quota pricing system in variation valuation, resources optimization and effective tool to solve disputations in construction industry in China Semi-structured Interviewing method is a major technique for collecting qualitative factual information and opinions using purposive samples to gather detailed information as well as valid and reliable data which are relevant to research question and objectives (Saunders et al., 2007). In this research, carrying out a semi-structured interview is a flexible qualitative method which can obtain a high response rate and provide opportunities to correct misunderstanding from questionnaires and are int ended to verify and validate the results devised from previous literatures and completed questionnaires (Kummar, 2005). Sampling in semi-structured interviews In order to gather the valid and reliable primary qualitative data, the semi-structured interview will be carried out with experienced professionals working in construction cost estimating field. Three purposive samples have been chosen based on contactability and availability from different cities in China, 1 of which were face-to-face, 2 of which were telephone interviews. They come from Development Company, building company and construction consultancy, so that the information can be gathered in different participants in a project and this can provide the chance to examine the views of different needs to quota and quota pricing system in construction activities. 3.2.1.3 Case Study In order to qualitatively support and illustrate the hypotheses devised from literature reviews, case studies research will be carrie d out in this research. As argued by Kummar (2005) that case study is a method used to study a social phenomenon through a thorough analysis of the individual case. Meanwhile, it provides opportunities for intensive analysis of different specific details to generalize a broader theory. Case study of education system A collective case study will be employed to investigate how the education system impact on practice of Quota pricing system through examining three types of textbooks in education system in China. The book named Construction surveying and cost estimation management is published by Tongji University in 2007, which is widely used by a number of universities in the course of quantity surveying and interrelated course in China. In this book, there is a special chapter to introduce the principles of quota pricing system. Furthermore, Construction Quota is chosen from the text books of self-study higher education examination in China. In the course of construction cos t estimation, construction quota system and related knowledge is taught as a basic part for further knowledge study. Construction Cost Estimating and Control is one of fascicules in professional training and examination, which is published in 2009. In the qualification examination of professional cost estimator, being an important part, the principle of quota and quota pricing method occupied the comparative proportion as well. Through the analysis of quota system offered in these books from different fields in education system in China, the fundamental influence of quota pricing system on academic structure to professionals will be explored. Sampling in case study of education system In the case study, three purposive case samples will be chosen. These three cases are chosen from different education field, including general university, self-study higher education examination and profession training. From the analysis of different fields, the results are convictive. 3.3 Ethic Statement All data in this research will be collected and analysed complied with the ethical and professional guidelines of Nottingham Trent University (2009). According to Bryman (2004), to ensure the respondents will uphold during the research process, a statement explaining the purpose of research and use of the data gathered from the questionnaire will be provided before the questions so that informed consent is obtained by each investigator. Meanwhile, additional statement of intent will emphasize the participation is voluntary and the data will be analysed and treated as confidential and anonymity. This confidentiality will avoid and potential harm caused to the respondents. Each of respondents will be provided a final report results (Bryman, 2004). 3.4 Summary of Research Methodology This research will review the relevant previous published literatures as theoretical and empirical basis to determine the facts in practice of quota system over construction pro cedures in China. In order to achieve the research aims, this paper will be combined with structured questionnaires, semi-structured interviews and case study to examine and establish the reasons that why the Quota Pricing System is persisting over construction procurement in China under the mandatory of National Bills of Quantities procurement method. This dissertation will therefore utilize triangulation methodology, of which quantitative and qualitative data will be involved. Don’t waste time! Our writers will create an original "Analysis Of Different Research Methods Finance Essay" essay for you Create order