Monday, October 16, 2017

contribution of postmodern marketing thought

Since “postmodernism” has begun to spread among academics as a new philosophical and scientific concept, management theory has also witnessed, though with different tones in some of its components, a debate concerning a new interpretation of issues as well as of the discipline. This debate offers new horizons to the academics and could bring about some interesting developments as those experienced in other disciplinary fields. With this work, we try to interpret the epistemology of marketing, a specific part of management theory, by conducting an analysis of the literature as it has developed so far and by constantly creating links between the level of philosophical elaboration and that of marketing research. It is the enrichment that the research experiences, and not only that, witnessed by the researcher, brings to the knowledge of the individual and of the community. It is clear that the process of creation of knowledge is endless and unstoppable as the more the individual (and the community) is enriched, and therefore learns, the more he realises his/her lack of knowledge. Knowledge “calls” knowledge.
The body of marketing
The origins of marketing take their roots in the American management literature of the late Fifties and of the beginning of the Sixties, when some researchers started to investigate into some management practice and, above all, into the origin of the market success. Those articles are now considered the landmark of marketing and have established the main concepts upon which this field of human knowledge has been developing for many decades and is still today accepted (Felton, 1959; Borden, 1964). The concept of marketing mix was then defined. This concept has since been considered a pillar of this discipline thanks to its simplicity and the possibility it offered of liaising the concept of economic value (which is crucial in marketing) to the actual managerial action.
In the course of the years, these approaches have been object of a constant process of systematisation and refining in order to define tools and technique to meet the needs of the market and, at the same time, those of the manager. This process has strongly contributed to the spreading of the discipline, thus creating a language universally shared and recognised as principles of Marketing Management by the academic community, by that of the practitioners and, last but not least, by undergraduate and postgraduate students. Customer orientation is thus the management philosophy which legitimises marketing actions and makes its ensemble coherent and harmonious.
Since the Seventies, this definition of marketing has met a growing, rapid and general consensus, thus transforming it to an evergreen or, in a “marketing megalomania” or even in a “Kotlerite” (Brown, 2002) as defined by the critics. The considerable simplicity of use of marketing mix, which was initially created to translate the marketing concept into operative terms, has been one of the main reasons of the great diffusion and credibility of marketing as a discipline on a global scale.
The perceived need of improving the effect of marketing policies, in a context whose evolution is increasingly fast and less intelligible, have necessarily induced researchers to increase their specialisation, thus becoming great experts of single tools and aspects of marketing. The evolutionary trend of marketing contributions has been fostered by the editorial choices of A Journals which tend to publish very specialised papers, supported by solid empirical analysis, but which turn out to be scarcely comprehensive. For this reason, they have been also critically labelled as Journal of Marketing Obscurity (Piercy, 2000; Baker, 2001).
If on the one hand this trend tends toward a specialisation of competencies and allows the discipline to progress in a “scientific” way, on the other hand there is a considerable risk of losing sight of the conceptual frame of reference of the domain in which such competencies are applicable. Owing to this tendency also, marketing has been the object of strong criticism in the course of the time, which nevertheless has not changed its initial general approach and has only affected the following phase of its evolution. This relatively simplistic approach to marketing has transformed both the general theory of marketing and the consumers themselves into victims. Consumers have been preferably reduced to mere numbers (that is quantitative data) by marketing operators, not to mention university students who have been forced to learn the principles of marketing as a simple recipe book made of ingredients that can be mixed and formulas (which most of the time are quite rigid) that can be applied according to the circumstance. In most businesses, many marketing activities are still today an exclusive task of some specialists who are considered the only responsible for the firm orientation and who only have to apply formulas and recipes learned in their education and refined by practice (whose partial dynamic and heterogeneous matrix has been recognised) Dissatisfaction toward this situation has started to manifest in a plurality of forms: from consumerism to feminism, sociological incursions and so on. They generally are very fragmented and little systematic trends, with the exception of two trends which have acquired an identity in their own right: relationship marketing and experiential marketing.
Criticism directed at marketing
The scientific advancement proceeds in the domain of human knowledge through what is now a standard process: theory, criticism and new theory. Marketing follows this general approach as well. In practical terms, this means that every contribution of marketing – as for every kind of discipline – starts with the analysis of literature, pinpoints a critical point due to a poor correspondence between theory and reality, and continues the reconstruction of knowledge for that specific area, thus contributing to the improvement of the knowledge of society.
Within marketing literature, it is possible to trace some critical approaches that share common features and give rise to actual movements for the refounding of the discipline. In particular, the main trends which have addressed marketing with strong criticism are two: relationship marketing and experiential marketing. They both have accused the discipline of involution, and of being devoted only to modelling of interpretation schemes of reality which have proved to be too far from it and were therefore not suitable to provide an exhaustive and generalizable explanation.
Chronologically, the first trend to cause a crisis within the discipline has been later called relationship marketing. During the Seventies, a part of marketing literature started to question the object of the discipline and its extendibility to other realities. In particular, the Swedish School of Industrial Marketing and the Nordic School of Services have contemporarily criticised marketing by maintaining that it adjusted well to the exchange relations of the mass consumption goods market, for which it was initially studied, but lost analytical and interpretative effectiveness when used exactly in the same way in other kinds of situations, especially in the industrial goods and service industry.
Mass consumption goods market is characterised by a strongly atomistic demand in which the personal features of the purchaser lose relevance and give space to anonymous and homogenous expectations. These can be analysed, in the most sophisticated cases, through segmentation techniques that are sometimes quite refined.
According to this scheme, the consumer is clearly passive and is subjected to the company policy without the possibility of affecting it in any way. The only possible action is the choice among alternatives of a predetermined supply. Exchanging power is therefore asymmetric and unbalanced: the single purchaser does not have decisional weight as his/her contractual force is proportioned to the percentage of his/her purchase in relation to the total turnover of the company and is therefore almost nil. According to the representatives of the “relationship” vision, the above described situation is considerably different in the industrial goods and service market. The peculiar features of this industry make it a different kind of market altogether in which the customer retains a particular and active participation and emerges as a consumer, producer and production resource. This calls for a reconsideration of marketing. If, as far as the market of mass consumption goods is concerned, the literature had put the exchange at the centre of the relation between demand and supply, and consequently at the centre of the analysis too, the new reflections which had been developing between the Seventies and the Eighties replaced the concept of exchange with that of relationship, that is the relationship that is established (in a more or less continuous way) between the purchaser and the seller: in analysis this is what really counts and not the single exchange act (which is often sporadic). Both the Swedish School of Industrial Marketing and the Nordic School of Services stressed how crucial the role played by the long term perspective is in the management of these markets. From this common point, both schools have independently developed their line of thought: industrial markets researches have focused mainly on the relations among companies, in particular on the role of trust and on the concept of relationships network (Håkansson, Östberg, 1975; Håkansson, 1982; Jackson, 1985; Hallén, Sandström, 1991; Ganesan, 1994; Morgan, Hunt, 1994; Doney, Cannon, 1997; Smith, Barclay, 1997; Duncan, Moriarty, 1998). Services researchers have concentrated on the differences in services in relation to goods and in particular on the continuous and necessary interaction between producer and consumer (Berry, 1980; Normann, 1985; Turnbull, Valla, 1985; Grönroos, 1991; Grönroos, 1994; Vavra, 1995). Moreover, in the course of the time, the importance of the relationship approach has spread in the consumer markets as well, thus making it necessary to consider the consumer perspective in all marketing choices. In the light of the specificity of the new analysed contexts, these authors have highlighted the weak points of the traditional approach of marketing, and defined it “traditional marketing” or the “traditional paradigm” of marketing, which does not result suitable in the contexts in which the firm can pinpoint the counterpart and treat it individually.
The second and important criticism to “universal” marketing was put forward by the trend of experiential marketing, some years later. The experiential interpretation of consumer behaviour started at the beginning of the Eighties in contrast with the traditional and prevailing view of studies of consumer behaviour whose first contributions date back to the Sixties and constitute what the experiential authors consider an utilitarian view (which is still today the major research trend within consumer behaviour).
Since the middle of the Eighties, some researchers have started to suggest an extension of the consumer behaviour interpretation, highlighting some limits of the utilitarian view of thought, such as the thesis of univocal rationality of the individual (Hirschman, Holbrook, 1982; Holbrook, Hirschman, 1982). By focusing on the mere act of purchasing, the utilitarian view has highlighted the rational component that leads the purchaser toward the resolution of the decisional problem faced with – a problem of choice among product alternatives. Resolution of the decisional problem is, in fact, an area which can be easily object of a rationalistic interpretation of consumption, and especially of a sophisticated modelling which becomes sometimes exasperated. Therefore, if on the one hand consumer behaviour researchers have acquired a considerable store of knowledge concerning the issue, on the other they have almost completely neglected all the other aspects of consumption which do not have a rationalistic component, especially the interaction between consumer – and not purchaser – and product. This is the real experience of consumption whose definition is, by nature, elusive and difficult.
Although the experiential view openly criticised traditional marketing only in 1999 (Schmitt B., 1999), the first criticism attacks could actually be dated back to 1982 when Hirschman and Holbrook carried out an initial comparison between the traditional and the experiential approach to the study of consumer behaviour. The two researchers, who are the pioneers of this trend of study, which has slowly encountered agreement and support in the course of the years, have ascribed the differences of the two approaches to the mental construction used, to the categories of the analysed products, to the use of product and finally to the consideration of individual differences among individuals. They have carried on this comparison by defining the essential features of the experiential interpretation of consumer behaviour. Criticism to the traditional approach concerns in particular the thesis of rationality and utilitarianism of consumer. According to the traditional theorists of consumer behaviour, founders of the traditional approach, the behaviour of the consumer is regulated by a general rationality which allows an easy resolution of every decisional problem, in particular the purchasing decision, in order to pinpoint the supply which maximises the utility for consumer. In this sense, the object of study of these researchers is the decisional process which leads an individual to make a specific purchasing choice, with the final objective of creating, with the same process, a universal model of reference. It is clear that the origins of such interpretation of consumer behaviour could be traced to the utilitarian vision of the general economics theory (Sherry, 1991).
Seventeen years later, Schmitt (1999) resumes the same process of comparison analysis by slightly modifying the categories compared and especially by highlighting once more the contrast between the traditional view and the emerging experiential one. In Schmitt’s contribution, however, the comparison analysis regards traditional marketing, expression that seems to define the ensemble of principles, models and tools of marketing management. In any case, the experiential view develops initially in an antithetical way in relation to the  prevailing trend, thus constituting a real reaction to the traditional model of consumer behaviour and aimed at a revision of models and tools in order to improve adherence to reality. It is, in fact, with the objective of studying the consumption behaviour of hedonistic products (considered as non strictly “rational”) that the concept of experience is defined, making the importance of individual emotions emerge (Carù, Cova, 2002).
Modernism and postmodernism
Although the term modernism refers to a system of thought that has developed in the course of the last four centuries, its actual definition can be traced to the last decades, when a new thought, postmodernism, emerged, thus contrasting with the precedent one. In order to understand the postmodern system of thought, whose unsettling effects are being delivered to every aspect of human knowledge, it is necessary to start with the analysis of modernism. Modernism is the vision of the world which imprinted the human action in the modernity era. The latter is conventionally said to have started with the Industrial Revolution and experienced its highest moment between the Nineteenth and the Twentieth Centuries. During these centuries, an extraordinary development of western societies was witnessed : since the second half of the Eighteenth Century, the European continent has enjoyed a period of great stability and wellbeing. The numerous innovations, the great scientific and geographic discoveries, the demographic growth, that took place in that period, gave a new impulse to the economy (from the industrial sector to the agriculture sector) and fostered a diffused and generalised wellbeing, thus stimulating growth and development of the populations. In Europe, the First Industrial Revolution, more than the second one, gave rise to a process of wellbeing and improvement of the general living standards which seemed unstoppable. The machine was considered the solution to the problems of humanity which, if on the one hand was freed from the servile oppression of physical work, on the other one was elated by a surge of wellbeing which had its expression in the possession of goods whose physicality was the tangible sign of their existence. In a geo-political context of stability and economic prosperity, the twists and turns of hope were superseded by the certainty of optimism. First the members of the Enlightenment and then those of the Positivism movements were convinced that with the support and guide of rationality only, humanity could reach higher levels of economic and social wellbeing and, therefore, of happiness, thus building a fair society and dominating nature (Best, Kellner, 1997). It all took the form of a continuous and linear process of progress of society, made possible and justified by rationality. The human thought was obviously affected by this approach and resulted in “modernism” which gathers the philosophical currents of thought of Neopositivism, Logical Empiricism, Logical Positivism and Neo-empiricism: dating back to Descartes and Kant, Smith, Locke and Hume, the members of the Positivism movement are generally considered the pioneers of modernism which received a considerable contribution from Newton research (Cobb, 1990; Abbagnano, 1995). According to the modern thought, machine and science have the same role: they are both at the service of the individual. Machine allows to reach an economic wellbeing whereas science contributes to the social prosperity. They both seem to be led by reason, by an omniscient rationality, which is able to reach certainties, knowledge of reality and therefore the truth. In the modern perspective, the recognised ability of the individual to understand nature, reality and its truths, allowed him to intervene on the state of things and to guide and improve them. Thinkers and researchers’ attention was therefore aimed at defining the laws regulating economic and scientific phenomena in order to understand their applications allowing, above all, their replication and improvement (Chiurazzi, 1999). In this sense, history was considered a linear evolution of society which proceeded through a continuous process of accumulation, and therefore, of progress. On the contrary, everything that seemed foreign to the evolutionary logic did not retain any interesting secret to be discovered and was, therefore, neglected. A finalised pragmatism dominated scientific analysis and its disciplines. Knowledge advanced toward reality and truth from which laws replicating repeatable and perfectible phenomena, and therefore suitable codes of conduct, were derived. Knowledge was aimed at “good” as it was fostered by the certainty of truth of the real. The will of reaching increasingly higher levels of wellbeing necessarily extended the concept of science to every discipline of knowledge: the mere application of the “scientific” method transformed every discipline into Science. At the beginning of the Twentieth Century, in fact, with the birth of psychology, sociology and psychoanalysis, the rationality of the individual is further valued and emphasised; in those years, modernism established as the prevailing trend and was considered an uncontested point of reference for every science. In the light of the breakthroughs obtained by humanity, the modern term has acquired strongly positive meanings, thus coinciding with the term “advanced”. Today, instead, the term modern indicates a past era that has been ending, at least for those who are more sensitive to social change (Cobb, 1990).
 In the second half of the Nineteenth Century, some philosophers – Kierkegaard, Nietzsche and Heidegger above all – started to doubt the inflexible faith of their contemporaries in rationality and in the ability of defining, circumscribing and knowing the truth (Jackson, 1996; Best, Kellner, 1997). The very same meaning of truth lost its immanent sense of holistic and salvific heurism, which had distinguished it in the previous thought. Although it was only an opposing trend at the time, their thought emerged and developed in the reflections of a group of French philosophers connected with the Poststructuralism – Derrida, Foucault, Lyotard and Baudrillard to mention the most famous ones– that are today known to have been the first postmodernism theorists (Best, Kellner, 1991; Williams, 1998; Chiurazzi, 1999). It was only during the Eighties, however, that their thought began to spread all over the world and started to encounter new followers such as the American philosopher Rorty. In Kierkegaard’s thought there was already a strong criticism addressed to the faith in human rationality as well as to every reflection aimed at knowing the real and the true. These concepts, according to Kierkegaard, imprison humanity and delude it to possess certainties, thus destroying feelings, inspiration and spontaneity. Those constitute the essential part of human beings and of their inclination toward God. In this perspective, Kierkegaard called for the role that irrationality, spontaneity and subjectivity play in making a human being, and that  the prevailing thought reduced to a series of rules and norms which limit his/her potential, thus causing spiritual frustration and alienation. Kierkegaard demands the rebirth of inner passion and spirituality which motivate individual actions and unify individuals. It is a clear reference to Christ’s spiritual Passion as a unifying force for all humanity, which always and in any case reigns over rationality. Reference to the Christian religion allows the philosopher to consider the subjective passion as a different concept of truth, transformed in daily life. According to Kierkegaard, religion redemption replaces the exasperated truth of the real as a principle of life. In Kierkegaard’s thought, passion and rationality, feelings and calculation, instinct and reasoning are in constant opposition and synthesise the contrast between the individual and the machine, spirit and physicality (of the individual and things).
Nietzsche’s criticism concerning modern thought is even stronger as it lacks any religious reference. Nietzsche extols individuality, its power and autonomy in strong contrast with any form of aprioristic, immanent, rational, definitive, and in any case salvific, ideology. According to the philosopher’s perspective, any ideology is nothing but an attempt of the individual to protect himself/herself from the daily course of life and is a false source of truth and certainty. Once the deception of ideology is revealed, God ceases to exist too, thus every faith tracing everything to a unique explanation vanishes (Chiurazzi, 1999). Rationality, modern science and its utility in life, the search of the truth, objectivity – all concepts exalted by the Enlightenment and Positivism movements– are object of a strong attack by the philosopher. This attack will find a resolution in postmodernism. There are no eternal truths, nor demonstrable or univocal truths. Everything should be contextualised to the place and to the historical period. Metaphysics, the idea of a permanent knowledge and of a transcendental reality, are nothing but constructions that are created to alleviate human sufferance, and prevent the individual from fully accomplish his/her capacities and of experiencing the true sense of his/her life which is made of opposing forces and passions. At the same time, there is not a unique and absolute truth, but only perspectives (visions) of every individual concerning different events. Moreover, these perspectives need to be relativised to that individual, to the moment and to the historical and social context. According to the philosopher, true knowledge is the simultaneous existence of a multiplicity of interpretations, each of which is the result of a particular perspective that is essential and should therefore be valued; this manifold knowledge leads the individual to the appreciation of difference. However, it is also the result of a long process requiring considerable efforts and will of knowledge, an unappeasable and humble desire to know, where the knowable is endless as every knowledge is source of other consequent researches and knowledge. Knowledge generates knowledge. Nietzsche also reconsiders the concept of subject and interprets it as a mere idealised construction which encapsulates a multiplicity of emotions, thoughts, ideas and stimuli created by modern thinkers to delude individuals to have an identity and to fictitiously remove them from the anonymous mass in which they live. In this situation, humanity can advance but only thanks to the efforts of individuals who are free and open to knowledge, who are able to liberate their individuality and creativity and are not afraid of not possessing a truth. The truth does not exist. Individual interpretations of the real do, but failure of possessing the truth, if perceived and accepted, is the strength and the inspiration that makes knowledge possible. Through the knowledge of the new, the individual questions him/herself, encounters risks but can find fulfilment.
The third father of postmodernism is Heidegger who can be considered an essential point of reference for the most recent postmodern thought. As a matter of fact, Heidegger, is one of the major critics of the basic thesis of modernism and one of the philosophers that have most influenced contemporary philosophy. His thought developed around the question of being. This issue allowed him to face several issues and to detach himself from the prevailing modern thought. It is not surprising that he chose to focus on issues that philosophy of that time neglected and took them for granted, almost obvious. In his thought, Heidegger distanced himself from the traditional concept of truth – it is true what correspond to reality – and considers freedom the original truth. The modern notion of knowledge and of being appears, therefore, to be the result of the dominance and power of individual over another  individual. Heidegger criticised the interpretation of theoretic conceptualisation as the only way toward knowledge by highlighting that primary knowledge of the individual is not conceptual at all and that it is not possible to make everything the individual knows completely explicit. Heidegger gave to art and poetry, neglected by the modern thought, a particular meaning as he considered them an alternative way of learning and therefore of knowing. At the same time, Heidegger defied the modern distinction between subject and object as objectivity is a result of an interpretation as well. Moreover, according to Heidegger metaphysics cannot be considered a branch of philosophy as it is, instead, a global perspective which concerns every human activity. In this sense, in Heidegger’s thought, language is superior to the individual as it is not a simple means of communication as considered by modernism. It is a privileged manifestation of being. Language is the extension of being and is therefore a way of being in its own right.
From a social point of view, Heidegger accused modernity of having transformed peoples in amorphous masses and of having levelled their tastes, ideas, languages and habits through a constant process of homogenisation which eliminated every manifestation of individuality. Truth and knowledge are searched at the expense of the individual specificity which is, instead, his/her expression and richness. This evolution is heavily supported by the modern technological development which, in Heidegger’s perspective, only creates powerful tools of dominance over individuals who are considered mere resources that can be replaced (Best, Kellner, 1997) and surrogated by machines–to which they are reduced. As a result of his ideas and critiques concerning modernism, Heidegger does not offer an unifying system of thought, but only fragmented reflections concerning some issues (Clark, 2002), parts of a truth to be invented and not discovered.
The thoughts of those philosophers, sometimes fragmented, that constituted an opposing trend during the Nineteenth Century, became an actual system of thought during the following century, when historical events highlighted the frailty of human certainties. Political conflicts and crises, world wars, the fall of political blocks and nations, the weakening of social groups and family, the proliferation of impersonal and standardised communication technologies, social dissatisfaction are all phenomena which characterise postmodernity and have undermined the idea of political stability and of possibility of a univocal convergence of different and multiple interests. At the same time, they have imparted new vigour to the criticism of the precedent philosophers and started a process of revision of the idea of progress, in the light of the evidence that the objective of a minimal level of wellbeing, common to all social classes, cannot be achieved and that is, on the contrary, the result of a concept that can be historicised relative and questionable, certainly not absolute: postmodernism “refers to the consistent deconstructing of the entire program of early modernism” (Cobb, 1990, p.150).
Despite the benefits induced by the development achieved thanks to biotechnology, to the economic globalisation, to the information highways and to the new genetic technologies, the current situation is uncertain and discriminating. Apart from few rich peoples, most of the world population is poor, unemployed and alienated. The world is full of contradictions, increasingly unsecured, dissatisfied and deprived of any certainty: certainty has become a good in its own right. During the Sixties, the voices of Gilles Deleuze, Roland Barthes, Jean Baudrillard and Julia Kristeva, started to rise in unison against the certainties of rationality and reported the failure of the Enlightenment movement and of Descartes’ subject (Best, Kellner, 1997). Few years later, the poststructuralists’ thought became part of postmodernism, whose term can be attributed to Charles Jenks.
The word “postmodernism” does not contain a precise meaning, and refers to many fragmented cultural phenomena, to the extent that some have suggested the need of using the plural and therefore of referring to “postmodernisms” in line with the postmodern spirit (Featherstone, 1991; Brown, 1994; 1995; 1997; Chiurazzi, 1999). In spite of that, it is  possible to recognise in this complexity, fragmentation and even unknowability of reality, that was so far defied by modernism, the central element of the new philosophy (Cova, 1996). The very same concept of reality is then questioned together with that of truth. More generally, it is possible to suggest that postmodernism doubts any certainty of modernism (Cobb, 1990). As a matter of fact, each philosopher has developed his/her own thought in a specific way. Deconstructionists, in particular Derrida and Lyotard, emphasised the concept of difference and highlighted its link with language in such way that difference and language are complementary (Chiurazzi, 1999; Best, Kellner, 1997) Lyotard, specifically worked on the relationship between fragmentation and globalisation and defined the features of the postmodern condition (Williams, 1998). Vattimo (1983) focused his attention on the critique of rationality, and eventually defied the possibility of identity and opted for a celebration of difference and tolerance that was a “week thought”, in fact. Bocchi, Ceruti, Morin and others developed the theory of complexity, that is the celebration of multiformity as the basis of the world whose ambiguity and confusion makes it impossible for science to develop an interpretation scheme which can be valid always and in absolute terms (Bocchi, Ceruti, 1985). Foucault concentrated on the subject and denounced its submission to society and to its false constructions. He considered the subject a mere construction which indicates unity and identity, a result of social logic and rules. Rorty analysed western philosophy and defied its role within the political life of society, which results to be lacking its own critical interpretative conscience and therefore searched original certainties.
All these trends of thought, though with their own peculiarities, claimed the validity of the differences among historic periods, geographic places and single individuals. The idea of the existence of a linear development of history leading to a situation of an increased wellbeing and emancipation for humanity in the course of the time, was strongly rejected. There is no core, no structure to be known. Every single thing cohabits with the other, without a precise aprioristic and absolutist meaning. The end of universalism, fundamentalism, hierarchies and boundaries was declared and, at the same time, contingence and diversity was exalted (Firat, Venkatesh, 1993; 1995). The individual is elated by the thought of dominating the machine and of having been freed by the servitude of work. However, it is the machine that dominates the individual, thus defying his/her own specificity and depriving him/her of the freedom of being different, of being him/herself. As long as the individual can be reduced to a machine, s/he has no qualities.

Reference: 
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.200.8440&rep=rep1&type=pdf
Stefano Podestà Università L. Bocconi Michela Addis
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.509.5508&rep=rep1&type=pdf



Monday, May 9, 2016

The purpose of theory

Q1.      Rudner defines a theory as a systematically related set of statements, including some law like generalizations that is empirically testable. The purpose of theory is to increase scientific understanding through a systematized structure capable of both explaining and predicting phenomena." Agree or disagree, and why? What are the requisites of a fully formalized theory?
The role of theory in science is contained in the following consensus definition of 'theory' proposed by Hunt (1983) following Rudner (1966): A theory is a .systematically related set of statements, including some law like generalizations, that is empirically testable. The purpose of theory is to increase scientific understanding through a systematized structure capable of both explaining and predicting phenomena (emphases ours).
This definition incorporates the key elements of the nature of theory as proposed by philosophers of science from different branches of the social sciences as Kaplan (1964)—sociology, Blalock (1969)—statistics, Aiderson (1957)—marketing, and Friedman (1953)—economics. We feel therefore that this is a reasonably complete specification of the essential criteria and purpose of theory.
MWB make two normative claims in their Propositions 1 and 2 regarding the preferred method of theory construction, i,e, 'all theory generation should depend on some past observation' and 'all observations should be guided and interpreted through some theory,' Both propositions imply that the reverse can also be true, but are dysfunctional ways of theorybuilding. MWB seem to argue that theories can be generated in the absence of any observations and that data can be interpreted in the absence of any theory, i.e. theorizing is possible independent of past experience and data are theoretically neutral. However, a number of philosophers of science (Kaplan, 1964; Churchman, 1971) have shown that these reverse statements cannot be true. The mere process of deriving law-like generalizations about a phenomenon involves the experience of the researcher, and all interpretation of data is conducted within the context of a framework imposed by the researchers. Therefore, MWB's first two propositions should be expressed as positive rather than as normative statements, and as such we are in general agreement with them. However, an interesting issue with regard to the  role of observations in theory-building remains. Churchman (1971) has pointed out that initial observations have different roles in two methods of theory generation, i.e. the inductive method and the deductive method. The inductive method of the Lockean Inquiring System is 'the process of starting with highly warranted (or well agreed upon) observational statements about specific events and inferring a generalization' (Churchman, 1971: 94). Thus, observations are the very basis of the theory. On the other hand, the deductive mode is 'the process of using a set of assumptions to prove a theorem by some standard set of rules of inference' (Churchman, 1971: 94). In this method the role of initial observations is to provide a basis for speculation about the phenomenon, which is then followed by development of assumptions and the hypothetical model from which generalizations are deduced.
Now, MWB state that their Proposition 1 is 'the basis of empiricism, or what Churchman (1971) calls the Lockean Inquiry System' (p. 190), They therefore seem to imply that all theory generation should be based on the pure inductive method (i.e, the Lockean Inquiring System).
It is useful to view the pure inductive method and the pure deductive method of building theory as representing the extremes of a continuum. In contrast to MWB, we propose that methods falling along all points of the continuum, including the deductive method, represent valid ways to generate theory for strategy researchers. While the inductivist route has had a prime role in the development of strategy theory, there is no clear reason why this must be normatively true. The research question and the phenomenon of interest dictate whether 'more inductively-oriented' techniques with greater emphasis on the role of initial observations, or 'more deductively oriented techniques' are likely to be useful (see Karnani, 1984, for a good example of the productive use of deductive techniques). MWB seem to feel that because operations researchers and economists, who often use deductive techniques, sometimes sacrifice relevance for mathematical elegance, this somehow makes deductive techniques deficient. However, it is not the deductive procedures which underlie the deficiencies in the theories, but rather the failure of these theories to correspond to the norms underlying 'good' theories. These norms are discussed in the next section.
MWB's notion of what constitutes 'good theory' is contained in their Proposition 3 and discussion of Assertion 1. Proposition 3 states that 'a theory is better, ceterus paribus, (a) if it is refutable and (b) if it is consistent with a body of existing theories' (p. 190), Assertion 1 makes the suggestion that 'well-reasoned' theory should underlie strategy research, and MWB offer a series of examples of 'well-reasoned theory' in their discussion.
Because of the interdisciplinary and integrative orientation of the field, most phenomena of interest to strategy researchers are highly complex in nature. Therefore, the complete specification of necessary and sufficient conditions, as well as the explicit formulation of assumptions, are extremely difficult tasks. For the most part we deal in partial explanations and the predictive power of our theorizing is limited.
There is an intrinsic tension between the requirement that theoretical statements have 'precise' explanatory and predictive power and that they be applicable to a 'wide' range of circumstances. This issue is of particular concern to strategy researchers, who seek an understanding of strategic behavior at different levels of aggregation (i.e, at the firm level, at the strategic group level, at the industry level, and across industries). Often, predictive precision can be obtained for phenomena at lower levels of aggregation (such as strategic groups) but this may imply sacrificing generalizability (for the operation of the phenomena across industries, for instance).
Do these objections make the task of seeking lawlike generalizations futile? Not at all. We believe that it is a worthwhile endeavor to continue to build mid-range theories, for this can be a useful path to the development of more complete theories. Our view is that a theory is better if it explains a wider range of phenomena while striving to aim for predictive precision. For example, a theory of the relationship between market share and profitability would be 'better' if it isolated the particular circumstances which govern the nature of this relationship at different levels of aggregation.
If strategic management is to become a science it must strive towards 'explaining by law' the phenomena of interest. It is true that the field has traditionally been concerned with generating normative implications for practicing managers. However, since the goal of science is to explain and predict phenomena, the role of positive research must be recognized. Three related issues merit further discussion.
The first issue has to do with the model of science that is appropriate to follow. In our view it is doubtful whether physics, where even ultimate applications often do not constitute a primary factor for directing research efforts, would be the best model. Rather, it must be recognized that strategic management is more an applied discipline (similar, perhaps, to engineering), and therefore concerned with ultimate application of research findings. However, these applications may sometimes take a while to emerge. Therefore, we concur with MWB's argument that direct practical applications should not be required of ail papers.
The second issue concerns the identification of the audiences for whom the applications are being generated. Strategy researchers should be encouraged to seek knowledge which is generalizable beyond the confines of their members. To approach the status of a 'science' it is beneficial to examine issues which are valued in the larger community of scholars and practitioners. Strategy research findings may be extremely relevant for public policy-makers, researchers from other disciplines, consultants, the popular press, or the public at large.
The third issue concerns the role of application in the development of the field. MWB provide an interesting discussion of the 'division of labor' among researchers. However, this division may be even broader than MWB suggest. Some philosophers of science suggest that there are two distinct roles necessary for the advancement of knowledge (Manicas and Secord, 1983). It is important for scientists (either 'pure' or applied) to conduct theoretical and empirical research in order to uncover causal structures. Under this model, it is then the role of the technician to apply these decision rules to the situations faced by particular organizations. For instance, a manager or consultant may make strategic plans for a corporation using an analytical framework which is developed from research findings.

It is important to realize that the theoretical issues raised by MWB are present (though often only implied or assumed) in every empirical article published in the strategy literature. We feel that the strategy field is at an important juncture. Research can continue in a rather unfocused fashion (as well described by MWB), or attempts can be made to coordinate research efforts to conform with certain guidelines. The guidelines we suggest are of necessity somewhat broad. It is important to guard against forming arbitrary rules to guide the research process, because these may in fact hinder the transition to an organized and recognized science. 

Friday, January 8, 2016

Elaboration Likelihood Model

Title: From the Periphery To The Center: An Emotional Perspective Of The Elaboration Likelihood Model            
Author: Ajatshatru Singh           

Model/Theory

Elaboration Likelihood Model

The link between conceptual framework and theory/model

The current research says that the most of the theories of attitudinal change and persuasion are apprehensive with taking an important cognitive route to a longer-lasting attitudinal change and the long-term persuasion. The ebb and flow research says that the most speculations of mentality change and influence are worried with taking a psychological course to long haul influence and a more drawn out enduring state of mind change. The ELM i.e. Elaboration Likelihood Model is one of the such hypothesis. This hypothesis underlines that insight is the focal component in the course to state of mind change and demonstrates that feeling is a viewpoint, yet a less vital one, during the time spent state of mind change.
This study endeavor to set up feeling at the focal point of the state of mind change process by analyzing unconstrained responses to two auto commercials and after that connecting these responses to buy expectations to make its case for feeling. The consequences of this study demonstrate that feeling is a vital component during the time spent mentality change and that it might assume a more focal part in this procedure than has been beforehand appeared.
The ANOVA drove between the two social affair groups reveal a basic region of feeling in the "mental" get-together. This shows in spite of the way that information is being taken care of subjectively, the technique is not being coordinated in an enthusiastic vacuum. Frankly, the "mental" social occasion shows an in a general sense higher energetic response with respect to the excited variable of enjoyment. These clues of feeling in mental taking care of offer support to the need of reexamining the piece of feeling in the ELM.
Depended Variables
Independent Variable
Attitude
CAR= vehicle being tested
P = Pleasure,
A= Arousal,
D= Dominance,
C= Cognitive,
E= Emotional
Cognitive SR,  Emotional SR




AIDA Model

Title: Explain the effectiveness of advertising using the AIDA model
Author: Sahar Gharibi, Dr, Syed Yahyah, Danesh, Dr. KambizShahrodi

Model/Theory

AIDA Model is used in the conceptual framework.

The link between conceptual framework and theory/model

The primary objective of this research is to describe the effectiveness of advertising using the AIDA model in the existing insurance companies(private) present in the metropolitan city of  Iran i.e. Tehran. The principle target of this examination, Explain the viability of publicizing utilizing the AIDA model as a part of the private insurance agencies in Tehran. In this study, with a specific end goal to survey your association, Aida model of experimental models to assess the best promoting strategy is utilized. This model was presented by Elmo Lewis in 1898.
The AIDA model is used for arranging promoting messages, in a way that recommends 4 general reasons for existing is to draw in consideration, make intrigue, and invigorate seek and pushing individuals to purchase (Birch, 2010). In the previous century, a few models were proposed for the viability of publicizing and marketing promotions that have been known as the adequacy of the various leveled models. Above all is that the model utilized is the AIDA model that is displayed by Elmo Lewis, around 1906.
In the years from that point forward, numerous models proposed by scholars with different strategies adopted in the Aida model. And still, after around a century, AIDA model accompanies a great deal of fans (Howard, 1990). Marketing and Promotions ought to be seen, to get perused, comprehended, and it must be finished. Promoting prompts individuals from obliviousness to information, observation, influence, making eagerness buy (activity).
According to this model, the greater part of what publicizing and marketing is ought to do is to cause cautiousness in the viewers. These research studies have observed and found the TRA to be a strong instrument when analyzing gambling comportment. On the other hand, the specialist selected to alter the TRA in the flow study by incorporating two alleviating variables, natural inspiration and locus of control, to the calculated model keeping in mind the end goal to give a superior theoretical model to future exploration (Gharibi, Shahrodi & Danesh, 2012).

Figure 3: Conceptual Framework (Gharibi et., al., 2011)



Dependent & Independent Variables

Table 4: Dependent & Independent Variables
Depended Variables
Independent Variable
Effectiveness of Environmental Advertising
Attention
Interest
Desire
Action

Thursday, January 7, 2016

A study on Theory of Reasoned Action

Title: Using the Theory of Reasoned Action to examine the gambling behaviors of college athletes and other students.
Author: Robert Gene Thrasher

Model/Theory

Theory of Reasoned Action

The link between conceptual framework and theory/model

The general indicate of the present study was to inspect the gambling behavior of understudies students and, specifically, undergrad athletes. This study inspected the connections among subjective norms, gambling attitudes, gambling motivations, locus of control, and gambling intentions on the gambling behavior of understudies. The objective of this study was to survey gambling in a clear cut populace with simple access to gambling and to assess the sufficiency of a changed Theory (TRA) (Ajzen & Fishbein, 1980) for visualizing the gambling recurrence and the gambling behavior. A few studies have used the TRA to analyze gambling behavior (Oh & Hsu, 2001; Moore & Ohtsuka, 1997, 1999).
These studies explored the TRA to be an efficacious instrument when analyzing the gambling behavior. However, the researcher of the current study opted to modify TRA by integrating two mitigating variables, intrinsic motivation and locus of control, to the conceptual framework that can be a good option for future research (Thrasher 2006).

Figure 1: Conceptual Framework (Threshur,2009)

Dependent & Independent Variables

Table 1: Dependent & Independent Variables
Depended Variables
Independent Variable
Moderating Variables
Gambling attitudes
Student Athletes
Gambling motivations
Subjective norms
Gender
Locus of control
Gambling Intentions
Student Athletes

Gambling Behaviors
Grade Point Average


Frequency


Expenditure


Tuesday, January 5, 2016

Evolution of Marketing Tools

The marketing history thought were categorized by the Robert Bartel. He categorized it in decades starting from the initial years of 20th century. People have some differences in this characterization ("History of Marketing", n.d.). Given below are the details:
·         In 1900s: It was the discovery of primary concepts of marketing with their exploration
·         In 1910s: There came the definition of terms, conceptualization & classification.
·         In 1920s: There came the integration of the basis of principles
·         In 1930s: There came development of specialization and variation in theory
·         In 1940s: There came reappraisal in the light of new demands and a more scientific approach
·         In 1950s: There came re-conceptualization in the light of managerialism, social development & quantitative approaches
·         In 1960s: There came differentiation on bases such as managerialism, holism, environmentalism, systems, and internationalism
·         1970s: socialization i.e. the revision of marketing to social change
With the development in significance of showcasing divisions and marketing departments and their related directors, the field has ended up ready for the engendering of administration trends which don't generally credit themselves to periodization

Sixties (60s)

·         1960 to 1969: It was The Beginning Decade
o   Micro-economic changes and their approaches to marketing problems (Nerlove & Arrow, 1962)
o    Marketing issues formulated as known operation research (OR) issues (Engel & Warshaw, 1964); (Montgomery &  Urban, 1969)
Micro-economics is the literature where Initial mathematical techniques and their approaches to marketing issues can be found. However, theorem for marketing Mix optimization by Dorfman and Steiner paper (1954) is very famous during this decade (Wierenga 2008). Later on, more work was done on Operation Research and Management Science in this decade. Operation Research concepts in this decade were combined with concepts of “theory of decision making” (Pratt et al. 1965). 

Seventies (70s)

·         1970–1979 It was The Golden Decade
o   Golden decade entitled (Stochastic Models Massy et al.,1970)
o   Models for marketing instruments by Kotler (1971);
o   Market response models by Little (1979a);
o   Labeled marketing decision models by CALLPLAN (Lodish, 1971)
o   Marketing decision and their support systems by Little (1979b)
In the seventies, the field of advertising developed and grew exponentially. In this manner, we can say that this decade is the Golden Decade for promoting choice models (Wierenga 2008). Specialists understood that OR calculations can be extremely hard to actualize for certifiable advertising issues. Notwithstanding advertising issues must be abuse to fit them to a current OR method (Montgomery, 1973). Utilization of straight programming to media arranging is the most observable illustration (Engel & Warshaw, 1964). There were numerous models were created and displayed in this decade as should be obvious in above graph. Idea of "Promoting Decision Support Systems (MDSS)" is the additionally one of the huge advancement in the seventies'' (Little 1979b).

Eighties (80s)

·         1980–1989 Marketing move towards Generalizations and enhanced Marketing Knowledge
o   Meta-analyses of the effects of marketing instruments (Asmus et al.,1984)
o   Knowledge-based models and expert systems(Abraham & Lodish, 1987); (McCann & Gallagher, 1990)
o   Conjoint analysis models (Green et al., 1981)
Working on the marketing and promoting models by seventies were noteworthy, in this manner, it did speculation in eighties (Wierenga, 2008). The meta-investigations for promoting (Asmus et al., 1984) and value (Tellis, 1988) are the frequently refered to concentrates on. The reason for speculation is fundamentally the condensing our insight around a specific territory or subject. After the mid-eighties the prominent point was showcasing learning. Software engineering and counterfeit consciousness (AI) procedures offer taking choice by putting away some assistance with marketing data in PCs and after that permit that data for choice making process. It contributed a considerable measure in giving an ascent to the advancement of master frameworks and learning based frameworks. The greater parts of these frameworks in showcasing were created for deals advancements and promoting (Wierenga 2008). Conjoint examination models were exceptionally conspicuous in this decade. Conjoint examination's first work in showcasing was showed up in the 70s (Green & Srinivasan 1978).

Nineties (90s)

·         1990–1999: It was the Marketing Information Revolution (Relationship Era)
o   Involves long–term and the value–added relationships
o   Carried client introduction considerably assist
o   Focuses on building up and keeping up associations with both clients and suppliers
o   Involves long–term, value–added connections
o   Marketing time has as moved from being "exchange based" to concentrating on connections and networking
o   Scanner-data-based consumer choice modeling (Neslin, 1990)
o   Neural networks & the data mining (Hruschka, 1993)
o    Theoretical modeling (Moorthy, 1993)
The arguments were made that conventional marketing concentrated on drawing in new clients as opposed to holding existing ones. It is just as critical to hold tight to existing clients with the goal that they get to be rehash purchasers and long haul faithful clients

The purpose of procurement (scanner-information) on huge scale get to be accessible in nineties decade. This is likewise called the 'promoting data upset'' and it is considered as a noteworthy animating explanation for buyer decision surge (Blattberg et al. 1994). The most conspicuous instrument to do these investigations was Multinomial logic models (Guadagni and Little 1983). The quickly developing information helped utilizing software engineering and computerized reasoning to make new systems which incorporate the inductive procedures (e.g., fake neural nets) that aides discovering substantial information bases regularities, and separating learning from information which we call information mining idea. Amid nineties, a hypothetical demonstrating called "adapted" get to be prevalent. In this hypothetical model, the showcasing issue and promoting marvel is portrayed by couple of scientific mathematical statements (Wierenga, 2008).