Monday, May 9, 2016

The purpose of theory

Q1.      Rudner defines a theory as a systematically related set of statements, including some law like generalizations that is empirically testable. The purpose of theory is to increase scientific understanding through a systematized structure capable of both explaining and predicting phenomena." Agree or disagree, and why? What are the requisites of a fully formalized theory?
The role of theory in science is contained in the following consensus definition of 'theory' proposed by Hunt (1983) following Rudner (1966): A theory is a .systematically related set of statements, including some law like generalizations, that is empirically testable. The purpose of theory is to increase scientific understanding through a systematized structure capable of both explaining and predicting phenomena (emphases ours).
This definition incorporates the key elements of the nature of theory as proposed by philosophers of science from different branches of the social sciences as Kaplan (1964)—sociology, Blalock (1969)—statistics, Aiderson (1957)—marketing, and Friedman (1953)—economics. We feel therefore that this is a reasonably complete specification of the essential criteria and purpose of theory.
MWB make two normative claims in their Propositions 1 and 2 regarding the preferred method of theory construction, i,e, 'all theory generation should depend on some past observation' and 'all observations should be guided and interpreted through some theory,' Both propositions imply that the reverse can also be true, but are dysfunctional ways of theorybuilding. MWB seem to argue that theories can be generated in the absence of any observations and that data can be interpreted in the absence of any theory, i.e. theorizing is possible independent of past experience and data are theoretically neutral. However, a number of philosophers of science (Kaplan, 1964; Churchman, 1971) have shown that these reverse statements cannot be true. The mere process of deriving law-like generalizations about a phenomenon involves the experience of the researcher, and all interpretation of data is conducted within the context of a framework imposed by the researchers. Therefore, MWB's first two propositions should be expressed as positive rather than as normative statements, and as such we are in general agreement with them. However, an interesting issue with regard to the  role of observations in theory-building remains. Churchman (1971) has pointed out that initial observations have different roles in two methods of theory generation, i.e. the inductive method and the deductive method. The inductive method of the Lockean Inquiring System is 'the process of starting with highly warranted (or well agreed upon) observational statements about specific events and inferring a generalization' (Churchman, 1971: 94). Thus, observations are the very basis of the theory. On the other hand, the deductive mode is 'the process of using a set of assumptions to prove a theorem by some standard set of rules of inference' (Churchman, 1971: 94). In this method the role of initial observations is to provide a basis for speculation about the phenomenon, which is then followed by development of assumptions and the hypothetical model from which generalizations are deduced.
Now, MWB state that their Proposition 1 is 'the basis of empiricism, or what Churchman (1971) calls the Lockean Inquiry System' (p. 190), They therefore seem to imply that all theory generation should be based on the pure inductive method (i.e, the Lockean Inquiring System).
It is useful to view the pure inductive method and the pure deductive method of building theory as representing the extremes of a continuum. In contrast to MWB, we propose that methods falling along all points of the continuum, including the deductive method, represent valid ways to generate theory for strategy researchers. While the inductivist route has had a prime role in the development of strategy theory, there is no clear reason why this must be normatively true. The research question and the phenomenon of interest dictate whether 'more inductively-oriented' techniques with greater emphasis on the role of initial observations, or 'more deductively oriented techniques' are likely to be useful (see Karnani, 1984, for a good example of the productive use of deductive techniques). MWB seem to feel that because operations researchers and economists, who often use deductive techniques, sometimes sacrifice relevance for mathematical elegance, this somehow makes deductive techniques deficient. However, it is not the deductive procedures which underlie the deficiencies in the theories, but rather the failure of these theories to correspond to the norms underlying 'good' theories. These norms are discussed in the next section.
MWB's notion of what constitutes 'good theory' is contained in their Proposition 3 and discussion of Assertion 1. Proposition 3 states that 'a theory is better, ceterus paribus, (a) if it is refutable and (b) if it is consistent with a body of existing theories' (p. 190), Assertion 1 makes the suggestion that 'well-reasoned' theory should underlie strategy research, and MWB offer a series of examples of 'well-reasoned theory' in their discussion.
Because of the interdisciplinary and integrative orientation of the field, most phenomena of interest to strategy researchers are highly complex in nature. Therefore, the complete specification of necessary and sufficient conditions, as well as the explicit formulation of assumptions, are extremely difficult tasks. For the most part we deal in partial explanations and the predictive power of our theorizing is limited.
There is an intrinsic tension between the requirement that theoretical statements have 'precise' explanatory and predictive power and that they be applicable to a 'wide' range of circumstances. This issue is of particular concern to strategy researchers, who seek an understanding of strategic behavior at different levels of aggregation (i.e, at the firm level, at the strategic group level, at the industry level, and across industries). Often, predictive precision can be obtained for phenomena at lower levels of aggregation (such as strategic groups) but this may imply sacrificing generalizability (for the operation of the phenomena across industries, for instance).
Do these objections make the task of seeking lawlike generalizations futile? Not at all. We believe that it is a worthwhile endeavor to continue to build mid-range theories, for this can be a useful path to the development of more complete theories. Our view is that a theory is better if it explains a wider range of phenomena while striving to aim for predictive precision. For example, a theory of the relationship between market share and profitability would be 'better' if it isolated the particular circumstances which govern the nature of this relationship at different levels of aggregation.
If strategic management is to become a science it must strive towards 'explaining by law' the phenomena of interest. It is true that the field has traditionally been concerned with generating normative implications for practicing managers. However, since the goal of science is to explain and predict phenomena, the role of positive research must be recognized. Three related issues merit further discussion.
The first issue has to do with the model of science that is appropriate to follow. In our view it is doubtful whether physics, where even ultimate applications often do not constitute a primary factor for directing research efforts, would be the best model. Rather, it must be recognized that strategic management is more an applied discipline (similar, perhaps, to engineering), and therefore concerned with ultimate application of research findings. However, these applications may sometimes take a while to emerge. Therefore, we concur with MWB's argument that direct practical applications should not be required of ail papers.
The second issue concerns the identification of the audiences for whom the applications are being generated. Strategy researchers should be encouraged to seek knowledge which is generalizable beyond the confines of their members. To approach the status of a 'science' it is beneficial to examine issues which are valued in the larger community of scholars and practitioners. Strategy research findings may be extremely relevant for public policy-makers, researchers from other disciplines, consultants, the popular press, or the public at large.
The third issue concerns the role of application in the development of the field. MWB provide an interesting discussion of the 'division of labor' among researchers. However, this division may be even broader than MWB suggest. Some philosophers of science suggest that there are two distinct roles necessary for the advancement of knowledge (Manicas and Secord, 1983). It is important for scientists (either 'pure' or applied) to conduct theoretical and empirical research in order to uncover causal structures. Under this model, it is then the role of the technician to apply these decision rules to the situations faced by particular organizations. For instance, a manager or consultant may make strategic plans for a corporation using an analytical framework which is developed from research findings.

It is important to realize that the theoretical issues raised by MWB are present (though often only implied or assumed) in every empirical article published in the strategy literature. We feel that the strategy field is at an important juncture. Research can continue in a rather unfocused fashion (as well described by MWB), or attempts can be made to coordinate research efforts to conform with certain guidelines. The guidelines we suggest are of necessity somewhat broad. It is important to guard against forming arbitrary rules to guide the research process, because these may in fact hinder the transition to an organized and recognized science. 

Friday, January 8, 2016

Elaboration Likelihood Model

Title: From the Periphery To The Center: An Emotional Perspective Of The Elaboration Likelihood Model            
Author: Ajatshatru Singh           

Model/Theory

Elaboration Likelihood Model

The link between conceptual framework and theory/model

The current research says that the most of the theories of attitudinal change and persuasion are apprehensive with taking an important cognitive route to a longer-lasting attitudinal change and the long-term persuasion. The ebb and flow research says that the most speculations of mentality change and influence are worried with taking a psychological course to long haul influence and a more drawn out enduring state of mind change. The ELM i.e. Elaboration Likelihood Model is one of the such hypothesis. This hypothesis underlines that insight is the focal component in the course to state of mind change and demonstrates that feeling is a viewpoint, yet a less vital one, during the time spent state of mind change.
This study endeavor to set up feeling at the focal point of the state of mind change process by analyzing unconstrained responses to two auto commercials and after that connecting these responses to buy expectations to make its case for feeling. The consequences of this study demonstrate that feeling is a vital component during the time spent mentality change and that it might assume a more focal part in this procedure than has been beforehand appeared.
The ANOVA drove between the two social affair groups reveal a basic region of feeling in the "mental" get-together. This shows in spite of the way that information is being taken care of subjectively, the technique is not being coordinated in an enthusiastic vacuum. Frankly, the "mental" social occasion shows an in a general sense higher energetic response with respect to the excited variable of enjoyment. These clues of feeling in mental taking care of offer support to the need of reexamining the piece of feeling in the ELM.
Depended Variables
Independent Variable
Attitude
CAR= vehicle being tested
P = Pleasure,
A= Arousal,
D= Dominance,
C= Cognitive,
E= Emotional
Cognitive SR,  Emotional SR




AIDA Model

Title: Explain the effectiveness of advertising using the AIDA model
Author: Sahar Gharibi, Dr, Syed Yahyah, Danesh, Dr. KambizShahrodi

Model/Theory

AIDA Model is used in the conceptual framework.

The link between conceptual framework and theory/model

The primary objective of this research is to describe the effectiveness of advertising using the AIDA model in the existing insurance companies(private) present in the metropolitan city of  Iran i.e. Tehran. The principle target of this examination, Explain the viability of publicizing utilizing the AIDA model as a part of the private insurance agencies in Tehran. In this study, with a specific end goal to survey your association, Aida model of experimental models to assess the best promoting strategy is utilized. This model was presented by Elmo Lewis in 1898.
The AIDA model is used for arranging promoting messages, in a way that recommends 4 general reasons for existing is to draw in consideration, make intrigue, and invigorate seek and pushing individuals to purchase (Birch, 2010). In the previous century, a few models were proposed for the viability of publicizing and marketing promotions that have been known as the adequacy of the various leveled models. Above all is that the model utilized is the AIDA model that is displayed by Elmo Lewis, around 1906.
In the years from that point forward, numerous models proposed by scholars with different strategies adopted in the Aida model. And still, after around a century, AIDA model accompanies a great deal of fans (Howard, 1990). Marketing and Promotions ought to be seen, to get perused, comprehended, and it must be finished. Promoting prompts individuals from obliviousness to information, observation, influence, making eagerness buy (activity).
According to this model, the greater part of what publicizing and marketing is ought to do is to cause cautiousness in the viewers. These research studies have observed and found the TRA to be a strong instrument when analyzing gambling comportment. On the other hand, the specialist selected to alter the TRA in the flow study by incorporating two alleviating variables, natural inspiration and locus of control, to the calculated model keeping in mind the end goal to give a superior theoretical model to future exploration (Gharibi, Shahrodi & Danesh, 2012).

Figure 3: Conceptual Framework (Gharibi et., al., 2011)



Dependent & Independent Variables

Table 4: Dependent & Independent Variables
Depended Variables
Independent Variable
Effectiveness of Environmental Advertising
Attention
Interest
Desire
Action

Thursday, January 7, 2016

A study on Theory of Reasoned Action

Title: Using the Theory of Reasoned Action to examine the gambling behaviors of college athletes and other students.
Author: Robert Gene Thrasher

Model/Theory

Theory of Reasoned Action

The link between conceptual framework and theory/model

The general indicate of the present study was to inspect the gambling behavior of understudies students and, specifically, undergrad athletes. This study inspected the connections among subjective norms, gambling attitudes, gambling motivations, locus of control, and gambling intentions on the gambling behavior of understudies. The objective of this study was to survey gambling in a clear cut populace with simple access to gambling and to assess the sufficiency of a changed Theory (TRA) (Ajzen & Fishbein, 1980) for visualizing the gambling recurrence and the gambling behavior. A few studies have used the TRA to analyze gambling behavior (Oh & Hsu, 2001; Moore & Ohtsuka, 1997, 1999).
These studies explored the TRA to be an efficacious instrument when analyzing the gambling behavior. However, the researcher of the current study opted to modify TRA by integrating two mitigating variables, intrinsic motivation and locus of control, to the conceptual framework that can be a good option for future research (Thrasher 2006).

Figure 1: Conceptual Framework (Threshur,2009)

Dependent & Independent Variables

Table 1: Dependent & Independent Variables
Depended Variables
Independent Variable
Moderating Variables
Gambling attitudes
Student Athletes
Gambling motivations
Subjective norms
Gender
Locus of control
Gambling Intentions
Student Athletes

Gambling Behaviors
Grade Point Average


Frequency


Expenditure


Tuesday, January 5, 2016

Evolution of Marketing Tools

The marketing history thought were categorized by the Robert Bartel. He categorized it in decades starting from the initial years of 20th century. People have some differences in this characterization ("History of Marketing", n.d.). Given below are the details:
·         In 1900s: It was the discovery of primary concepts of marketing with their exploration
·         In 1910s: There came the definition of terms, conceptualization & classification.
·         In 1920s: There came the integration of the basis of principles
·         In 1930s: There came development of specialization and variation in theory
·         In 1940s: There came reappraisal in the light of new demands and a more scientific approach
·         In 1950s: There came re-conceptualization in the light of managerialism, social development & quantitative approaches
·         In 1960s: There came differentiation on bases such as managerialism, holism, environmentalism, systems, and internationalism
·         1970s: socialization i.e. the revision of marketing to social change
With the development in significance of showcasing divisions and marketing departments and their related directors, the field has ended up ready for the engendering of administration trends which don't generally credit themselves to periodization

Sixties (60s)

·         1960 to 1969: It was The Beginning Decade
o   Micro-economic changes and their approaches to marketing problems (Nerlove & Arrow, 1962)
o    Marketing issues formulated as known operation research (OR) issues (Engel & Warshaw, 1964); (Montgomery &  Urban, 1969)
Micro-economics is the literature where Initial mathematical techniques and their approaches to marketing issues can be found. However, theorem for marketing Mix optimization by Dorfman and Steiner paper (1954) is very famous during this decade (Wierenga 2008). Later on, more work was done on Operation Research and Management Science in this decade. Operation Research concepts in this decade were combined with concepts of “theory of decision making” (Pratt et al. 1965). 

Seventies (70s)

·         1970–1979 It was The Golden Decade
o   Golden decade entitled (Stochastic Models Massy et al.,1970)
o   Models for marketing instruments by Kotler (1971);
o   Market response models by Little (1979a);
o   Labeled marketing decision models by CALLPLAN (Lodish, 1971)
o   Marketing decision and their support systems by Little (1979b)
In the seventies, the field of advertising developed and grew exponentially. In this manner, we can say that this decade is the Golden Decade for promoting choice models (Wierenga 2008). Specialists understood that OR calculations can be extremely hard to actualize for certifiable advertising issues. Notwithstanding advertising issues must be abuse to fit them to a current OR method (Montgomery, 1973). Utilization of straight programming to media arranging is the most observable illustration (Engel & Warshaw, 1964). There were numerous models were created and displayed in this decade as should be obvious in above graph. Idea of "Promoting Decision Support Systems (MDSS)" is the additionally one of the huge advancement in the seventies'' (Little 1979b).

Eighties (80s)

·         1980–1989 Marketing move towards Generalizations and enhanced Marketing Knowledge
o   Meta-analyses of the effects of marketing instruments (Asmus et al.,1984)
o   Knowledge-based models and expert systems(Abraham & Lodish, 1987); (McCann & Gallagher, 1990)
o   Conjoint analysis models (Green et al., 1981)
Working on the marketing and promoting models by seventies were noteworthy, in this manner, it did speculation in eighties (Wierenga, 2008). The meta-investigations for promoting (Asmus et al., 1984) and value (Tellis, 1988) are the frequently refered to concentrates on. The reason for speculation is fundamentally the condensing our insight around a specific territory or subject. After the mid-eighties the prominent point was showcasing learning. Software engineering and counterfeit consciousness (AI) procedures offer taking choice by putting away some assistance with marketing data in PCs and after that permit that data for choice making process. It contributed a considerable measure in giving an ascent to the advancement of master frameworks and learning based frameworks. The greater parts of these frameworks in showcasing were created for deals advancements and promoting (Wierenga 2008). Conjoint examination models were exceptionally conspicuous in this decade. Conjoint examination's first work in showcasing was showed up in the 70s (Green & Srinivasan 1978).

Nineties (90s)

·         1990–1999: It was the Marketing Information Revolution (Relationship Era)
o   Involves long–term and the value–added relationships
o   Carried client introduction considerably assist
o   Focuses on building up and keeping up associations with both clients and suppliers
o   Involves long–term, value–added connections
o   Marketing time has as moved from being "exchange based" to concentrating on connections and networking
o   Scanner-data-based consumer choice modeling (Neslin, 1990)
o   Neural networks & the data mining (Hruschka, 1993)
o    Theoretical modeling (Moorthy, 1993)
The arguments were made that conventional marketing concentrated on drawing in new clients as opposed to holding existing ones. It is just as critical to hold tight to existing clients with the goal that they get to be rehash purchasers and long haul faithful clients

The purpose of procurement (scanner-information) on huge scale get to be accessible in nineties decade. This is likewise called the 'promoting data upset'' and it is considered as a noteworthy animating explanation for buyer decision surge (Blattberg et al. 1994). The most conspicuous instrument to do these investigations was Multinomial logic models (Guadagni and Little 1983). The quickly developing information helped utilizing software engineering and computerized reasoning to make new systems which incorporate the inductive procedures (e.g., fake neural nets) that aides discovering substantial information bases regularities, and separating learning from information which we call information mining idea. Amid nineties, a hypothetical demonstrating called "adapted" get to be prevalent. In this hypothetical model, the showcasing issue and promoting marvel is portrayed by couple of scientific mathematical statements (Wierenga, 2008).